Activision edges out Sony and Nintendo in Augusts TV ad spend – VentureBeat

Gaming brands upped their outlay on TV advertising in August by 26.66% compared to July, for an estimated spend of $22.5 million. There was almost a three-way tie for top-spending brands, with Activision edging out longtime chart leader PlayStation. In total, 11 brands aired 43 spots over 5,000 times, resulting in 1.1 billion TV ad impressions. Aside from Nintendo, each of the top brands targeted sports programming, especially NBA and MLB games, for ads during the month.

GamesBeat has partnered with iSpot.tv, the always-on TV ad measurement and attribution platform, to bring you a monthly report on how gaming brands are spending. The results below are for the top five gaming-industry brands in August, ranked by estimated national TV ad spend.

Activision spent an estimated $6.2 million airing a single spot for Call of Duty: Warzone, User Reviews, 627 times, resulting in 215.3 million TV ad impressions. The brand prioritized reaching a sports-loving audience: Top programming by outlay included the NBA, NHL, and MLB, while top networks included TNT, NBC Sports, and Fox.

PlayStation takes second place with an estimated spend of $5.8 million on four ads that ran 754 times, generating 214.7 million TV ad impressions. Most of the spend and impressions occurred in the second half of the month. The spot with the biggest spend (estimated at $3.8 million) was Cannot Be Controlled, promoting the Marvels Avengers game. ESPN, Adult Swim, and Comedy Central were three of the networks with the biggest outlay, while top programming included MLB, NBA, and South Park.

At No. 3: Nintendo, with an estimated spend of $4.9 million on 20 commercials that aired over 1,900 times, resulting in 355.8 million TV ad impressions. The top spot by spend (estimated at $677,351) was Shes My Favorite: Animal Crossing. Programs with the biggest outlay included SpongeBob SquarePants, The Loud House, and The Amazing World of Gumball; top networks included Nick, Cartoon Network, and Bravo.

Fourth place goes to Crystal Dynamics, which hadnt advertised on TV at all this year until August 20. The brand spent an estimated $3.2 million airing two ads, both for the Marvels Avengers game, 397 times, generating 116.6 million TV ad impressions. Its Time to Assemble had the biggest outlay, an estimated $1.8 million. Three of the top programs by spend were the NBA, South Park, and MLB; top networks included ESPN, Adult Swim, and Comedy Central.

Rounding out the ranking is MLB Advanced Media Video Games with an estimated outlay of $825,253 on two spots that aired 323 times, resulting in 49.5 million TV ad impressions. Home Runs, advertising R.B.I. Baseball 20, had the most spend (estimated at $729,251). Most of its outlay went to MLB games, but Ancient Top 10 and Baseball Tonight: Sunday Night Countdown were also in the mix. On the network side of things, the brand prioritized Fox Sports 1, ESPN, and Fox.

For more about iSpots attention and conversion analytics, visit iSpot.tv.

Go here to read the rest:
Activision edges out Sony and Nintendo in Augusts TV ad spend - VentureBeat

All you need to know about the Indian AI Stack – MediaNama.com

A committee under the Department of Telecommunications has released a draft framework of the Indian Artificial Intelligence Stack which seeks to remove the impediments to AI deployment, and essentially proposed to set up a six-layered stack, each handling different functions including consent gathering, storage, and AI/Machine Learning (AI/ML) analytics. Once developed, this stack will be structured across all sectors, including data protection, data minimisation, open algorithm frameworks, defined data structures, trustworthiness and digital rights, and data federation (a single database source for front-end applications), among other things. The paper also said that there is no uniform definition of AI.

This committee AI Standardisation Committee had, in October last year, invited papers on Artificial Intelligence, addressing different aspects of AI such as functional network architecture, AI architecture, and data structures required, among other things. At the time, the DoT had said that as the proliferation of AI increases, there is a need to develop an Indian AI stack so as to bring interoperability, among other things. Here is a summary of the draft Indian AI Stack, comments to which can be emailed at aigroup-dot@gov.in or diradmnap-dot@gov.in, until October 3.

The stack will be made up of five main horizontal layers, and one vertical layer:

This is the root layer of the Indian AI stack over which the entire AI functionality is built. The layer will ensure setting up of a common data controller, and will involve multi-cloud scenarios both private and public clouds. This is where the infrastructure for data collection will be defined. The multilayer cloud services model will define both relations between cloud service models and other functional layers:

This layer will have to define the protocols and interfaces for storing hot data, cold data, and warm data (all three defined below). The paper called this as the most important layer in the stack regardless of size and type of data, since value from data can only be derived once it is processed. And data can only be processed efficiently, when it is stored properly. It is important to store data safely for a very long time while managing all factors of seasonality and trends, ensuring that it is easily accessible and shareable on any device, the paper said.

The paper has created three subcategories of data depending on the relevance of data and its usability:

Categories of data

This layer, through a set of defined protocols and templates ensures an open algorithm framework. The AI/ML process could be Natural Language Processing (NLP), deep learning and neural networks. This layer will also define data analytics that includes data engineering, which focuses on practical applications of data collection and analysis, apart from scaling and data ingestion. The technology mapping and rule execution will also be part of this layer.

The paper acknowledged the need for a proper data protection framework: the Compute layer involves analysis to mine vast troves of personal data and find correlations, which will then be used for various computations. This raises various privacy issues, as well as broader issues of lack of due process, discrimination and consumer protection.

The data so collected can shed light on most aspects of individuals lives. It can also provide information on their interactions and patterns of movement across physical and networked spaces and even on their personalities. The mining of such large troves of data to seek out new correlations creates many potential uses for Big Personal Data. Hence, there is a need to define proper data protection mechanism in this layer along with suitable data encryption and minimisation. from the paper

The compute layer will also define a new way to build and deploy enterprise service-oriented architectures, along with providing transparent computing architecture over which the industry could develop their own analytics. It will have to provide for a distinction between public, shared and private data sources, so that machine learning algorithms can be applied against relevant data fields.

The report also said that the NITI Aayog has proposed an AI specific cloud compute infrastructure which will facilitate research and solution development in using high performance and high throughput AI-specific supercomputing technologies. The broad specifications for this proposed cloud controller architecture may include:

Proposed architecture of AI specific controller

The paper described this as a purpose-built layer through which software and applications can be hosted and executed as a service layer. This layer will also support various backend services for processing of data, and will provide for backend services and a proper service framework for the AI engine to function. It will also keep track of all transaction across the stack, helping in logging auditing activities.

This layer will define the end customer experience through defined data structures and proper interfaces and protocols. It will have to support a proper consent framework for access to data by/for the customer. Provision for consent can be for individual data fields or for collective fields. This layer will also host gateway services. Typically, different tiers of consent will be made available to accommodate different tiers of permissions, the paper said.

This layer also needs to ensure that ethical standards are followed to ensure digital rights. In the absence of a clear data protection law in the country, the EUs General Data Protection Regulation (GDPR) or any of the laws can be applied. This will serve as interim measure until Indian laws are formalised, the paper said.

This layer will ensure the process of security and governance for all the preceding five horizontal layers. There will be an overwhelming flow of data through the stack, which is why there is a need to ensure encryption at different levels, the paper said. This may require setting up the ability for handling multiple queries in an encrypted environment, among other things. Cryptographic support is also an important dimension of the security layer, the paper said.

Why this layer is important, per the paper: data aggregated, transmitted, stored, and used by various stakeholders may increase the potential for discriminatory practices and pose substantial privacy and cybersecurity challenges. The data processed and stored in many cases include geolocation information, product-identifying data, and personal information related to use or owner identity, such as biometric data, health information, or smart-home metrics

Data storage in backend systems can present challenges in protection of data from cyberattacks. In addition to personal-information, privacy concerns, there could be data used in system operation, which may not typically be personal information. Cyber attackers could misuse these data by compromising data availability or changing data, causing data integrity issues, and use big data insights to reinforce or create discriminatory outcomes. When data is not available, causing a system to fail, it can result in damagefor example a smart homes furnace overheats or an individuals medical device cannot function, when required. from the paper

How the proposed AI stack looks like

According to the report, the key benefits of this proposed AI stack are:

This is how the paper proposes data flow through the stack:

Proposed AI flowchart

In AI, the thrust is on how efficiently data is used, the paper said, noting that if the data is garbage then the output will also be so. For example, if programmers or AI trainers transfer their biases to AI; the system will become biased, the paper said. There is a need for evolving ethical standards, trustworthiness, and consent framework to get data validation from users, the paper suggested.

The risks of passive adoption of AI that automates human decision-making are also severe. Such delegation can lead to harmful, unintended consequences, especially when it involves sensitive decisions or tasks and excludes human supervision, the paper said. It gave the example of Microsofts Twitter chatbot Tay as an example of what can happen when garbage data is input into an AI system. Tay had started tweeting racist and misogynist remarks in less than 24 hours of its launch.

Need for openness in AI algorithms: The paper said it was necessary to have an open AI algorithm framework, along with clearly defined data structures. It referenced on how the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) software used by some US courts in predicting the likelihood of recidivism in criminal defendants was demonstrated to be biased since the AI black box was proprietary.

As AI learns to address societal problems, it also develops its own hidden biases. The self learning nature of AI means, the distorted data the AI discovers in search engines, perhaps based upon unconscious and institutional biases, and other prejudices, is codified into a matrix that will make decisions for years to come. In the pursuit of being the best at its task, the AI may make decisions it considers the most effective or efficient for its given objective, but because of the wrong data, it becomes unfair to humans, the report said.

Need to centrally control data: Right after the paper made a pitch for having openness in AI algorithms, it proposed that the data fed into the AI system should be controlled centrally. The data from which the AI learns can itself be flawed or biased, leading to flawed automated AI decisions. This is certainly not the intention of algorithmised decision-making, which is perhaps a good-faith attempt to remove unbridled discretion and its inherent biases. There is thus a need to ensure that the data is centrally controlled including using a single or multiple cloud controllers, the report said.

Proper storage frameworks for AI: An important factor in aiding biases in AI systems is contamination of data, per the paper, which includes, missing information, inconsistent data, or simply errors. This could be because of unstructured storage of data. Thus, there is a need to ensure proper storage frameworks for AI, it said.

Changing the culture of coders and developers: There is a need to change the culture so that coders and developers themselves recognise the harmful and consequential implication of biases, the paper said, adding that this goes beyond standardisation of the type of algorithmic code and focuses on the programmers of the code. Since much coding is outsourced, this would place the onus on the company developing the software product to enforce such standards. Such a comprehensive approach would tackle the problem across the industry as a whole, and enable AI software to make fair decisions made on unbiased data, in a transparent manner, it added.

In the near future, AI will have huge implications on the countrys security, its economic activities and the society. The risks are unpredictable and unprecedented. Therefore, it is imperative for all countries including India to develop a stack that fits into a standard model, which protects customers; users; business establishments and the government.

Economic impact: AI will have a major impact on mainly four sectors, per the paper: manufacturing industries, professional services, financial services, and wholesale and retail. The paper also charted out how AI could be used in some specific sectors. For instance, in healthcare, it said in rural areas, which suffer from limited availability of healthcare professionals and facilities, AI could be used for diagnostics, personalised treatment, early identification of potential pandemics, and imaging diagnostics, among others.

Similarly, in the banking and financial services sector, A can be used for things like development of credit scores through analysis of bank history or social media data, and fraud analytics for proactive monitoring and prevention of various instances of fraud, money laundering, malpractice, and prediction of potential risks, according to the report.

Uses for the government: For governments, for example, cybersecurity attacks can be rectified within hours, rather than months and national spending patterns can be monitored in real-time to instantly gauge inflation levels whilst collecting indirect taxes.

Excerpt from:
All you need to know about the Indian AI Stack - MediaNama.com

Sudbury to hold first drive-in concert – Sherwood Park News

Despite no summer festival in July, the Northern Lights Festival Boral team has been steadily planning new ways to bring live music experiences to Sudbury.

Last week, the organization was thrilled to announce NLFB #49, a diverse and exciting presentation of festival programming in alternative formats.

The long-running music and arts festival has been re-introducing live music in ways that are safe, responsible and fun. This special festival #49 programming culminates in the regions first-ever drive-in concert event, featuring some past festival favourites, as well as a few new faces.

The Sept. 19 concert will include Canadian roots-pop icon Serena Ryder, dynamic songwriter/performer Hawksley Workman, Toronto roots/folk/soul artist Julian Taylor, as well as locals Martine Fortin and Maxwell Jos.

The event will take place in partnership with Horizon Drive-in, at the New Sudbury Centre parking lot (1349 Lasalle Blvd.). Tickets are available online only at nlfb.ca/tickets.

Ryder is an artist adored by fans, peers and critics alike, in part due to her raw and earnest songwriting, and beautifully electric live performances. She has received numerous accolades, including six prestigious Juno Awards, a MuchMusic Video Award for Stompa, and a Canadian Screen Award for Achievement in Music Original Song.

Before her chart-smashing album, Harmony (2013), she also enjoyed success with previous releases, If Your Memory Serves You Well (2007), and Is it O.K. (2009), achieving Gold-selling status.

In 2012, her single, Weak In The Knees, also achieved Gold Certification. Ryders Christmas Kisses was named one of the Top 5 Christmas records of 2018 by Rolling Stone. She has also received the 2018 Margaret Trudeau Mental Health Advocacy Award and has been the face of Bell Lets Talk campaign for multiple years.

A staple of the Canadian arts scene for almost 20 years, Hawksley Workman boasts a catalogue of 15 solo releases showcasing his now signature spectrum of sonic influence, from cabaret to electro-pop to anthemic rock and plenty in between.

The accolades amassed include JUNO nods and wins and widespread critical acclaim. As a producer, his fingerprints grace releases by Juno and Polaris Prize nominees, and winners like Tegan and Sara, Sarah Slean, Serena Ryder, Hey Rosetta!, and Great Big Sea.

Hes also penned melodies with a myriad of artists, from Oscar-award winning Marion Cotillard (La Vie en Rose, Inception) to French rock icon Johnny Hallyday.

Hawksleys touring career has seen him play nearly a thousand shows worldwide. Hes headlined prestigious venues like Massey Hall in Toronto and The Olympia in Paris, and opened for heroes Morrissey, David Bowie, and The Cure.

Julian Taylor doesnt fit in a box. He never has and more power to him. A Toronto music scene staple and a musical chameleon, Taylor is used to shaking it up over the course of 10 albums in the last two decades.

Of West Indian and Mohawk descent, Taylor first made his name as frontman of Staggered Crossing, a Canadian rock radio staple in the early 2000s. These days, however, the soulful singer/guitarist might be on stage one night playing with his eponymous band, spilling out electrified rhythm and blues glory, and the next hell be performing at a folk festival delivering a captivating solo singer-songwriter set.

Martine Fortin is a bilingual singer-songwriter from Sudbury and a past winner of NLFBs annual Meltdown Competition. Her music is a blend of pop, jazz, blues, soul, and rock, combined with intimate, introspective lyrics, and moving piano melodies. She will perform a few of her songs near the start of the evening.

Walking the line between country and folk, Maxwell Joss songs draw from his experience growing up both in the North, on Lake Superior, as well as in southern Illinois. Anxiety, growing pains, and some good old fashioned storytelling are key elements of his tunes. He will open up the event by sharing a few of these songs.

Gates open at 6 p.m., and vehicles are asked to arrive at that time to ensure vehicle placement for showtime. Tickets are $30 in advance and $40 at the gate.

Due to safety protocols around COVID-19 and general health and safety, concert-goers must remain in the vehicles during the show. For any questions regarding tickets, protocols, or the event in general, contact the NLFB team at marketing@nlfb.ca or 705-674-5512.

sud.editorial@sunmedia.ca

Twitter: @SudburyStar

Visit link:
Sudbury to hold first drive-in concert - Sherwood Park News

Shadow banning and its role in modern day censorship – Cherwell Online

It is no secret algorithms dominate our online social lives it is not as if we arent making our own decisions when it comes to who we talk to or what media we consume, but it would be wilfully ignorant to ignore how systems have been programmed to categorise, collect, and suggest data just based on our likes and follows. This exposes us to content, people and ideas that we just would not have found on our own but it begs the questions of how much control do these systems have in restricting what we see?

This brings us to shadow banning.

Shadow banning is the decision of a social media platform to partially or wholly obstruct a persons content from being interacted with preventing new people from searching for your content, ensuring you do not appear under hashtags or even limiting how often you are suggested as a person to follow are just a few ways this can be achived. Platforms such as Instagram and Tiktok rarely acknowledge the claims of this nature but rather point to their right to remove posts that do not align with their Community Guidelines and how agreeing to use the platform is consenting to their power to do so.

In the grand scheme of things, having your videos taken down or fewer people finding and engaging content is not the greatest detriment to the world, but there is a significant pattern to who is being shadow banned. If I refer back to Tiktoks community guidelines, they claim to scrap videos created to facilitate harm onto others but within the guidelines, they make an effort to reiterate that they allow educational, historical, satirical, artistic, and other content that can be clearly identified as counterspeech or aims to raise awareness of the harm caused by dangerous individuals and/or organisations. This quote and their statement to show support of the Black Lives Matter movement will come as surprise especially to the number of black creators that have seen their engagement rates fall and their videos be taken down on their app.

Instagram has shown itself to be just as complicit in this there has been significant backlash from sex workers, sex educators and often queer inclusive sex-positive spaces on the app. Chante Joseph in her Guardian piece exposed the grey area that is not as clearly defined as Instagrams no nudity policy where the administrators can flag content as sexually suggestive; many people argue that this is necessary to ensure children are not exposed to inappropriate content rather than parents taking accountability or social media platforms at least attempting to introduce any form of age restriction, the onus is placed on creators. But consider, for example, LGBTQIA+ creators; their accounts are providing information that young people who may not have even come out to themselves would otherwise be able to access so they can process and understand their feelings in a healthy space that wasnt available to them just a decade ago. In essence, these guidelines about what a person is allowed to share is being defined by some arbitrary moral standard where discussions of sex specifically those outside the realm of the heteronormative are something to be protected from, even though there are very few spaces that allow for them in real life either.

Instagram, Twitter, TikTok, Facebook all are often steeped in their reputation of being superficial and resting on the self-gratification of people wanting to be seen (which isnt even itself a bad thing), but besides that they can be used to share ideas, political thoughts and knowledge. So when black creators attempting to inform the masses are restricted from sharing information or when sex workers messages on misogyny are inaccessible because their page is considered too sexually suggestive (a term not defined so therefore difficult to avoid), the silence is deafening. Shadowbanning is a threat to us because it maintains for us the illusion of control. Yet the whole idea is synonymous with censorship and the obstruction of information. Further, this obstruction is dictated by what platforms see as appropriate so the power we assumed we had in our voices can still be silenced.

Go here to see the original:

Shadow banning and its role in modern day censorship - Cherwell Online

What’s the state of quantum computing? Led by IBM & Amazon it’s developing rapidly – WRAL Tech Wire

Editors note: Stephanie Long is Senior Analyst with Technology Business Research.

HAMPTON, N.H. Like IBM did with its Selectric typewriters in the 1960s, the company is successfully weaving its quantum computing thread through myriad aspects of the greater quantum ecosystem, underpinned by strategic sponsorships and the inclusion of partners in the IBM Quantum Experience.

Amazon Web Services (AWS) is pushing back on this approach by offering a vendor-agnostic view of quantum cloud computing.

Academia has also thrown its hat into the ring with ongoing innovation and advancements in quantum computing.

The competitive landscape of quantum computing has begun to take on the look and feel of the early classical computing world; however, the modern industry has addressed the mistakes made with classical computing, and therefore progress can be more formulaic and swift.

August 2020 developments are starting to tie pieces of investments together to show a glimpse of when the post-quantum world may come, and as advancements continue the future state appears closer on the horizon than previously thought.

Duke joins $115M program to focus on development of quantum computing

If you would like more detailed information around the quantum computing market, please inquire about TBRsQuantum Computing Market Landscape,a semiannual deep dive into the quantum computing market. Our most recent version, which focused on services, was released in June. Look for our next iteration in December, focused on middleware.

(C) TBR

Follow this link:
What's the state of quantum computing? Led by IBM & Amazon it's developing rapidly - WRAL Tech Wire

Global Scale of the Quantum Computing Opportunity – Quantaneo, the Quantum Computing Source

The quantum computing economy is real and growing IBM (NYSE: IBM) is a headline sponsor of London Tech Week, with Bob Sutor, VP IBM Quantum Ecosystem Development, IBM Research, emphasising the collaborative approach of IBMs Q Network towards continued development of the quantum computing ecosystem. Archer is a member of the global IBM Q Network, and as part of an agreement with IBM, plans to use Qiskit as the software stack for its 12CQ qubit processors. Archer aims to build the 12CQ chip for quantum computing operation at room-temperature and integration onboard modern electronic devices. Sutor sent a clear message to sceptics of quantum computing, highlighting some extraordinary stats of the rapid user uptake of IBMs quantum tech solutions: in 4 years IBMs Qiskit quantum development platform has grown to 250,000+ registered users, and over 1 billion quantum hardware circuits are now being run on IBMs quantum computers each day! Other giants are also involved in the quantum economy, and Daniel Franke from Merck Ventures, the strategic, corporate venture capital arm of the pharmaceutical giant Merck (NYSE: MRK), updated delegates on their efforts to integrate with the emerging global quantum research ecosystem. Mercks approach saw the formation of numerous partnerships with start-ups, industry peers and academia with over 50 staff dedicated to a quantum computing taskforce focused on what they dubbed performance materials in the life sciences and pharmaceutical arena. A positive quantum disruption to entire economies UK Parliamentary Under-Secretary of State for Science, Research and Innovation Amanda Solloway, highlighted the UKs National Quantum Technology Programme, which is set to attract more than 1 billion (A$1.8 billion) of public and private investment over its 10-year duration. Much of this investment over the next 5 years is to boost the UKs thriving technology ecosystem post-COVID19 and infrastructure that is quantum best-in-class globally, to develop the UKs first commercially available quantum computer, and new infrastructure including the National Quantum Computing Centre (NQCC). Quantum hardware: the new Smart Tech There was a bold consensus among panellists involving UK-based start-ups and a number of global players in the quantum computing space: a move to hybrid computing over the next 5 years and full quantum computing over the next 10 years. The time horizons come with the caveat of the need to progress quantum computing technology, including potential solutions to practical quantum computing, e.g. overcoming commercial limitations posed by excessive cooling requirements of current quantum computers. Progress in technology development a key market catalyst A year ago, delegates (including Archer) at the Quantum.Tech conference in Boston USA, heard a myriad of venture capitalist concerns of a quantum winter, and the inconvenience of quantum technologys deep tech time-to-market all compounded with uncertainties in market size. Now, at the Quantum Summit, corporate venture challenges appear to be shifting to a potential need to reframe a 1 to 2-year risk appetite towards a deep tech value-driven 5 to 10-year framework. This is to better capitalise on the global-scale of opportunity that quantum computing is now beginning to rapidly validate. It is clear that quantum computing is not just a faster computer. Even though early-stage quantum computing applications are not yet general purpose, examples of disruptive enterprise-scale solutions are spanning globally relevant industries of life sciences, finance, and telecommunications. We are excited in participating in the upcoming sessions of London Tech Week, and particularly as invited delegates of the Virtual Mission (Australian companies) which begins tonight, and I look forward to updating our shareholders on key outcomes at the conclusion of London Tech Week.

See the rest here:
Global Scale of the Quantum Computing Opportunity - Quantaneo, the Quantum Computing Source

The Quantum Dream: Are We There Yet? – Toolbox

The emergence of quantum computing has led industry heavyweights to fast track their research and innovations. This week, Google conducted the largest chemical simulation on a quantum computer to date. The U.S. Department of Energy, on the other hand, launched five new Quantum Information Science (QIS) Research Centers. Will this accelerate quantum computings progress?

Quantum technology is the next big wave in the tech landscape. As opposed to traditional computers where all the information emails, tweets, YouTube videos, and Facebook photos are streams of electrical pulses in binary digits, 1s and 0s; quantum computers rely on quantum bits or qubits to store information. Qubits are subatomic particles, such as electrons or photons which change their state regularly. Therefore, they can be 1s and 0s at the same time. This enables quantum computers to run multiple complex computational tasks simultaneously and faster when compared to digital computers, mainframes, and servers.

Introduced in the 1980s, quantum computing can unlock the complexities across different industries much faster than traditional computers. A quantum computer can decipher complex encryption systems that can easily impact digital banking, cryptocurrencies, and e-commerce sectors, which heavily depend on encrypted data. Quantum computers can expedite the discovery of new medicines, aid in climate change, power AI, transform logistics, and design new materials. In the U.S., technology giants, including IBM, Google, Honeywell, Microsoft, Intel, IonQ, and Rigetti Computing, are leading the race to build quantum computers and gain a foothold in the quantum computing space. Whereas Alibaba, Baidu, Huawei are leading companies in China.

For a long time, the U.S. and its allies, such as Japan and Germany, had been working hard to compete with China to dominate the quantum technology space. In 2018, the U.S. government released the National Strategy Overview for Quantum Information Science to reduce technical skills gaps and accelerate quantum computing research and development.

In 2019, Google claimed quantum supremacy for supercomputers when the companys Sycamore processor performed specific tasks in 200 seconds, which would have taken a supercomputer 10,000 years to complete. In the same year, Intel rolled out Horse Ridge, a cryogenic quantum control chip, to reduce the quantum computing complexities and accelerate quantum practicality.

Tech news: Is Data Portability the Answer To Anti-Competitive Practices?

Whats 2020 Looking Like For Quantum Computing?

In July 2020, IBM announced a research partnership with the Japanese business and academia to advance quantum computing innovations. This alliance will deepen ties between the countries and build an ecosystem to improve quantum skills and advance research and development.

More recently, in June 2020, Honeywell announced the development of the worlds highest-performing quantum computer. AWS, Microsoft, and several other IaaS providers have announced quantum cloud services, an initiative to advance quantum computing adoption. In August 2020, AWS announced the general availability of its Amazon Braket, a quantum cloud service that allows developers to design, develop, test, and run quantum algorithms.

Since last year, auto manufacturers, such as Daimler and Volkswagen have been leveraging quantum computers to identify new methods to improve electric vehicle battery performance. Pharmaceutical companies are also using the technology to develop new medicines and drugs.

Last week, the Google AI Quantum team used their quantum processor, Sycamore, to simulate changes in the configuration of a chemical molecule, diazene. During the process, the computer was able to describe the changes in the positions of hydrogen accurately. The computer also gave an accurate description of the binding energy of hydrogen in bigger chains.

If quantum computers develop the ability to predict chemical processes, it would advance the development of a wide range of new materials with unknown properties. Current quantum computers, unfortunately, lack the augmented scaling required for such a task. Although todays computers are not ready to take on such a challenge yet, computer scientists hope to accomplish this in the near future as tech giants like Google invest in quantum computing-related research.

Tech news: Will Googles Nearby Share Have Anything Transformative to Offer?

It, therefore, came as a relief to many computer scientists when the U.S. Department of Energy announced an investment of $625 million over the next five years for five newly formed Quantum Information Science (QIS) Research Centers in the U.S. The newly formed hubs are an amalgam of research universities, national labs, and tech titans in quantum computing. Each of the research hubs is led by the Energy Departments Argonne National Laboratory, Oak Ridge National Laboratory, Brookhaven National Laboratory, Fermi National Laboratory, and Lawrence Berkeley National Laboratory; powered by Microsoft, IBM, Intel, Riggeti, and ColdQuanta. This partnership aims to advance quantum computing commercialization.

Chetan Nayak, general manager of Quantum Hardware at Microsoft, says, While quantum computing will someday have a profound impact, todays quantum computing systems are still nascent technologies. To scale these systems, we must overcome a number of scientific challenges. Microsoft has been tackling these challenges head-on through our work towards developing topological qubits, classical information processing devices for quantum control, new quantum algorithms, and simulations.

At the start of this year, Daniel Newman, principal analyst and founding partner at Futurum Research, predicted that 2020 will be a big year for investors and Silicon Valley to invest in quantum computing companies. He said, It will be incredibly impactful over the next decade, and 2020 should be a big year for advancement and investment.

Quantum computing is still in the development phase, and the lack of suppliers and skilled researchers might be one of the influential factors in its establishment. However, if tech giants, and researchers continue to collaborate on a large scale, quantum technology can turbocharge innovation at a large scale.

What are your thoughts on the progress of quantum computing? Comment below or let us know on LinkedIn, Twitter, or Facebook. Wed love to hear from you!

Originally posted here:
The Quantum Dream: Are We There Yet? - Toolbox

How Amazon Quietly Powers The Internet – Forbes

Amazon (AMZN)

What was the last thing you heard about Amazon (AMZN)?

Let me guess. Its battle with Walmart WMT ? Or was it the FAAs approval of Amazons delivery drones? Most of this news about Amazons store is just noise that distracts investors from Amazons real force.

As Ill show, Amazon is running an operating system that powers some of todays most important technologies such as virtual reality, machine learning, and even quantum computing. Behind the scenes, it is utilized by over a million companiesincluding tech giants Apple AAPL , Netflix NFLX , and Facebook FB .

This is Amazons key and ever-growing moneymaker that has been driving Amazon stock to the moon. But before I pull the curtains, lets step back for a moment.

First, how Amazon makes moneyfor real

For all the online shopping fuss, Amazon doesn't earn much from its store. Yes, Amazon.com AMZN flips hundreds of billions of dollars worth of products every yearand its revenues are on a tear. But Amazon turns only a sliver of that into profits.

In the past year, Amazons store generated a record $282 billion in revenue from Amazon.com. That translated to just $5.6 billion in profitskeep in mind that was Amazon.coms most profitable year ever.

Meanwhile, most of Amazons profits came from the lesser-known side of its business called Amazon Web Services (AWS), as you can see below:

Amazon's profits from AWS vs Amazon.com

Its Amazons cloud arm that is serving over a million companies across the world. You may have heard that AWS has something to do with storing data in the cloud. But its much,muchmore than that.

AWS is the operating system of the internet

To get an idea of how AWS works, take your computer as an example.

Like every other computer, it runs on an operating system such as Windows or MacOS, which comes with a set of programs. This software puts your computer resources to use and helps you carry out daily taskssuch as sending emails or sorting out your files.

Now, think of AWS as an operating system thats running not one, but hundreds of thousands of big computers (in tech lingo: servers). It gives companies nearly unlimited computing power and storageas well as tools to build and run their software on the internet.

The difference is that these big computers sit in Amazons warehouses. And companies work on them remotelyor via the cloud. In other words, AWS is like the operating system of the internet.

Amazons operating system now powers AI, blockchain, and other next-gen technologies

In 2003, when Amazons AWS first started out, it offered only a couple of basic cloud services for storage and mail. Today, this system offers an unmatched set of 175+ tools that help companies build software harnesses todays top technologies.

The list includes blockchain, VR, machine learning (AI), quantum computing, augmented reality (AR), and other technologies that are the building blocks of todays internet.

For example, Netflix is using AWS for more than simply storing and streaming its shows on the internet. Its also employing AWS machine learning technology to recommend movies and shows to you.

Youve also probably heard of Slack (WORK), the most popular messaging app for business. Slack recently announced it will use Amazons media technology to introduce video and audio calls on its app.

And its not just tech companies that are utilizing Amazons AWS tools.

Take GE Power. The worlds energy leader is using AWS analytics technology to store and sift through avalanches of data from its plants. Or Fidelity. Americas mutual fund giant experiments with Amazons VR technology to build VR chat rooms for its clients.

In a picture, Amazons AWS works like this:

How Amazon's AWS powers the internet

Amazons AWS is earning more and more... and more

Amazon is not the only company running a cloud service. Google, Microsoft MSFT , Alibibaba, IBM IBM , and other tech giants are all duking it out for a slice of this lucrative business. But Amazon is the biggest and most feature-rich.

Today, Amazon controls 33% of the market, leaving its closest competitors Microsoft (2nd with 18%) and Google (3rd with 9%) far behind in the dust. That means nearly one third of the internet is running on Amazons AWS.

And it doesnt appear that Amazon will step down from its cloud throne anytime soon. Amazons sales from AWS soared 10X in the past six years. And last year, Amazon reported a bigger sales gain from AWS (dollar-wise) than any other cloud company.

Heres the main takeaway for investors

If you are looking into Amazon stock, dont get caught up in the online shopping fuss.

For years, AWS has been the linchpin of Amazons business. And this invisible side of Amazon is where Amazons largest gears turn.

Problem is, AWS is like a black box. Amazon reports very little on its operations. So if you want to dig deeper, youll have to do your own research.

Youll also have to weigh a couple of risks before putting your money into Amazon stock:

Other than that, Amazon is an outstanding stock, killing it in one of the most lucrative businesses on the planet. And its proven to be resilient to Covid, whose spread could hit the markers again.

Get investing tips that make you go Hmm...

Every week, I put out a big picture story to help explain whats driving the markets. Subscribe here to get my analysis and stock picks right in your inbox.

Go here to read the rest:
How Amazon Quietly Powers The Internet - Forbes

Which cybersecurity failures cost companies the most and which defenses have the highest ROI? – Help Net Security

Massachusetts Institute of Technology (MIT) scientists have created a cryptographic platform that allows companies to securely share data on cyber attacks they suffered and the monetary cost of their cybersecurity failures without worrying about revealing sensitive information to their competitors or damaging their own reputation.

The SCRAM platform allows defenders to learn from past attacks and provides insight into which cyber-risk control areas require additional scrutiny or investment.

In the past, the only way to aggregate and share information about cyber attacks was through a trusted third party, explained the students, economists, cryptography and internet policy experts who worked on this project under the auspices of MITs Computer Science and Artificial Intelligence Lab (CSAIL).

But that third party could be breached, the data stolen and disclosed. The data could also be accidentally disclosed. For these reasons, companies often refused to participate in such schemes and share information about their losses.

SCRAM (Secure Cyber Risk Aggregation and Measurement) has, according to its creators, solved that longstanding cyber-security problem.

SCRAM mimics the traditional aggregation technique, but works exclusively on encrypted data that it cannot see. The system takes in encrypted data from the participants, runs a blind computation on it, and returns an encrypted result that must be unlocked by each participant separately before anyone can see the answer, they explained.

The security of the system comes from the requirement that the keys from all the participants are needed in order to unlock any of the data. Participants guarantee their own security by agreeing to unlock only the result using their privately held key.

More technical details about the process and the platform, which consists of a central server, software clients, and a communication network to pass encrypted data between the clients and the server, can be found in this paper.

The researchers recruited seven large companies that had a high level of security sophistication and a CISO to test out the platform, i.e., to contribute encrypted information about their network defenses and a list of all monetary losses from cyber attacks and their associated defensive failures over a two-year period.

Firms of this size would have the technological expertise and resources to nominate people on their team to work with us to design the appropriate questions and to perform the internal data collection, the scientists explained the rationale behind their decision to focus on larger companies.

SCRAM returned information about adopted defenses and pointed out which security failures cost companies the most money:

These results provide a compelling proof-of-concept for how cyber intrusion data can be shared. Our next step will be to increase the number of incidents in future rounds to produce more robust estimates, more complex analyses, and more generalizable results, the scientists noted.

With a larger data sample, we will also be able to explore loss distribution approaches that cover both the frequency and severity of losses. A larger sample size will also reduce the chance of outliers or single incidents leaking the magnitude of an individual event.

In the meantime, though, theyve been able to demonstrate to companies that sensitive cyber attack data can be shared and used without being actually being disclosed.

What this effectively means is that new cryptographic platforms such as SCRAM can gain access to previously untouchable data that can then be used to inform market participants and meet important challenges, they added.

Many of the target firms for this multi-party computation were interested in participating, but they wanted to see the results of the first computation before contributing their own data. From a cybersecurity standpoint, this represents a new opportunity to create new cybersecurity aggregation pools with greater reach and precision than ever before.

Read more:
Which cybersecurity failures cost companies the most and which defenses have the highest ROI? - Help Net Security

Quantum Cryptography Market Research Analysis Including Growth Factors, Types And Application By Regions From 2024 – Kentucky Journal 24

Overview:

Quantum cryptographyis a new method for secret communications that provides the assurance of security of digital data. Quantum cryptography is primarily based on the usage of individual particles/waves of light (photon) and their essential quantum properties for the development of an unbreakable cryptosystem, primarily because it is impossible to measure the quantum state of any system without disturbing that system.

Request For ReportSample@https://www.trendsmarketresearch.com/report/sample/9921

It is hypothetically possible that other particles could be used, but photons offer all the necessary qualities needed, the their behavior is comparatively understandable, and they are the information carriers in optical fiber cables, the most promising medium for very high-bandwidth communications.

Quantum computing majorly focuses on the growing computer technology that is built on the platform of quantum theory which provides the description about the nature and behavior of energy and matter at quantum level. The fame of quantum mechanics in cryptography is growing because they are being used extensively in the encryption of information. Quantum cryptography allows the transmission of the most critical data at the most secured level, which in turn, propels the growth of the quantum computing market. Quantum computing has got a huge array of applications.

Market Analysis:

According to Infoholic Research, the Global Quantum cryptography Market is expected to reach $1.53 billion by 2023, growing at a CAGR of around 26.13% during the forecast period. The market is experiencing growth due to the increase in the data security and privacy concerns. In addition, with the growth in the adoption of cloud storage and computing technologies is driving the market forward. However, low customer awareness about quantum cryptography is hindering the market growth. The rising demands for security solutions across different verticals is expected to create lucrative opportunities for the market.

Market Segmentation Analysis:

The report provides a wide-ranging evaluation of the market. It provides in-depth qualitative insights, historical data, and supportable projections and assumptions about the market size. The projections featured in the report have been derived using proven research methodologies and assumptions based on the vendors portfolio, blogs, whitepapers, and vendor presentations. Thus, the research report serves every side of the market and is segmented based on regional markets, type, applications, and end-users.

Countries and Vertical Analysis:

The report contains an in-depth analysis of the vendor profiles, which include financial health, business units, key business priorities, SWOT, strategy, and views; and competitive landscape. The prominent vendors covered in the report include ID Quantique, MagiQ Technologies, Nucrypt, Infineon Technologies, Qutools, QuintenssenceLabs, Crypta Labs, PQ Solutions, and Qubitekk and others. The vendors have been identified based on the portfolio, geographical presence, marketing & distribution channels, revenue generation, and significant investments in R&D.

Get Complete TOC with Tables andFigures@https://www.trendsmarketresearch.com/report/discount/9921

Competitive Analysis

The report covers and analyzes the global intelligent apps market. Various strategies, such as joint ventures, partnerships,collaborations, and contracts, have been considered. In addition, as customers are in search of better solutions, there is expected to be a rising number of strategic partnerships for better product development. There is likely to be an increase in the number of mergers, acquisitions, and strategic partnerships during the forecast period.

Companies such as Nucrypt, Crypta Labs, Qutools, and Magiq Technologies are the key players in the global Quantum Cryptography market. Nucrypt has developed technologies for emerging applications in metrology and communication. The company has also produced and manufactured electronic and optical pulsers. In addition, Crypta Labs deals in application security for devices. The company deals in Quantum Random Number Generator products and solutions and Internet of Things (IoT). The major sectors the company is looking at are transport, military and medical.

The report includes the complete insight of the industry, and aims to provide an opportunity for the emerging and established players to understand the market trends, current scenario, initiatives taken by the government, and the latest technologies related to the market. In addition, it helps the venture capitalists in understanding the companies better and to take informed decisions.

Regional Analysis

The Americas held the largest chunk of market share in 2017 and is expected to dominate the quantum cryptography market during the forecast period. The region has always been a hub for high investments in research and development (R&D) activities, thus contributing to the development of new technologies. The growing concerns for the security of IT infrastructure and complex data in America have directed the enterprises in this region to adopt quantum cryptography and reliable authentication solutions.

<<< Get COVID-19 Report Analysis >>>https://www.trendsmarketresearch.com/report/covid-19-analysis/9921

Benefits

The report provides an in-depth analysis of the global intelligent apps market aiming to reduce the time to market the products and services, reduce operational cost, improve accuracy, and operational performance. With the help of quantum cryptography, various organizations can secure their crucial information, and increase productivity and efficiency. In addition, the solutions are proven to be reliable and improve scalability. The report discusses the types, applications, and regions related to this market. Further, the report provides details about the major challenges impacting the market growth.

More here:
Quantum Cryptography Market Research Analysis Including Growth Factors, Types And Application By Regions From 2024 - Kentucky Journal 24