The Interplay between Quantum Theory And Artificial Intelligence – Analytics India Magazine

Download our Mobile App

Machine Learning Developers Summit (MLDS 2021) is one of the biggest gatherings of machine learning developers in India. With more than 1,500 machine learning developers, 60 speakers from around 200 organisations, the conference corrals Indias leading Machine Learning innovators and practitioners to share their ideas about machine learning tools, advanced development and more.

Anish Agarwal, Director, Data & Analytics, India at NatWest Group, talked about The Interplay between Quantum Theory And Artificial Intelligence at MLDS 2021.

The session started with an introduction to emerging technologies like artificial intelligence, a brief on quantum computing, different forms of quantum technology used for various military as well as civilian applications, how it is different from the classical computers as well as how quantum computing plays a vital role in the advancement of artificial intelligence.

In the field of quantum computing, Agarwal discussed the technique of quantum artificial intelligence, how it can be used for computation of machine learning algorithms and what makes this technology unique.

Quantum AI can help in achieving results that are impossible with classical computers. He said, as per reports, 25 percent of fortune global 500 companies will have a competitive edge from quantum computing by the year 2023. Tech giants like Google, Microsoft are doubling down on quantum computing.

He then explained the possibilities of applying quantum computing in AI:

He said, Quantum machine learning (QML) is not one settled homogeneous field. This is because machine learning itself is quite diverse in nature. He added, Quantum Machine Learning is simply the field exploring the connections between quantum computing and quantum physics on one hand and machine learning and related fields on the other hand.

Agarwal then deliberated on Quantum Game Theory and compared it with classical game theory. He said quantum game theory can be used to overcome critical problems in quantum communications.

He also discussed the advantages of quantum AI:

Agarwal concluded the session by touching upon the key applications of quantum artificial intelligence. Lastly, he mentioned some of the critical milestones for quantum AI and busted a few myths related to quantum computing techniques.

The critical milestones include:

A Technical Journalist who loves writing about Machine Learning and Artificial Intelligence. A lover of music, writing and learning something out of the box. Contact: [emailprotected]

The rest is here:
The Interplay between Quantum Theory And Artificial Intelligence - Analytics India Magazine

Jet Suit Testing by the British Royal Navy and Gravity Industries – OODA Loop

Ever since Star Wars Episode VI: Return of the Jedi, when Boba Fett busts his jet suit on Jabba the Hutts sail barge during the Battle of the Great Pit of Carkoon, well, this writer was hooked. Jet packs have since been depicted in media and sci-fi, most notably in the dystopian scenario of Spielbergs 2002 Minority Report (an adaptation of a 1956 science fiction novella by Philip K. Dick). The Guardian offers this thorough history of jet packs.

Technological fact now mirrors science fiction, as the British Royal Navy has recently been testing jet suit technology to board ships. A new video (above) was recently released by the UK-based Gravity Industries, which manufactures the jet suit technology.

Already a member?Sign in to your account.

OODA Loop provides actionable intelligence, analysis, and insight on global security, technology, and business issues. Our members are global leaders, technologists, and intelligence and security professionals looking to inform their decision making process to understand and navigate global risks and opportunities.

You can chose to be an OODA Loop Subscriber or an OODA Network Member. Subscribers get access to all site content, while Members get all site content plus additional Member benefits such as participation in our Monthly meetings, exclusive OODA Unlocked Discounts, discounted training and conference attendance, job opportunities, our Weekly Research Report, and other great benefits.

For more information please click here. Thanks!

Already a member?Sign in to your account.

Now more than ever, organizations need to apply rigorous thought to business risks and opportunities. In doing so it is useful to understand the concepts embodied in the terms Black Swan and Gray Rhino. See: Potential Future Opportunities, Risks and Mitigation Strategies in the Age of Continuous Crisis

The OODA leadership and analysts have decades of experience in understanding and mitigating cybersecurity threats and apply this real world practitioner knowledge in our research and reporting. This page on the site is a repository of the best of our actionable research as well as a news stream of our daily reporting on cybersecurity threats and mitigation measures. See:Cybersecurity Sensemaking

OODAs leadership and analysts have decades of direct experience helping organizations improve their ability to make sense of their current environment and assess the best courses of action for success going forward. This includes helping establish competitive intelligence and corporate intelligence capabilities.Our special series on the Intelligent Enterprise highlights research and reports that can accelerate any organization along their journey to optimized intelligence. See: Corporate Sensemaking

This page serves as a dynamic resource for OODA Network members looking for Artificial Intelligence information to drive their decision-making process. This includes a special guide for executives seeking to make the most of AI in their enterprise. See: Artificial Intelligence Sensemaking

From the very beginning of the pandemic we have focused on research on what may come next and what to do about it today. This section of the site captures the best of our reporting plus daily daily intelligence as well as pointers to reputable information from other sites. See: OODA COVID-19 Sensemaking Page.

A dynamic resource for OODA Network members looking for insights into the current and future developments in Space, including a special executives guide to space. See: Space Sensemaking

OODA is one of the few independent research sources with experience in due diligence on quantum computing and quantum security companies and capabilities. Our practitioners lens on insights ensures our research is grounded in reality. See: Quantum Computing Sensemaking.

In 2020, we launched the OODAcast video and podcast series designed to provide you with insightful analysis and intelligence to inform your decision making process. We do this through a series of expert interviews and topical videos highlighting global technologies such as cybersecurity, AI, quantum computing along with discussions on global risk and opportunity issues. See: The OODAcast

Read more here:
Jet Suit Testing by the British Royal Navy and Gravity Industries - OODA Loop

NWA funding for taking quantum technology to the public Bits&Chips – Bits&Chips

1 December

The Quantum Inspire consortium has received a 4.5 million euro grant from the Dutch Research Council (NWO) to bring quantum technology closer to potential users. We hope that different people from all parts of society will interact with Quantum Inspire, so that we can work together to figure out the full range of possibilities offered to our society by quantum computing including which societal challenges it will be able to solve, said Lieven Vandersypen, coordinator of the grant application and research director of Qutech.

Quantum technology is expected to find applications in many different fields, such as energy, food supply, security and health care. Being an emerging technology, however, not much people in these fields are actively investigating its potential yet. And even if they wanted to, where would they go? Getting access to a quantum computer is not exactly easy.

This why Quantum Inspire was started: people can run their own quantum algorithms on Quantum Inspires simulators or hardware backends and experience the possibilities of quantum computing. Qutech launched a first version of Quantum Inspire in April 2020, and the grant will allow the consortium to develop it further.

Quantum Inspires capital infusion is funded by the Dutch National Research Agenda (NWA) program Research along routes by consortia (NWA-ORC). In total, NWO distributed 93 million euros over 21 interdisciplinary research projects.

View original post here:
NWA funding for taking quantum technology to the public Bits&Chips - Bits&Chips

Fidelity Investments leaps back to the future in an experiment to restore active management to its lofty perch, using technology that is still more…

The Boston giant is renting a special corner of Amazon's cloud to remake Monte Carlo and do hyper-quant investing like an AI Peter Lynch, with no experience as a golf caddy.

Brooke's Note: Passive indexing is done by computers that mostly make sure that theybet on nothing but the diversity implicit in any given index of securities. It's an approach where wisdom of knowing how little you know -- and executing it with mechanical precision -- mostly beats market timing done byyounger, smarter computers, never mind smarter, or dumber, people. The passive approach now attracts the most dollars because it is cheaper andbetter, or better because it's cheaper. But it's easy to see why smart people with smart computers wouldn't want to accept this new odd reality lying down, and Fidelity's people, it seems areamong them. The logic to its FCAT quantum project with Amazonis that a tipping point back to active managers beating passive onesis bound to come alongif computers keep getting smarter. Of course, active managers eventually outsmart each other, which blunts any advantages, so it's key to be first. Fidelity Investments is trying to do just that by playing the quantum revolution.

Fidelity Investments is exploring a path out of the drab world of passive investing backto the greener pastures of active management, using a technology that, until recently, was more science fiction than fact.

That path is being charted deep within the bowels of the Boston giant at its Center for Applied Technology (FCAT). It'son a never-ending mission to find "breakthrough achievements in research and tech,"according to its website.

And, it thinks it's found one in the latest advances in quantum computing.Itpromises to revitalize active management, where the fees are fat and the returns are -- hopefully -- fatter.

Fidelity's latest research project runs FCAT-developed quantum algorithms through Amazon's Braket, arecently launchedcloudservicethatruns on three super-computers, D-Wave, IonQand Rigetti.

Quantum computers areable to solve certaincomputational problems, infinitely faster than classical computers. They havefour major potential benefits for financial firms.

It speeds up market forecasting, cryptographyand data gathering,and makes it more precise, says Fidelity head of emerging technologyAdam Schouela.

It is a quest for the proverbial quantum leap.

"We're looking for those technologies that truly have that potential to displace technologies we're using today," he says."That's where quantum computing fits in."

In August, Fidelity completed a quantum computing proof-of-concept in conjunction with Amazon that promises faster and more accurate asset pricing, investment analytics, tradingand Monte Carlo analyses.

"Active investing is in Fidelitys DNA,"says Will Trout, director of wealth management at Pleasanton, Calif.-based consultancy, Javelin Strategy & Research, via email.

"Whether supported by the boffins or cutting-edge technology ... avenues where it's still possible to outperform and get paid ... will remain on the Fidelity road map," he explains.

FCATs latest project created a security not unlike an index ETF that tracked a synthesized index in close to real time with a lower rate of error than currently possible. By further crunchingthe data, it yieldednear real-time asset pricing, inclusive of options trades.

That said, Schouelais careful to temper expectations.

"I wouldn't necessarily call it a gamble but I wouldn't call it a 'will'... as in will potentially."

Fidelity is also one of the few firms withdeep enough pockets to pull off a project like this in such a nascent technology, says Lex Sokolin, global fintech co-head at New York City blockchain software company, ConsenSys, via email.

"With mutual fund AUM over [$3.5] trillion, Fidelity is able to partner and have a meaningful conversation with [firms like] Amazon ... these are big fixed-cost projects, and technology firms need to find a use case that works for millions." See:Fidelity Investments takes another leap into the future, enlisting Amazon to turn advisors into virtual reality avatars, but some say it's pie-in-the-sky.

Although quantum computing has potential long-term benefits for the financial industry, the field itself remains closer to the whiteboard than the shop floor.

Between 1977 and 1990, when Peter Lynch managed Fidelity's Magellan Fund, he averaged a 29.2% annual return, increasing assets from $18 million to$14 billion.

The legendary investor got recruited by a Fidelity exec who saw promise in his caddy -- an approach to capturing lucrative decision-making capabilities regarded by most HR departments as too hit-or-miss in 2020.

Indeed, many of FCATs own staff have yet to come to terms with quantum computing.

The firm uses a mix of workshops and virtual reality to get its employees thinking about the "mental shift" quantum-design requires.

"Quantum computing is in the very early stages of considering commercialization," Sokolin explains.

"This hardware is important, as are its uses, but I expect the discussion to stay in innovation labs for another few years. Much of what is happening today is finding the problems that fit the types of solutions that quantum computation can provide."

But a developmental leap is in the offing, similar to the shift from hexadecimal machine code to programming in English-like script, and Fidelity intends to capitalize, says Schouela.

"There are these layers of abstraction [that have] started to form for quantum computing [and] as soon as the technology is viable, we have the ability to leverage it to the benefit of Fidelity."

Typically, the now 20 year-old FCAT spends between three- to seven-years working on a project before it gets tucked into Fidelity or spun-off.

Fidelity's ability to succeed depends on its ability to make a portfolio of bets where failure or cold storageis an option.

"We shelve stuff all the time," Schouela says. " [and] sometimes the markets not ready for something yet it's an exploration."

'Incidental' pairing

The Fidelity-Amazon quantum partnership is also the fourth time the two firms have worked together since anearly attemptat joint distribution in 2006.

In May 2018, Fidelity developed a chatbot Cora built on AWS Sumerian, a VR design tool; and later Fidelity strengthened its VR partnership with Amazon as it pursued VR advice and training systems, the latter of which are now in use.

Then, in late 2019, Amazon chose Fidelity as its new 401(k) vendor. See:Fidelity wrests high-profile Amazon 401(k) business from Vanguard.

But the two firms continued partnering is merely "incidental", rather than strategic, says Schouela, who worked on Fidelity's VR projects.

"Its completely different folks [at Amazon this time] so it is a little bit more on the incidental side were [also] actively working with lots of different people in this space."

Fidelity's ownership of the algorithm-basedshort-selling asset manager Geode -- a 2003 Fidelity spin-off -- and its stake in ESG investment manager Ethic are examples of the firm's continued interest in active-management. See:Fidelity Investments inks deal with $180-million startup.

"Fidelity, led by [CEO] Abby Johnson in this context has unlimited thirst for advantage," saysSteve Gresham, managing principal of NYC consultancy, The Execution Project, via email.

Johnson has, for example, pushed the family firm toward crypto-currency.Fidelity Investments applies its proven Peter Jubber to its unproven bitcoin unit and its launch of Fidelity Digital Funds signals it's all in on blockchain currency

Yet Amazon will win downstream, says Sokolin.

"If it can help financial firms, whether Fidelity, hedge funds, or market makers more efficiently price financial instruments at scale [for] the entire market in real time, then it can become the de facto analytics engine for financial services."

"This would again mean that technology firms become more powerful relative to the existingfinancial ecosystem," he adds.

Amazon uses neutral language.

"Our goal for Braket is to be a catalyst,"says AWS vice president for technology, Bill Vass,in a release.

Excerpt from:
Fidelity Investments leaps back to the future in an experiment to restore active management to its lofty perch, using technology that is still more...

Quantum Computing Market Size 2021 By Analysis, Manufacturers, Regions, Type and Application, and Forecasts to 2027 – Jumbo News

Fort Collins, Colorado Report on Quantum Computing Market effectively provides key characteristics of the global investment market, population analysis, companies planning mergers and acquisitions, and concerned or new vendors in the review of research institutes reputable global markets. The Quantum Computing Report by QY Research describes the comprehensive market study covering overview, production, manufacturers, dimensions, revenue, price, consumption, growth rate, sales, import, sourcing, export, future plans and technological advancement for the detailed study of the Quantum Computing Market. Although it allows inexpensive reports readily available, tailor-made research by a team of experts. This report primarily focuses on the consumer and retail sectors.

Global Quantum Computing Market was valued at 193.68 million in 2019 and is projected to reach USD1379.67 million by 2027, growing at a CAGR of 30.02% from 2020 to 2027.

The Quantum Computing Market report comprises various chapters listing the participants which are playing a significant role in the global Quantum Computing Market growth. This section of the report displays the statistics of major players in the international market, including company profile, product specification, market share, and production value. The main type of segmentation mentioned in this report is a commercial and residential category. Based on the extensive historical data a well thought out study on the estimated period for the good expansion of Quantum Computing market globally is produced.

Request a Discount on the report @ https://reportsglobe.com/ask-for-discount/?rid=32953

Market Segments and Sub-segments Covered in the Report are as per below:

Quantum Computing Market, By Offering

Consulting solutions Systems

Quantum Computing Market, By Application

Machine Learning Optimization Material Simulation

Quantum Computing Market, By End-User

Automotive Healthcare Space and Defense Banking and Finance Others

It also provides accurate calculations and sales reports of the segments in terms of volume and value. The report introduces the industrial chain analysis, downstream buyers, and raw material sources along with the accurate insights of market dynamics. The report also studies the individual sales, revenue, and market share of every prominent vendor of the Quantum Computing Market. It majorly focuses on manufacturing analysis including the raw materials, cost structure, process, operations, and manufacturing cost strategies. The report delivers detailed data of big companies with information about their revenue margins, sales data, upcoming innovations and development, business models, strategies, investments, and business estimations.

The Quantum Computing Market reports deliver information about the industry competition between vendors through regional segmentation of markets in terms of revenue generation potential, business opportunities, demand & supply comparison taking place in the future. Understanding the Global perspective, the Quantum Computing Market report introduces an aerial view by analyzing historical data and future growth rate.

Request customization of the report @https://reportsglobe.com/need-customization/?rid=32953

Quantum Computing Market: By Region

North America Europe The Asia Pacific Latin America The Middle East and Africa

The objectives of the Quantum Computing Global Market Study are:

Split the breakdown data by region, type, manufacturer, and application. Identify trends, drivers, and key influencing factors around the world and in the regions Analysis and study of global Quantum Computing status and future forecast, including production, sales, consumption, history, and forecast. Analysis of the potential and advantage, opportunities and challenges, limitations, and risks of the global market and key regions. Analyze competitive developments such as expansions, agreements, product launches, and acquisitions in the market. Introducing the major Quantum Computing manufacturers, production, sales, market share, and recent developments.

To learn more about the report, visit @ https://reportsglobe.com/product/global-quantum-computing-market/

Thanks for reading this article; you can also get individual chapter wise section or region wise report versions like North America, Europe, or Asia.

How Reports Globe is different than other Market Research Providers

The inception of Reports Globe has been backed by providing clients with a holistic view of market conditions and future possibilities/opportunities to reap maximum profits out of their businesses and assist in decision making. Our team of in-house analysts and consultants works tirelessly to understand your needs and suggest the best possible solutions to fulfill your research requirements.

Our team at Reports Globe follows a rigorous process of data validation, which allows us to publish reports from publishers with minimum or no deviations. Reports Globe collects, segregates, and publishes more than 500 reports annually that cater to products and services across numerous domains.

Contact us:

Mr. Mark Willams

Account Manager

US: +1-970-672-0390

Email:[emailprotected]

Web:reportsglobe.com

More here:
Quantum Computing Market Size 2021 By Analysis, Manufacturers, Regions, Type and Application, and Forecasts to 2027 - Jumbo News

New York needs to be reimagined with technology and job training – Crain’s New York Business

Our response to Covid-19 offers a similar opportunity. Although theres no doubt we must focus on addressing immediate problems (schools, contact tracing, saving small businesses), we also should put thought into New Yorks future. Repairing is one thing, but designing a foundation is another. The new street grid, transit reforms and development policies that came out of 9/11 attest to the importance of the latter.

New York leaders should therefore take a few steps to chart the 21st century. In addition to controlling the virus and helping people in need, we must develop a grand strategy that recognizes the economic changes that were already happening before the pandemic, and leverage them in a way that benefits everyone.

Step one: capitalizing on emerging industries. Here the tech sector is a good starting point. Not only will tech companies continue to grow, but so too will tech aid and fuel the growth of every other kind of business. The areas that we should invest in include cybersecurity, quantum computing, artificial intelligence, transportation and smart manufacturing. Not only are they slated to create many jobs, but they also will increasingly undergird every other industry. A recent study on the projected impact of quantum computing on the New York economy, for instance, found that more than 57,000 new jobs will be generated in this area during the next five years, with that number expected to continue to grow as the technology advances. Policymakers and entrepreneurs need to work together to ensure that momentum keeps moving into the next decade, and create the right business conditions for New York to become an emerging tech hub.

Another way of putting this is reinvention by necessity. With more and more of our lives happening in a virtual world, the safety and efficiency challenges facing organizations have changed. Cyber threats, for example, are now a regular vulnerability for businesses and governments alike. Companies need rapid data processing like never before. Quantum computing and advanced malware detection are crucial for the economy. Not only will emerging tech generate growth, but it will also be a necessary component for the economy of tomorrow.

The next steps are doubling down on workforce development and ensuring that people can actually break into the sectors. Job openings in AI and cybersecurity dont mean much if New Yorkers arent qualified for them. We, therefore, need to expand our roster of digital skills programmingwhich includes computer science in the classroom, boot camps for aspiring coders, and a bevy of private training classes for entrepreneurs and workers. If the tech economy is to be inclusive, well need to put as much emphasis on teaching people the requisite skills as we do teaching them arithmetic.

Closing the digital divide is another step. Before Covid-19, we were already spending a lot of time online. In the midst of the pandemic, that trend has been amplified. People now need speedy, affordable internet connections to do their job, go to school, pay bills and get through each day. The fact that there are disparities in internet access is an impediment to the economy and only exacerbates existing inequalities. A strong 5G network throughout the city and state would help solve that issue and ultimately allow workers to take the necessary steps to move into the tech sector.

The good news is we already have parts of the foundation. New York has nearly unlimited investment resources, and state and local leaders have shown their appreciation for what tech can do.

The key is tying all the parts together and creating a new economy that offers opportunities to all.

Lynn McMahon is themanaging director of Accentures metro New York office. Julie Samuels is the executive director of Tech:NYC.

See the original post here:
New York needs to be reimagined with technology and job training - Crain's New York Business

Quantum Computing and the Cryptography Conundrum – CXOToday.com

By: Anand Patil

On October 23, 2019, researchers from Google made an official announcement of a major breakthrough one that scientists compared to the Wright Brothers first flight, or even mans first moon landing. They said to have achieved Quantum Supremacy, meaning that they had created a Quantum Computer that could perform a calculation that is considered impossible by the classical computers of today. The announcement was a landmark, highlighting the possibilities of Quantum Computing.

The concept of Quantum Computing itself isnt new. It is a field that has been a point of interest of physicists and computer researchers since the 1980s. Googles announcement, however, has brought it to the mainstream, and shone a spotlight on the promise that this niche field of innovation holds. Of course, like someone once said, with great power comes with great responsibility, so this field isnt without complexities.

The Possibilities of Quantum Computing

Quantum Computing is a branch of computer science that is focused on leveraging the principles of quantum physics to develop computer technology. Quantum Computers hold the promise to power major advances in various fields that require complex calculations from materials science and pharmaceuticals to aerospace and artificial intelligence (AI).

So far, Quantum Computers have been nothing more than fancy laboratory experiments large and expensive but they have successfully demonstrated that the underlying principles are sound and have the potential to transform industries and accelerate innovation like never before. This has spurred scientific and industrial interest in this nascent field, giving rise to multiple projects across the world in pursuit of creating a viable, general-use Quantum Computer. That said, it may still be many years before Quantum Computers are commercially and generally available.

So Why Does It Matter Today?The possibility of Quantum Computers poses a serious challenge to cryptographic algorithms deployed widely today. Todays key-exchange algorithms, like RSA, Diffie-Hellman, and others, rely on very difficult mathematical problems such as prime factorization for their security, which a Quantum computer would be able to solve much faster than a classical computer.

For example, it would take a classical computer centuries or even longer, to break modern algorithms like DH, RSA-2048 etc. by using brute-force methods. However, given the power and efficiency of quantum machines in calculations such as finding prime factors of large numbers it may be possible for a quantum computer to break current asymmetric algorithms in a matter of days

So, while the encrypted internet is not at risk at the moment, all that a bad actor has to do is capture the encrypted data today including the initial key exchange, and then wait until a powerful enough quantum computer is available to decrypt it. This is particularly a problem for organizations that have large amounts of sensitive data that they need to protect over the long term such as Banks, Governments and Defense agencies.

What Can I Do Now?For organizations that could be at risk in the future, this is the best time to start evaluating post-quantum cryptography. Simply put, this means moving to algorithms and/or keys that are a lot more robust and can withstand a brute-force attack by a quantum computer i.e. quantum resistant.

The National Institute of Standards and Technology (NIST) in the US is leading the effort towards the standardization of post-quantum secure algorithms. However, given the lengthy process involved, this may take many years to fructify.

An alternative is to use Quantum Key Distribution (QKD) techniques with existing algorithms that are considered quantum-safe. This involves using a dedicated optical channel to exchange keys using the quantum properties of photons. Any attempt to tap this secure channel will lead to a change in the quantum state of the photon and can be immediately detected and therefore the key is unhackable. One of the limitations of QKD in this method is the need for a dedicated optical channel that cannot span more than 50km between the two terminals. Of course, this also means that the existing encryption devices or routers should be capable of ingesting such Quantum-Generated keys.

Post-Quantum Cryptography and CiscoCisco is an active contributor to the efforts to standardize post-quantum algorithms. However, recognizing that an implementable standard may be some years away, there is work ongoing to ensure that organizations are able to implement quantum-resistant encryption techniques in the interim, that leverage existing network devices like routers which are most commonly used as encryptors.

To start with, a team of veteran technical leaders and cryptography experts from Cisco US David McGrew, Scott Fluhrer, Lionel Florit and the engineering team in Cisco India lead by Amjad Inamdar and Ramas Rangaswamy developed an API interface called the Secure Key Import Protocol or SKIP through which Cisco routers can securely ingest keys from an external post-quantum key source. This allows existing Cisco routers to be quantum-ready, with just the addition of an external QKD system. Going forward, this team is working on a way to deliver quantum-safe encryption keys without the need for short-range point-to-point connections.

The advantage of this method is that organizations can integrate post-quantum key sources with existing networking gear in a modular fashion without the need to replace anything already installed. In this manner, you could create a quantum-ready network for all traffic with minimal effort.

Getting Ready for the Post-Quantum WorldQuantum Supremacy is an event which demonstrates that a quantum machine is able to solve a problem that no classical computer can solve in a feasible amount of time. This race has gathered momentum in the recent past with several companies joining the bandwagon, and some even claiming to have achieved it.

There is an unprecedented amount of attention focused on making a commercially viable quantum computer. Many believe it is inevitable, and only a question of time. When it does happen, the currently used cryptography techniques will become vulnerable, and therefore be limited in their security. The good news is, there are methods available to adopt strong encryption techniques that will remain secure even after quantum computers are generally available.

If you are an organization that wants to protect its sensitive data over the long term, you should start to evaluate post-quantum secure encryption techniques today. By leveraging existing networking infrastructure and adding suitable post-quantum key distribution techniques, it is possible to take a quantum leap in securing your data.

(The author is Director, Systems Engineering, Cisco India and SAARC and the views expressed in this article are his own)

See the article here:
Quantum Computing and the Cryptography Conundrum - CXOToday.com

The Future of Computing: Hype, Hope, and Reality – CIOReview

Bill Reichert, Partner, Pegasus Tech Ventures

For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has changed tremendously, and software has evolved accordingly. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades.

All that is changing.

The same advances in semiconductor fabrication technology that powered Moores Lawthe exponential increase in the power of computers over the last several decadeshave enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.

At the same time, software engineering is also progressing. Marc Andreessen has famously said, Software is eating the world. What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software.

Heterogeneous Computing

New architectures, however,require that both software engineers and hardware engineers work together. A new class of hardware is emerging that takes advantage of what is called heterogeneous computing, multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change.

AI Chips

With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency.

Virtually all the progress in computing over the past 30 years has been thanks to hardware, not software

Edge Computing

The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the cloud and support thousands, even millions, of users. But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. Now companies are looking to process data at the edge, at the point of collection, rather than sending it up to the cloud, thereby reducing latency and cutting bandwidth and storage costs. At its simplest level, edge computing filters out unimportant data and sends only the most important data to the cloud. For more complex tasks, such as autonomous driving, edge computing requires processing massive AI models and making very accurate judgments in milliseconds. For these tasks, the new special-purpose chips discussed above and below are fighting for design wins.

Analog Computing

As brilliant as binary code is for enabling absolutely precise calculations, the real world is analog, not digital, and many compute tasks could be done more efficiently if we could operate with analog values rather than having to digitize them. But analog computing is imprecise, and most computing problems require exact values, not approximate values. (How much money do you have in your bank account?) Some problems, like AI inference and monitoring sensor data, do not need six sigma precision to get the right answer or make the right decision. Companies like Mythic, Analog Inference, and Aspinity are incorporating analog computing architectures into their chips to make them up to 100x more efficient solving problems involving data from our analog world.

Photonic Computing

Light has been used for digital communications and computer networks for decades, but using photons to do the math and putting photonic processors on a chip are extremely challenging. That is what several startups are trying to do. Spinning technologies out of MIT and Princeton, three companies, Lightelligence, Lightmatter, and Luminous Computing, are racing to commercialize the first photonic chip for doing AI inference at the edge.

Neuromorphic Computing

In spite of what the media portrays as the imminent cyber-apocalypse where robots rebel against their human masters and take over the world, we are a long way away from the science fiction world imagined in popular culture. The fact is that the human brain is still massively more powerful and efficient than the most powerful supercomputers on earth. But computer scientists think there is a path to create an artificial brain. The branch of artificial intelligence that uses neural network mathematical frameworks to compute information in a manner similar to the human brain is sometimes referred to as neuromorphic, because it mimics human neuro-biology. But researchers have been working on models that even more closely mimic the human brain in its design and efficiency. The brain sends signals as electrochemical spikes, not digital bytes, and the brains roughly 86 billion neurons are interconnected in a way that is very different from transistors on a chip. Researchers at Stanford, Intel, IBM, and several startup companies, such as Rain Neuromorphics and BrainChip, are trying to develop hardware and software that uses neuromorphic principles to deliver very high-power computing on very small semiconductor chips.

Quantum Computing

Almost certainly the most radical initiative in computing is the attempt to harness the potential of quantum computing. At the subatomic level, particles of matter behave in extraordinary and wonderful ways they can exist in more than one state simultaneously, and they can entangle with one another across a distance without any apparent physical connection. It turns out that electronic devices like transistors and diodes wouldnt even work if the universe were strictly Newtonian. If we can figure out how to control the quantum properties of light and matter the way we figured out how to use gears to make adding machines and transistors to make computers, we will be able to make quantum computers that are as superior to current supercomputers as supercomputers are to adding machines.

Some people say we are still a long way away from quantum supremacy, when quantum computers can solve problems that no classical computer can solve. But recent advances indicate that we may not be that far away from quantum advantage, when quantum computers can solve certain specialized problems faster than classical computers.

Already big players like IBM, Google, Intel, Honeywell, and Microsoft are demonstrating machines that can execute quantum algorithms and startups like Rigetti Computing,IonQ, and PsiQuantum are joining the race, along with quantum software companies like QC Ware, Cambridge Quantum Computing, and Zapata Computing. Big corporations and governments are investing in projects that will take advantage of the power of quantum computing in chemistry, pharmaceuticals, finance, logistics, failure analysis, and artificial intelligence.

Each of these emerging technologies promises to significantly advance computing, and with these advances will come new technology leaders. The evolution of computing has given rise to multiple generations of spectacular success stories like IBM, Intel, Microsoft, Nvidia, Google, and Amazon Web Services. Most of these companies are trying to reinvent themselves to catch the next wave of computing technology, but certainly new companies will emerge in these new sectors, and some famous names will founder and go the way of the dinosaurs, like Univac, Digital Equipment, MIPS, and Silicon Graphics. Meanwhile, corporate CIOs will have to decide where to place their bets and start investing in these new technologies, if they havent already.

More here:
The Future of Computing: Hype, Hope, and Reality - CIOReview

Forum Teratec 2020 Gathered Experts in Simulation, HPC, Big Data and AI – HPCwire

Oct. 19, 2020 Held in digital format on October 13 and 14, 2020, given the circumstances of COVID-19, Forum Teratec gathered over 1200 experts in Simulation, HPC, Big Data and Artificial Intelligence. It brought together industrialists, users, suppliers and political decision-makers around the essential issue of digital. As President of Teratec Daniel Verwaerde said in his introduction: This crisis demonstrates the fundamental importance of digital, and especially HPC and HPDA in our lives and in our economy.

The Forum Teratec 2020 was up to the challenge of previous years editions, welcoming more than 1,200 participants. It brought together major European decision-makers virtually, including Thierry Breton, the European Commissioner, Florence Parly, the French Minister of the Armed Forces and many industrialists. More than sixty companies and innovative projects presented their latest results with the ability for participants to share experiences during business meetings. In addition, six thematic workshops attended by national and international experts provided an opportunity to review the latest technological advances, in the fields of digital-twin in medicine, quantum computing, satellite data and the environment, AI and scientific computing, Cloud computing and HPC or Exascale.

One strategic stake, both political and economical

In all economic fields, these technologies will be essential and companies able to master them will be the leaders of tomorrow. Thierry Breton, European Commissioner for the Internal Market clearly stated: High-Performance Computing represents a major strategic challenge for Europe as much industrial, technological and, of course, scientific. It is also one of the pillars of our digital autonomy.

Digital autonomy for European States will require the implementation of a network of supercomputers on their territory for all users in industry, research and the public sector.

The European Commission has identified HPC as one of key pillars of the digital decade and decided to invest, together with Member States and industry, more than 8 billion in new-generation supercomputers under the EuroHPC Joint Undertaking.

Beyond supercomputers, European sovereignty is also conditioned by Europes ability to produce processors at best global scope, in order to reduce its dependence in this strategic area. It is also in the process of bringing together all the players involved (research organizations, small and large enterprises, public authorities) within digital ecosystems capable of mastering those technologies that will guarantee Europes competitiveness in the global economy.

Key technologies for all economic sectors

For Florence Parly, French Minister of the Armed Forces: Artificial Intelligence, High-Performance Computing, Quantum computing and, more generally, breakthrough innovations linked to data are subjects of prime importance for the Ministry of the Armed Forces. They are therefore at the heart of innovation and investment strategies, with the aim of devoting them 1 billion a year from 2022.

HPC in the COVID-19 era

During the roundtable discussion How can digital technology serve health in the age of COVID-19?, major sponsors of the Forum Teratec discussed the contribution of HPC and HPDA to the health sector, with obvious particular focus on the COVID-19 pandemic. They were thus able to demonstrate the value of these technologies in the management of the pandemic and in research for treatments and vaccines.

Innovation is core for the 6th Trophies for Simulation and AI 2020

The 6th Simulation and AI 2020 Trophies, organized with LUsine Digitale in partnership with Ansys, the CEA, Inria and Smart 4D, rewarded innovative projects or companies that have carried out an outstanding operation in the field of digital simulation, high-performance computing, Big Data or AI, or their application to healthcare. For each category, the winners are:

Closing the Forum Teratec, Daniel Verwaerde concluded: The Forum Teratec 2020 has shown the major importance of HPC and HPDA for the management of the health crisis and for industrial recovery. I would like to thank over than 1,200 participants who made it a remarkable success, and I look forward to seeing them again at the Forum Teratec 2021 next June, 22 and 23.

https://teratec.eu/forum

Source: Teratec

Read the original:
Forum Teratec 2020 Gathered Experts in Simulation, HPC, Big Data and AI - HPCwire

Fellow at Sandia Labs Appointed to National Quantum Computing Advisory Committee – HPCwire

ALBUQUERQUE, N.M., Oct. 13, 2020 Sandia National Laboratories Fellow Gil Herrera has been appointed to the newly established U.S. National Quantum Initiative Advisory Committee.

Herrera is one of two committee members representing the Department of Energy national laboratories. He joins 20 others from government, industry and academia tasked with advising the nations highest offices on matters concerning quantum information science. His appointment is for three years.

Quantum information science a broad field of study that includes quantum computing concerns machines that accomplish extraordinary tasks by manipulating matter at the smallest scales.

Quantum computing represents both an exceptional opportunity and a dire threat, Herrera said. On the positive side, when useful quantum computers can be built, they could solve molecular chemistry problems that could significantly reduce worldwide energy consumption or facilitate the rapid development of pharmaceuticals. On a more negative note, a quantum computer threatens public key encryption that protects almost all secure web communications.

In August, Sandia and more than a dozen collaborators, collectively called the Quantum Systems Accelerator, were selected as one of five national quantum research centers.

The national advisory committee, established on Aug. 28, informs offices such as the president and the secretary of energy about how to maintain U.S. leadership in this area of technology.

To me, leadership means that U.S. companies have the highest performing quantum computers, from qubits through apps, and the best quantum sensors and communication systems, Herrera said. Of equal importance, the U.S. quantum information technologies are not reliant on supply chains or intellectual property outside of the U.S., and the benefits of the U.S. government investments in quantum information science extend to all Americans, including those who manufacture quantum computing, sensing and communications systems.

A qubit is the basic processing unit of a quantum computer, analogous to a bit in a conventional computer.

Of his new appointment, which he will hold concurrently with his position at Sandia, Herrera said, I hope to help the program achieve a balance between the needs of scientific advancement, commercial interests of U.S. businesses, and national security interests.

Herrera has recently been coordinating COVID-19 research efforts across Sandias 14,000-strong workforce. A Sandia fellow since 2018, he also has spearheaded efforts to expand discovery research, served on an independent review team for a U.S. Department of Defense microelectronics program and has mentored staff members ranging from new hires to directors.

He previously served as the director of Sandias Microsystems Engineering, Science and Applications complex, which researches and produces quantum technology in addition to its main mission of producing specialized microelectronics for the nations nuclear stockpile.

Herrera has been director of the Laboratory for Physical Sciences a joint University of Maryland and U.S. government research institute and served at the White House Office of Science and Technology Policy as an American Association for the Advancement of Science/Sloan Fellow under President George H.W. Bush, where he worked on semiconductor and technology transfer policies.

He has received numerous awards for his service, including three Civilian Service medals from the Pentagon and the National Security Agency Research Medallion, and has received two distinguished alumni awards from the University of New Mexico.

Herrera earned his masters degree in electrical engineering from the University of California, Berkeley. An Albuquerque native, he received his bachelors degree in computer engineering from UNM.

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energys National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Source: Sandia National Laboratories

Read the original:
Fellow at Sandia Labs Appointed to National Quantum Computing Advisory Committee - HPCwire

Semiconductor Industry Announces Research and Funding Priorities to Sustain U.S. Leadership in Chip Technology – goskagit.com

WASHINGTON, Oct. 15, 2020 /PRNewswire/ -- The Semiconductor Industry Association (SIA) and the Semiconductor Research Corporation (SRC) today released a preview of their upcoming "Decadal Plan for Semiconductors," a report outlining chip research and funding priorities over the next decade that will help strengthen U.S. semiconductor technology and spur growth in emerging technologies such as artificial intelligence, quantum computing, advanced wireless communications. The Decadal Plan, developed with contributions from a broad cross-section of leaders in academia, government, and industry, identifies five "seismic shifts" shaping the future of chip technology and calls for an annual $3.4 billion federal investment over the next decade to fund semiconductor R&D across these five areas.

[DOWNLOAD THE INTERIM REPORT | ONE-PAGER]

"Federal government and private sector investments in semiconductor R&D have propelled the rapid pace of innovation in the U.S. semiconductor industry, spurring tremendous growth throughout the U.S. and global economies," said John Neuffer, SIA president and CEO. "As we enter a new era, however, a renewed focus on public-private research partnerships is necessary to address the seismic shifts facing chip technology. The federal government must invest ambitiously in semiconductor research to keep America on top in semiconductors and the game-changing future technologies they enable."

The Decadal Plan's proposed additional federal investment of $3.4 billion annually would strengthen the U.S. semiconductor industry's global leadership position, add $161 billion to U.S. GDP, and create half a million U.S. jobs in the next 10 years, according to findings from an earlier SIA study. The Decadal Plan makes specific recommendations on how this increased funding should be allocated, identifying the following seismic shifts that require a renewed focus on semiconductor research:

"The future holds unlimited potential for semiconductor technology, with emerging applications such as artificial intelligence, quantum computing, and advanced wireless technologies promising incalculable societal benefit," said Dr. Todd Younkin, SRC president and CEO. "The Decadal Plan provides a blueprint for how we can convert this potential into a reality. Working together, we can boost semiconductor technology to keep it strong, competitive, and at the tip of the spear of innovation."

The full Decadal Plan is scheduled to be published in December 2020. SIA and SRC will host a virtual workshop coinciding with the release of the full report. Learn more and download the interim report at http://www.src.org/decadalplan.

Media Contacts

Dan RossoSemiconductor Industry Association240-305-4738drosso@semiconductors.org

David HenshallSemiconductor Research Corporation919-941-9440david.henshall@src.org

About SIA

The Semiconductor Industry Association (SIA) is the voice of the semiconductor industry, one of America's top export industries and a key driver of America's economic strength, national security, and global competitiveness. Semiconductors the tiny chips that enable modern technologies power incredible products and services that have transformed our lives and our economy. The semiconductor industry directly employs nearly a quarter of a million workers in the United States, and U.S. semiconductor company sales totaled $193 billion in 2019. represents 95 percent of the U.S. semiconductor industry by revenue and nearly two-thirds of non-U.S. chip firms. Through this coalition, SIA seeks to strengthen leadership of semiconductor manufacturing, design, and research by working with Congress, the Administration, and key industry stakeholders around the world to encourage policies that fuel innovation, propel business, and drive international competition. Learn more atwww.semiconductors.org.

About SRC

Semiconductor Research Corporation (SRC), a world-renowned, high technology-based consortium serves as a crossroads of collaboration between technology companies, academia, government agencies, and SRC's highly regarded engineers and scientists. Through its interdisciplinary research programs, SRC plays an indispensable part to address global challenges, using research and development strategies, advanced tools and technologies. Members of SRC work synergistically together, gain access to research results, fundamental IP, and highly experienced students to compete in the global marketplace and build the workforce of tomorrow. Learn more at http://www.src.org.

See the original post here:
Semiconductor Industry Announces Research and Funding Priorities to Sustain U.S. Leadership in Chip Technology - goskagit.com

4 Reasons Why Now Is the Best Time to Start With Quantum Computing – Medium

Quantum computing is a rapidly developing field, with everyone trying to build the perfect hardware, find new applications for current algorithms, or even develop new algorithms. Because of that, the near-future demand for quantum programmers and researchers will increase shortly.

Many governmental and industrial institutions have set aside substantial funds to develop quantum technologies. The Quantum Daily (TQD) estimated the current market for quantum computing to be around $235 million. This number is predicted to grow substantially to $6.25 billion by 2025.

This incredible amount of funds leads to an increase in the number of academia, government, and industry positions. Almost all technology companies are changing their business model to adapt to when quantum technology makes an impact.

TQD also adds that the U.S. Bureau of Labor Statistics estimates that in 2020 so far, there are around 1.4 million more quantum software development jobs than applicants who can fill them.

In 2019, MIT published an article called Q&A: The talent shortage in quantum computing that addressed the different challenges the field faces right now. Afterward, it developed MIT xPRO, a group addressing the reality that students arent the only people interested in learning about the different aspects of quantum information.

More here:
4 Reasons Why Now Is the Best Time to Start With Quantum Computing - Medium

What is Quantum Computing, and How does it Help Us? – Analytics Insight

The term quantum computing gained momentum in the late 20thcentury. These systems aim to utilize these capabilities to become highly-efficient. They use quantum bits or qubits instead of the simple manipulation of ones and zeros in existing binary-based computers. These qubits also have a third state called superposition that simultaneously represents a one or a zero. Instead of analyzing a one or a zero sequentially, superposition allows two qubits in superposition to represent four scenarios at the same time. So we are at the cusp of a computing revolution where future systems have capability beyond mathematical calculations and algorithms.

Quantum computers also follow the principle of entanglement, which Albert Einstein had referred to as spooky action at a distance. Entanglement refers to the observation that the state of particles from the same quantum system cannot be described independently of each other. Even when they are separated by great distances, they are still part of the same system.

Several nations, giant tech firms, universities, and startups are currently exploring quantum computing and its range of potential applications. IBM, Google, Microsoft, Amazon, and other companies are investing heavilyin developing large-scale quantum computing hardware and software. Google and UCSB have a partnership to develop a 50 qubits computer, as it would represent 10,000,000,000,000,000 numbers that would take a modern computer petabyte-scale memory to store. A petabyte is the unit above a terabyte and represents 1,024 terabytes. It is also equivalent to 4,000 digital photos taken every day. Meanwhile, names like Rigetti Computing, D-Wave Systems, 1Qbit Information Technologies, Inc., Quantum Circuits, Inc., QC Ware, Zapata Computing, Inc. are emerging as bigger players in quantum computing.

IEEE Standards Association Quantum Computing Working Group is developing two technical standards for quantum computing. One is for quantum computing definitions and nomenclature, so we can all speak the same language. The other addresses performance metrics and performance benchmarking to measure quantum computers performance against classical computers and, ultimately, each other. If required, new standards will also be added with time.

The rapid growth in the quantum tech sector over the past five years has been exciting. This is because quantum computing presents immense potential. For instance, a quantum system can be useful for scientists for conducting virtual experiments and sifting through vast amounts of data. Quantum algorithms like quantum parallelism can perform a large number of computations simultaneously. In contrast, quantum interference will combine their results into something meaningful and can be measured according to quantum mechanics laws. Even Chinese scientists are looking to developquantum internet, which shall be a more secure communication system in which information is stored and transmitted withadvanced cryptography.

Researchers at Case Western Reserve University used quantum algorithms to transform MRI scans for cancer, allowing the scans to be performed three times faster and to improve their quality by 30%. In practice, this can mean patients wont need to be sedated to stay still for the length of an MRI, and physicians could track the success of chemotherapy at the earliest stages of treatment.

Laboratoire de Photonique Numrique et Nanosciences of France has built a hybrid device that pairs a quantum accelerometer with a classical one and uses a high-pass filter to subtract the classical data from the quantum data. This has the potential to offer an highly precise quantum compass that would eliminate the bias and scale factor drifts commonly associated with gyroscopic components. Meanwhile, the University of Bristolhas founded a quantum solution for increasing security threats. Researchers at the University of Virginia School of Medicine are working to uncover the potential quantum computers hold to help understand genetic diseases.Scientists are also using quantum computing to find a vaccine for COVID and other life-threatening diseases.

In July 2017, in collaboration with commercial photonics tools providerM Squared, QuantIC demonstrated how a quantum gravimeter detects the presence of deeply hidden objects by measuring disturbances in the gravitational field. If such a device becomes practical and portable, the team believes it could become invaluable in an early warning system for predicting seismic events and tsunamis.

Excerpt from:
What is Quantum Computing, and How does it Help Us? - Analytics Insight

Are You Ready for the Quantum Computing Revolution? – Harvard Business Review

Executive Summary

The quantum race is already underway. Governments and private investors all around the world are pouringbillions of dollarsinto quantum research and development. Satellite-based quantum key distribution for encryption has been demonstrated, laying the groundwork fora potential quantum security-based global communication network.IBM, Google, Microsoft, Amazon, and other companies are investing heavilyin developing large-scale quantum computing hardware and software. Nobody is quite there yet. Even so, business leaders should consider developing strategies to address three main areas: 1.) planning for quantum security, 2.) indentifying use cases for quantum computing, and 3.) thinking through responsible design. By planning responsibly, while also embracing future uncertainty, businesses can improve their odds of being ready for the quantum future.

Quantum physics has already changed our lives. Thanks to the invention of the laser and the transistor both products of quantum theory almost every electronic device we use today is an example of quantum physics in action. We may now be on the brink of a second quantum revolution as we attempt to harness even more of the power of the quantum world. Quantum computing and quantum communication could impact many sectors, including healthcare, energy, finance, security, and entertainment. Recent studies predict a multibillion-dollar quantum industry by 2030. However, significant practical challenges need to be overcome before this level of large-scale impact is achievable.

Although quantum theory is over a century old, the current quantum revolution is based on the more recent realization that uncertainty a fundamental property of quantum particles can be a powerful resource. At the level of individual quantum particles, such as electrons or photons (particles of light), its impossible to precisely know every property of the particle at any given moment in time. For example, the GPS in your car can tell you your location and your speed and direction all at once, and precisely enough to get you to your destination. But a quantum GPS could not simultaneously and precisely display all those properties of an electron, not because of faulty design, but because the laws of quantum physics forbid it. In the quantum world, we must use the language of probability, rather than certainty. And in the context of computing based on binary digits (bits) of 0s and 1s, this means that quantum bits (qubits) have some likelihood of being a 1 and some likelihood of being 0 at the same time.

Such imprecision is at first disconcerting. In our everyday classical computers, 0s and 1s are associated with switches and electronic circuits turning on and off. Not knowing if they are exactly on or off wouldnt make much sense from a computing point of view. In fact, that would lead to errors in calculations. But the revolutionary idea behind quantum information processing is that quantum uncertainty a fuzzy in-between superposition of 0 and 1 is actually not a bug, but a feature. It provides new levers for more powerful ways to communicate and process data.

One outcome of the probabilistic nature of quantum theory is that quantum information cannot be precisely copied. From a security lens, this is game-changing. Hackers trying to copy quantum keys used for encrypting and transmitting messages would be foiled, even if they had access to a quantum computer, or other powerful resources. This fundamentally unhackable encryption is based on the laws of physics, and not on the complex mathematical algorithms used today. While mathematical encryption techniques are vulnerable to being cracked by powerful enough computers, cracking quantum encryption would require violating the laws of physics.

Just as quantum encryption is fundamentally different from current encryption methods based on mathematical complexity, quantum computers are fundamentally different from current classical computers. The two are as different as a car and a horse and cart. A car is based on harnessing different laws of physics compared to a horse and cart. It gets you to your destination faster and to new destinations previously out of reach. The same can be said for a quantum computer compared to a classical computer. A quantum computer harnesses the probabilistic laws of quantum physics to process data and perform computations in a novel way. It can complete certain computing tasks faster, and can perform new, previously impossible tasks such as, for example, quantum teleportation, where information encoded in quantum particles disappears in one location and is exactly (but not instantaneously) recreated in another location far away. While that sounds like sci-fi, this new form of data transmission could be a vital component of a future quantum internet.

A particularly important application of quantum computers might be to simulate and analyze molecules for drug development and materials design. A quantum computer is uniquely suited for such tasks because it would operate on the same laws of quantum physics as the molecules it is simulating. Using a quantum device to simulate quantum chemistry could be far more efficient than using the fastest classical supercomputers today.

Quantum computers are also ideally suited for solving complex optimization tasks and performing fast searches of unsorted data. This could be relevant for many applications, from sorting climate data or health or financial data, to optimizing supply chain logistics, or workforce management, or traffic flow.

The quantum race is already underway. Governments and private investors all around the world are pouring billions of dollars into quantum research and development. Satellite-based quantum key distribution for encryption has been demonstrated, laying the groundwork for a potential quantum security-based global communication network. IBM, Google, Microsoft, Amazon, and other companies are investing heavily in developing large-scale quantum computing hardware and software. Nobody is quite there yet. While small-scale quantum computers are operational today, a major hurdle to scaling up the technology is the issue of dealing with errors. Compared to bits, qubits are incredibly fragile. Even the slightest disturbance from the outside world is enough to destroy quantum information. Thats why most current machines need to be carefully shielded in isolated environments operating at temperatures far colder than outer space. While a theoretical framework for quantum error correction has been developed, implementing it in an energy- and resource-efficient manner poses significant engineering challenges.

Given the current state of the field, its not clear when or if the full power of quantum computing will be accessible. Even so, business leaders should consider developing strategies to address three main areas:

The rapid growth in the quantum tech sector over the past five years has been exciting. But the future remains unpredictable. Luckily, quantum theory tells us that unpredictability is not necessarily a bad thing. In fact, two qubits can be locked together in such a way that individually they remain undetermined, but jointly they are perfectly in sync either both qubits are 0 or both are 1. This combination of joint certainty and individual unpredictability a phenomenon called entanglement is a powerful fuel that drives many quantum computing algorithms. Perhaps it also holds a lesson for how to build a quantum industry. By planning responsibly, while also embracing future uncertainty, businesses can improve their odds of being ready for the quantum future.

Read more from the original source:
Are You Ready for the Quantum Computing Revolution? - Harvard Business Review

OSFI’s Consultation on Technology: Understanding the risks inherent in the technologies that power the financial industry – Lexology

INTRODUCTION

On September 15, 2020, the Office of the Superintendent of Financial Institutions (OSFI) released a discussion paper regarding technology risks in the financial sector. The paper, Developing financial sector resilience in a digital world: Selected themes in technology and related risks, focuses on digital risks arising from cybersecurity, data analytics, third party ecosystems and data. Today, technology and data are central to the operations of federally regulated entities (FREs). In the paper, OSFI focuses on some of them including quantum computing, artificial intelligence, cloud computing, and data. OSFI poses questions in areas that it wishes to investigate further, potentially signaling OSFIs interest in collaborating with stakeholders to develop guidance that balances the safety and soundness of the Canadian financial sector against the needs of the sector to innovate.

The paper is something that should not be taken lightly or ignored. OSFI has requested stakeholder comments on the paper by December 15, 2020. These comments will likely form the basis for further consultations before OSFI tables any firm proposals. Any new guidance from OSFI purporting to regulate technology and related risks could therefore have wide ranging impacts on the financial sector, including in connection with the following:

Financial institutions have long been seen to be powered-by and dependent on a vast array of digital technologies. The ability of financial institutions to reliably deliver critical products and services during the COVID-19 pandemic is but one recent example of how financial institutions are successfully harnessing the power of digital technologies to deliver flexible, reliable and powerful products and services. With that said, this increasing reliance on digital technologies could trigger or amplify operational and financial risks to financial institutions. OSFI indicates that it is assessing the merits of a focus on operational resilience objectives with respect to technology and related risks and believes that a holistic view of operational risk management and operational resilience is warranted.

This consultation is a continuation of earlier work by OSFI to identify and mitigate risks presented from digital technologies, including:

PRIORITY TECHNOLOGY RISK AREAS IDENTIFIED BY OSFI

The discussion paper focuses on principles related to three priority areas: cyber security, advanced analytics and third party ecosystems. As data is foundational to each of these areas, the discussion paper also includes a separate discussion on data risk. OSFI intends on using these principles as a basis for building out more specific regulatory expectations in these areas going forward.

Cyber Security

The cyber security principle focuses on the confidentiality, integrity and availability of information. This builds on the existing work from OSFI related to cyber security, including the 2013 Cyber Security Self-Assessment Guidance, the 2019 advisory regarding cyber incident reporting and the ongoing circulation of Intelligence Bulletins and Technology Risk Bulletins that are intended to complement OSFIs guidelines and advisories. OSFI notes that it continues to observe gaps in many financial institutions cyber security policies, procedures and capabilities and many opportunities exist for improvement.

As part of this principle, OSFI flags two specific points of focus:

Advanced Analytics

OSFI notes that advanced analytics, and in particular the use of artificial intelligence (AI) and machine learning (ML) models, present a novel set of opportunities and risks. OSFI intends on using the stakeholder feedback received from this discussion paper to inform the development of regulatory and supervisory frameworks that address the risks resulting from the use of AI and ML. OSFI has identified soundness, explainability and accountability as being core principles to manage elevated risks associated with advanced analytics, including AI and ML. Through the consultation, OSFI seeks feedback on whether these three principles appropriately capture such elevated risks or whether there are any additional principles or risks that should be considered.

Third Party Ecosystems

OSFI has long sought to manage the risks presented by reliance by financial institutions on third party ecosystems, most notably though Guideline B-10. OSFI notes that while the existing principles in Guideline B-10 remain relevant, those guidelines and expectations require review. Areas of specific interest that are noted include:

OSFI will be undertaking a separate consultation process related to the expectations contained in Guideline B-10 which will be informed by the findings of this consultation.

Data

The overarching concept of data is the final area covered by the discussion paper, and in particular how to maintain sound data management and governance throughout the data lifecycle. The areas of focus highlighted are:

Originally posted here:
OSFI's Consultation on Technology: Understanding the risks inherent in the technologies that power the financial industry - Lexology

The Hyperion-insideHPC Interviews: ORNL Distinguished Scientist Travis Humble on Coupling Classical and Quantum Computing – insideHPC

Oak Ridge National Labs Travis Humble has worked at the headwaters of quantum computing research for years. In this interview, he talks about his particular areas of interest, including integration of quantum computing with classical HPC systems. Weve already recognized that we can accelerate solving scientific applications using quantum computers, he says. These demonstrations are just early examples of how we expect quantum computers can take us to the most challenging problems for scientific discovery.

In This Update. From the HPC User Forum Steering Committee

By Steve Conway and Thomas Gerard

After the global pandemic forced Hyperion Research to cancel the April 2020 HPC User Forum planned for Princeton, New Jersey, we decided to reach out to the HPC community in another way by publishing a series of interviews with leading members of the worldwide HPC community. Our hope is that these seasoned leaders perspectives on HPCs past, present and future will be interesting and beneficial to others. To conduct the interviews, Hyperion Research engaged insideHPC Media. We welcome comments and questions addressed to Steve Conway, sconway@hyperionres.com or Earl Joseph, ejoseph@hyperionres.com.

This interview is with Travis Humble, Deputy Director at the Department of Energys Quantum Science Center, a Distinguished Scientist at Oak Ridge National Laboratory, and director of the labs Quantum Computing Institute. Travis is leading the development of new quantum technologies and infrastructure to impact the DOE mission of scientific discovery through quantum computing. He is editor-in-chief for ACM Transactions on Quantum Computing, Associate Editor for Quantum Information Processing, and co-chair of the IEEE Quantum Initiative. Travis also holds a joint faculty appointment with the University of Tennessee Bredesen Center for Interdisciplinary Research and Graduate Education, where he works with students in developing energy-efficient computing solutions. He received his doctorate in theoretical chemistry from the University of Oregon before joining ORNL in 2005.

The HPC User Forum was established in 1999 to promote the health of the global HPC industry and address issues of common concern to users. More than 75 HPC User Forum meetings have been held in the Americas, Europe and the Asia-Pacific region since the organizations founding in 2000.

Doug Black: Hi, everyone. Im Doug Black. Im editor-in-chief at InsideHPC and today we are talking with Dr. Travis Humble. He is a distinguished scientist at Oak Ridge National Lab, where he is director of the labs Quantum Computing Institute. Dr. Humble, welcome. Thanks for joining us today.

Travis Humble: Thanks for having me on, Doug.

Black: Travis, tell us, if you would, the area of quantum computing that youre working in and the research that youre doing that youre most excited about, that has what you would regard as the greatest potential.

Humble: Quantum computing is a really exciting area, so its really hard to narrow it down to just one example. This is the intersection of quantum informationquantum mechanicswith computer science.

Weve already recognized that we can accelerate solving scientific applications using quantum computers. At Oak Ridge, for example, we have already demonstrated examples in chemistry, material science and high-energy physics, where we can use quantum computers to solve problems in those areas. These demonstrations are just early examples of how we expect quantum computers can take us to the most challenging problems for scientific discovery.

My own research is actually focused on how we could integrate quantum computers with high-performance computing systems. Of course, we are adopting an accelerator model at Oak Ridge, where we are thinking about using quantum processors to offload the most challenging computational tasks. Now, this seems like an obvious approach; the best of both worlds. But the truth is that there are a lot of challenges in bringing those two systems together.

Black: It sounds like sort of a hybrid approach, almost a CPU/GPU, only were talking about systems writ large. Tell us about DOEs and Oak Ridges overall quantum strategy and how the Quantum Computing Institute works with vendors and academic institutions on quantum technology development.

Humble: The Oak Ridge National Laboratory has played an important role within the DOEs national laboratory system, a leading role in both research and infrastructure. In 2018, the President announced the National Quantum Initiative, which is intended to accelerate the development of quantum science and technology in the United States. Oak Ridge has taken the lead in the development of research, especially software applications and hardware, for how quantum computing can address scientific discovery.

At the same time, weve helped DOE establish a quantum computing user program; something we call QCUP. This is administered through the Oak Ridge Leadership Computing Facility and it looks for the best of the best in terms of approaches to how quantum computers could be used for scientific discovery. We provide access to the users through the user program in order for them to test and evaluate how quantum computers might be used to solve problems in basic energy science, nuclear physics, and other areas.

Black: Okay, great. So how far would you we are from practical quantum computing and from what is referred to as quantum advantage, where quantum systems can run workloads faster than conventional or classical supercomputers?

Humble: This is such a great question. Quantum advantage, of course, is the idea that a quantum computer would be able to outperform any other conventional computing system on the planet. Very early in this fiscal year, back in October, there was an announcement from Google where they actually demonstrated an example of quantum advantage using their quantum computing hardware processor. Oak Ridge was part of that announcement, because we used our Summit supercomputer system as the baseline to compare that calculation.

But heres the rub: the Google demonstration was primarily a diagnostic check that their processor was behaving as expected, and the Summit supercomputer actually couldnt keep up with that type of diagnostic check. But when we look at the practical applications of quantum computing, still focusing on problems in chemistry, material science and other scientific disciplines, we appear to still be a few years away from demonstrating a quantum advantage for those applications. This is one of the hottest topics in the field at the moment, though. Once somebody can identify that, we expect to see a great breakthrough in how quantum computers can be used in these practical areas.

Black: Okay. So, how did you become involved in quantum in the first place? Tell us a little bit about your background in technology.

Humble: I started early on studying quantum mechanics through chemistry. My focus, early on in research, was on theoretical chemistry and understanding how molecules behave quantum mechanically. What has turned out to be one of the greatest ironies of my career is that quantum computers are actually significant opportunities to solve chemistry problems using quantum mechanics.

So I got involved in quantum computing relatively early. Certainly, the last 15 years or so have been a roller coaster ride, mainly going uphill in terms of developing quantum computers and looking at the question of how they can intersect with high-performance computing. Being at Oak Ridge, thats just a natural question for me to come across. I work every day with people who are using some of the worlds fastest supercomputers in order to solve the same types of problems that we think quantum computers would be best at. So for me, the intersection between those two areas just seems like a natural path to go down.

Black: I see. Are there any other topics related to all this that youd like to add?

Humble: I think that quantum computing has a certain mystique around it. Its an exciting area and it relies on a field of physics that many people dont yet know about, but I certainly anticipate that in the future thats going to change. This is a topic that is probably going to impact everyones lives. Maybe its 10 years from now, maybe its 20 years, but its certainly something that I think we should start preparing for in the long term, and Oak Ridge is really happy to be one of the places that is helping to lead that change.

Black: Thanks so much for your time. It was great to talk with you.

Humble: Thanks so much, Doug. It was great to talk with you, too.

Read more from the original source:
The Hyperion-insideHPC Interviews: ORNL Distinguished Scientist Travis Humble on Coupling Classical and Quantum Computing - insideHPC

Putting the Quantum in Security – Optics & Photonics News

Grgoire Ribordy [Image: Courtesy of ID Quantique]

In the second day of OSAs Quantum 2.0 conference, the focus shifted from quantum computing to other aspects of quantum technologyparticularly quantum communications and quantum sensing. On that note, Grgoire Ribordy, the founder of the Switzerland-based quantum crypto firm ID Quantique, looked at how quantum technologies are being employed for the long-term challenges in data security posed by quantum computing itself.

ID Quantique has a long pedigree in quantum technology; the company has been in business since 2001. In retrospect, Ribordy said, we were really crazy to start a company in quantum technology in 2001 It was way too early. But the firm forged ahead and has now developed a suite of applications in the data-security space.

Ribordy stressed thatespecially over the past few monthsits become increasingly clear that digital security, and protecting digital information against hacking, is extremely important. Classical cryptography assembles a set of techniques for hiding information from unauthorized users, which Ribordy compared to building a castle around the data.

The problem, however, is that after quantum computers become reality, one application for them will be to crack the cryptography systems that are currently in use. When that happens, said Ribordy, the walls we have today wont be able to protect the data anymore. The best cryptography techniques for avoiding that baleful outcome, he suggested, are those that themselves rely on quantum technologyand that can provide robust protection, while still allowing the convenience of the prevailing classical private-key encryption systems.

[Image: Grgoire Ribordy/OSA Quantum 2.0 Conference]

Just how much one should worry about all ofthis nowwhen quantum computers powerful enough to do this sort of cracking still lie years in the futuredepends, according toRibordy, on three factors. One, which he labeled factor x, is how long you need current data to be encryptedperhaps only a short time for some kinds of records, decades for other kinds. The second, y, is the time that it will take to retool the current infrastructure to be transformed into somethingquantum-safe. And the third, z, is how long it will actually take for large-scale, encryption-breaking quantum computers to be built.

If x and/or y are longer than z, he suggested, we have a problemand theres a lot of debate today surrounding just what the values of these variables are. One of ID Quantiques services is to take clients through a quantum risk assessment that attempts to suss out how long they need to protect their data, and what the implications are for their cryptography approach.

Ribordy cited three key components to effective long-term quantum encryption. One, and perhaps the oldest, is quantum random number generation (QRNG) to build security keys, whether classical or quantum. A second is something that Ribordy called crypto-agility. (You dont hard-code cryptography, he explained. Instead, you want to upgrade it whenever a new advance comes.) And the third component is quantum key distribution (QKD), which is a technique still under active development, but which is already being deployed in some cases.

On the first component, Ribordy noted that ID Quantique has been active in QRNG since 2014, when the idea arose of using mobile-phone camera sensors as a source for QRNs. These arrays of pixels, he said, can provide both large rates of raw entropy (an obvious necessity for true randomness), and an industry-compatible interface. He walked the audience through the companys efforts to create a low-cost (CMOS-based), low-power, security-compliant chip for QRNGbeginning with early experiments using a Nokia phone and moving through the required efforts at miniaturization, engineering for stability and consistency, and avoiding such pitfalls as correlations between the different camera pixels, which would degrade the randomness of the output.

The result, Ribordy said, is a QRNG chip that has recently been added to a new Samsung mobile phoneappropriately named the Galaxy A71 Quantumthat is now available in the Republic of Korea. And the chip is not just window dressinga Korean software company partnered with Samsung to create apps for pay services, cryptocurrency services and other features that rely on random numbers, and that use the ID Quantique chip to get high-quality instances of them.

Grgoire Ribordy, at the OSA Quantum 2.0 conference.

We think this is very important, said Ribordy, because it shows that quantum technologies can be industrialized and integrated into applications.

In terms of such industrialization, another security application, quantum key distribution (QKD) is not as advanced as QRNG, according to Ribordybut he argued that the experience of QRNG bodes well for QKDs commercialization path. One issue for QKD is the short distance that such secure links can exist in fiber before quantum bit error rates become too high, though Ribordy pointed to recent paper in Nature Photonics in which practical QKD was demonstrated across a fiber link of 307 km.

Ribordy noted a number of areas of particular activity in the QKD sphere. One active area of interest, for example, is developing network topologies that play especially well with QKD. ID Quantique is also working with SK Telecom in the Republic of Korea on how QKD can be integrated into the optical networks underlying next-generation, 5G wireless. In these circumstances, the proverbial last mile, operating at radio frequencies, can only be secured with traditional cryptography, but using QKD on the optical part of the communication change will make the network as a whole more secure.

A number of other projects are in the works as well, Ribordy said, including a European project, Open QKD, the goal of which is to prepare the next generation of QKD deployment in Europe. And large-scale deployment projects are afoot in China as well.

The presence of these diverging global efforts prompted a question in the Q&A session that followed Ribordys talkjust how open are these QKD markets? Ribordy noted that, in the near term they are closing down Since quantum is a new industry, every country or region would like to be a player. The Chinese QKD ecosystem, he suggested, is completely cut offthere is kind of a Galapagos effect, and Europe also is starting to become a more closed ecosystem in the QKD arena. Ribordy views this as a sign of market immaturity, however, and believes things will become more open again in the future with efforts toward certification and standardization.

See the original post:
Putting the Quantum in Security - Optics & Photonics News

Assistant Professor in Computer Science job with Indiana University | 286449 – The Chronicle of Higher Education

The Luddy School of Informatics, Computing, and Engineering atIndiana University (IU) Bloomington invites applications for atenure track assistant professor position in Computer Science tobegin in Fall 2021. We are particularly interested in candidateswith research interests in formal models of computation,algorithms, information theory, and machine learning withconnection to quantum computation, quantum simulation, or quantuminformation science. The successful candidate will also be aQuantum Computing and Information Science Faculty Fellow supportedin part for the first three years by an NSF-funded program thataims to grow academic research capacity in the computing andinformation science fields to support advances in quantum computingand/or communication over the long term. For additional informationabout the NSF award please visit:https://www.nsf.gov/awardsearch/showAward?AWD_ID=1955027&HistoricalAwards=false.The position allows the faculty member to collaborate actively withcolleagues from a variety of outside disciplines including thedepartments of physics, chemistry, mathematics and intelligentsystems engineering, under the umbrella of the Indiana Universityfunded "quantum science and engineering center" (IU-QSEc). We seekcandidates prepared to contribute to our commitment to diversityand inclusion in higher education, especially those with experiencein teaching or working with diverse student populations. Dutieswill include research, teaching multi-level courses both online andin person, participating in course design and assessment, andservice to the School. Applicants should have a demonstrablepotential for excellence in research and teaching and a PhD inComputer Science or a related field expected before August 2021.Candidates should review application requirements, learn more aboutthe Luddy School and apply online at: https://indiana.peopleadmin.com/postings/9841.For full consideration submit online application by December 1,2020. Applications will be considered until the positions arefilled. Questions may be sent to sabry@indiana.edu. IndianaUniversity is an equal employment and affirmative action employerand a provider of ADA services. All qualified applicants willreceive consideration for employment without regard to age,ethnicity, color, race, religion, sex, sexual orientation, genderidentity or expression, genetic information, marital status,national origin, disability status or protected veteranstatus.

Link:
Assistant Professor in Computer Science job with Indiana University | 286449 - The Chronicle of Higher Education

Spin-Based Quantum Computing Breakthrough: Physicists Achieve Tunable Spin Wave Excitation – SciTechDaily

Magnon excitation. Credit: Daria Sokol/MIPT Press Office

Physicists from MIPT and the Russian Quantum Center, joined by colleagues from Saratov State University and Michigan Technological University, have demonstrated new methods forcontrolling spin waves in nanostructured bismuth iron garnet films via short laser pulses. Presented inNano Letters, the solution has potential for applications in energy-efficient information transfer and spin-based quantum computing.

Aparticles spin is its intrinsic angular momentum, which always has a direction. Inmagnetized materials, the spins all point in one direction. A local disruption of this magnetic order is accompanied by the propagation of spin waves, whose quanta are known as magnons.

Unlike the electrical current, spin wave propagation does not involve a transfer of matter. Asaresult, using magnons rather than electrons to transmit information leads to much smaller thermal losses. Data can be encoded in the phase or amplitude of a spin wave and processed via wave interference or nonlinear effects.

Simple logical components based on magnons are already available as sample devices. However, one of the challenges of implementing this new technology is the need to control certain spin wave parameters. Inmany regards, exciting magnons optically is more convenient than by other means, with one of the advantages presented in the recent paper in Nano Letters.

The researchers excited spin waves in a nanostructured bismuth iron garnet. Even without nanopatterning, that material has unique optomagnetic properties. It is characterized by low magnetic attenuation, allowing magnons topropagate over large distances even at room temperature. It is also highly optically transparent in the near infrared range and has a high Verdet constant.

The film used in the study had an elaborate structure: a smooth lower layer with a one-dimensional grating formed on top, with a 450-nanometer period (fig.1). This geometry enables the excitation ofmagnons with a very specific spin distribution, which is not possible for an unmodified film.

To excite magnetization precession, the team used linearly polarized pump laser pulses, whose characteristics affected spin dynamics and the type of spin waves generated. Importantly, wave excitation resulted from optomagnetic rather than thermal effects.

Schematic representation of spin wave excitation by optical pulses. The laser pump pulse generates magnons by locally disrupting the ordering of spins shown as violet arrows in bismuth iron garnet (BiIG). A probe pulse is then used to recover information about the excited magnons. GGG denotes gadolinium gallium garnet, which serves as the substrate. Credit: Alexander Chernov et al./Nano Letters

The researchers relied on 250-femtosecond probe pulses to track the state of the sample and extract spin wave characteristics. Aprobe pulse can be directed to any point on the sample with adesired delay relative to the pump pulse. This yields information about the magnetization dynamics in a given point, which can be processed to determine the spin waves spectral frequency, type, and other parameters.

Unlike the previously available methods, the new approach enables controlling the generated wave by varying several parameters of the laser pulse that excites it. In addition to that, thegeometry of the nanostructured film allows the excitation center to be localized inaspot about 10 nanometers in size. The nanopattern also makes it possible to generate multiple distinct types of spin waves. The angle of incidence, the wavelength and polarization of the laser pulses enable the resonant excitation of the waveguide modes of the sample, which are determined by the nanostructure characteristics, so the type of spin waves excited can be controlled. It is possible for each of the characteristics associated with optical excitation to be varied independently to produce the desired effect.

Nanophotonics opens up new possibilities in the area of ultrafast magnetism, said the studys co-author, Alexander Chernov, who heads the Magnetic Heterostructures and Spintronics Lab at MIPT. The creation of practical applications will depend on being able to go beyond the submicrometer scale, increasing operation speed and the capacity for multitasking. We have shown a way to overcome these limitations by nanostructuring a magnetic material. We have successfully localized light in a spot few tens of nanometers across and effectively excited standing spin waves of various orders. This type of spin waves enables the devices operating at high frequencies, up to the terahertz range.

The paper experimentally demonstrates an improved launch efficiency and ability to control spin dynamics under optical excitation by short laser pulses in a specially designed nanopatterned film of bismuth iron garnet. It opens up new prospects for magnetic data processing and quantum computing based on coherent spin oscillations.

Reference: All-Dielectric Nanophotonics Enables Tunable Excitation of the Exchange Spin Waves by Alexander I. Chernov*, Mikhail A. Kozhaev, Daria O. Ignatyeva, Evgeniy N. Beginin, Alexandr V. Sadovnikov, Andrey A. Voronov, Dolendra Karki, Miguel Levy and Vladimir I. Belotelov, 9 June 2020, Nano Letters.DOI: 10.1021/acs.nanolett.0c01528

The study was supported by the Russian Ministry of Science and Higher Education.

More:
Spin-Based Quantum Computing Breakthrough: Physicists Achieve Tunable Spin Wave Excitation - SciTechDaily

Microsofts Big Win in Quantum Computing Was an Error After All – WIRED

Whatever happened, the Majorana drama is a setback for Microsofts ambitions to compete in quantum computing. Leading computing companies say the technology will define the future by enabling new breakthroughs in science and engineering.

Quantum computers are built from devices called qubits that encode 1s and 0s of data but can also use a quantum state called a superposition to perform math tricks not possible for the bits in a conventional computer. The main challenge to commercializing that idea is that quantum states are delicate and easily quashed by thermal or electromagnetic noise, making qubits error-prone.

Google, IBM, and Intel have all shown off prototype quantum processors with around 50 qubits, and companies including Goldman Sachs and Merck are testing the technology. But thousands or millions of qubits are likely required for useful work. Much of a quantum computers power would probably have to be dedicated to correcting its own glitches.

Microsoft has taken a different approach, claiming qubits based on Majorana particles will be more scalable, allowing it to leap ahead. But after more than a decade of work, it does not have a single qubit.

From the fuller data, theres no doubt that theres no Majorana.

Sergey Frolov, University of Pittsburgh

Majorana fermions are named after Italian physicist Ettore Majorana, who hypothesized in 1937 that particles should exist with the odd property of being their own antiparticles. Not long after, he boarded a ship and was never seen again. Physicists wouldnt report a good glimpse of one of his eponymous particles until the next millennium, in Kouwenhovens lab.

Microsoft got interested in Majoranas after company researchers in 2004 approached tech strategy chief Craig Mundie and said they had a way to solve one problem holding back quantum computersqubits flakiness.

The researchers seized on theoretical physics papers suggesting a way to build qubits that would make them more dependable. These so-called topological qubits would be built around unusual particles, of which Majorana particles are one example, that can pop into existence in clumps of electrons inside certain materials at very low temperatures.

Microsoft created a new team of physicists and mathematicians to flesh out the theory and practice of topological quantum computing, centered on an outpost in Santa Barbara, California, christened Station Q. They collaborated with and funded leading experimental physicists hunting for the particles needed to build this new form of qubit.

Kouwenhoven, in Delft, was one of the physicists who got Microsofts backing. His 2012 paper reporting signatures of Majorana particles inside nanowires started chatter about a future Nobel prize for proving the elusive particles existence. In 2016, Microsoft stepped up its investmentand the hype.

Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.

Kouwenhoven and another leading physicist, Charles Marcus, at the University of Copenhagen were hired as corporate Majorana hunters. The plan was to first detect the particles and then invent more complex devices that could control them and function as qubits. Todd Holmdahl, who previously led hardware for Microsofts lucrative Xbox games console, took over as leader of the topological quantum computing project. Early in 2018, he told Barrons he would have a topological qubit by the end of the year. The now-disputed paper appeared a month later.

While Microsoft sought Majoranas, competitors working on established qubit technologies reported steady progress. In 2019, Google announced it had reached a milestone called quantum supremacy, showing that a chip with 53 qubits could perform a statistical calculation in minutes that would take a supercomputer millennia. Soon after, Microsoft appeared to hedge its quantum bet, announcing it would offer access to quantum hardware from other companies via its cloud service Azure. The Wall Street Journal reported that Holmdahl left the project that year after missing an internal deadline.

Microsoft has been quieter about its expected pace of progress on quantum hardware since Holmdahl's departure. Competitors in quantum computing continue to tout hardware advances and urge software developers to access prototypes over the internet, but none appear close to creating a quantum computer ready for prime time.

Read the original post:
Microsofts Big Win in Quantum Computing Was an Error After All - WIRED