Hybrid Cloud is the future of IT in the Post COVID-19 World – CXOToday.com

The pace of cloud adoption has intensified due to the Coronavirus pandemic as organizations are moving to remote working and almost everything is going digital. In such a scenario, hybrid cloud that offers the best of both worlds for organizations due to its inherent flexibility, agility and efficiencies is seeing a phenomenal growth. In a recent interaction with CXOToday, Gurpreet Singh, Managing Director, Arrow PC Network, explains why hybrid cloud is the future of IT in the post COVID-19 world.

CXOToday: What are the current trends shaping the cloud landscape, both in the country and globally?

Gurpreet Singh: The increase in the demand for infrastructure requirements has influenced cloud services, and we especially see a growth in hybrid cloud adoption. As hybrid is a combination of public and a private cloud platform it provides the best of both, and with barriers between the platforms disappearing due to technological advancement, the adaptation of hybrid cloud is only going to increase. Besides, cloud is also being influenced by privacy-preserving multi-party analytics in a public cloud, hardware-based security, homomorphic encryption topics and IoT-based services. Apart from that, serverless computing, omni-cloud, quantum computing, Kubernetes are also the latest trends shapingup cloud landscape. Globally, the influence of these trends on the cloud will only increase allowing seamless operation of systems across the globe in organizations of all sizes. The cloud market is also betting high on mobile cloud where mobile applications are built, operated and hosted with cloud technology.

Could you give your views on how cloud adoption has picked up during the ongoing pandemic and how is it impacting cloud service providers?

Gurpreet Singh: There was a time, say a decade or two ago, when cloud was considered an inconsequential expense, but now it is a necessity. Due to COVID-19 pandemic, there was a radical shift in the work environment. Many enterprises chose to downgrade their office size and shifted to cloud services. This resulted in increasing the business for cloud service providers, as customers at all levels started adapting to cloud services in different flavors. For example, organizations are utilizing cloud automation to increase their online presence by developing commerce websites on cloud platforms. SaaS segment that relies on service desks, accounting packages, customer relationship management, human resource management and enterprise resource planning gained growth potential even during the COVID times. Healthcare systems needed scalable and secured cloud infrastructure to manage and maintain patient information with high speed and flexibility which certainly helped during the pandemic.

With teams working remotely, business continuity and security are the biggest concerns for any enterprise at present. How can cloud address these issues while also ensuring flexibility and seamless operation to the employees?

Gurpreet Singh: Centralized workspace is a thing of the past, millennials love to work from anywhere as opposed to the traditional format of working from a cubical in an office or their homes and this trend is only set to see a rise. Integrating the workforce consisting of digital natives and the traditional technological generation will certainly give an edge to an organization. To be on par with the millennial work-environment expectation, business continuity layered with security will be the topmost priority for enterprises. Cloud was born to provide the required not only in terms of new work environments but also flexibility in terms of cost and usage. Enterprises despite the lack of on-site IT personnel were able to leverage cloud capabilities to check, maintain, and monitor their server and storage installations in data centers, aiding the uninterrupted function of their workforce and thus helping themselves in terms of business continuity. These are challenging times as businesses are facing uncertainties and wish to plan for months as compared to years, hence features like pay-as-you-use and grow-as-you-require are receiving more visibility as organizations wish to move their expenses from CapEx to Opex and cloud can help in this transition.

With businesses fast moving on to the cloud in the past months, there has also been a surge in the development of many cloud applications. How is this going to impact the cloud market in the long run?

Gurpreet Singh: The global cloud market is expected to grow $295 billion by 2021 at a CAGR of 12.5%. The evolving hybrid cloud mode of deployment, along with the growing demand for content streaming, presents a favorable opportunity for market growth. New business ideas will leverage the cloud and multiply faster than ever. Startups with cloud applications solving big business concerns will pick up pace and play a key role in business growth in the new and existing verticals. The present traditional applications are also choosing to move to the cloud, in turn pushing the cloud applications also to develop. There will be continuous growth in the demand for cloud infrastructure services along with expenditure on specialized software, communications equipment, and telecom services. Players have adopted various growth strategies, such as partnerships and new service launches, to expand their presence further in the impact of COVID-19 on the cloud market and broaden their customer base. The APAC region / Asia-Pacific region is expected to depict the highest CAGR and this is attributable to the increase in demand for cloud identity and access management across multiple sectors.

How can the partner community benefit out of this and see this as a big opportunity for growing their business?

Gurpreet Singh: As companies are making strategic acquisitions and launching new services, cloud consultancies have expressed confidence in continued cloud demand. There is also a possibility where companies are looking to reduce their disparate solutions and focus on a partner community that meets multiple needs. Customers will look for partners who can help them in the digital transformation journey and those partners with the capability to handle end-to-end solutions for the customers will prefer hybrid cloud giving it a big opportunity in the marketplace. This is the time to unlearn and relearn. Our teams need to learn new technology solutions and take them to the customer to reap early mover advantage.

What will be the future of cloud in a post-COVID 19 world?

Gurpreet Singh: Customers will consume technology in all forms depending upon their workloads and hybrid cloud platforms will reap utmost benefit from it as this has opened up a myriad of possibilities in terms of technological advancement. A research found that the hybrid cloud market will grow to $97.6 billion by 2023, at a CAGR of 17 percent as hybrid not only helps in easing the economic factors for an organization but also delivers security. Even human cloud is also an emerging trend in the B2B sector anticipating 22% year-over-year growth. Although opting for the hybrid cloud strategy can pose a challenge for larger enterprises due to complex IT architecture and security challenges, hybrid cloud is poised to serve as a revolution for organizations due to the inherent flexibility, agility and efficiencies they offer. In simple terms, hybrid cloud is the future of IT in the post COVID-19 world.

Continued here:
Hybrid Cloud is the future of IT in the Post COVID-19 World - CXOToday.com

Quantum Computing Market Overview With Detailed Analysis, Competitive Landscape, Forecast to 2026 Honeywell, Transurban, Transtoll – Weekly Wall

QY Research has Published Latest Trending Report on Global Quantum Computing Market

Los Angeles, United State, The report titledGlobal Quantum Computing Marketis one of the most comprehensive and important additions to QY Researchs archive of market research studies. It offers detailed research and analysis of key aspects of the global Quantum Computing market. The market analysts authoring this report have provided in-depth information on leading growth drivers, restraints, challenges, trends, and opportunities to offer a complete analysis of the global Quantum Computing market. Market participants can use the analysis on market dynamics to plan effective growth strategies and prepare for future challenges beforehand. Each trend of the global Quantum Computing market is carefully analyzed and researched about by the market analysts.

Request Sample Report and Full Report TOC:https://www.qyresearch.com/sample-form/form/1975193/global-quantum-computing-market

The Essential Content Covered in the GlobalQuantum Computing Market Report:

* Top Key Company Profiles.* Main Business and Rival Information* SWOT Analysis and PESTEL Analysis* Production, Sales, Revenue, Price and Gross Margin* Market Share and Size

Global Quantum Computing Market is estimated to reach xxx million USD in 2020 and projected to grow at the CAGR of xx% during 2020-2026. According to the latest report added to the online repository of QY Research the Quantum Computing market has witnessed an unprecedented growth till 2020. The extrapolated future growth is expected to continue at higher rates by 2025.

Top Players of Quantum Computing Market are Studied: In 2020, the global Quantum Computing market size will be US$ 140.25 million and it is expected to reach US$ 1061.89 million by the end of 2026, with a CAGR of 40.13% during 2020-2026. This report focuses on the global Quantum Computing status, future forecast, growth opportunity, key market and key players. The study objectives are to present the Quantum Computing development in North America, Europe, Japan, China, etc. By Company D-Wave Solutions IBM Google Microsoft Rigetti Computing Intel Origin Quantum Computing Technology Anyon Systems Inc. Cambridge Quantum Computing Limited Segment by Type, , , , Hardware Software Cloud Service Segment By Application Medical Chemistry Transportation Manufacturing Others By Region North America, Europe China Japan Others

The report provides a 6-year forecast (2020-2026) assessed based on how the Quantum Computing market is predicted to grow in major regions likeUSA, Europe, Japan, China, India, Southeast Asia, South America, South Africa, Others.

Segmentation by Type:, , , Hardware Software Cloud Service Segment By Application Medical Chemistry Transportation Manufacturing Others By Region North America Europe China Japan Others

Segmentation by Application: Medical Chemistry Transportation Manufacturing Others By Region North America, Europe, China, Japan, Others

Reasons to Buy this Report:

Table of Contents

1 REPORT OVERVIEW1 1.1 Study Scope1 1.2 Key Market Segments1 1.3 Players Covered2 1.4 Market Analysis by Type2 1.4.1 Global Quantum Computing Market Size Growth Rate by Type (2020-2026)2 1.4.2 Hardware3 1.4.3 Software4 1.4.4 Cloud Service5 1.5 Market by Application6 1.5.1 Global Quantum Computing Market Share by Application (2020-2026)6 1.5.2 Medical7 1.5.3 Chemistry7 1.5.4 Transportation7 1.5.5 Manufacturing7 1.5.6 Others8 1.6 Study Objectives8 1.7 Years Considered8 2 EXECUTIVE SUMMARY9 2.1 Global Quantum Computing Market Size (2015-2026)9 2.2 Quantum Computing Market Size by Regions9 2.2.1 Quantum Computing Growth Rate by Regions (2020-2026)9 2.2.2 Quantum Computing Market Share by Regions (2020-2026)10 2.3 Industry Trends11 2.3.1 Market Top Trends11 2.3.2 Market Use Cases13 3 KEY PLAYERS15 3.1 Quantum Computing Revenue by Players (2019-2020)15 3.2 Quantum Computing Key Players Headquarters and Area Served16 3.3 Date of Enter into Quantum Computing Market17 3.4 Recent Mergers & Acquisitions, Expansion Plans17 4 BREAKDOWN BY TYPE AND APPLICATION19 4.1 Global Quantum Computing Market Size by Type (2020-2026)19 4.2 Global Quantum Computing Market Size by Application (2020-2026)20 5 NORTH AMERICA23 5.1 North America Quantum Computing Market Forecast (2020-2026)23 5.2 Quantum Computing Key Players in North America23 5.3 North America Quantum Computing Market Size by Type25 5.4 North America Quantum Computing Market Size by Application26 6 EUROPE27 6.1 Europe Quantum Computing Market Forecast (2020-2026)27 6.2 Quantum Computing Key Players in Europe27 6.3 Europe Quantum Computing Market Size by Type28 6.4 Europe Quantum Computing Market Size by Application28 7 JAPAN30 7.1 Japan Quantum Computing Market Forecast (2020-2026)30 7.2 Quantum Computing Key Players in Japan30 7.3 Japan Quantum Computing Market Size by Type31 7.4 Japan Quantum Computing Market Size by Application31 8 CHINA33 8.1 China Quantum Computing Market Analysis33 8.2 Key Players in China33 8.3 China Quantum Computing Market Size by Type34 8.4 China Quantum Computing Market Size by Application34 9 INTERNATIONAL PLAYERS PROFILES36 9.1 D-Wave Solutions36 9.1.1 D-Wave Solutions Company Details36 9.1.2 D-Wave Solutions Description and Business Overview36 9.1.3 D-Wave Solutions Quantum Computing Introduction36 9.1.4 D-Wave Solutions Revenue in Quantum Computing Business (2019-2020)37 9.1.5 D-Wave Solutions Recent Development37 9.2 IBM38 9.2.1 IBM Company Details38 9.2.2 IBM Description and Business Overview39 9.2.3 IBM Quantum Computing Introduction39 9.2.4 IBM Revenue in Quantum Computing Business (2019-2020)40 9.3 Google40 9.3.1 Google Company Details40 9.3.2 Google Description and Business Overview40 9.3.3 Google Quantum Computing Introduction41 9.3.4 Google Revenue in Quantum Computing Business (2019-2020)41 9.4 Microsoft41 9.4.1 Microsoft Company Details41 9.4.2 Microsoft Description and Business Overview42 9.4.3 Microsoft Quantum Computing Introduction42 9.4.4 Microsoft Revenue in Quantum Computing Business (2019-2020)42 9.5 Rigetti Computing43 9.5.1 Rigetti Computing Company Details43 9.5.2 Rigetti Computing Description and Business Overview43 9.5.3 Rigetti Computing Quantum Computing Introduction43 9.5.4 Rigetti Computing Revenue in Quantum Computing Business (2019-2020)43 9.5.5 Rigetti Computing Recent Development44 9.6 Intel44 9.6.1 Intel Company Details44 9.6.2 Intel Description and Business Overview45 9.6.3 Intel Quantum Computing Introduction45 9.6.4 Intel Revenue in Quantum Computing Business (2019-2020)45 9.6.5 Intel Recent Development46 9.7 Origin Quantum Computing Technology46 9.7.1 Origin Quantum Computing Technology Company Details46 9.7.2 Origin Quantum Computing Technology Description and Business Overview46 9.7.3 Origin Quantum Computing Technology Quantum Computing Introduction47 9.7.4 Origin Quantum Computing Technology Revenue in Quantum Computing Business (2019-2020)47 9.7.5 Origin Quantum Computing Technology Recent Development47 9.8 Anyon Systems Inc.48 9.8.1 Anyon Systems Inc. Company Details48 9.8.2 Anyon Systems Inc. Description and Business Overview48 9.8.3 Anyon Systems Inc. Quantum Computing Introduction48 9.8.4 Anyon Systems Inc. Revenue in Quantum Computing Business (2019-2020)49 9.9 Cambridge Quantum Computing Limited49 9.9.1 Cambridge Quantum Computing Limited Company Details49 9.9.2 Cambridge Quantum Computing Limited Description and Business Overview49 9.9.3 Cambridge Quantum Computing Limited Quantum Computing Introduction50 9.9.4 Cambridge Quantum Computing Limited Revenue in Quantum Computing Business (2019-2020)50 9.9.5 Cambridge Quantum Computing Limited Recent Development51 10 MARKET DYNAMICS52 10.1 Drivers52 10.2 Challenges53 10.3 Porter Five Forces Analysis54 11 KEY FINDINGS IN THIS REPORT56 12 APPENDIX58 12.1 Research Methodology58 12.1.1 Methodology/Research Approach58 12.1.2 Data Source61 12.2 Disclaimer64 12.3 Author Details64

About US

QY Research is a leading global market research and consulting company. Established in 2007 in Beijing, China, QY Research focuses on management consulting, database and seminar services, IPO consulting, industry chain research and custom research to help our clients in providing non-linear revenue model and make them successful. We are globally recognized for our expansive portfolio of services.

Visit link:
Quantum Computing Market Overview With Detailed Analysis, Competitive Landscape, Forecast to 2026 Honeywell, Transurban, Transtoll - Weekly Wall

This algorithm could revolutionize disease diagnosis, but we cant use it yet – Digital Trends

Scientists from the University of Virginia School of Medicine have built an algorithm that may shed crucial light on genetic diseases, as well as help physicians and medical experts to rapidly diagnose them. And it could be a game-changer once someone actually builds a computer powerful enough to run it, that is.

The algorithm in question is one that is designed to analyze genomic data. It can be used to determine whether a test sample comes from a person with a disease or a healthy control and to do this significantly faster than current conventional computers.

[Our] algorithm classifies a person as having a disease or not based on the occurrence of genetic variations in the persons genome, Stefan Bekiranov, associate professor at UVA, told Digital Trends. In principle, it could be applied to predict a patients genetic predisposition to disease as well.

Imagine, for instance, that a middle-aged patient with memory loss goes into a clinic. Their physician and family are worried about possible early-onset Alzheimers disease. The patient has blood drawn, and DNA and RNA are extracted and sequenced. Then they wait. And wait.

Today, this process could take weeks or even months before an answer is reached. But using the new algorithm developed by UVA researchers, the process which involves scanning enormous genomic, cellular databases to make the necessary predictions could be successfully completed in a matter of hours.

So whats the roadblock? After all, the great thing about todays over-the-air updates and constantly tweaked, cloud-based algorithms (Google alone rolls out some 500 to 600 changes to its search algorithm every year) is that they can be deployed rapidly. The problem with the UVA algorithm, however, is that it cant be called into action just yet because the computer thats optimally equipped to run it doesnt yet exist.

Thats because its an algorithm designed for a quantum computer: A class of next-generation supercomputers currently in their relative infancy. Unlike a classical computer, which encodes information as a series of ones and zeroes, quantum computer bits (called qubits) can be either a one, a zero, or both simultaneously. These qubits are composed of subatomic particles, which conform to the rules of quantum, instead of classical, mechanics.

The hope with quantum computers is that they will be able to carry out operations mind-bogglingly quickly. This is because their superposition property (in which quantum particles exist in multiple overlapping states at the same time) allows a quantum computers qubits to take multiple guesses at a time when solving problems. That is far superior to classical computings time-consuming, trial-and-error computations which can take just one guess at a time.

Sure, no one can run it right now, but the wait will be worth it when it finally arrives.

Because of their problem-solving speed, quantum computers could be highly significant for difficult challenges like cryptography and particle physics. In both of these cases, quantum computers promise to help solve enormous computational conundrums in a fraction of the time of their classical counterparts. But this work the first published quantum computer study funded by the National Institute of Mental Health and, possibly, the first using a universal quantum computer funded by the National Institutes of Health shows how quantum computers could also prove useful in fields like biochemistry and molecular genetics.

Our study serves as a marker that interest in quantum computing is expanding, even while its still in a nascent stage of development, Bekiranov said.

The UVA algorithm has been tested on IBMs quantum computers. The full algorithm in principle can be run on existing quantum computers. But the problem is that it can only run on a toy problem, not a real one with close to the complexity that would be required in the real world.

Bekiranov noted that there are a number of current roadblocks to the algorithm being used. For starters, the quantum logic gates (the basic quantum circuit that operates on a small number of qubits) do not perform the operations with perfect fidelity, resulting in errors in the measured results and even in the predictions. The number of qubits on even the most powerful quantum computer is also severely stunted at present. This limits the researchers to low genomic resolution. In addition, asking the quantum computer to perform too many gate operations causes the quantum state to decohere in the middle of computation, thereby destroying it.

Finally, Bekiranov said, and this is going to seem crazy, [but] it can take a complex set of gate operations just to input our data into the quantum computer. In fact, depending on the data, it can require so many gates to implement that it can negate the advantage of our quantum algorithm.

While that might seem disappointing, however, he noted that with sufficient steady progress and critical scientific breakthroughs along the way, a quantum computer able to run this properly could be here within a decade. Think of it like building an amazing app for an iPhone that wont ship until 2030. Sure, no one can run it right now, but the wait will be worth it when it finally arrives.

Just put groundbreaking genetic diagnosis tools down as one more reason to be excited about the coming quantum computing revolution.

See the original post here:
This algorithm could revolutionize disease diagnosis, but we cant use it yet - Digital Trends

This Is The First-Ever Photo of Quantum Entanglement – ScienceAlert

This stunning image captured last year by physicists at the University of Glasgow in Scotland is the first-ever photo of quantum entanglement - a phenomenon so strange, physicist Albert Einstein famously described it as 'spooky action at a distance'.

It might not look like much, but just stop and think about it for a second: this fuzzy grey image was the first time we'd seen the particle interaction that underpins the strange science of quantum mechanics and forms the basis of quantum computing.

Quantum entanglement occurs when two particles become inextricably linked, and whatever happens to one immediately affects the other, regardless of how far apart they are. Hence the 'spooky action at a distance' description.

This particular photo shows entanglement between two photons - two light particles. They're interacting and - for a brief moment - sharing physical states.

Paul-Antoine Moreau, first author of the paper wherein the image was unveiled back in July 2019, told the BBC the image was "an elegant demonstration of a fundamental property of nature".

To capture the incredible photo, Moreau and a team of physicists created a system that blasted out streams of entangled photons at what they described as 'non-conventional objects'.

The experiment actually involved capturing four images of the photons under four different phase transitions. You can see the full image below:

(Moreau et al., Science Advances, 2019)

What you're looking at here is actually a composite of multiple images of the photons as they go through a series of four phase transitions.

The physicists split the entangled photons up and ran one beam through a liquid crystal material known as -barium borate, triggering four phase transitions.

At the same time they captured photos of the entangled pair going through the same phase transitions, even though it hadn't passed through the liquid crystal.

You can see the setup below: The entangled beam of photons comes from the bottom left, one half of the entangled pair splits to the left and passes through the four phase filters. The others that go straight ahead didn't go through the filters, but underwent the same phase changes.

(Moreau et al., Science Advances, 2019)

The camera was able to capture images of these at the same time, showing that they'd both shifted the same way despite being split. In other words, they were entangled.

While Einstein made quantum entanglement famous, the late physicist John Stewart Bell helped define quantum entanglement and established a test known as 'Bell inequality'. Basically, if you can break Bell inequality, you can confirm true quantum entanglement.

"Here, we report an experiment demonstrating the violation of a Bell inequality within observed images," the team wrote in Science Advances.

"This result both opens the way to new quantum imaging schemes ... and suggests promise for quantum information schemes based on spatial variables."

The research was published in Science Advances.

A version of this article was first published in July 2019.

Read the original:
This Is The First-Ever Photo of Quantum Entanglement - ScienceAlert

Ripple CTO: Quantum computers will be a threat to Bitcoin and XRP – Crypto News Flash

In a chapter of the Modern CTO podcast, Ripples CTO, David Schwartz, expressed concerns about the development of quantum computers. Ripples CTO believes this technology is a threat to the security of Bitcoin, XRP, and cryptocurrencies. This is primarily because the consensus algorithms behind cryptocurrencies rely on conventional cryptography, as Schwartz stated:

From the point of view of someone who is building systems based on conventional cryptography, quantum computing is a risk. We are not solving problems that need powerful computing like payments and liquidity the work that the computers do is not that incredibly complicated, but because it relies on conventional cryptography, very fast computers present a risk to the security model that we use inside the ledger.

Algorithms like SHA-2 and ECDSA (elliptic curve cryptography) are sort of esoteric things deep in the plumbing but if they were to fail, the whole system would collapse. The systems ability to say who owns Bitcoin or who owns XRP or whether or not a particular transaction is authorized would be compromised().

Ripples CTO said that Ripple is trying to prepare for the emergence of quantum computers. Therefore, they are determining when the algorithms mentioned will no longer be reliable. Ripples CTO estimates that in the next 8-10 years, quantum computers will begin to pose a threat, as Schwartz further stated:

I think we have at least eight years. I have very high confidence that its at least a decade before quantum computing presents a threat, but you never know when there could be a breakthrough. Im a cautious and concerned observer, I would say.

The other fear would be if some bad actor, some foreign government, secretly had quantum computing way ahead of whats known to the public. Depending on your threat model, you could also say what if the NSA has quantum computing. Are you worried about the NSA breaking your payment system?

Despite the above, Ripples CTO made an optimistic conclusion and stated that even if there is a malicious actor with this technology, he will not use it against the average person. Therefore, Schwartz believes that most users have nothing to worry about:

While some people might really be concerned it depends on your threat model, if youre just an average person or an average company, youre probably not going to be a victim of this lets say hypothetically some bad actor had quantum computing that was powerful enough to break things, theyre probably not going to go after you unless you are a target of that type of actor.

As soon as its clear that theres a problem, these systems will probably be frozen until they can be fixed or improved. So, most people dont have to worry about it.

Excerpt from:
Ripple CTO: Quantum computers will be a threat to Bitcoin and XRP - Crypto News Flash

What is quantum computing?

Quantum computing is an area of study focused on the development of computer based technologies centered around the principles ofquantum theory. Quantum theory explains the nature and behavior of energy and matter on thequantum(atomic and subatomic) level. Quantum computing uses a combination ofbitsto perform specific computational tasks. All at a much higher efficiency than their classical counterparts. Development ofquantum computersmark a leap forward in computing capability, with massive performance gains for specific use cases. For example quantum computing excels at like simulations.

The quantum computer gains much of its processing power through the ability for bits to be in multiple states at one time. They can perform tasks using a combination of 1s, 0s and both a 1 and 0 simultaneously. Current research centers in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory. In addition, developers have begun gaining access toquantum computers through cloud services.

Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

The Essential Elements of Quantum Theory:

Further Developments of Quantum Theory

Niels Bohr proposed the Copenhagen interpretation of quantum theory. This theory asserts that a particle is whatever it is measured to be, but that it cannot be assumed to have specific properties, or even to exist, until it is measured. This relates to a principle called superposition. Superposition claims when we do not know what the state of a given object is, it is actually in all possible states simultaneously -- as long as we don't look to check.

To illustrate this theory, we can use the famous analogy of Schrodinger's Cat. First, we have a living cat and place it in a lead box. At this stage, there is no question that the cat is alive. Then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both alive and dead, according to quantum law -- in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

A Comparison of Classical and Quantum Computing

Classical computing relies on principles expressed by Boolean algebra; usually Operating with a 3 or 7-modelogic gateprinciple. Data must be processed in an exclusive binary state at any point in time; either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. In addition, there is still a limit as to how quickly these devices can be made to switch states. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply.

The quantum computer operates with a two-mode logic gate:XORand a mode called QO1 (the ability to change 0 into a superposition of 0 and 1). In a quantum computer, a number of elemental particles such as electrons or photons can be used. Each particle is given a charge, or polarization, acting as a representation of 0 and/or 1. Each particle is called a quantum bit, or qubit. The nature and behavior of these particles form the basis of quantum computing and quantum supremacy. The two most relevant aspects of quantum physics are the principles of superposition andentanglement.

Superposition

Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as aspin-upstate, or opposite to the field, which is known as aspin-downstate. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from alaser. If only half a unit of laser energy is used, and the particle is isolated the particle from all external influences, the particle then enters a superposition of states. Behaving as if it were in both states simultaneously.

Each qubit utilized could take a superposition of both 0 and 1. Meaning, the number of computations a quantum computer could take is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. For reference, 2^500 is infinitely more atoms than there are in the known universe. These particles all interact with each other via quantum entanglement.

In comparison to classical, quantum computing counts as trueparallel processing. Classical computers today still only truly do one thing at a time. In classical computing, there are just two or more processors to constitute parallel processing.EntanglementParticles (like qubits) that have interacted at some point retain a type can be entangled with each other in pairs, in a process known ascorrelation. Knowing the spin state of one entangled particle - up or down -- gives away the spin of the other in the opposite direction. In addition, due to the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being measured is determined at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction. The reason behind why is not yet explained.

Quantum entanglement allows qubits that are separated by large distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously. This is because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Quantum Programming

Quantum computing offers an ability to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations." This would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers.

The first quantum computing program appeared in 1994 by Peter Shor, who developed a quantum algorithm that could efficiently factorize large numbers.

The Problems - And Some Solutions

The benefits of quantum computing are promising, but there are huge obstacles to overcome still. Some problems with quantum computing are:

There are many problems to overcome, such as how to handle security and quantum cryptography. Long time quantum information storage has been a problem in the past too. However, breakthroughs in the last 15 years and in the recent past have made some form of quantum computing practical. There is still much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis.

Read this article:
What is quantum computing?

Quantum Computing – Intel

Ongoing Development in Partnership with Industry and AcademiaThe challenges in developing functioning quantum computing systems are manifold and daunting. For example, qubits themselves are extremely fragile, with any disturbance including measurement causing them to revert from their quantum state to a classical (binary) one, resulting in data loss. Tangle Lake also must operate at profoundly cold temperatures, within a small fraction of one kelvin from absolute zero.

Moreover, there are significant issues of scale, with real-world implementations at commercial scale likely requiring at least one million qubits. Given that reality, the relatively large size of quantum processors is a significant limitation in its own right; for example, Tangle Lake is about three inches square. To address these challenges, Intel is actively developing design, modeling, packaging, and fabrication techniques to enable the creation of more complex quantum processors.

Intel began collaborating with QuTech, a quantum computing organization in the Netherlands, in 2015; that involvement includes a US$50M investment by Intel in QuTech to provide ongoing engineering resources that will help accelerate developments in the field. QuTech was created as an advanced research and education center for quantum computing by the Netherlands Organisation for Applied Research and the Delft University of Technology. Combined with Intels expertise in fabrication, control electronics, and architecture, this partnership is uniquely suited to the challenges of developing the first viable quantum computing systems.

Currently, Tangle Lake chips produced in Oregon are being shipped to QuTech in the Netherlands for analysis. QuTech has developed robust techniques for simulating quantum workloads as a means to address issues such as connecting, controlling, and measuring multiple, entangled qubits. In addition to helping drive system-level design of quantum computers, the insights uncovered through this work contribute to faster transition from design and fabrication to testing of future generations of the technology.

In addition to its collaboration with QuTech, Intel Labs is also working with other ecosystem members both on fundamental and system-level challenges on the entire quantum computing stack. Joint research being conducted with QuTech, the University of Toronto, the University of Chicago, and others builds upward from quantum devices to include mechanisms such as error correction, hardware- and software-based control mechanisms, and approaches and tools for developing quantum applications.

Beyond Superconduction: The Promise of Spin QubitsOne approach to addressing some of the challenges that are inherent to quantum processors such as Tangle Lake that are based on superconducting qubits is the investigation of spin qubits by Intel Labs and QuTech. Spin qubits function on the basis of the spin of a single electron in silicon, controlled by microwave pulses. Compared to superconducting qubits, spin qubits far more closely resemble existing semiconductor components operating in silicon, potentially taking advantage of existing fabrication techniques. In addition, this promising area of research holds the potential for advantages in the following areas:

Operating temperature:Spin qubits require extremely cold operating conditions, but to a lesser degree than superconducting qubits (approximately one degree kelvin compared to 20 millikelvins); because the difficulty of achieving lower temperatures increases exponentially as one gets closer to absolute zero, this difference potentially offers significant reductions in system complexity.

Stability and duration:Spin qubits are expected to remain coherent for far longer than superconducting qubits, making it far simpler at the processor level to implement them for algorithms.

Physical size:Far smaller than superconducting qubits, a billion spin qubits could theoretically fit in one square millimeter of space. In combination with their structural similarity to conventional transistors, this property of spin qubits could be instrumental in scaling quantum computing systems upward to the estimated millions of qubits that will eventually be needed in production systems.

To date, researchers have developed a spin qubit fabrication flow using Intels 300-millimeter process technology that is enabling the production of small spin-qubit arrays in silicon. In fact, QuTech has already begun testing small-scale spin-qubit-based quantum computer systems. As a publicly shared software foundation, QuTech has also developed the Quantum Technology Toolbox, a Python package for performing measurements and calibration of spin-qubits.

Visit link:
Quantum Computing - Intel

Ripple Executive Says Quantum Computing Will Threaten Bitcoin, XRP and Crypto Markets Heres When – The Daily Hodl

Ripple CTO David Schwartz says quantum computing poses a serious threat to the future of cryptocurrency.

On the Modern CTO Podcast, Schwartz says quantum computing will break the cryptographic algorithms that keep cryptocurrencies like Bitcoin (BTC) and XRP as well as the internet at large secure.

From the point of view of someone who is building systems based on conventional cryptography, quantum computing is a risk. We are not solving problems that need powerful computing like payments and liquidity the work that the computers do is not that incredibly complicated, but because it relies on conventional cryptography, very fast computers present a risk to the security model that we use inside the ledger.

Algorithms like SHA-2 and ECDSA (elliptic curve cryptography) are sort of esoteric things deep in the plumbing but if they were to fail, the whole system would collapse. The systems ability to say who owns Bitcoin or who owns XRP or whether or not a particular transaction is authorized would be compromised

A lot of people in the blockchain space watch quantum computing very carefully and what were trying to do is have an assessment of how long before these algorithms are no longer reliable.

Schwartz says he thinks developers have at least eight years until the technology, which leverages the properties of quantum physics to perform fast calculations, becomes sophisticated enough to crack cryptocurrency.

I think we have at least eight years. I have very high confidence that its at least a decade before quantum computing presents a threat, but you never know when there could be a breakthrough. Im a cautious and concerned observer, I would say.

Schwartz says crypto coders should closely follow the latest public developments in quantum computing, but hes also concerned about private efforts from the government.

The other fear would be if some bad actor, some foreign government, secretly had quantum computing way ahead of whats known to the public. Depending on your threat model, you could also say what if the NSA has quantum computing. Are you worried about the NSA breaking your payment system?

While some people might realistically be concerned it depends on your threat model, if youre just an average person or an average company, youre probably not going to be a victim of this lets say hypothetically some bad actor had quantum computing that was powerful enough to break things, theyre probably not going to go after you unless you are a target of that type of actor. As soon as its clear that theres a problem, these systems will probably be frozen until they can be fixed or improved. So, most people dont have to worry about it.

Featured Image: Shutterstock/Elena11

Read the rest here:
Ripple Executive Says Quantum Computing Will Threaten Bitcoin, XRP and Crypto Markets Heres When - The Daily Hodl

GPT-3 Obsession, Python Reigns Supreme And More In This Week’s Top AI News – Analytics India Magazine

This week the machine learning community had their handsful with OpenAIs new toy GPT-3. Many enthusiasts applied the model for various innovative uses and few even started startups that work on GPT-3. Apart from this, there are also reports of the quarterly earnings, which saw Microsoft performing well, especially in the cloud segment. Know what else has happened in this weeks top AI news.

In a recent development, GitHub moved 21TB of its open-source code and repositories in the form of digital photosensitive archival film into Arctic Code Vault, Svalbard. The boxes of reels are stored in hundreds of meters of permafrost and can last for 1000 years. Done in collaboration with their archive partners, Piql. This initiative, GitHub Archive Program, aims to preserve the open-source software for future generations.

D-Wave Systems, a Canadian quantum computing company announced the expansion of its Leap cloud access and quantum application environment to India and Australia. The company claims that now users in these countries will have real-time access to a commercial quantum computer. In addition to access, Leap offers free developer plans, teaching and learning tools, code samples, demos and an emerging quantum community to help developers, forward-thinking business and researchers get started building and deploying quantum applications.

The race to democratise has made MLaaS a lucrative business model. The result is, today, there are multiple APIs offering similar services. This again, can be challenging. Addressing this issue and to establish a hassle-free ML ecosystem, a group of researchers from Stanford University, introduced a predictive framework called FrugalML that assists the users in switching between APIs in a smart manner. The researchers have detailed about their new framework in a paper titled, To Call or Not to Call?

The results show that FrugalML leads to more than 50% cost reduction when using APIs from Google, Microsoft and Face++ for a facial emotion recognition task. Whereas, experiments on FER+ dataset showed that only 33% cost is needed to achieve accuracies that match those of Microsoft API.

The authors posit that the performance of Frugal ML is likely because the base services quality score is highly correlated to its prediction accuracy, and their framework only needs to call expensive services for a few difficult data points and relies on the cheaper base services for the relatively easy data points.

Microsoft on Wednesday, reported earnings for its fourth fiscal quarter of 2020, including revenue of $38.0 billion, net income of $11.2 billion, and earnings per share of $1.46 (compared to revenue of $33.7 billion, net income of $13.2 billion, and earnings per share of $1.71 in Q4 2019). All three of the companys operating groups saw year-over-year growth.

Organizations that build their own digital capability will recover faster and emerge from this crisis stronger.

Revenue in Intelligent Cloud was $13.4 billion and increased 17% (up 19% in constant currency). The server products and cloud services revenue increased 19% (up 21% in constant currency) driven by Azure revenue growth of 47% (up 50% in constant currency). Whereas, the enterprise Services revenue was relatively unchanged (up 2% in constant currency).

In a recent survey conducted by IEEE Spectrum, it was found that Python has exerted sheer dominance over its contemporaries Java and C. The organisers have devised 11 metrics to check the popularity of 55 languages. One interpretation of Pythons high ranking is that its metrics are inflated by its increasing use as a teaching language: Students are simply asking and searching for the answers to the same elementary questions over and over, stated IEEE in their blog. The rose in Pythons popularity also coincides with that of fields such as machine learning, which have been increasingly introducing libraries and frameworks that encourage Python users. Given the recent trends, it looks like there are no roadblocks in sight for Python.

GPT-3, the worlds largest NLP model, which was released by OpenAI last month became quite popular. From generating codes to believable stories, this model has been put to use for a wide range of applications.

Generative models can display both overt and diffuse harmful outputs, such as racist, sexist, or otherwise pernicious language. This is an industry-wide issue, making it easy for individual organizations to abdicate or defer responsibility. OpenAI will not.

The popularity rose so high that one of the founders of OpenAI, Sam Altman, had to put out a tweet warning how GPT-3 is still far from being perfect. While the OpenAI team is jubilant of this rapid adoption, have listed a set of guidelines explaining how they would be working on making GPT-3 more reliable in the coming days.

DeepMind researchers released a paper that details a meta learning approach that would allow the researchers to automate the discovery of reinforcement learning algorithms, which have been manual so far. The paper claims that the generated algorithms performed well in video games such as Atari.

The proposed approach has the potential to dramatically accelerate the process of discovering new reinforcement learning algorithms by automating the process of discovery in a data-driven way, wrote the researchers.

Home GPT-3 Obsession, Python Reigns Supreme And More In This Weeks Top AI News

According to VICE reports, Four United Kingdom Uber drivers launched a lawsuit on Monday to gain access to Ubers algorithms through Europes General Data Protection Regulation (GDPR).

The union representing the drivers said theyre seeking to gain a deeper understanding of the algorithms that underpin Ubers automated decision-making system. This level of transparency, the union said, is needed to establish the level of management control Uber exerts on its drivers, allow them to calculate their true wages and benchmark themselves against other drivers, and help them build collective bargaining power.

The information asymmetry that allows Uber to selectively share data in forms that paint it in a favorable lightusually by obscuring negative outcomes like dead mileage or arbitrary deactivation. The case is being heard in Amsterdam and the outcome can severely impact the way Uber and other ride hailing companies do their business.

The University of Florida on Wednesday has announced a public-private partnership with NVIDIA that will catapult UFs research strength to address some of the worlds most formidable challenges, create unprecedented access to AI training and tools for underrepresented communities, and build momentum for transforming the future of the workforce.

The initiative is anchored by a $50 million gift $25 million from UF alumnus Chris Malachowsky and $25 million in hardware, software, training and services from NVIDIA, the Silicon Valley-based technology company he co founded and a world leader in AI and accelerated computing.

Along with an additional $20 million investment from UF, the initiative will create an AI-centric data center that houses the worlds fastest AI supercomputer in higher education. Working closely with NVIDIA, UF will boost the capabilities of its existing supercomputer.

comments

Read the original:
GPT-3 Obsession, Python Reigns Supreme And More In This Week's Top AI News - Analytics India Magazine

Commentary: America must invest in its ability to innovate – MIT News

In July of 1945, in an America just beginning to establish a postwar identity, former MIT vice president Vannevar Bush set forth a vision that guided the country to decades of scientific dominance and economic prosperity. Bushs report to the president of the United States, Science: The Endless Frontier, called on the government to support basic research in university labs. Its ideas, including the creation of the National Science Foundation (NSF), are credited with helping to make U.S. scientific and technological innovation the envy of the world.

Today, Americas lead in science and technology is being challenged as never before, write MIT President L. Rafael Reif and Indiana University President Michael A. McRobbie in an op-ed published today by The Chicago Tribune. They describe a triple challenge of bolder foreign competitors, faster technological change, and a merciless race to get from lab to market.

The governments decision to adopt Bushs ideas was bold and controversial at the time, and similarly bold action is needed now, they write.

The U.S. has the fundamental building blocks for success, including many of the worlds top research universities that are at the forefront of the fight against COVID-19, reads the op-ed. But without a major, sustained funding commitment, a focus on key technologies and a faster system for transforming discoveries into new businesses, products and quality jobs, in todays arena, America will not prevail.

McRobbie and Reif believe a bipartisan bill recently introduced in both chambers of Congress can help Americas innovation ecosystem meet the challenges of the day. Named the Endless Frontier Act, the bill would support research focused on advancing key technologies like artificial intelligence and quantum computing. It does not seek to alter or replace the NSF, but to create new strength in parallel, they write.

The bill would also create scholarships, fellowships, and other forms of assistance to help build an American workforce ready to develop and deploy the latest technologies. And, it would facilitate experiments to help commercialize new ideas more quickly.

Todays leaders have the opportunity to display the far-sighted vision their predecessors showed after World War II to expand and shape of our institutions, and to make the investments to adapt to a changing world, Reif and McRobbie write.

Both university presidents acknowledge that measures such as the Endless Frontier Act require audacious choices. But if leaders take the right steps now, they write, those choices will seem, in retrospect, obvious and wise.

Now as then, our national prosperity hinges on the next generation of technical triumphs, Reif and Mcrobbie write. Now as then, that success is not inevitable, and it will not come by chance. But with focused funding and imaginative policy, we believe it remains in reach.

More:
Commentary: America must invest in its ability to innovate - MIT News