Technology to Highlight the Next 10 Years: Quantum Computing – Somag News

Technology to Highlight the Next 10 Years According to a Strategy Expert: Quantum Computing

It is said that quantum computers, quantum computing, will have an impact on human history in the coming years. Bank of Americas strategist said that quantum calculation will mark the 2020s.

Bank of America strategist Haim Israel, the revolutionary feature that will emerge in the 2020s will be quantum calculation, he said. The iPhone was released in 2007, and we felt its real impact in the 2010s. We will not see the first business applications for quantum computing until the end of the next decade.

Strategy expert Haim Israel; He stated that the effect of quantum computing on business will be more radical and revolutionary than the effect of smartphones. Lets take a closer look at quantum computing.

What is Quantum Calculation?

Quantum computation is a fairly new technology based on quantum theory in physics. Quantum theory, in the simplest way, describes the behavior of subatomic particles and states that these particles can exist in more than one place until they are observed. Quantum computers, like todays computers, go beyond the storage of zeros and get enormous computing power.

In October, Google, a subsidiary of Alphabet Inc., claimed that they completed the calculation in 200 seconds on a 53 qubit quantum computing chip using a quantum computer, which takes 10,000 years on the fastest supercomputer. Amazon said earlier this month that it intends to cooperate with experts to develop quantum computing technologies. IBM and Microsoft are also among the companies that develop quantum computing technologies.

Quantum computation; health services can recreate the Internet of objects and cyber security areas:

Israel; quantum computing would have revolutionary implications in areas such as health care, the Internet of things and cyber security. Pharmaceutical companies will be the first commercial users of these devices, he said, adding that only the quantum computers can solve the pharmaceutical industrys big data problem.

Quantum computing will also have a major impact on cyber security. Todays cyber security systems are based on cryptographic algorithms, but with quantum computing these equations can be broken in a very short time. Even the most powerful encryption algorithms in the future will weaken significantly by quantum computation, Ok said Oktas marketing manager, Swaroop Sham.

For investors, Israel said that the first one or two companies that could develop commercially applicable quantum computing in this field could access huge amounts of data. This makes the software of these companies very valuable for customers.

You may also like.

Original post:

Technology to Highlight the Next 10 Years: Quantum Computing - Somag News

Quantum Computing for Everyone – The Startup – Medium

Qubits are exponentially faster than bits in several computing problems, such as database searches and factoring (which, as we will discuss soon, may break your Internet encryption).

An important thing to realize is that qubits can hold much more information than a bit can. One bit holds the same amount of information as one qubit they can both only hold one value. However, four bits must be used to store the same amount of information as two qubits. A two-qubit system in equal superposition holds values for four states, which on a classical computer, would need at least four bits to hold. Eight bits are needed to store the same amount of information as three qubits, since a three-qubit system can store eight states 000, 001, 010, 011, 100, 101, 110, and 111. This pattern continues.

The below graph provides a visual for the computing power of qubits. The x-axis represents the number of qubits used to hold a certain amount of information. The blue lines y represents the number of bits needed to hold the same amount of information as the number of qubits (x-axis), or 2 to the power of x. The red lines y represents the number of qubits needed to hold the same amount of information as the number of qubits in the x-axis (y=x).

Imagine the exponential speedup quantum computing can provide! A gigabyte (8E+09 bits) worth of information can be represented with log(8E+09)/log(2) = 33 (rounded up from 32.9) qubits.

Quantum computers are also great at factoring numbers which leads us to RSA encryption. The security protocol that secures Medium and probably any other website youve been on is known as RSA encryption. It relies on the fact that with current computing resources, it would take a very, very long time to factor a 30+-digit number m that has only one solution namely, p times q, where both p and q are large prime numbers. However, dividing m by p or q is computationally much easier, and since m divided by q returns p and vice versa, it provides a quick key verification system.

A quantum algorithm called Shors algorithm has shown exponential speedup in factoring numbers, which could one day break RSA encryption. But dont buy into the hype yet as of this writing, the largest number factored by quantum computers is 21 (into 3 and 7). The hardware has not been developed yet for quantum computers to factor 30-digit numbers or even 10-digit numbers. Even if quantum computers one day do break RSA encryption, a new security protocol called BB84 that relies on quantum properties is verified safe from quantum computers.

So will quantum computers ever completely replace the classical PC? Not in the forseeable future.

Quantum computing, while developing very rapidly, is still in an infantile stage, with research only being conducted semi-competitively by large corporations like Google, Microsoft, and IBM. Much of the hardware to accelerate quantum computing is not currently available. There are several obstacles to a quantum future, of which a major one is addressing gate errors and maintaining integrity of a qubits state.

However, given the amount of innovation that has happened in the past few years, it seems inevitable during our lifetimes that quantum computing will make huge strides. In addition, complexity theory has shown that there are several cases where classical computers perform better than quantum computers. IBM quantum computer developers state that quantum computing will probably never completely eliminate classical computers. Instead, in the future we may see a hybrid chip that relies on quantum transistors for certain tasks and classical transistors for others, depending on which one is more appropriate.

Excerpt from:
Quantum Computing for Everyone - The Startup - Medium

The Well-matched Combo of Quantum Computing and Machine Learning – Analytics Insight

The pace of improvement in quantum computing mirrors the fast advances made in AI and machine learning. It is normal to ask whether quantum technologies could boost learning algorithms: this field of inquiry is called quantum-improved machine learning.

Quantum computers are gadgets that work dependent on principles from quantum physics. The computers that we at present use are constructed utilizing transistors and the information is stored as double 0 and 1. Quantum computers are manufactured utilizing subatomic particles called quantum bits, qubits for short, which can be in numerous states simultaneously. The principal advantage of quantum computers is that they can perform exceptionally complex tasks at supersonic velocities. In this way, they take care of issues that are not presently feasible.

The most significant advantage of quantum computers is the speed at which it can take care of complex issues. While theyre lightning speedy at what they do, they dont give abilities to take care of issues from undecidable or NP-Hard problem classes. There is a problem set that quantum computing will have the option to explain, anyway, its not applicable for all computing problems.

Ordinarily, the issue set that quantum computers are acceptable at solving includes number or data crunching with an immense amount of inputs, for example, complex optimisation problems and communication systems analysis problemscalculations that would normally take supercomputers days, years, even billions of years to brute force.

The application that is routinely mentioned as an instance that quantum computers will have the option to immediately solve is solid RSA encryption. A recent report by the Microsoft Quantum Team recommends this could well be the situation, figuring that itd be feasible with around a 2330 qubit quantum computer.

Streamlining applications leading the pack makes sense well since theyre at present to a great extent illuminated utilizing brute force and raw computing power. If quantum computers can rapidly observe all the potential solutions, an ideal solution can become obvious all the more rapidly. Streamlining stands apart on the grounds that its significantly more natural and simpler to get a hold on.

The community of people who can fuse optimization and robust optimization is a whole lot bigger. The machine learning community, the coinciding between the innovation and the requirements are technical; theyre just pertinent to analysts. Whats more, theres a much smaller network of statisticians on the planet than there are of developers.

Specifically, the unpredictability of fusing quantum computing into the machine learning workflow presents an impediment. For machine learning professionals and analysts, its very easy to make sense of how to program the system. Fitting that into a machine learning workflow is all the more challenging since machine learning programs are getting very complex. However, teams in the past have published a lot of research on the most proficient method to consolidate it in a training workflow that makes sense.

Undoubtedly, ML experts at present need another person to deal with the quantum computing part: Machine learning experts are searching for another person to do the legwork of building the systems up to the expansions and demonstrating that it can fit.

In any case, the intersection of these two fields goes much further than that, and its not simply AI applications that can benefit. There is a meeting area where quantum computers perform machine learning algorithms and customary machine learning strategies are utilized to survey the quantum computers. This region of research is creating at such bursting speeds that it has produced a whole new field called Quantum Machine Learning.

This interdisciplinary field is incredibly new, however. Recent work has created quantum algorithms that could go about as the building blocks of machine learning programs, yet the hardware and programming difficulties are as yet significant and the development of fully functional quantum computers is still far off.

The future of AI sped along by quantum computing looks splendid, with real-time human-imitable practices right around an inescapable result. Quantum computing will be capable of taking care of complex AI issues and acquiring multiple solutions for complex issues all the while. This will bring about artificial intelligence all the more effectively performing complex tasks in human-like ways. Likewise, robots that can settle on optimised decisions in real-time in practical circumstances will be conceivable once we can utilize quantum computers dependent on Artificial Intelligence.

How away will this future be? Indeed, considering just a bunch of the worlds top organizations and colleges as of now are growing (genuinely immense) quantum computers that right now do not have the processing power required, having a multitude of robots mirroring humans running about is presumably a reasonable way off, which may comfort a few people, and disappoint others. Building only one, however? Perhaps not so far away.

Quantum computing and machine learning are incredibly well matched. The features the innovation has and the requirements of the field are extremely close. For machine learning, its important for what you have to do. Its difficult to reproduce that with a traditional computer and you get it locally from the quantum computer. So those features cant be unintentional. Its simply that it will require some time for the people to locate the correct techniques for integrating it and afterwards for the innovation to embed into that space productively.

Go here to see the original:
The Well-matched Combo of Quantum Computing and Machine Learning - Analytics Insight

The $600 quantum computer that could spell the end for conventional encryption – BetaNews

Concerns that quantum computing could place current encryption techniques at risk have been around for some time.

But now cybersecurity startup Active Cypher has built a password-hacking quantum computer to demonstrate that the dangers are very real.

Using easily available parts costing just $600, Active Cyphers founder and CTO, Dan Gleason, created a portable quantum computer dubbed QUBY (named after qubits, the basic unit of quantum information). QUBY runs recently open-sourced quantum algorithms capable of executing within a quantum emulator that can perform cryptographic cracking algorithms. Calculations that would have otherwise taken years on conventional computers are now performed in seconds on QUBY.

Gleason explains, "After years of foreseeing this danger and trying to warn the cybersecurity community that current cybersecurity protocols were not up to par, I decided to take a week and move my theory to prototype. I hope that QUBY can increase awareness of how the cyberthreats of quantum computing are not reserved to billion-dollar state-sponsored projects, but can be seen on much a smaller, localized scale."

The concern is that quantum computing will lead to the sunset of AES-256 (the current encryption standard), meaning all encrypted files could one day be decrypted. "The disruption that will come about from that will be on an unprecedented, global scale. It's going to be massive," says Gleason. Modelled after the SADM, a man-portable nuclear weapon deployed in the 1960s, QUBY was downsized so that it fits in a backpack and is therefore untraceable. Low-level 'neighborhood hackers' have already been using portable devices that can surreptitiously swipe credit card information from an unsuspecting passerby. Quantum compute emulating devices will open the door for significantly more cyberthreats.

In response to the threat, Active Cypher has developed advanced dynamic cyphering encryption that is built to be quantum resilient. Gleason explains that, "Our encryption is not based on solving a mathematical problem. It's based on a very large, random key which is used in creating the obfuscated cyphertext, without any key information within the cyphertext, and is thus impossible to be derived through prime factorization -- traditional brute force attempts which use the cyphertext to extract key information from patterns derived from the key material."

Active Cypher's completely random cyphertext cannot be deciphered using even large quantum computers since the only solution to cracking the key is to try every possible combination of the key, which will produce every known possible output of the text, without knowledge of which version might be the correct one. "In other words, you'll find a greater chance of finding a specific grain of sand in a desert than cracking this open," says Gleason.

Active Cypher showcased QUBY in early February at Ready -- an internal Microsoft conference held in Seattle. The prototype will also be presented at RSA in San Francisco later this month.

See more here:
The $600 quantum computer that could spell the end for conventional encryption - BetaNews

Why Quantum Computing Gets Special Attention In The Trump Administration’s Budget Proposal – Texas Standard

The Trump administrations fiscal year 2021 budget proposal includes significant increases in funding for artificial intelligence and quantum computing, while cutting overall research and development spending. If Congress agrees to it, artificial intelligence, or AI, funding would nearly double, and quantum computing would receive a 50% boost over last years budget, doubling in 2022, to $860 million. The administration says these two fields of research are important to U.S. national security, in part because China also invests heavily in these fields.

Quantum computing uses quantum mechanics to solve highly complex problems more quickly than they can be solved by standard or classical computers. Though fully functional quantum computers dont yet exist, scientists at academic institutions, as well as at IBM, Google and other companies, are working to build such systems.

Scott Aaronson is a professor of computer science and the founding director of the Quantum Information Center at the University of Texas at Austin. He says applications for quantum computing include simulation of chemistry and physics problems. These simulations enable scientists to design new materials, drugs, superconductors and solar cells, among other applications.

Aaronson says the governments role is to support basic scientific research the kind needed to build and perfect quantum computers.

We do not yet know how to build a fully scalable quantum computer. The quantum version of the transistor, if you like, has not been invented yet, Aaronson says.

On the software front, researchers have not yet developed applications that take full advantage of quantum computings capabilities.

Thats often misrepresented in the popular press, where its claimed that a quantum computer is just a black box that does everything, Aaronson says.

Competition between the U.S. and China in quantum computing revolves, in part, around the role such a system could play in breaking the encryption that makes things secure on the internet.

Truly useful quantum computing applications could be as much as a decade away, Aaronson says. Initially, these tools would be highly specialized.

The way I put it is that were now entering the very, very early, vacuum-tube era of quantum computers, he says.

Read this article:
Why Quantum Computing Gets Special Attention In The Trump Administration's Budget Proposal - Texas Standard

IBM Tops U.S. Patent List for 2019 With Innovations in AI, Blockchain, Cloud and Quantum Computing – Database Trends and Applications

IBM inventors received 9,262 U.S. patents in 2019, achieving a milestone of most patents ever awarded to a U.S. company, and marking the company's 27thconsecutive year of U.S. patent leadership.

In 2019,IBM led the industryin the number of U.S. patents granted across key technology areas such as AI, blockchain, cloud computing, quantum computing and security.

In 2019, according toIFI CLAIMS Patent Services,provider of a global patent data platform, U.S. patent filings hit an all-time high with 333,530 patents granted, representing an unprecedented 15% increase from 2018.

IBM was awarded more than 1,800 AI patents, including a method for teaching AI systems how to understand and deduce the nuances and implications behind certain text or phrases of speech by analyzing other related content.

IBM also led in the number of blockchain patents granted, which includes several patents for improving the security of blockchain networks. One patented technique would help in resisting "replay attacks," where an attacker copies and uses signature information from one transaction on a blockchain to later perform other transactions on the blockchain that are not authorized.

IBM inventors were granted more than 2,500 patents in cloud technology, including a patent for a method to jointly manage cloud and non-cloud computing platforms. Working with a unified portal, this technique receives, organizes and streamlines incoming cloud and non-cloud tasks and requests, which could help organizations easily migrate to hybrid cloud platforms.

IBM's quantum computing program continued to grow in 2019. Quantum computing innovations by IBM included a method to scale a quantum computer to support additional qubits, as well as enabling a breakthrough approach for simulating molecules.

Read more:
IBM Tops U.S. Patent List for 2019 With Innovations in AI, Blockchain, Cloud and Quantum Computing - Database Trends and Applications

Enterprise hits and misses – quantum gets real, Koch buys Infor, and Shadow’s failed app gets lit up – Diginomica

Lead story - Quantum computing - risks, opportunities and use cases - by Chris Middleton

MyPOV: Master-of-the-edgy-think-piece Chris Middleton unfurled a meaty two-parter on the realities of quantum computing. As a quantum computing fan boy and a proud quantum-changes-everything association member curmudgeon, I was glad to see Chris take this on.

In Quantum tech - big opportunities from (very, very) little things, he reminds us that pigeonholing quantum as "computing" is a mistake:

Quantum technology embraces a host of different systems, each of which could form a fast-expanding sector of its own if investors shift their focus away from computing. These include quantum timing, metrology, and navigation, such as the development of hyper-accurate, portable atomic clocks.

Each use case carries its own risks/opportunities, and need for transparency, particularly when you combine quantum and "AI." However, based on the recent sessions he attended, Chris says we should think of quantum as enhancing our tool kit rather than replacing classic computing outright. He concludes:

In business and technology, we see a world of big objects and quantifiable opportunities, and it is far from clear how the quantum realm relates to it though it is clear that it does. In short, investors, policymakers, and business leaders need something tangible and relatable before they reach for their credit cards.

Translation quantum computing is so 2021 (or maybe 2025). But I find middle ground with the hypesters: we'd better start talking about the implications now. Quantum computing has a far greater inevitability than say, enterprise blockchains.

Diginomica picks - my top stories on diginomica this week

Vendor analysis, diginomica style. Bears might be hibernating, but enterprise software vendors sure aren't napping:

Koch buys Infor: When Infor's CFO Kevin Samuelson took over the CEO role from Charles Phillips, many felt that the pending Infor IPO was in play. Well, many were wrong. Derek was on the case:

Infor to be acquired by Koch Industries - whats the likely impact? and the follow-on: Infor answers questions on Koch acquisition. The big question here, to me, isn't why Koch versus IPO. It's CloudSuite SaaS adoption. And which industries can Infor address via SaaS industry ERP? Derek's pieces give us important clues - and we'll we watching.

Google breaks out cloud earnings: ordinarily, earning reports are not watershed moments. But this was the first time "Alphabet" broke out Google Cloud (and YouTube) numbers. Google is obviously wary of the AWS and Azure comparisons. But it's not easy to break it all out anyhow (Google added GSuite revenues in also). Stuart parses it out inGoogle's 'challenger' cloud business hits $10 billion annual run rate as Alphabet breaks out the numbers for the first time.

SAP extends Business Suite maintenance to 2030 (with caveats): Arguably the biggest SAP story since the leadership change. Den had some questions stuck in his craw things to say, so he unfurled a two-parter:

MyPOV: a smart move - though an expected one - for the SAP new leadership team, with the user groups heavily involved in pushing the case. However, the next smart moves will be a lot tougher.

More vendor analysis:

And if that's not enough, Brian's got a Zoho review, I filed an Acumatica use case on SaaS best-of-breed, and Stuart crunched a landmark Zendesk earnings report.

Jon's grab bag - My annual productivity post is up and out; plus I took gratuitous shots at linkbaity Slack-has-ruined-work headlines (Personal productivity 2020 - Slack and Microsoft Teams didn't ruin work - but they didn't fix work either).

Neil explains the inexplicable in The problem of AI explainability - can we overcome it? Finally, I'm glad Jerry addressed the Clearview AI bottom-feeders in Clearview AI - super crime fighter or the death of privacy as we know it? There's a special place in my personal Hades for greedy entrepreneurs who steal faces, drape their motives in totally bogus 1st amendment claims, and plan to sell said data to authoritarian regimes. These bozos make robocallers look like human rights activists.

Lead story - analyzing the wreckage of the Iowa caucus tech fail

MyPOV: This could probably just be the whiffs section. The Iowa caucus app failure is very much like this: if you and I wrote down a step-by-step plan on how to screw up a mission-critical app launch, with everything from poor user engagement to technical failure to lack of contingencies to hacking vulnerabilities (which fortunately were not exploited), we've have this mess.

Hits/misses reader Clive reckons this is the best post-mortem: Shadow Inc. CEO Iowa Interview: 'We Feel Really Terrible' . First off, don't feel terrible, just go away. Shovel snow, or get involved in a local recycling initiative. Make a pinball app. Just stay away from the future of democracy from now on. Then there's this doozy: An 'Off-the-Shelf, Skeleton Project': Experts Analyze the App That Broke Iowa. Tell me if this sounds like something that would go smoothly:

To properly login and submit results, caucus chairs had to enter a precinct ID number, a PIN code, and a two-factor identification code, each of which were six-digits long.

Then there's the IDP, which was warned not to use the app by at least one party, and went headlong into their own abyss. Fortunately, there are a few lessons we can extract. Such as this one from Greg Miller, co-founder of the Open Source Election Technology Institute, which warned the IDP not to use the app weeks ago:

Our message is that apps like this should be developed in the sunlight and part of an open bug bounty.

An ironic message for an app developer named Shadow...

Honorable mention

I got a terrifying college flashback when I saw this one: Note targeting 'selfish' bongo player at Glastonbury Tor demands he stops playing. This prankster brought us back to the future though: Berlin artist uses 99 phones to trick Google into traffic jam alert.

In my line of work, we joke about PR hacks over-achievers pogo sticks pros "circling back", as if a second blast will somehow polish the turd of a crummy pitch as it slinkers by - well, this takes the noxious act of circling back to another level: Family Gets 55,000 Duplicate Letters from Loan Company. But hey, it's not all crash-and-burn here:

I can't let this slide another week:

I think we all realize by now that "free" services are all about data hucksters gorging themselves on the sweet nectar of our personal lives selling us out to the highest bidder. But when an anti-virus company gets it on the action, surely the Idiocracy has been achieved: "To make matters worse, Avast seems to maintain a lukewarm stance on the issue."

I'd like to invite the Avast team to step into my fiery cauldron. The only thing that's lukewarm is your grasping business model and your mediocre adware, err, I mean, anti-virus protection. Just one question: who protects us from you? As for Liz:

I'm with ya, Ms. Miller. Hopefully this is the next best thing....

If you find an #ensw piece that qualifies for hits and misses - in a good or bad way - let me know in the comments as Clive (almost) always does. Most Enterprise hits and misses articles are selected from my curated @jonerpnewsfeed. 'myPOV' is borrowed with reluctant permission from the ubiquitous Ray Wang.

Read this article:
Enterprise hits and misses - quantum gets real, Koch buys Infor, and Shadow's failed app gets lit up - Diginomica

Nobody is behind in science and technology: Serguei Beloussov – Kuensel, Buhutan’s National Newspaper

The CEO of Acronis, a reputed global technology company, talks on the future of computers and opportunities

Ugyen Penjore

Artificial Intelligence, Quantum Computing, Internet of Things, Cloud Computing. Sounds too technically sophisticated and way beyond comprehension?

If that was what most in the audience felt when invited to a talk on Future of Computing, the guest speaker, Serguei Beloussov, left many convinced that the future is in science and technology.

Serguei Beloussov is the Chief Executive Officer of Acronis, a reputed global technology company and the founder of Schaffhausen Institute of Technology (SIT), Switzerland.

Bhutan should take advantage of its smallness and use it as an opportunity to get ahead in the field of science and technology, said Serguei Beloussov, who is also a serial tech entrepreneur. Drawing examples of Switzerland where his company SIT is located and Singapore where he is currently based, he said Switzerland was a small poor farming country before it started precision manufacturing. Singapore, he said, transformed from a small poor country to one of the wealthiest nation based on science and technology.

The smallness is not a threat. It is an opportunity. Science could be the future for Bhutan, he said.

In technology and science, the number of people, he said, was not important. It is about having smart people. Albert Einstein in one year did more to science than all the 30million PhD holders did in 50 years, he said. Bhutan, he added, has the advantage given its culture and ethics. In science and technology, ethics matters.

On the importance of quantum computers, Serguei Beloussov, who is also the first man to bring cyber protection to motorsports, said the world was becoming digital whether we want it or not. The world is transforming from primarily a physical world of the past to the digital world of the future. The world is now about Internet of things, next generations computers, big data, virtual reality and space exploration.

On how and where Bhutan could start, Serguei Beloussov, said IT is an amazing field where everything changes every 10 years. There is no way that you are behind because there are many aspects to IT that are new. You can just start new and will be starting fresh with everyone else, he said.

Serguei Beloussov said real-world problems need computers to solve it. The problems of aging and diseases, environment and global warming, social justice and poverty could be solved with better computers, he said. If we have the right computers, we could predict the problems of the universe. Quantum computers are a reality and there are amazing features to solve once unsolvable problems.

The digital world, however, is fragile and needs security. Therefore, cyber protection has become the basic need in the digital world. Without cyber protection, we cannot continue in the future just like we cannot continue living without immune system. Immune system for the digital world is cyber protection.

The CEO said cybersecurity was a priority for Bhutan too. There is no choice. Whether you wan to be happy or unhappy, you have to have cyber security as you have gone digital, he said. For Bhutan, cyber security is more important given our geographic location.

On the apprehension that supercomputers or building the next generation of computers would require resources, human and material, Serguei Beloussov said science and technology are not really difficult or complicated as many believe although a lot of symbols and technical data are involved. Sixty years ago when people were using information theory and computer science, it was for scientists. Today, it is for everyone. Quantum physics actually simpler than classical physics. In fact, it is not harder to learn it than arithmetic, he said. 300 years ago, only the priests could read and they are considered special people. Today everybody could read, write and count. This is no harder.

GNH and the digital world

Calling himself a believer in knowledge, Serguei Beloussov said that knowledge could make people happier when asked about how the drive for technology featured in the concept of Gross national Happiness. I believe that without knowledge, you will be unhappy. And so, if you are refusing technology, its effectively refusing knowledge.

Bhutanese, he said, were a lot happier than others, but the ranking was not high on the happiness ranking. He pointed out issues related to unemployment. People want employment. In my country, we have 100 percent employment and people are happy. I dont think you can argue that you want to have less jobs, he said.

On GDP, Serguei Beloussov said the world cares about GDP. People want to be having higher levels of life. Everybody wants to have a nice house, live longer lives, they to be sick less, get better education, he said.

In my opinion, if you increase the life of a person, you provide them better education, better schools, better healthcare, better roads, better food, better environment, cleaner forest.

IT hub in Bhutan?

At the talk, the CEO said that SIT was considering establishing a South Asia campus in Bhutan. Although it is at an initial stage, the founder of SIT said that leveraging on IT could create high-end jobs, attract high-end tourists and promote local industries and the government.

He said that just 10,000 IT jobs in the country could add about 30,000 non-IT jobs, develop Bhutanese tech companies and add about Nu 3Billion to the GDP.

He also said that SIT campuses could create leaders. The first approach for a SIT campuses in the country would be approaching Cyber security, Atificial Intelligence and machine learning, software engineering and robotics in the field of computers.

In the field of business, the approach is on digitilising health, new generation business management, digital sports digital learning and education and artificial intelligence in arts and design.

The talk on Tuesday, February 11, was organised by His Majestys secretariat at Taj Hotel, Thimphu.

Read the rest here:
Nobody is behind in science and technology: Serguei Beloussov - Kuensel, Buhutan's National Newspaper

Budget 2020: Govt bets on AI, data analytics and quantum computing – Livemint

Finance Minister Nirmala Sitharaman on Saturday announced an outlay of 8,000 crore over the next five years on national mission on quantum technologies, while emphasising on the importance of leveraging artificial intelligence, data analytics, and internet of things for digital governance.

Data centre parks will be set up in India with the help of private sector, she said. The budget also allocated 6,000 crore for Bharat Net and said 1 lakh gram panchayats will get fibre to home connections under Bharat Net scheme in one year.

Policy on private sector building data centre parks is an exciting opportunity for fintech companies. This is also in line with the governments policy on retaining critical data within the country," said Sanjay Khan, partner, Khaitan & Co.

Maninder Bharadwaj, partner, Deloitte India said the emphasis of government on data and digitisation is clearly highlighted in this budget. Building of data centers, collection of nutritional information from 10 crore households and focus on fiber optic networks are initiatives that will propel India towards a digital journey," he added.

Artificial intelligence and machine learning featured extensively in the ministers speech with proposal to use it in various existing and future projects such as the proposed national policy for statistics and the Ayushman Bharat scheme.

While the government had previously set up a national portal for AI research and development, in the latest announcement, the government has continued to offer its support for tech advancements. We appreciate the governments emphasis on promoting cutting-edge technologies in India," Atul Rai, co-founder & CEO of Staqu said in a statement.

Governments across the world have been laying emphasis on use of AI for digital governance. As per reports, US government intends to spend almost $1 billion in AI-related research and development in 2020.

See more here:
Budget 2020: Govt bets on AI, data analytics and quantum computing - Livemint

Could Photonic Chips Outpace the Fastest Supercomputers? – Singularity Hub

Theres been a lot of talk about quantum computers being able to solve far more complex problems than conventional supercomputers. The authors of a new paper say theyre on the path to showing an optical computer can do so, too.

The idea of using light to carry out computing has a long pedigree, and it has gained traction in recent years with the advent of silicon photonics, which makes it possible to build optical circuits using the same underlying technology used for electronics. The technology shows particular promise for accelerating deep learning, and is being actively pursued by Intel and a number of startups.

Now Chinese researchers have put a photonic chip to work tackling a fiendishly complex computer science challenge called the subset sum problem. It has some potential applications in cryptography and resource allocation, but primarily its used as a benchmark to test the limits of computing.

Essentially the task is to work out whether any subset of a given selection of numbers adds up to a chosen target number. The task is NP-complete, which means the time required to solve it scales rapidly as you use a bigger selection of numbers, making it fundamentally tricky to calculate large instances of the challenge in a reasonable time using normal computing approaches.

However, optical computers work very differently from standard ones, and the device built by the researchers was able to solve the problem in a way that suggests future versions could outpace even the fastest supercomputers. They even say it could be a step on the way to photonic supremacy, mimicking the term quantum supremacy used to denote the point at which quantum computers outperform classical ones.

The chip the researchers designed is quite different from a conventional processor, though, and did not rely on silicon photonics. While most chips can be reprogrammed, the ones built by the researchers can only solve a particular instance of the subset problem. A laser was used to etch the task into a special glass by creating a network of wave-guides that channel photons through the processor as well as a series of junctions that get the light beams to split, pass each other, or converge.

They used a laser and series of lenses and mirrors to shoot a beam of light into one end of the processor, and a light detector then picked up the output as it came out the other side. The network of channels is designed to split the light into many different beams that explore all possible combinations of numbers simultaneously in parallel.

The team created two chips designed to solve the problem for sets of three and four numbers, and they showed it could do both easily and efficiently. Problems that small arent especially tough; you could probably do them on the back of an envelope, and conventional chips can work them out in fractions of a nanosecond.

However, the researchers say their approach could fairly simply be scaled up to much bigger instances of the problemand thats where things get interesting. For their approach, the time it takes to compute is simply a function of the speed of light and the longest path in the network. The former doesnt change and the latter goes up fairly gradually with bigger problems, and so their calculations show computing time shouldnt shift much even scaling up to far bigger problems.

Conventional chips have to do a brute-force search of every possible combination of numbers, which expands rapidly as the problem gets bigger. The groups calculations suggest that their chip would surpass a state-of-the-art Intel i7 CPU at a problem size of just six, which they think they should be able to demonstrate in their next experiment. Their estimates also predict their approach would overtake the worlds most powerful supercomputer, Summit, at a problem size of just 28.

Obviously, the proof is in the pudding, and until theyve built much larger chips its hard to predict if there might be unforeseen roadblocks. The fact that each chip is bespoke for a particular problem would seem to make it impractical for most applications.

While there is some prospect of mapping real-world problems onto subset problems that could be solved in this way, its likely any practical application would use an alternative chip design. Butthe researchers say its a great demonstration of the potential for photonic approaches to vastly outstrip conventional computers at some problems.

Image Credit: Image by Thomas-Suisse from Pixabay

Originally posted here:
Could Photonic Chips Outpace the Fastest Supercomputers? - Singularity Hub

New Centers Lead the Way towards a Quantum Future – Energy.gov

The world of quantum is the world of the very, very small. At sizes near those of atoms and smaller, the rules of physics start morphing into something unrecognizableat least to us in the regular world. While quantum physics seems bizarre, it offers huge opportunities.

Quantum physics may hold the key to vast technological improvements in computing, sensing, and communication. Quantum computing may be able to solve problems in minutes that would take lifetimes on todays computers. Quantum sensors could act as extremely high-powered antennas for the military. Quantum communication systems could be nearly unhackable. But we dont have the knowledge or capacity to take advantage of these benefitsyet.

The Department of Energy (DOE) recently announced that it will establish Quantum Information Science Centers to help lay the foundation for these technologies. As Congress put forth in the National Quantum Initiative Act, the DOEs Office of Science will make awards for at least two and up to five centers.

These centers will draw on both quantum physics and information theory to give us a soup-to-nuts understanding of quantum systems. Teams of researchers from universities, DOE national laboratories, and private companies will run them. Their expertise in quantum theory, technology development, and engineering will help each center undertake major, cross-cutting challenges. The centers work will range from discovery research up to developing prototypes. Theyll also address a number of different technical areas. Each center must tackle at least two of these subjects: quantum communication, quantum computing and emulation, quantum devices and sensors, materials and chemistry for quantum systems, and quantum foundries for synthesis, fabrication, and integration.

The impacts wont stop at the centers themselves. Each center will have a plan in place to transfer technologies to industry or other research partners. Theyll also work to leverage DOEs existing facilities and collaborate with non-DOE projects.

As the nations largest supporter of basic research in the physical sciences, the Office of Science is thrilled to head this initiative. Although quantum physics depends on the behavior of very small things, the Quantum Information Science Centers will be a very big deal.

The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://www.energy.gov/science.

Continue reading here:
New Centers Lead the Way towards a Quantum Future - Energy.gov

AI has great potential in transforming the world: Alphabet CEO Sundar Pichai – YourStory

In recent years, artificial intelligence (AI) has become the talk of the town. No forum seems to be complete without talking about how technology is going to impact the world.

In a conversation with Professor Klaus Schwab, Founder and Executive Chairman of World Economic Forum, Sundar Pichai, CEO of Google and Alphabet shared some valuable insights on the age of AI, the future of the open web, and technology's impact on society at the recently concluded WEF summit at Davos, Switzerland.

While several may argue that technology is negatively impacting the world by taking away jobs and comprising the safety and security of individuals, Pichai calls himself a technology optimist and believes that despite its disadvantages, AI has great potential in reforming the world from climate to healthcare.

Credit: World Economic Forum

Edited excerpt from the interview:

Professor Klaus Schwab (PKS) - Welcome Sundar Pichai. My first question is, you have called yourself a technology optimist, and we hear a lot of concerns about technologies. What makes you an optimist?

Sundar Pichai (SP) - What makes me a technology optimist?I think it's more about how I got introduced to technology. Growing up, I think, I had to wait for a long time before I got my hands on either a telephone or television when it came to our household. I discreetly remember how it changed our lives. TV allowed me access to world news, football, and cricket. So I always had this first-hand experience of how gaining access to technology changes people's lives.

Later on, I was inspired by the One Laptop per Child project, where the school was giving $100 laptops to children. They quite didn't get there. But I think it was a very inspiring goal and made a lot of progress in the industry. Later, we were able to make progress with Android. Each year, millions of people get access to computing for the first time. We do this with low-cost affordable Chromebooks. And seeing the difference it has made in people's lives, it gives me great hope for the path ahead. And more recently with AI, just in the last month, we have seen how it can help doctors better detect breast cancer with more accuracy.

We also launched a better rainfall prediction app. Over time, AI can play a role in climate change. So when you see these examples firsthand, I'm clear-eyed about the risks with technology. But the biggest risk with AI may be failing to work on it and make more progress with it because it can impact billions of people.

PKS - Can you explain what we can expect from quantum computing?

SP - Its an extraordinarily important milestone we achieved last year, something thats known in the field as quantum supremacy. It is when you can take quantum computers and they can do something which classical computers cannot. To me, nature at a fundamental level works in a quantum way. At a subatomic level, things can exist in many different states at the same time. Classical computers work in ones and zeros, so we know that's an imperfect way to simulate nature. Nature works differently. What's exciting about quantum computing and why we are so excited about the possibilities is it will allow us to understand the world more deeply. We can simulate nature better. So that means simulating molecular structures to discover better drugs, understanding the climate more deeply to predict weather patterns and tackle climate change, etc. We can design better batteries, nitrogen fixation the process by which we make the world's fertilisers, and accounts for two percent of carbon emissions. And the processes have not changed for a long time because it's very complicated.

Quantum computers will allow us the hope that we can make that process more efficient. So it's very profound. We've all been dealing in technology with the end of Moore's law. It's revolutionised in the past 40 years, but it's levelled off. So when I look at the future and say how do we drive improvements, quantum will be one of the tools in our arsenal by which we can keep something like Moore's Law continuing to evolve. The potential is huge and we'll have challenges. But in five to 10 years, quantum computing will break encryption as we know it today. But we can work around it. We need to do quantum encryption. There are challenges as always with any evolving technology. But I think the combination of AI and quantum will help us tackle some of the biggest problems we see.

PKS - And also to a certain extent, genetics. I think quantum computing and biology will have great potential positive or negative impacts.

SP - The positive one, as you're saying, rightly is to simulate molecules, protein folding, etc. It's very complex today. We cannot do it with classical computers. So with quantum computers, we can. But we have to be clear about all these powerful technologies. And this is why I think we need to be deliberate and regulate technologies like AI, and as a society, we need to engage in it.

PKS - And this leads me to the next question, actually because in an editorial in the Financial Times, which I read just before the annual meeting, you stated and I quote, Google's whole starts with recognising the need for a principle and regulated approach for applying artificial intelligence. What does it mean?

SP - You know, I've said this before that AI is one of the most profound things we are working on as humanity. It's more profound than fire, electricity, or any of the other bigger things we have worked on. It has tremendous positive sides to it. But it has real negative consequences. When you think about technologies like facial recognition, it can be used to benefit. It can be used to find missing people, but it can (also) be used for mass surveillance. And as democratic countries with a shared set of values, we need to build on those values and make sure when we approach AI we're doing it in a way that serves society. And that means making sure AI doesn't have a bias that we build and test it for safety. We make sure that there is a human agency that is ultimately accountable to people.

About 18 months ago, we published a set of principles under which we would develop as Google. But it's been very encouraging to see the European Commission has identified AI and sustainability as their top priorities. And the US put out a set of principles last week. And, be it the OECD or G20, they're all talking about this, which I think is very encouraging. And I think we need a common framework by which we approach AI.

PKS - How do you see Google in five years from now?

SP - We know we will do well, only if others do well along with us. That's how Google works today through search. We help users reach the information they want including businesses and businesses grow along with search. In the US, last year, we created $335 billion of economic opportunity. And that's true in every country around the world. We think with Alphabet, there's a real chance to take a long-term view and work on technology which can improve people's lives. But we won't do it alone. In many other bets, which we are working on where we can, we take outside investments. These companies are independent, so you can imagine we'll do it in partnerships with other companies. And Alphabet gives us the flexibility to have different structures for different areas in a way we need them to fix healthcare, and we can deeply partner with other companies. Today, we partner with the leading healthcare companies as we work on these efforts.

So we understand for Alphabet to do well, we inherently need to do it in a way that works with other companies, creating an ecosystem around it. This is why last year, just through our venture arm, we invested in over 100 companies. We are just investors in these companies, and they're going to be independent companies. We want them to thrive and succeed. And so, you know, that's the way we think about it. But I think it gives us a real chance to take a long-term view, be it self driving cars or AI.

PKS - So last question. You said you are an optimist. When you wake up at night and you cannot sleep anymore, what worries you at some time?

SP - You were pretty insightful. That is true. Yeah, I do wake up at night. What worries me at night? I think technology has a chance to transform society for the good, but we need to learn to harness it to work for society's good. But I do worry that we turn our backs on technology. And I worry that when people do that they get left behind too. And so to me, how do you do it inclusively? I was in Belgium and I went to MolenGeek, a startup incubator in Molenbeek. In that community, you see people who may not have gone to school, but when you give them access to digital skills, they're hungry for it. People want to learn technology and be a part of it. That's the desire you see around the world when we travel. When I go to emerging markets, it's a big source of opportunity. And so I think it's our duty and responsibility to drive this growth inclusively. And that keeps me up at night.

(Edited by Suman Singh)

Read the original:
AI has great potential in transforming the world: Alphabet CEO Sundar Pichai - YourStory

‘How can we compete with Google?’: the battle to train quantum coders – The Guardian

There is a laboratory deep within University College London (UCL) that looks like a cross between a rebel base in Star Wars and a scene imagined by Jules Verne. Hidden within the miles of cables, blinking electronic equipment and screens is a gold-coloured contraption known as a dilution refrigerator. Its job is to chill the highly sensitive equipment needed to build a quantum computer to close to absolute zero, the coldest temperature in the known universe.

Standing around the refrigerator are students from Germany, Spain and China, who are studying to become members of an elite profession that has never existed before: quantum engineering. These scientists take the developments in quantum mechanics over the past century and turn them into revolutionary real-world applications in, for example, artificial intelligence, self-driving vehicles, cryptography and medicine.

The problem is that there is now what analysts call a quantum bottleneck. Owing to the fast growth of the industry, not enough quantum engineers are being trained in the UK or globally to meet expected demand. This skills shortage has been identified as a crucial challenge and will, if unaddressed, threaten Britains position as one of the worlds top centres for quantum technologies.

The lack of access to a pipeline of talent will pose an existential threat to our company, and others like it, says James Palles-Dimmock, commercial director of London- and Oxford-based startup Quantum Motion. You are not going to make a quantum computer with 1,000 average people you need 10 to 100 incredibly good people, and thatll be the case for everybody worldwide, so access to the best talent is going to define which companies succeed and which fail.

This doesnt just matter to niche companies; it affects everyone. If the UK is to remain at the leading edge of the world economy then it has to compete with the leading technological and scientific developments, warns Professor Paul Warburton, director of the CDT in Delivering Quantum Technologies. This is the only way we can maintain our standard of living.

This quantum bottleneck is only going to grow more acute. Data is scarce, but according to research by the Quantum Computing Report and the University of Wisconsin-Madison, on one day in June 2016 there were just 35 vacancies worldwide for commercial quantum companies advertised. By December, that figure had leapt to 283.

In the UK, Quantum Motion estimates that the industry will need another 150200 quantum engineers over the next 18 months. In contrast, Bristol Universitys centre for doctoral training produces about 10 qualified engineers each year.

In the recent past, quantum engineers would have studied for their PhDs in small groups inside much larger physics departments. Now there are interdisciplinary centres for doctoral training at UCL and Bristol University, where graduates in such subjects as maths, engineering and computer science, as well as physics, work together. As many of the students come with limited experience of quantum technologies, the first year of their four-year course is a compulsory introduction to the subject.

Rather than work with three or four people inside a large physics department its really great to be working with lots of people all on quantum, whether they are computer scientists or engineers. They have a high level of knowledge of the same problems, but a different way of thinking about them because of their different backgrounds, says Bristol student Naomi Solomons.

While Solomons is fortunate to study on an interdisciplinary course, these are few and far between in the UK. We are still overwhelmingly recruiting physicists, says Paul Warburton. We really need to massively increase the number of PhD students from outside the physics domain to really transform this sector.

The second problem, according to Warburton, is competition with the US. Anyone who graduates with a PhD in quantum technologies in this country is well sought after in the USA. The risk of lucrative US companies poaching UK talent is considerable. How can we compete with Google or D-Wave if it does get into an arms race? says Palles-Dimmock. They can chuck $300,000-$400,000 at people to make sure they have the engineers they want.

There are parallels with the fast growth of AI. In 2015, Ubers move to gut Carnegie Mellon Universitys world-leading robotics lab of nearly all its staff (about 50 in total) to help it build autonomous cars showed what can happen when a shortage of engineers causes a bottleneck.

Worryingly, Doug Finke, managing editor at Quantum Computing Report, has spotted a similar pattern emerging in the quantum industry today. The large expansion of quantum computing in the commercial space has encouraged a number of academics to leave academia and join a company, and this may create some shortages of professors to teach the next generation of students, he says.

More needs to be done to significantly increase the flow of engineers. One way is through diversity: Bristol has just held its first women in quantum event with a view to increasing its number of female students above the current 20%.

Another option is to create different levels of quantum engineers. A masters degree or a four-year dedicated undergraduate degree could be the way to mass-produce engineers because industry players often dont need a PhD-trained individual, says Turner. But I think you would be training more a kind of foot soldier than an industry leader.

One potential roadblock could be growing threats to the free movement of ideas and people. Nations seem to be starting to get a bit protective about what theyre doing, says Prof John Morton, founding director of Quantum Motion. [They] are often using concocted reasons of national security to justify retaining a commercial advantage for their own companies.

Warburton says he has especially seen this in the US. This reinforces the need for the UK to train its own quantum engineers. We cant rely on getting our technology from other nations. We need to have our own quantum technology capability.

Read more here:
'How can we compete with Google?': the battle to train quantum coders - The Guardian

University of Sheffield launches Quantum centre to develop the technologies of tomorrow – Quantaneo, the Quantum Computing Source

A new research centre with the potential to revolutionise computing, communication, sensing and imaging technologies is set to be launched by the University of Sheffield this week (22 January 2020).

The Sheffield Quantum Centre, which will be officially opened by Lord Jim ONeill, Chair of Chatham House and University of Sheffield alumnus, is bringing together more than 70 of the Universitys leading scientists and engineers to develop new quantum technologies.

Quantum technologies are a broad range of new materials, devices and information technology protocols in physics and engineering. They promise unprecedented capabilities and performance by exploiting phenomena that cannot be explained by classical physics.

Quantum technologies could lead to the development of more secure communications technologies and computers that can solve problems far beyond the capabilities of existing computers.

Research into quantum technologies is a high priority for the UK and many countries around the world. The UK government has invested heavily in quantum research as part of a national programme and has committed 1 billion in funding over 10 years.

Led by the Universitys Department of Physics and Astronomy, Department of Electronic and Electrical Engineering and Department of Computer Science, the Sheffield Quantum Centre will join a group of northern universities that are playing a significant role in the development of quantum technologies.

The University of Sheffield has a strong presence in quantum research with world leading capabilities in crystal growth, nanometre scale device fabrication and device physics research. A spin-out company has already been formed to help commercialise research, with another in preparation.

Professor Maurice Skolnick, Director of the Sheffield Quantum Centre, said: The University of Sheffield already has very considerable strengths in the highly topical area of quantum science and technology. I have strong expectation that the newly formed centre will bring together these diverse strengths to maximise their impact, both internally and more widely across UK universities and funding bodies.

During the opening ceremony, the Sheffield Quantum Centre will also launch its new 2.1 million Quantum Technology Capital equipment.

Funded by the Engineering and Physical Sciences Research Council (EPSRC), the equipment is a molecular beam epitaxy cluster tool designed to grow very high quality wafers of semiconductor materials types of materials that have numerous everyday applications such as in mobile phones and lasers that drive the internet.

The semiconductor materials also have many new quantum applications which researchers are focusing on developing.

Professor Jon Heffernan from the Universitys Department of Electronic and Electrical Engineering, added: The University of Sheffield has a 40-year history of pioneering developments in semiconductor science and technology and is host to the National Epitaxy Facility. With the addition of this new quantum technologies equipment I am confident our new research centre will lead to many new and exciting technological opportunities that can exploit the strange but powerful concepts from quantum science.

More:
University of Sheffield launches Quantum centre to develop the technologies of tomorrow - Quantaneo, the Quantum Computing Source

20 technologies that could change your life in the next decade – Economic Times

The decade thats knocking on our doors now the 2020s is likely to be a time when science fiction manifests itself in our homes and roads and skies as viable, everyday technologies. Cars that can drive themselves. Meat that is derived from plants. Robots that can be fantastic companions both in bed and outside.

Implanting kidneys that can be 3-D printed using your own biomaterial. Using gene editing to eradicate diseases, increase crop yield or fix genetic disorders in human beings. Inserting a swarm of nanobots that can cruise through your blood stream and monitor parameters or unblock arteries. Zipping between Delhi and New York on a hypersonic jet. All of this is likely to become possible or substantially closer to becoming a reality in the next 10 years.

Ideas that have been the staple of science fiction for decades artificial intelligence, universal translators, sex robots, autonomous cars, gene editing and quantum computing are at the cusp of maturity now. Many are ready to move out of labs and enter the mainstream. Expect the next decade to witness breakout years for the world of technology.

Read on:

The 2020s: A new decade promising miraculous tech innovations

Universal translators: End of language barrier

Climate interventions: Clearing the air from carbon

Personalised learning: Pedagogy gets a reboot with AI

Made in a Printer: 3-D printing going to be a new reality

Digital money: End of cash is near, cashless currencies are in vogue

Singularity: An era where machines will out-think human

Mach militaries: Redefining warfare in the 2020

5G & Beyond: Ushering a truly connected world

Technology: Solving the problem of clean water

Quantum computing : Beyond the power of classical computing

Nanotechnology: From science fiction to reality

Power Saver: Energy-storage may be the key to maximise power generation

Secret code: Gene editing could prove to be a game-changer

Love in the time of Robots: The rise of sexbots and artificial human beings

Wheels of the future: Flying cars, hyperloops and e-highways will transform how people travel

New skies, old fears: The good, bad& ugly of drones

Artificial creativity: Computer programs could soon churn out books, movies and music

Meat alternatives: Alternative meat market is expected to grow 10 times by 2029

Intelligent robots & cyborg warriors will lead the charge in battle

Why we first need to focus on the ethical challenges of artificial intelligence

It's time to reflect honestly on our motivations for innovation

India's vital role in new space age

Plastic waste: Environment-friendly packaging technologies will gain traction

The rest is here:

20 technologies that could change your life in the next decade - Economic Times

2020s — the Decade of AI and Quantum – Inside Higher Ed

Too often, we look ahead assuming that the technologies and structures of today will be in place for years to come. Yet a look back confirms that change has moved at a dramatic pace in higher education.

Reviewing the incredible progress each decade brings makes me wonder, if I knew at the beginning of the decade what was coming, how might I have better prepared?

Make no mistake, we have crossed the threshold into the fourth industrial revolution that will most markedly advance this decade through maturing artificial intelligence, ultimately driven by quantum computing. The changes will come at an ever-increasing rate as the technologies and societal demands accelerate. Digital computers advanced over the past half century at approximately the rate described by Moores Law, with processing power doubling every two years. Now we are entering the era of Nevens Law, which predicts the speed of progress of quantum computing at a doubly exponential rate. This means change at a dizzyingly rapid rate that will leave many of us unable to comprehend the why and barely able to digest the daily advances that will describe reality. New platforms, products and processes will proliferate in this new decade.

That includes higher education. The centuries-old model of the faculty member at a podium addressing a class of students who are inconsistently and inaccurately taking notes on paper or laptop will seem so quaint, inefficient and impractical that it will be laughable. Observers in 2030 will wonder how any significant learning even took place in that environment.

Semesters and seat time will not survive the coming decade. Based in 19th- and 20th-century societal needs, these are long overdue to pass away. The logical and efficient structure of outcomes-based adaptive learning will quickly overtake the older methods, doing away with redundancy for the advanced students and providing developmental learning for those in need. Each student will be at the center of their learning experience, with AI algorithms fed by rich data about each student mapping progress and adjusting the pathway for each learner. This will lead to personalized learning where the courses and curriculum will be custom-made to meet the needs of the individual learner. Yet, it also will also serve to enhance the social experience for learners meeting face-to-face. In a report from Brookings on the topic, researchers stated that technology can help education leapfrog in a number of ways. It can provide individualized learning by tracking progress and personalizing activities to serve heterogeneous classrooms.

Early implementations of adaptive learning in the college setting have shown that this AI-driven process can result in greater equity success for the students. In addition, the faculty members see that their role has become even more important as they directly interact with the individual students to enable and facilitate their learning.

Increasingly we are gathering data about our students as they enter and progress through learning at our institutions. That big data is the "food" upon which artificial intelligence thrives. Sorting through volumes and varieties of data that in prior decades we could not efficiently process, AI can now uncover cause and effect pairs and webs. It can lead us to enhancements and solutions that previously were beyond our reach. As the pool of data grows and becomes more and more diverse -- not just numbers, but also videos and anecdotes -- the role of quantum computing comes into play.

While it is unlikely we will see quantum computers physically on the desks of university faculty and staff in the coming decade, we certainly will see cloud use of quantum computers to solve increasingly complex problems and opportunities. Quantum computers will interact with digital computers to apply deep learning at an as yet unseen scale. We will be able to pose challenges such as "what learning will researchers need to best prepare for the next generation of genetic advancement?" Faster than a blink of an eye, the quantum computers will respond.

It turns out that major developments are occurring every day in the advancement of quantum computing. Johns Hopkins University researchers recently discovered a superconducting material that may more effectively host qubits in the future. And Oxford University researchers just uncovered ways in which strontium ions can be much more efficiently entangled for scaling quantum computers. Advancements such as these will pave the path to ever more powerful computers that will enable ever more effective adaptive, individualized and personalized learning.

We know that change is coming. We know the direction of that change. We know some of the actual tools that will be instrumental in that change. Armed with that knowledge, what can we do today to prepare for the decade of the 2020s? Rather than merely reacting to changes after the fact, can we take steps to anticipate and prepare for that change? Can our institutions be better configured to adapt to the changes that are on the horizon? And who will lead that preparation at your institution?

Read the rest here:
2020s -- the Decade of AI and Quantum - Inside Higher Ed

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing – VentureBeat

Quantum is having a moment. In October, Google claimed to have achieved a quantum supremacy milestone. In November, Microsoft announced Azure Quantum, a cloud service that lets you tap into quantum hardware providers Honeywell, IonQ, or QCI. Last week, AWS announced Amazon Braket, a cloud service that lets you tap into quantum hardware providers D-Wave, IonQ, and Rigetti. At the Q2B 2019 quantum computing conference this week, I got a pulse for how the nascent industry is feeling.

Binary digits (bits) are the basic units of information in classical computing, while quantum bits (qubits) make up quantum computing. Bits are always in a state of 0 or 1, while qubits can be in a state of 0, 1, or a superposition of the two. Quantum computing leverages qubits to perform computations that would be much more difficult for a classical computer. Potential applications are so vast and wide (from basic optimization problems to machine learning to all sorts of modeling) that interested industries span finance, chemistry, aerospace, cryptography, and more. But its still so early that the industry is nowhere close to reaching consensus on what the transistor for qubits should look like.

Currently, your cloud quantum computing options are limited to single hardware providers, such as those from D-Wave and IBM. Amazon and Microsoft want to change that.

Enterprises and researchers interested in testing and experimenting with quantum are excited because they will be able to use different quantum processors via the same service, at least in theory. Theyre uneasy, however, because the quantum processors are so fundamentally different that its not clear how easy it will be to switch between them. D-Wave uses quantum annealing, Honeywell and IonQ use ion trap devices, and Rigetti and QCI use superconducting chips. Even the technologies that are the same have completely different architectures.

Entrepreneurs and enthusiasts are hopeful that Amazon and Microsoft will make it easier to interface with the various quantum hardware technologies. Theyre uneasy, however, because Amazon and Microsoft have not shared pricing and technical details. Plus, some of the quantum providers offer their own cloud services, so it will be difficult to suss out when it makes more sense to work with them directly.

The hardware providers themselves are excited because they get exposure to massive customer bases. Amazon and Microsoft are the worlds biggest and second biggest cloud providers, respectively. Theyre uneasy, however, because the tech giants are really just middlemen, which of course poses its own problems of costs and reliance.

At least right now, it looks like this will be the new normal. Even hardware providers that havent announced they are partnering with Amazon and/or Microsoft, like Xanadu, are in talks to do just that.

Overall at the event, excitement trumped uneasiness. If youre participating in a domain as nascent as quantum, you must be optimistic. The news this quarter all happened very quickly, but there is still a long road ahead. After all, these cloud services have only been announced. They still have to become available, gain exposure, pick up traction, become practical, prove useful, and so on.

The devil is in the details. How much are these cloud services for quantum going to cost? Amazon and Microsoft havent said. When exactly will they be available in preview or in beta? Amazon and Microsoft havent said. How will switching between different quantum processors work in practice? Amazon and Microsoft havent said.

One thing is clear. Everyone at the event was talking about the impact of the two biggest cloud providers offering quantum hardware from different companies. The clear winners? Amazon and Microsoft.

ProBeat is a column in which Emil rants about whatever crosses him that week.

Read the rest here:

ProBeat: AWS and Azure are generating uneasy excitement in quantum computing - VentureBeat

Quantum expert Robert Sutor explains the basics of Quantum Computing – Packt Hub

What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?

For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.

If we could do this at scale, we might be able to compute the molecules we want and need.

These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.

Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?

This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.

Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.

This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is

295, 000, 000, 000, 000, 000, 000 molecules.

A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.

These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.

To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.

Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?

Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:

10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.

And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.

How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.

We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.

However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.

In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.

I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.

Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.

When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.

If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.

If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.

If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?

I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.

There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.

To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.

A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.

A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time

This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.

Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.

Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:

As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.

In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.

To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.

Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.

Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing

Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions

Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases

Read the rest here:

Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub

Quantum Computers Are the Ultimate Paper Tiger – The National Interest Online

Google announced this fall to much fanfare that it had demonstrated quantum supremacy that is, it performed a specific quantum computation far faster than the best classical computers could achieve. IBM promptly critiqued the claim, saying that its own classical supercomputer could perform the computation at nearly the same speed with far greater fidelity and, therefore, the Google announcement should be taken with a large dose of skepticism.

This wasnt the first time someone cast doubt on quantum computing. Last year, Michel Dyakonov, a theoretical physicist at the University of Montpellier in France, offered a slew of technical reasons why practical quantum supercomputers will never be built in an article in IEEE Spectrum, the flagship journal of electrical and computer engineering.

So how can you make sense of what is going on?

As someone who has worked on quantum computing for many years, I believe that due to the inevitability of random errors in the hardware, useful quantum computers are unlikely to ever be built.

Whats a quantum computer?

To understand why, you need to understand how quantum computers work since theyre fundamentally different from classical computers.

A classical computer uses 0s and 1s to store data. These numbers could be voltages on different points in a circuit. But a quantum computer works on quantum bits, also known as qubits. You can picture them as waves that are associated with amplitude and phase.

Qubits have special properties: They can exist in superposition, where they are both 0 and 1 at the same time, and they may be entangled so they share physical properties even though they may be separated by large distances. Its a behavior that does not exist in the world of classical physics. The superposition vanishes when the experimenter interacts with the quantum state.

Due to superposition, a quantum computer with 100 qubits can represent 2100 solutions simultaneously. For certain problems, this exponential parallelism can be harnessed to create a tremendous speed advantage. Some code-breaking problems could be solved exponentially faster on a quantum machine, for example.

There is another, narrower approach to quantum computing called quantum annealing, where qubits are used to speed up optimization problems. D-Wave Systems, based in Canada, has built optimization systems that use qubits for this purpose, but critics also claim that these systems are no better than classical computers.

Regardless, companies and countries are investing massive amounts of money in quantum computing. China has developed a new quantum research facility worth US$10 billion, while the European Union has developed a 1 billion ($1.1 billion) quantum master plan. The United States National Quantum Initiative Act provides $1.2 billion to promote quantum information science over a five-year period.

Breaking encryption algorithms is a powerful motivating factor for many countries if they could do it successfully, it would give them an enormous intelligence advantage. But these investments are also promoting fundamental research in physics.

Many companies are pushing to build quantum computers, including Intel and Microsoft in addition to Google and IBM. These companies are trying to build hardware that replicates the circuit model of classical computers. However, current experimental systems have less than 100 qubits. To achieve useful computational performance, you probably need machines with hundreds of thousands of qubits.

Noise and error correction

The mathematics that underpin quantum algorithms is well established, but there are daunting engineering challenges that remain.

For computers to function properly, they must correct all small random errors. In a quantum computer, such errors arise from the non-ideal circuit elements and the interaction of the qubits with the environment around them. For these reasons the qubits can lose coherency in a fraction of a second and, therefore, the computation must be completed in even less time. If random errors which are inevitable in any physical system are not corrected, the computers results will be worthless.

In classical computers, small noise is corrected by taking advantage of a concept known as thresholding. It works like the rounding of numbers. Thus, in the transmission of integers where it is known that the error is less than 0.5, if what is received is 3.45, the received value can be corrected to 3.

Further errors can be corrected by introducing redundancy. Thus if 0 and 1 are transmitted as 000 and 111, then at most one bit-error during transmission can be corrected easily: A received 001 would be a interpreted as 0, and a received 101 would be interpreted as 1.

Quantum error correction codes are a generalization of the classical ones, but there are crucial differences. For one, the unknown qubits cannot be copied to incorporate redundancy as an error correction technique. Furthermore, errors present within the incoming data before the error-correction coding is introduced cannot be corrected.

Quantum cryptography

While the problem of noise is a serious challenge in the implementation of quantum computers, it isnt so in quantum cryptography, where people are dealing with single qubits, for single qubits can remain isolated from the environment for significant amount of time. Using quantum cryptography, two users can exchange the very large numbers known as keys, which secure data, without anyone able to break the key exchange system. Such key exchange could help secure communications between satellites and naval ships. But the actual encryption algorithm used after the key is exchanged remains classical, and therefore the encryption is theoretically no stronger than classical methods.

Quantum cryptography is being commercially used in a limited sense for high-value banking transactions. But because the two parties must be authenticated using classical protocols, and since a chain is only as strong as its weakest link, its not that different from existing systems. Banks are still using a classical-based authentication process, which itself could be used to exchange keys without loss of overall security.

Quantum cryptography technology must shift its focus to quantum transmission of information if its going to become significantly more secure than existing cryptography techniques.

Commercial-scale quantum computing challenges

While quantum cryptography holds some promise if the problems of quantum transmission can be solved, I doubt the same holds true for generalized quantum computing. Error-correction, which is fundamental to a multi-purpose computer, is such a significant challenge in quantum computers that I dont believe theyll ever be built at a commercial scale.

[ Youre smart and curious about the world. So are The Conversations authors and editors. You can get our highlights each weekend. ]

Subhash Kak, Regents Professor of Electrical and Computer Engineering, Oklahoma State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image: Reuters

See more here:

Quantum Computers Are the Ultimate Paper Tiger - The National Interest Online

Quantum Computing Market 2020 Global Overview, Growth, Size, Opportunities, Trends, Leading Company Analysis and Forecast to 2026 – Cole of Duty

1qb Information Technologies

All of the product type and application segments of the Quantum Computing market included in the report are deeply analyzed based on CAGR, market size, and other crucial factors. The segmentation study provided by the report authors could help players and investors to make the right decisions when looking to invest in certain market segments.

The Essential Content Covered in the Quantum Computing Market Report :

* Top Key Company Profiles.* Main Business and Rival Information* SWOT Analysis and PESTEL Analysis* Production, Sales, Revenue, Price and Gross Margin* Market Share and Size

The report is a compilation of different studies, including regional analysis where leading regional Quantum Computing markets are comprehensive studied by market experts. Both developed and developing regions and countries are covered in the report for a 360-degree geographic analysis of the Quantum Computing market. The regional analysis section helps readers to become familiar with the growth patterns of important regional Quantum Computing markets. It also provides information on lucrative opportunities available in key regional Quantum Computing markets.

Ask For Discounts, Click Here @ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=24845&utm_source=COD&utm_medium=001

Table of Content

1 Introduction of Quantum Computing Market

1.1 Overview of the Market1.2 Scope of Report1.3 Assumptions

2 Executive Summary

3 Research Methodology

3.1 Data Mining3.2 Validation3.3 Primary Interviews3.4 List of Data Sources

4 Quantum Computing Market Outlook

4.1 Overview4.2 Market Dynamics4.2.1 Drivers4.2.2 Restraints4.2.3 Opportunities4.3 Porters Five Force Model4.4 Value Chain Analysis

5 Quantum Computing Market, By Deployment Model

5.1 Overview

6 Quantum Computing Market, By Solution

6.1 Overview

7 Quantum Computing Market, By Vertical

7.1 Overview

8 Quantum Computing Market, By Geography

8.1 Overview8.2 North America8.2.1 U.S.8.2.2 Canada8.2.3 Mexico8.3 Europe8.3.1 Germany8.3.2 U.K.8.3.3 France8.3.4 Rest of Europe8.4 Asia Pacific8.4.1 China8.4.2 Japan8.4.3 India8.4.4 Rest of Asia Pacific8.5 Rest of the World8.5.1 Latin America8.5.2 Middle East

9 Quantum Computing Market Competitive Landscape

9.1 Overview9.2 Company Market Ranking9.3 Key Development Strategies

10 Company Profiles

10.1.1 Overview10.1.2 Financial Performance10.1.3 Product Outlook10.1.4 Key Developments

11 Appendix

11.1 Related Research

Get Complete Report @ https://www.verifiedmarketresearch.com/product/Quantum-Computing-Market/?utm_source=COD&utm_medium=001

About us:

Verified Market Research is a leading Global Research and Consulting firm servicing over 5000+ customers. Verified Market Research provides advanced analytical research solutions while offering information enriched research studies. We offer insight into strategic and growth analyses, Data necessary to achieve corporate goals and critical revenue decisions.

Our 250 Analysts and SMEs offer a high level of expertise in data collection and governance use industrial techniques to collect and analyse data on more than 15,000 high impact and niche markets. Our analysts are trained to combine modern data collection techniques, superior research methodology, expertise and years of collective experience to produce informative and accurate research.

We study 14+ categories from Semiconductor & Electronics, Chemicals, Advanced Materials, Aerospace & Defence, Energy & Power, Healthcare, Pharmaceuticals, Automotive & Transportation, Information & Communication Technology, Software & Services, Information Security, Mining, Minerals & Metals, Building & construction, Agriculture industry and Medical Devices from over 100 countries.

Contact us:

Mr. Edwyne Fernandes

US: +1 (650)-781-4080UK: +44 (203)-411-9686APAC: +91 (902)-863-5784US Toll Free: +1 (800)-7821768

Email: [emailprotected]

Tags: Quantum Computing Market Size, Quantum Computing Market Trends, Quantum Computing Market Growth, Quantum Computing Market Forecast, Quantum Computing Market Analysis NMK, Majhi Naukri, Sarkari Naukri, Sarkari Result

Our Trending Reports

Rugged Display Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Quantum Computing Market Size, Growth Analysis, Opportunities, Business Outlook and Forecast to 2026

Read more:
Quantum Computing Market 2020 Global Overview, Growth, Size, Opportunities, Trends, Leading Company Analysis and Forecast to 2026 - Cole of Duty