IBM and the U. of Tokyo launch quantum computing initiative for Japan | – University Business

IBM (NYSE:IBM) and the University ofTokyo announced today an agreement to partner to advance quantum computing and make it practical for the benefit of industry, science and society.

IBM and theUniversity of Tokyowill form theJapan IBM Quantum Partnership, a broad national partnership framework in which other universities, industry, and government can engage. The partnership will have three tracks of engagement: one focused on the development of quantum applications with industry; anotheron quantum computing system technology development; and the third focused on advancing the state of quantum science and education.

Under the agreement, anIBM Q System One, owned and operated by IBM, willbe installed in an IBM facility inJapan. It will be the first installation of its kind in the region and only the third in the world followingthe United StatesandGermany. The Q System One will be used to advance research in quantum algorithms, applications and software, with the goal of developing the first practical applications of quantum computing.

IBM and theUniversity of Tokyowill also create a first-of-a-kind quantumsystem technology center for the development of hardware components and technologies that will be used in next generation quantum computers. The center will include a laboratory facility to develop and test novel hardware components for quantum computing, including advanced cryogenic and microwave test capabilities.

IBM and theUniversity of Tokyowill also directly collaborateon foundational research topics important to the advancement of quantum computing, and establish a collaboration space on the University campus to engage students, faculty, and industry researchers with seminars, workshops, and events.

Quantum computing is one of the most crucial technologies in the coming decades, which is why we aresetting up this broad partnership framework with IBM, who is spearheading its commercial application,said Makoto Gonokami, the President of theUniversity of Tokyo. We expect this effortto further strengthenJapans quantum research and developmentactivities and build world-class talent.

Developed byresearchers and engineers fromIBM Researchand Systems, the IBM Q System One is optimized for the quality, stability, reliability, and reproducibility of multi-qubit operations. IBM established theIBM Q NetworkTM, a community of Fortune 500 companies, startups, academic institutions and research labs working with IBM to advance quantum computing and explore practical applications for business and science.

This partnership will sparkJapansquantum researchcapabilities by bringing together experts from industry, government and academia to build and grow a community that underpins strategically significant research and development activities to foster economic opportunities acrossJapan, saidDario Gil, Director of IBM Research.

Advances in quantum computing could open the door to future scientific discoveries such as new medicines and materials, improvements in the optimization of supply chains, and new ways to model financial data to better manage and reduce risk.

TheUniversity of Tokyowill lead theJapan IBM Quantum Partnership and bring academic excellence from universities and prominent research associations together with large-scale industry, small and medium enterprises, startups as well as industrial associations from diverse market sectors. A high priority will be placed on building quantum programming as well as application and technology development skills and expertise.

For more about IBM Q:https://www.ibm.com/quantum-computing/

AboutUniversity of Tokyo

TheUniversity of Tokyowas established in 1877 as the first national university inJapan. As a leading research university, theUniversity of Tokyooffers courses in essentially all academic disciplines at both undergraduate and graduate levels and conducts research across the full spectrum of academic activity. The University aims to provide its students with a rich and varied academic environment that ensures opportunities for both intellectual development and the acquisition of professional knowledge and skills.

See the rest here:

IBM and the U. of Tokyo launch quantum computing initiative for Japan | - University Business

The Quantum Computing Decade Is ComingHeres Why You Should Care – Observer

Googles Sycamore quantum processor. Erik Lucero, Research Scientist and Lead Production Quantum Hardware

Multiply 1,048,589 by 1,048,601, and youll get 1,099,551,473,989. Does this blow your mind? It should, maybe! That 13-digit prime number is the largest-ever prime number to be factored by a quantum computer, one of a series of quantum computing-related breakthroughs (or at least claimed breakthroughs) achieved over the last few months of the decade.

An IBM computer factored this very large prime number about two months after Google announcedthat it had achieved quantum supremacya clunky term for the claim, disputed by its rivals including IBM as well as others, that Google has a quantum machine that performed some math normal computers simply cannot.

SEE ALSO: 5G Coverage May Set Back Accurate Weather Forecasts By 30 Years

An arcane field still existing mostly in the theoretical, quantum computers have done enough recently and are commanding enough very real public and private resources to be deserving of your attentionnot the least of which is because if and when the Chinese government becomes master of all your personal data, sometime in the next decade, it will be because a quantum computer cracked the encryption.

Building the quantum computer, it is said, breathlessly, is a race to be won, as important as being the first in space (though, ask the Soviet Union how that worked out) or fielding the first workable atomic weapon (seems to be going OK for the U.S.).

And so here is a postwritten in terms as clear and simple as this human could mustersumming up these recent advances and repeating other experts predictions that the 2020s appear to be the decade when quantum computers begin to contribute to your life, by both making slight improvements to your map app, and powering artificial intelligence robust and savvy enough to be a real-life Skynet.

First, the requisite introduction to the concept. Normal computers, such as the device you are using to access and display this content, process information in a binary. Everything is either a one, or a zero, or a series of ones and zeroes. On, or off. But what if the zero was simultaneously also a one? (Please exit here for your requisite digression into quantum physics and mechanics.)

The idea that a value can be a zero, or a one, or both at the same time is the quantum principle of superposition. Each superposition is a quantum bit, or qubit. The ability to process qubits is what allows a quantum computer to perform functions a binary computer simply cannot, like computations involving 500-digit numbers. To do so quickly and on demand might allow for highly efficient traffic flow. It could also render current encryption keys mere speedbumps for a computer able to replicate them in an instant.

An artists rendition of Googles Sycamore quantum processor mounted in a cryostat. Forest Stearns, Google AI Quantum Artist in Residence

Why hasnt this been mastered already, whats holding quantum computers back? Particles like photons only exist in quantum states if they are either compressed very, very small or made very, very coldwith analog engineering techniques. What quantum computers do exist are thus resource-intensive. Googles, for example, involves metals cooled (the verb is inadequate) to 460 degrees below zero, to a state in which particles behave in an erratic and random fashion akin to a quantum state.

And as Subhash Kak, the regents professor of electrical and computer engineering at Oklahoma State University and an expert in the field,recently wrote, the power of a quantum computer can be gauged by how many quantum bits, or qubits, it can process. The machines built by Google, Microsoft, Intel, IBM and possibly the Chinese all have less than 100 qubits,he wrote. (In Googles case, the company claims to have created a quantum state of 53 qubits.)

To achieve useful computational performance,according to Kak, you probably need machines with hundreds of thousands of qubits. And what qubits a quantum computer can offer are notoriously unstable and prone to error. They need many of the hard-won fixes and advancements that saw binary computers morph from room-sized monstrosities spitting out punch cards to iPhones.

How fast will that happencan it happen?

Skeptics, doubters, and haters might note that Google first pledged to achieve quantum supremacy (defined as the point in time at which quantum computers are outperforming binary computers) by the end of 2017meaning its achievement was almost two full years behind schedule, and meaning other quantum claims, like Dario Gil of IBMs pledge that quantum computers will be useful for commercial and scientific advantage sometime next year, may also be dismissed or at least subject to deserved skepticism.

Dario Gil, director of IBM Research, stands in front of IBMs Q System One quantum computer on October 18, 2019. Misha Friedman/Getty Images

And those of us who can think only in binary may also find confusion in the dispute between quantum rivals. The calculation performed by Googles Sycamore quantum computer in 200 seconds, the company claimed, would take a normal binary supercomputer 10,000 years to solve. Not so, according to IBM, which asserted that the calculation could be done by a binary computer in two and a half days. Either way, as The New York Times wrote, quantum supremacy is still a very arcane experiment that cant necessarily be applied to other things. Googles breakthrough might be the last achievement for a while.

But everybody is tryingincluding the U.S. government, which is using your money to do it. Commercial spending on quantum computing research is estimated to reach hundreds of millions of dollars sometime in the next decade. A year ago, spooked and shamed by what appeared to be an unanswered flurry of quantum progress in China, Congress dedicated $1.2 billion to the National Quantum Initiative Act, money specifically intended to boost American-based quantum computing projects. According to Bloomberg, China may have already spent 10 times that.

If you walk away with nothing else, know that quantum computer spending is very real, even if the potential is theoretical.

See the original post here:

The Quantum Computing Decade Is ComingHeres Why You Should Care - Observer

IBM partners with the University of Tokyo on quantum computing initiative – SiliconANGLE News

IBM Corp. said today its teaming up with the University of Tokyo to create a new Japan-IBM Quantum Partnershipthat will focus on advancing the adoption of quantum computers in order to benefit science, industry and society.

IBM said the partnership would have three areas of focus, including the development of quantum applications for industry and the development of quantum computing hardware, with an aim to advance the state of quantum science and education.

The initiative will also see an IBM Q System One(pictured) installed at one of the companys facilities in Japan. The system was launched in January and is said to be the worlds first-ever circuit-based commercial quantum computer.

There are currently two such machines in operation one in the U.S. and one in Germany. Once the system is installed in Japan, IBM and University of Tokyo researchers intend to use it to aid their research into quantum algorithms and practical quantum applications.

IBM and the University of Tokyo also plan to create a quantum system technology center focused on developing and testing new quantum hardware.University of Tokyo President Makoto Gonokami said in a statement that his institution would place a much higher priority on quantum programming going forward.

Quantum computing is one of the most crucial technologies in the coming decades, which is why we are setting up this broad partnership framework with IBM, Gonokami said. We expect this effort to further strengthen Japans quantum research and development activities and build world-class talent.

Show your support for our mission by our 1-click subscribe to our YouTube Channel (below) The more subscribers we have the more then YouTubes algorithm promotes our content to users interested in #EnterpriseTech. Thank you.

Support Our Mission: >>>>>> SUBSCRIBE NOW >>>>>> to our Youtube Channel

Wed like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we dont have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary onSiliconANGLE along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams attheCUBE take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here,please take a moment to check out a sample of the video content supported by our sponsors,tweet your support, and keep coming back toSiliconANGLE.

The rest is here:

IBM partners with the University of Tokyo on quantum computing initiative - SiliconANGLE News

2020 and beyond: Tech trends and human outcomes – Accountancy Age

The next decade promises to offer both incredible opportunity and challenge for all of us. Technologies like artificial intelligence (and its close friend, machine learning) will no longer be considered new but will instead be at the heart of some huge disruptive changes that will run right through our society. In particular, AI will start to enable the automation of many things that were previously deemed too complex or even too human.

Well see these changes at work traditional professions like accountancy, lawyers and others will, over time, see significant portions of what they do be taken over by virtual robots. Vocations such as lorry drivers, taxi drivers and even chefs may disappear as machines are introduced to perform the same function but with more consistent results and less risk.

Well also see these changes at home as AI will bring a host of new changes to how we live. AI will help us speak any language to anyone in the world, it will help us discover and create new content and maybe even help us decide what food to eat and when we should rest (and for how long!) in order to help us live lives that are not just more healthy, but more productive and of course more fun.

Well (hopefully) see these changes at school and in education too when we finally realise that in the 21st century, simply knowing stuff is no longer enough. Instead we might seek to use AI to build personalised learning schemes that tailor learning for every unique student such that they can reach their true potential regardless of their background, ability to learn or particular strengths and weaknesses. This could also mean the end of exams and tests as we know it as we move away from the unnecessary stress and futility of a single measure of knowledge taken at a single moment in time to a world of continuous assessment, where the system is able to measure progress as a by-product of the work that the student does every single day.

As for the technology itself, its going to continue to get quicker, cheaper, more powerful and smaller. Your huge smartphone may not be so huge by the time we get to 2030, in fact it may not be a phone at all but instead a small implant that you have inserted under your skin, just like the one we use today for our pets

Well also see the introduction of new game changing technologies like Quantum Computing. Dont be fooled, this is not just another computer but faster, the power and potential Quantum Computing offers us is almost unimaginable. Todays quantum computers are limited, complex machines that require an extreme environment in which to run, (most early quantum computers need to run at -273 degrees centigrade so dont think youre going to see one in your office or your home any time soon. But they are important because of the scale at which they operate. In simple terms, the power of todays quantum computers is measured at around 50 cubits (a cubit is a quantum computers measure of power, a bit like the digital equivalent of horse power), scientists believe that when we can get Quantum computers to 500 cubits, those computers will be able to answer as many questions as there are atoms in the world and at the same time! This is a kind of computational power that we cant even begin to imagine.

Oh and robots too. These wont be the industrial robots youre used to seeing, they might not even be the science fiction looking robots (you know, the ones that start as friends and then take over the world). These robots are going to be not just our friends, theyll be a part of our families. Its already started. If you have a smart speaker at home, youve got an early ancestor of something that will end up becoming your own personal C3PO, not just there to help you but there to provide companionship and friendship while you go about your busy lives.

But all this wont be without some risks.

Massive parts of our current labour market will be challenged by the rise of the machines. Our kids will continue to lack the skills theyre going to need to thrive and we adults are going to struggle to make sense of it all at home and at work.

The machines wont be perfect either, seeing as theyre created by humans, they end up with some human problems as a result, algorithmic bias will be one of the defining challenges of 2020 and beyond and its going to take a lot of human effort to get all of us to a point where we can trust our lives to the algorithms alone.

The good news in all of this is that the end result is still ultimately down to us humans. The real answer to what 2020 will hold for technology and how it affects us in our everyday lives will continue to be all about how we humans choose to use it. Im hopeful for a new era in 2020, one where we turn the corner in our relationship with technology and look not for dystopia, but instead we seek to ensure everyone has the right skills and ambition to build the utopia we deserve. To get there we need to teach our kids (and ourselves!) to break free of the technology that traps and disconnects us, an instead use the same technology to elevate what we could achieve not by replacing us, but by freeing us to do all of the amazing things that the technology alone cannot do. The best future awaits those that can combine the best of technological capability with the best of human ability.

Dave Coplin is former Chief Envisioning Officer for Microsoft UK, he has written two books, worked all over the world with organisations, individuals and governments all with the goal of demystifying technology and championing it as a positive transformation in our society.

See the rest here:

2020 and beyond: Tech trends and human outcomes - Accountancy Age

Quantum Computing Market Increase In Analysis & Development Activities Is More Boosting Demands – Market Research Sheets

The Report Titled on Global Quantum Computing Market Size, Status and Forecast 2019-2025 is a professional and in-depth study on the current state of the Quantum Computing industry with a focus on the market Overview, Classification, Industry Value, Price, Cost and Gross Profit. This Quantum Computing market report enhanced on worldwide competition by topmost prime manufactures like (D-Wave Systems, Google, IBM, Intel, Microsoft, 1QB Information Technologies, Anyon Systems, Cambridge Quantum Computing, ID Quantique, IonQ, QbitLogic, QC Ware, Quantum Circuits, Qubitekk, QxBranch, Rigetti Computing) which providing information such asCompany Profiles, Product Picture and Specification, Capacity, Production, Cost, Revenueand Contact Information. The report provides key statistics on the market status of the Quantum Computing market manufacturers and is a valuable source of guidance and direction for companies and individuals interested in the industry. Overall, the report provides an in-depth insight of 2014-2025 global Quantum Computing market covering all important parameters.

In-Depth Qualitative Analyses Include Identification and Investigation Of The Following Aspects:Quantum Computing Market Structure, Growth Drivers, Restraints and Challenges, Emerging Product Trends & Market Opportunities, Porters Fiver Forces.

Instantaneous of Quantum Computing Market:Quantum computing is a technology that applies the laws of quantum mechanics to computational ability. It includes three states, namely 1, 0 as well as the superposition of 1 and 0. Superposition indicates that two states exist at the same time. These bits are known as quantum bits or qubits. The global quantum computing market consists of the hardware that is required to develop quantum computers and its peripherals.

North America accounted for the largest share of the overall quantum computing market in 2017. On the other hand, Asia Pacific (APAC) would be the fastest growing region for quantum computing during the forecast period. This growth can be attributed to the increasing demand for quantum technology to solve the most tedious and complex problems in the defense and banking & finance industry.

On the basis of product,this report displays the sales volume, revenue (Million USD), product price, market share and growth rate ofeach type, primarily split into-

Hardware Software Services

On the basis on the end users/applications,this report focuses on the status and outlook for major applications/end users, sales volume, market share and growth rate of Quantum Computing market foreach application, including-

Defense Healthcare & pharmaceuticals Chemicals Banking & finance Energy & power

KeyQueriesAnsweredWithin theQuantum Computing Market Report:-

Analysts of the report centered on respondent some key questions on Quantum Computing market. This can be to assist readers gain clear information regarding growth within the Quantum Computing market, and what are the continued changes happening which will diversify the market within the returning years.

Whatare theForemost Recent Advanced TechnologiesAdopted by Quantum Computing?

HowaretheRecent TrendsPoignantGrowth within theWorldQuantum Computing Market?

Whatarethe KeyMethodsEmployed by Players and Repair Suppliers That Are Expected to Impact the Expansion of the Quantum Computing Market?

Whatarethe ResourcesOut Therein Various Regions That Attract Leading Players within theQuantum Computing Market?

What Was theHistorical Price and What is Going to be the Forecast Price of the Quantum Computing Market?

Quantum Computing Market: Regional analysis includes:

Contact:

ResearchMozMr. Nachiket Ghumare,Tel: +1-518-621-2074USA-Canada Toll Free: 866-997-4948Email:[emailprotected]

Browse More Reports Visit @https://bit.ly/2Sepby2

This post was originally published on Market Research Sheets

Read more:

Quantum Computing Market Increase In Analysis & Development Activities Is More Boosting Demands - Market Research Sheets

How quantum computing could beat climate change – World Economic Forum

Imagine being able to cheaply and easily suck carbon directly out of our atmosphere. Such a capability would be hugely powerful in the fight against climate change and advance us towards the ambitious global climate goals set.

Surely thats science fiction? Well, maybe not. Quantum computing may be just the tool we need to design such a clean, safe and easy-to-deploy innovation.

In 1995 I first learned that quantum computing might bring about a revolution akin to the agricultural, industrial and digital ones weve already had. Back then it seemed far-fetched that quantum mechanics could be harnessed to such momentous effect; given recent events, it seems much, much more likely.

Much excitement followed Googles recent announcement of quantum supremacy: [T]he point where quantum computers can do things that classical computers cant, regardless of whether those tasks are useful.

The question now is whether we can develop the large-scale, error-corrected quantum computers that are required to realize profoundly useful applications.

The good news is we already concretely know how to use such fully-fledged quantum computers for many important tasks across science and technology. One such task is the simulation of molecules to determine their properties, interactions, and reactions with other molecules a.k.a. chemistry the very essence of the material world we live in.

While simulating molecules may seem like an esoteric pastime for scientists, it does, in fact, underpin almost every aspect of the world and our activity in it. Understanding their properties unlocks powerful new pharmaceuticals, batteries, clean-energy devices and even innovations for carbon capture.

To date, we havent found a way to simulate large complex molecules with conventional computers, we never will, because the problem is one that grows exponentially with the size or complexity of the molecules being simulated. Crudely speaking, if simulating a molecule with 10 atoms takes a minute, a molecule with 11 takes two minutes, one with 12 atoms takes four minutes and so on. This exponential scaling quickly renders a traditional computer useless: simulating a molecule with just 70 atoms would take longer than the lifetime of the universe (13 billion years).

This is infuriating, not just because we cant simulate existing important molecules that we find (and use) in nature including within our own body and thereby understand their behaviour; but also because there is an infinite number of new molecules that we could design for new applications.

Thats where quantum computers could come to our rescue, thanks to the late, great physicist Richard Feynman. Back in 1981, he recognized that quantum computers could do that which would be impossible for classical computers when it comes to simulating molecules. Thanks to recent work by Microsoft and others we now have concrete recipes for performing these simulations.

One area of urgent practical importance where quantum simulation could be hugely valuable is in meeting the SDGs not only in health, energy, industry, innovation and infrastructure but also in climate action. Examples include room-temperature superconductors (that could reduce the 10% of energy production lost in transmission), more efficient processes to produce nitrogen-based fertilizers that feed the worlds population and new, far more efficient batteries.

One very powerful application of molecular simulation is in the design of new catalysts that speed up chemical reactions. It is estimated that 90% of all commercially produced chemical products involve catalysts (in living systems, theyre called enzymes).

Annual CO2 emissions globally in 2017

A catalyst for scrubbing carbon dioxide directly from the atmosphere could be a powerful tool in tackling climate change. Although CO2 is captured naturally, by oceans and trees, CO2 production has exceeded these natural capture rates for many decades.

The best way to tackle CO2 is not releasing more CO2; the next best thing is capturing it. While we cant literally turn back time, [it] is a bit like rewinding the emissions clock, according to Torben Daeneke at RMIT University.

There are known catalysts for carbon capture but most contain expensive precious metals or are difficult or expensive to produce and/or deploy. We currently dont know many cheap and readily available catalysts for CO2 reduction, says Ulf-Peter Apfel of Ruhr-University Bochum.

Given the infinite number of candidate molecules that are available, we are right to be optimistic that there is a catalyst (or indeed many) to be found that will do the job cheaply and easily. Finding such a catalyst, however, is a daunting task without the ability to simulate the properties of candidate molecules.

And thats where quantum computing could help.

We might even find a cheap catalyst that enables efficient carbon dioxide recycling and produces useful by-products like hydrogen (a fuel) or carbon monoxide (a common source material in the chemical industry).

We can currently simulate small molecules on prototype quantum computers with up to a few dozen qubits (the quantum equivalent of classical computer bits). But scaling this to useful tasks, like discovering new CO2 catalysts, will require error correction and simulation to the order of 1 million qubits.

Its a challenge I have long believed will only be met on any human timescale certainly by the 2030 target for the SDGs if we use the existing manufacturing capability of the silicon chip industry.

At a meeting of the World Economic Forums Global Future Councils last month a team of experts from across industry, academia and beyond assembled to discuss how quantum computing can help address global challenges, as highlighted by the SDGs, and climate in particular.

As co-chair of the Global Future Council on Quantum Computing, I was excited that we were unanimous in agreeing that the world should devote more resources, including in education, to developing the powerful quantum computing capability that could help tackle climate change, meet the SDGs more widely and much more. We enthusiastically called for more international cooperation to develop this important technology on the 2030 timescale to have an impact on delivering the SDGs, in particular climate.

So the real question for me is: can we do it in time? Will we make sufficiently powerful quantum computers on that timeframe? I believe so. There are, of course, many other things we can and should do to tackle climate change, but developing large-scale, error-corrected quantum computers is a hedge we cannot afford to go without.

License and Republishing

World Economic Forum articles may be republished in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Here is the original post:

How quantum computing could beat climate change - World Economic Forum

How Quantum Computers Work | HowStuffWorks

The massive amount of processing power generated by computer manufacturers has not yet been able to quench our thirst for speed and computing capacity. In 1947, American computer engineer Howard Aiken said that just six electronic digital computers would satisfy the computing needs of the United States. Others have made similar errant predictions about the amount of computing power that would support our growing technological needs. Of course, Aiken didn't count on the large amounts of data generated by scientific research, the proliferation of personal computers or the emergence of the Internet, which have only fueled our need for more, more and more computing power.

Will we ever have the amount of computing power we need or want? If, as Moore's Law states, the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale. And the logical next step will be to create quantum computers, which will harness the power of atoms and molecules to perform memory and processing tasks. Quantum computers have the potential to perform certain calculations significantly faster than any silicon-based computer.

Scientists have already built basic quantum computers that can perform certain calculations; but a practical quantum computer is still years away. In this article, you'll learn what a quantum computer is and just what it'll be used for in the next era of computing.

You don't have to go back too far to find the origins of quantum computing. While computers have been around for the majority of the 20th century, quantum computing was first theorized less than 30 years ago, by a physicist at the Argonne National Laboratory. Paul Benioff is credited with first applying quantum theory to computers in 1981. Benioff theorized about creating a quantum Turing machine. Most digital computers, like the one you are using to read this article, are based on the Turing Theory. Learn what this is in the next section.

Originally posted here:

How Quantum Computers Work | HowStuffWorks

AI, 5G, ‘ambient computing’: What to expect in tech in 2020 and beyond – USA TODAY

Tis the end of the year when pundits typically dust off the crystal ball and take a stab at what tech, and its impact on consumers,will look like over the next12 months.

But we're also on the doorstep of a brand-new decade, which this time around promisesfurther advances in 5G networks, artificial intelligence, quantum computing, self-driving vehicles and more, all of which willdramatically alter the way we live, work and play.

So what tech advances can we look forward to in the new year? Heres what we can expect to see in 2020 and in some cases beyond.

(Photo: Getty Images)

The next generation of wireless has showed up on lists like this for years now. But in 2020, 5G really will finally begin to make its mark in the U.S., with all four major national carriers three if the T-Mobile-Sprint merger finally goes through continue to build out their 5G networks across the country.

Weve been hearing about the promise of 5G on the global stage for what seems like forever, and the carriersrecently launched in select markets. Still, the rollout in most places will continue to take time, as will the payoff: blistering fast wireless speeds and network responsiveness on our phones, improved self-driving cars and augmented reality, remote surgery, and entire smartcities.

As 2019 winds down, only a few phones can exploit the latest networks, not to mention all the remaining holes in 5G coverage. But youll see a whole lot more 5G phone introductions in the new year, including what many of us expect will be a 5G iPhone come September.

Dark side of sharing economy: Use Airbnb, Uber or Lyft? Beware. There can be a dark side to sharing with strangers

A look back at the 2010s: Revisiting everything in tech from Alexa to Xbox

When those holes are filled, roughly two-thirds of consumers said theyd be more willing to buy a 5G-capable smartphone, according to a mobile trends survey by Deloitte.

But Deloitte executive Kevin Westcott also said that telcos will need to manage consumer expectations about what 5G can deliver and determine what the killer apps for 5G will be.

The Deloitte survey also found that a combination of economic barriers (pricing, affordability) and a sense that current phones are good enough, will continue to slow the smartphone refresh cycle.

Are you ready for all the tech around you to disappear? No, not right away.The trend towards so-called ambient computing is not going to happen overnight, nor is anyone suggesting that screens and keyboards are going to go away entirely, or that youll stop reaching for a smartphone. But as more tiny sensorsare built into walls, TVs, household appliances, fixtures, what you're wearing, and eventually even your own body, youll be able to gesture or speak to a concealed assistant to get things done.

Steve Koenig, vice president of research at the Consumer Technology Association likens ambient computing to Star Trek, and suggests that at some point we won't need to place Amazon Echo Dots or other smart speakers in every room of house, since well just speak out loud to whatever, wherever.

Self-driving cars have been getting most the attention. But its not just cars that are going autonomous try planes and boats.

Cirrus Aircraft, for example, is in the final stages of getting Federal Aviation Administration approval for a self-landing system for one of its private jets, and the tech, which I recently got to test, has real potential to save lives.

How so? If the pilot becomes incapacitated, a passenger can press a single button on the roof of the main cabin. At that moment, the plane starts acting as if the pilot were still doing things. It factors in real-time weather, wind, the terrain, how much fuel remains, all the nearby airports where an emergency landing is possible, including the lengths of all runways, and automatically broadcasts its whereaboutsto air traffic control.From there the system safely lands the plane.

Or consider the 2020 version of the Mayflower, not a Pilgrim ship, but rather a marine research vessel from IBM and a marine exploration non-profit known as Promare. The plan is to have the unmanned shipcross the Atlantic in September from Plymouth, England to Plymouth, Massachusetts. The ship will be powered by a hybrid propulsion system, utilizing wind, solar, state-of-the-art batteries, and a diesel generator. It plans to follow the 3,220-mile route the original Mayflower took 400 years ago.

Two of Americas biggest passions come together. esports is one of the fastest growing spectator sports around the world, and the Supreme Court cleared a path last year for legalized gambling across the states. The betting community is licking their chops at the prospect of exploiting this mostly untapped market. Youll be able to bet on esports in more places, whetherat a sportsbook inside a casino or through an app on your phone.

One of the scary prospects about artificial intelligence is that it is going to eliminate all these jobs. Research out of MIT and IBM Watson suggests that while AI will for sure impact the workplace, it wont lead to a huge loss of jobs.

That's a somewhat optimistic take given an alternate view thatAI-driven automation is going to displace workers.The research suggests thatAI will increasingly help us with tasks that can be automated, but will have a less direct impact on jobs that require skills such as design expertise and industrial strategy. The onus will be on bosses and employeesto start adapting to newroles and to try and expandtheirskills, effortsthe researchers say will beginin the new year.

The scary signs are still out there, however. For instance, McDonalds is already testing AI-powered drive-thrus that can recognize voice, which could reduce the need for human order-takers.

Perhaps its more wishful thinking than a flat-out prediction, but as Westcott puts it, Im hoping what goes away are the 17 power cords in my briefcase. Presumably a slight exaggeration.

But the thing we all want to see are batteries that dont prematurely peter out, and more seamless charging solutions.

Were still far off from the day where youll be able to get ample power to last all day on your phone or other devices just by walking into a room. But over-the-air wireless charging is slowly but surely progressing. This past June, for example, Seattle company Ossiareceived FCC certification for a first-of-its kind system to deliver over-the-air power at a distance. Devices with Ossias tech built-in should start appearing in the new year.

The Samsung Galaxy Fold smartphone featuring a foldable OLED display.(Photo: Samsung)

We know how the nascent market for foldable phones unfolded in 2019 things were kind of messy.Samsungs Galaxy Fold was delayed for months following screen problems, and even when the phone finally did arrive, it cost nearly $2,000. But that doesnt mean the idea behind flexible screen technologies goes away.

Samsung is still at it, and so is Lenovo-owned Motorola with its new retroRazr. The promise remains the same: let a devicefold or bend in such a way that you can take a smartphone-like form factor and morph it into a small tablet or computer. The ultimate success of such efforts will boil down to at least three of the factors that are always critical in tech: cost, simplicity, andutility.

Data scandals and privacy breaches have placed Facebook, Google and other others under the government's cross-hairs, and ordinary citizens are concerned. Expect some sort of reckoning, though it isn't obviousat this stage what that reckoningwill look like.

Pew recently put out a report that says roughly 6 in 10 Americans believe it is not possible to go about their daily lives without having their data collected.

"The coming decade will be a period of lots of ferment around privacy policy and also around technology related to privacy," says Lee Rainie, director of internet and technology research at Pew Research Center. He says consumers will potentially have more tools to give them a bit more control over how and what data gets shared and under whatcircumstances. "And there will be a lot of debate over what the policy should be."

Open question: Will there be national privacy regulations, perhaps ones modeled after the California law that is set to go into effect in the new year?

It isnt easy to explain quantum computing or the field it harnesses, quantum mechanics. In the simplest terms, think something exponentially more powerful than what we consider conventional computing, which is expressed in1s or 0s of bits. Quantum computing takes a quantum leap with whatare known as "qubits."

And while IBM, Intel, Google, Microsoft and others are all fighting for quantum supremacy, the takeaway over the next decadeis that thetechmay helpsolve problems far faster than before, fromdiagnosing disease to crackingforms of encryption, raising the stakes in data security.

Quantum computing: Google claims quantum computing breakthrough

What tech do you want or expect to see? Email: ebaig@usatoday.com; Follow @edbaig on Twitter.

Read or Share this story: https://www.usatoday.com/story/tech/2019/12/18/tech-trends-2020-battery-power-ai-privacy/4360879002/

Read this article:

AI, 5G, 'ambient computing': What to expect in tech in 2020 and beyond - USA TODAY

What WON’T Happen in 2020: 5G Wearables, Quantum Computing, and Self-Driving Trucks to Name a Few – Business Wire

OYSTER BAY, N.Y.--(BUSINESS WIRE)--As 2019 winds down, predictions abound on the technology advancements and innovations expected in the year ahead. However, there are several anticipated advancements, including 5G wearables, quantum computing, and self-driving trucks, that will NOT happen in the first year of the new decade, states global tech market advisory firm, ABI Research.

In its new whitepaper, 54 Technology Trends to Watch in 2020, ABI Researchs analysts have identified 35 trends that will shape the technology market and 19 others that, although attracting huge amounts of speculation and commentary, look less likely to move the needle over the next twelve months. After a tumultuous 2019 that was beset by many challenges, both integral to technology markets and derived from global market dynamics, 2020 looks set to be equally challenging, says Stuart Carlaw, Chief Research Officer at ABI Research. Knowing what wont happen in technology in the next year is important for end users, implementors, and vendors to properly place their investments or focus their strategies.

What wont happen in 2020?

5G Wearables: While smartphones will dominate the 5G market in 2020, 5G wearables wont arrive in 2020, or anytime soon, says Stephanie Tomsett, 5G Devices, Smartphones & Wearables analyst at ABI Research. To bring 5G to wearables, specific 5G chipsets will need to be designed and components will need to be reconfigured to fit in the small form factor. That wont begin to happen until 2024, at the earliest.

Quantum Computing: Despite claims from Google in achieving quantum supremacy, the tech industry is still far away from the democratization of quantum computing technology, says Lian Jye Su, AI & Machine Learning Principal Analyst at ABI Research. Quantum computing is definitely not even remotely close to the large-scale commercial deployment stage.

Self-Driving Trucks: Despite numerous headlines declaring the arrival of driverless, self-driving, or robot vehicles, very little, if any, driver-free commercial usage is underway beyond closed-course operations in the United States, says Susan Beardslee, Freight Transportation & Logistics Principal Analyst at ABI Research.

A Consolidated IoT Platform Market: For many years, there have been predictions that the IoT platform supplier market will begin to consolidate, and it just wont happen, says Dan Shey, Vice President of Enabling Platforms at ABI Research. The simple reason is that there are more than 100 companies that offer device-to-cloud IoT platform services and for every one that is acquired, there are always new ones that come to market.

Edge Will Not Overtake Cloud: The accelerated growth of the edge technology and intelligent device paradigm created one of the largest industry misconceptions: edge technology will cannibalize cloud technology, says Kateryna Dubrova, M2M, IoT & IoE Analyst at ABI Research. In fact, in the future, we will see a rapid development of edge-cloud-fog continuum, where technology will complement each other, rather than cross-cannibalize.

8K TVs: Announcements of 8K Television (TV) sets by major vendors earlier in 2019 attracted much attention and raised many of questions within the industry, says Khin Sandi Lynn, Video & Cloud Services Analyst at ABI Research. The fact is, 8K content is not available and the price of 8K TV sets are exorbitant. The transition from high definition (HD) to 4K will continue in 2020 with very limited 8K shipments less than 1 million worldwide.

For more trends that wont happen in 2020, and the 35 trends that will, download the 54 Technology Trends to Watch in 2020 whitepaper.

About ABI Research

ABI Research provides strategic guidance to visionaries, delivering actionable intelligence on the transformative technologies that are dramatically reshaping industries, economies, and workforces across the world. ABI Researchs global team of analysts publish groundbreaking studies often years ahead of other technology advisory firms, empowering our clients to stay ahead of their markets and their competitors.

For more information about ABI Researchs services, contact us at +1.516.624.2500 in the Americas, +44.203.326.0140 in Europe, +65.6592.0290 in Asia-Pacific or visit http://www.abiresearch.com.

See the article here:

What WON'T Happen in 2020: 5G Wearables, Quantum Computing, and Self-Driving Trucks to Name a Few - Business Wire

What Was The Most Important Physics Of 2019? – Forbes

So, Ive been doing a bunch of talking in terms of decades in the last couple of posts, about the physics defining eras in the 20th century and the physics defining the last couple of decades. Ill most likely do another decadal post in the near future, this one looking ahead to the 2020s, but the end of a decade by definition falls at the end of a year, so its worth taking a look at physics stories on a shorter time scale, as well.

New year 2019 change to 2020 concept, hand change wooden cubes

You can, as always, find a good list of important physics stories in Physics Worlds Breakthrough of the Year shortlist, and there are plenty of other top science stories of 2019 lists out there. Speaking for myself, this is kind of an unusual year, and its tough to make a call as to the top story. Most of the time, these end-of-year things are either stupidly obvious because one story towers above all the others, or totally subjective because there are a whole bunch of stories of roughly equal importance, and the choice of a single one comes down to personal taste.

In 2019, though, I think there were two stories that are head-and-shoulders above everything else, but roughly equal to each other. Both are the culmination of many years of work, and both can also claim to be kicking off a new era for their respective subfields. And Im really not sure how to choose between them.

US computer scientist Katherine Bouman speaks during a House Committee on Science, Space and ... [+] Technology hearing on the "Event Horizon Telescope: The Black hole seen Round the World" in the Rayburn House office building in Washington, DC on May 16, 2019. (Photo by Andrew CABALLERO-REYNOLDS / AFP) (Photo credit should read ANDREW CABALLERO-REYNOLDS/AFP via Getty Images)

The first of these is the more photogenic of the two, namely the release of the first image of a black hole by the Event Horizon Telescope collaboration back in April. This one made major news all over, and was one of the experiments that led me to call the 2010s the decade of black holes.

As I wrote around the time of the release, this was very much of a piece with the preceding hundred years of tests of general relativity: while many stories referred to the image as a shadow of the black hole, really its a ring produced by light bending around the event horizon. This is the same basic phenomenon that Eddington measured in 1919 looking at the shift in the apparent position of stars near the Sun, providing confirmation of Einsteins prediction that gravity bends light. Its just that scaling up the mass a few million times produces a far more dramatic bending of spacetime (and thus light) than the gentle curve produced by our Sun.

This Feb. 27, 2018, photo shows electronics for use in a quantum computer in the quantum computing ... [+] lab at the IBM Thomas J. Watson Research Center in Yorktown Heights, N.Y. Describing the inner workings of a quantum computer isnt easy, even for top scholars. Thats because the machines process information at the scale of elementary particles such as electrons and photons, where different laws of physics apply. (AP Photo/Seth Wenig)

The other story, in very 2019 fashion, first emerged via a leak: someone at NASA accidentally posted a draft of the paper in which Googles team claimed to have achieved quantum supremacy. They demonstrated reasonably convincingly that their machine took about three and a half minutes to generate a solution to a particular problem that would take vastly longer to solve with a classical computer.

The problem they were working with was very much in the quantum simulation mode that I talked about a year earlier, when I did a high-level overview of quantum computing in general, though a singularly useless version of that. Basically, they took a set of 50-odd qubits and performed a random series of operations on them to put them in a complicated state in which each qubit was in a superposition of multiple states and also entangled with other qubits in the system. Then they measured the probability of finding specific output states.

Qubit, or quantum bit, illustration. The qubit is a unit of quantum information. As a two-state ... [+] system with superposition of both states at the same time, it is fundamental to quantum computing. The illustration shows the Bloch sphere. The north pole is equivalent to one, the south pole to zero. The other locations, anywhere on the surface of the sphere, are quantum superpositions of 0 and 1. When the qubit is measured, the quantum wave function collapses, resulting in an ordinary bit - a one or a zero - which effectively depends on the qubit's 'latitude'. The illustration shows the qubit 'emitting' a stream of wave functions (the Greek letter psi), representing the collapse of the wave function when measured.

Finding the exact distribution of possible outcomes for such a large and entangled system is extremely computationally intensive if youre using a classical computer to do the job, but it happens very naturally in the quantum computer. So they could get a good approximation of the distribution within minutes, while the classical version would take a lot more time, where a lot more time ranges from thousands of years (Googles claim) down to a few days (the claim by a rival group at IBM using a different supercomputer algorithm to run the computation). If youd like a lot more technical detail about what this did and didnt do, see Scott Aaronson.

As with the EHT paper, this is the culmination of years of work by a large team of people. Its also very much of a piece with past work quantum computing as a distinct field is a recent development, but really, the fundamental equations used to do the calculations were pretty well set by 1935.

Glowing new technology in deep space, computer generated abstract background, 3D rendering

Both of these projects also have a solid claim to be at the forefront of something new. The EHT image is the first to be produced, but wont be the last theyre crunching numbers on the Sag A* black hole at the center of the Milky Way, and theres room to improve their imaging in the future. Along with the LIGO discovery from a few years ago, this is the start of a new era of looking directly at black holes, rather than just using them as a playground for theory.

Googles demonstration of quantum supremacy, meanwhile, is the first such result in a highly competitive field: IBM and Microsoft are also invested in similar machines, and there are smaller companies and academic labs exploring other technologies. The random-sampling problem they used is convenient for this sort of demonstration, but not really useful for anything else, but lots of people are hard at work on techniques to make a next generation of machines that will be able to do calculations where people care about the answer. Theres a good long way to go, yet, but a lot of activity in the field driving things forward.

So, in the head-to-head matchup for Top Physics Story of 2019, these two are remarkably evenly matched, and it could really go either way. The EHT result has a slightly deeper history, the Google quantum computer arguably has a brighter future. My inclination would be to split the award between them; if you put a gun to my head and made me pick one, Id go with quantum supremacy, but Id seriously question the life choices that led you to this place, because theyre both awesome accomplishments that deserve to be celebrated.

Original post:

What Was The Most Important Physics Of 2019? - Forbes