Delta Partners with IBM to Explore Quantum Computing – Database Trends and Applications

Delta Air Lines is embarking on a multi-year collaborative effort with IBM including joining theIBM Q Networkto explore the potential capabilities of quantum computing to transform experiences for customers and employees.

"Partnering with innovative companies like IBM is one way Delta stays on the leading edge of tech to better serve our customers and our people, while drawing the blueprints for application across our industry," saidRahul Samant, Delta's CIO. "We've done this most recently with biometrics in our international terminals and we're excited to explore how quantum computing can be applied to address challenges across the day of travel."

TheIBM Q Network is a global community of Fortune 500 companies, startups, academic institutions and research labs working to advance quantum computing and explore practical applications.

Additionally, through theIBM Q Hub at NC State University, Delta will have access to the IBM Q Network's fleet of universal hardware quantum computersfor commercial use cases and fundamental research, including the recently-announced 53-qubit quantum computer, which, the company says, has the most qubits of a universal quantum computer available for external access in the industry, to date.

"We are very excited by the addition of Delta to our list of collaborators working with us on building practical quantum computing applications," said director of IBM ResearchDario Gil. "IBM's focus, since we put the very first quantum computer on the cloud in 2016, has been to move quantum computing beyond isolated lab experiments conducted by a handful of organizations, into the hands of tens of thousands of users. We believe a clear advantage will be awarded to early adopters in the era of quantum computing and with partners like Delta, we're already making significant progress on that mission."

For more information about the IBM Q Network, go to http://www.ibm.com/quantum-computing/network/overview

See more here:
Delta Partners with IBM to Explore Quantum Computing - Database Trends and Applications

Quantum computing, climate change, and interdependent AI: Academics and execs predict how tech will revolutionize the next decade – Business Insider

The past decade saw technological advancements that transformed how we work, live, and learn. The next one will bring even greater change as quantum computing, cloud computing, 5G, and artificial intelligence mature and proliferate. These changes will happen rapidly, and the work to manage their impact will need to keep pace.

This session at the World Economic Forum, in Davos, Switzerland, brought together industry experts to discuss how these technologies will shape the next decade, followed by a panel discussion about the challenges and benefits this era will bring and if the world can control the technology it creates.

Henry Blodget, CEO, cofounder, and editorial director, Insider Inc.

This interview is part of a partnership between Business Insider and Microsoft at the 2020 World Economic Forum. Business Insider editors independently decided on the topics broached and questions asked.

Below, find each of the panelists most memorable contributions:

Julie Love believes global problems such as climate change can potentially be solved far more quickly and easily through developments in quantum computing.

She said: We [Microsoft] think about problems that were facing: problems that are caused by the destruction of the environment; by climate change, and [that require] optimization of our natural resources, [such as] global food production.

Its quantum computing that really a lot of us scientists and technologists are looking for to solve these problems. We can have the promise of solving them exponentially faster, which is incredibly profound. And that the reason is this: [quantum] technology speaks the language of nature.

By computing the way that nature computes, theres so much information contained in these atoms and molecules. Nature doesnt think about a chemical reaction; nature doesnt have to do some complex computation. Its inherent in the material itself.

Love claimed that, if harnessed in this way, quantum computing could allow scientists to design a compound that could remove carbon from the air. She added that researchers will need to be really pragmatic and practical about how we take this from, from science fiction into the here-and-now.

I believe the future of AI is actually interdependence, collaboration, and cooperation between people and systems, both at the macro [and micro] levels, said Cassell, who is also a faculty member of the Human-Computer Interaction Institute at Carnegie Mellon University.

At the macro-level, [look], for example, at robots on the factory floor, she said. Today, theres been a lot of fear about how autonomous they actually are. First of all, theyre often dangerous. Theyre so autonomous, you have to get out of their way. And it would be nice if they were more interdependent if we could be there at the same time as they are. But also, there is no factory floor where any person is autonomous.

In Cassells view, AI systems could also end up being built collaboratively with experts from non-tech domains, such as psychologists.

Today, tools [for building AI systems] are mostly machine learning tools, she noted. And they are, as youve heard a million times, black boxes. You give [the AI system] lots of examples. You say: This is somebody being polite. That is somebody being impolite. Learn about that. But when they build a system thats polite, you dont know why they did that.

What Id like to see is systems that allow us to have these bottom-up, black-box approaches from machine learning, but also have, for example, psychologists in there, saying thats not actually really polite, or its polite in the way that you dont ever want to hear.'

One thing I constantly wish is that there was a more standardized measurement for everybody to report how much theyre spending per employee on employee training because that really doesnt exist, when you think about it, said Smith, Microsofts president and chief legal officer since 2015.

I think, anecdotally, one can get a pretty strong sense that if you go back to the 1980s and 1990s employers invested a huge amount in employee training around technology. It was teaching you how to use MS-DOS, or Windows, or how to use Word or Excel interestingly, things that employers dont really feel obliged to teach employees today.

Learning doesnt stop when you leave school. Were going to have to work a little bit harder. And thats true for everyone.

He added that this creates a further requirement: to make sure the skills people do pick up as they navigate life are easily recognizable by other employers.

Ultimately, theres a wide variety of post-secondary credentials. The key is to have credentials that employers recognize as being valuable. Its why LinkedIn and others are so focused on new credentialing systems. Now, the good news is that should make things cheaper. It all should be more accessible.

But I do think that to go back to where I started employers are going to have to invest more [in employee training]. And were going to have to find some ways to do it in a manner that perhaps is a little more standardized.

Suri said 5G will be able to help develop industries that go far beyond entertainment and telecoms, and will impact physical or manual industries such as manufacturing.

The thing about 5G is that its built for machine-type communications. When we received the whole idea of 5G, it was how do we get not just human beings to interact with each other, but also large machines, he said.

So we think that there is a large economic boost possible from 5G and 5G-enabled technologies because it would underpin many of these other technologies, especially in the physical industries.

Suri cited manufacturing, healthcare, and agriculture as just some of the industries 5G could help become far more productive within a decade.

He added: Yes, well get movies and entertainment faster, but it is about a lot of physical industries that didnt quite digitize yet. Especially in the physical industries, we [Nokia] think that the [productivity] gains could be as much as 35% starting in the year 2028 starting with the US first, and then going out into other geographies, like India, China, the European Union, and so on.

Continue reading here:
Quantum computing, climate change, and interdependent AI: Academics and execs predict how tech will revolutionize the next decade - Business Insider

IBM And University Of Tokyo Launch Quantum Computing Initiative For Japan – E3zine.com

IBM and the University of Tokyo announced an agreement to partner to advance quantum computing and make it practical for the benefit of industry, science and society.

IBM and theUniversity of Tokyowill form theJapan IBM Quantum Partnership, a broad national partnership framework in which other universities, industry, and government can engage. The partnership will have three tracks of engagement: one focused on the development of quantum applications with industry; another on quantum computing system technology development; and the third focused on advancing the state of quantum science and education.

Under the agreement, anIBM Q System One, owned and operated by IBM, willbe installed in an IBM facility inJapan. It will be the first installation of its kind in the region and only the third in the world followingthe United StatesandGermany. The Q System One will be used to advance research in quantum algorithms, applications and software, with the goal of developing the first practical applications of quantum computing.

IBM and theUniversity of Tokyowill also create a first-of-a-kind quantumsystem technology center for the development of hardware components and technologies that will be used in next generation quantum computers. The center will include a laboratory facility to develop and test novel hardware components for quantum computing, including advanced cryogenic and microwave test capabilities.

IBM and theUniversity of Tokyowill also directly collaborateon foundational research topics important to the advancement of quantum computing, and establish a collaboration space on the University campus to engage students, faculty, and industry researchers with seminars, workshops, and events.

Developed byresearchers and engineers fromIBM Researchand Systems, the IBM Q System One is optimized for the quality, stability, reliability, and reproducibility of multi-qubit operations. IBM established theIBM Q Network, a community of Fortune 500 companies, startups, academic institutions and research labs working with IBM to advance quantum computing and explore practical applications for business and science.

Advances in quantum computing could open the door to future scientific discoveries such as new medicines and materials, improvements in the optimization of supply chains, and new ways to model financial data to better manage and reduce risk.

TheUniversity of Tokyowill lead theJapan IBM Quantum Partnership and bring academic excellence from universities and prominent research associations together with large-scale industry, small and medium enterprises, startups as well as industrial associations from diverse market sectors. A high priority will be placed on building quantum programming as well as application and technology development skills and expertise.

Follow this link:
IBM And University Of Tokyo Launch Quantum Computing Initiative For Japan - E3zine.com

Quantum networking projected to be $5.5 billion market in 2025 – TechRepublic

Several companies are working to advance the technology, according to a new report.

The market for quantum networking is projected to reach $5.5 billion by 2025, according to a new report from Inside Quantum Technology (IQT).

While all computing systems rely on the ability to store and manipulate information in individual bits, quantum computers "leverage quantum mechanical phenomena to manipulate information" and to do so requires the use of quantum bits, or qubits, according to IBM.

SEE:Quantum computing: An insider's guide (TechRepublic)

Quantum computing is seen as the panacea for solving the problems computers are not equipped to handle now.

"For problems above a certain size and complexity, we don't have enough computational power on earth to tackle them,'' IBM said. This requires a new kind of computing, and this is where quantum comes in.

IQT says that quantum networking revenue comes primarily from quantum key distribution (QK), quantum cloud computing, and quantum sensor networks. Eventually, these strands will merge into a Quantum Internet, the report said.

Cloud access to quantum computers is core to the business models of many leading quantum computer companiessuch as IBM, Microsoft and Rigettias well as several leading academic institutions, according to the report.

Microsoft, for instance, designed a special programming language for quantum computers, called Q#, and released a Quantum Development Kit to help programmers create new applications, according to CBInsights.

One of Google's quantum computing projects involves working with NASA to apply the tech's optimization abilities to space travel.

The Quantum Internet network will have the same "geographical breadth of coverage as today's internet," the IQT report stated.

It will provide a powerful platform for communications among quantum computers and other quantum devices, the report said.

And will enable a quantum version of the Internet of Things. "Finally, quantum networks can be the most secure networks ever built completely invulnerable if constructed properly," the report said.

The report, "Quantum Networks: A Ten-Year Forecast and Opportunity Analysis," forecasts demand for quantum network equipment, software and services in both volume and value terms.

"The time has come when the rapidly developing quantum technology industry needs to quantify the opportunities coming out of quantum networking," said Lawrence Gasman, president of Inside Quantum Technology, in a statement.

Quantum Key Distribution (QKD) adds unbreakable coding of key distribution to public key encryption, making it virtually invulnerable, according to the report.

QKD is the first significant revenue source to come from the emerging Quantum Internet and will create almost $150 million in revenue in 2020, the report said.

QKD's early success is due to potential usersbig financial and government organizationshave an immediate need for 100% secure encryption, the IQT report stated.

By 2025, IQT projects that revenue from "quantum clouds" are expected to exceed $2 billion.

Although some large research and government organizations are buying quantum computers for on-premise use, the high cost of the machines coupled with the immaturity of the technology means that the majority of quantum users are accessing quantum through clouds, the report explained.

Quantum sensor networks promise enhanced navigation and positioning and more sensitive medical imaging modalities, among other use cases, the report said.

"This is a very diverse area in terms of both the range of applications and the maturity of the technology."

However, by 2025 revenue from quantum sensors is expected to reach about $1.2 billion.

We deliver the top business tech news stories about the companies, the people, and the products revolutionizing the planet. Delivered Daily

Image: Getty Images/iStockphoto

Go here to read the rest:
Quantum networking projected to be $5.5 billion market in 2025 - TechRepublic

University of Sheffield launches Quantum centre to develop the technologies of tomorrow – Quantaneo, the Quantum Computing Source

A new research centre with the potential to revolutionise computing, communication, sensing and imaging technologies is set to be launched by the University of Sheffield this week (22 January 2020).

The Sheffield Quantum Centre, which will be officially opened by Lord Jim ONeill, Chair of Chatham House and University of Sheffield alumnus, is bringing together more than 70 of the Universitys leading scientists and engineers to develop new quantum technologies.

Quantum technologies are a broad range of new materials, devices and information technology protocols in physics and engineering. They promise unprecedented capabilities and performance by exploiting phenomena that cannot be explained by classical physics.

Quantum technologies could lead to the development of more secure communications technologies and computers that can solve problems far beyond the capabilities of existing computers.

Research into quantum technologies is a high priority for the UK and many countries around the world. The UK government has invested heavily in quantum research as part of a national programme and has committed 1 billion in funding over 10 years.

Led by the Universitys Department of Physics and Astronomy, Department of Electronic and Electrical Engineering and Department of Computer Science, the Sheffield Quantum Centre will join a group of northern universities that are playing a significant role in the development of quantum technologies.

The University of Sheffield has a strong presence in quantum research with world leading capabilities in crystal growth, nanometre scale device fabrication and device physics research. A spin-out company has already been formed to help commercialise research, with another in preparation.

Professor Maurice Skolnick, Director of the Sheffield Quantum Centre, said: The University of Sheffield already has very considerable strengths in the highly topical area of quantum science and technology. I have strong expectation that the newly formed centre will bring together these diverse strengths to maximise their impact, both internally and more widely across UK universities and funding bodies.

During the opening ceremony, the Sheffield Quantum Centre will also launch its new 2.1 million Quantum Technology Capital equipment.

Funded by the Engineering and Physical Sciences Research Council (EPSRC), the equipment is a molecular beam epitaxy cluster tool designed to grow very high quality wafers of semiconductor materials types of materials that have numerous everyday applications such as in mobile phones and lasers that drive the internet.

The semiconductor materials also have many new quantum applications which researchers are focusing on developing.

Professor Jon Heffernan from the Universitys Department of Electronic and Electrical Engineering, added: The University of Sheffield has a 40-year history of pioneering developments in semiconductor science and technology and is host to the National Epitaxy Facility. With the addition of this new quantum technologies equipment I am confident our new research centre will lead to many new and exciting technological opportunities that can exploit the strange but powerful concepts from quantum science.

Continue reading here:
University of Sheffield launches Quantum centre to develop the technologies of tomorrow - Quantaneo, the Quantum Computing Source

5 Emerging Technologies That Will Shape this Decade – San Diego Entertainer Magazine

UncategorizedByJohn Breaux|January 22, 2020

Some say that we are in the midst of a new technological revolution, with emerging technologies taking shape to transform the world we live in. As we step into a new decade, expect to see a handful of amazing advancements in technology that will dramatically shape our society at large.

Weve been told for years that self-driving cars are the future, but this decade will bring us the greatest advancements in this field as of yet. Companies have been researching and testing autonomous fleets of cars for years now, and some are finally gearing up to deploy them in the real world. Tesla has already released a self-driving feature in its popular electric vehicles, while Google-owned Waymo has completed a trial of autonomous taxi systems in California where it successfully transported more than 6000 people.

This radically powerful form of computing will continue to reach more practical applications throughout the decade. Quantum computers are capable of performing exponentially more powerful calculations when compared to traditional computing, but the size and power required to run them makes them difficult to use in a more practical sense. Further research in quantum, computing will allow greater application for solving real-world problems.

Augmenting our bodies with technology will become more common as wearable devices will allow us to improve everything from hearing to sight. Examples include devices and implants that will be able to enhance sensory capabilities, improve health, and contribute to a heightened quality of life and functional performance.

The advent of 5G will perhaps be one of the most impactful technologies for the many starting this year and proceeding onwards. 5G networks will have the capability of connecting us to the digital world in ways weve never had before, affording us blazing fast speeds of nearly 10 Gb/s. The speed of 5G will allow for seamless control of vast autonomous car fleets, precise robotic surgery, or streaming of 4K video with no buffering.

Drones are already a pivotal piece of technology in areas including transportation, surveillance, and logistics. Swarm robotics will be a new multi-robot system inspired by nature that will have major potential in completing tasks with unparalleled efficiency. Applications could include providing post-disaster relief, geological surveying, and even farming. Swarm robotics will be able to accomplish tasks through cooperative behavior while adapting to situations in ways that would not be possible with a single drone.

Read the original:
5 Emerging Technologies That Will Shape this Decade - San Diego Entertainer Magazine

Deltec Bank, Bahamas says the Impact of Quantum Computing in Banking will be huge – Press Release – Digital Journal

Deltec Bank, Quantum Computing can help institutions speed up their transactional activities while making sense of assets that typically seem incongruent.

Technologies based on quantum theory are coming to the financial sector. It is not an if, but a when for banks to begin using this option to evolve current business practices.

Companies like JPMorgan Chase and Barclays have over two years of experience working with IBMs quantum computing technology. The goal of this work is to optimize portfolios for investors, but several additional benefits could come into the industry as banks learn more about it.

Benefits of Quantum Computing in Banking

Quantum computing stayed in the world of academia until recent years when technology developers opened trial opportunities. The banking sector was one of the first to start experimenting with what might be possible.

Their efforts have led to the development of four positive outcomes that can occur because of the faster processing power that quantum computing offers.

1. Big Data Analytics

The high-powered processing capabilities of this technology make it possible for banks to optimize their big data. According to Deltec Bank, Quantum Computing can help institutions speed up their transactional activities while making sense of assets that typically seem incongruent.

2. Portfolio Analysis

Quantum computing permits high-frequency trading activities because it can appraise assets and analyze portfolios to determine individual needs. The creation of algorithms built on the full capabilities of this technology can mine more information to find new pathways to analysis and implementation.

3. Customer Service Improvements

This technology gives banks more access to artificial intelligence and machine learning opportunities. The data collected by institutions can improve customer service by focusing on consumer engagement, risk analysis, and product development. There will be more information available to develop customized financial products that meet individual needs while staying connected to core utilities.

4. Improved Security

The results of quantum computing in banking will create the next generation of encryption and safeguarding efforts to protect data. Robust measures that include encrypted individual identification keys and instant detection of anomalies can work to remove fraudulent transactions.

Privately Funded Research is Changing the Banking Industry

Although some firms are working with IBM and other major tech developers to bring quantum computing to the banking sector, it is private money that funds most of the innovations.

An example of this effort comes from Rigetti Computing. This company offers a product called Forest, which is a downloadable SDK that is useful in the writing and testing of programs using quantum technologies.

1QB Information Technologies in Canada has an SDK that offers the necessary tools to develop and test applications on quantum computers.

How the world approaches banking and finance could be very different in the future because of quantum computing. This technology might not solve every problem the industry faces today, but it can certainly put a significant dent in those issues.

Disclaimer: The author of this text, Robin Trehan, has an Undergraduate degree in economics, Masters in international business and finance and MBA in electronic business. Trehan is Senior VP at Deltec International http://www.deltecbank.com. The views, thoughts, and opinions expressed in this text are solely the views of the author, and not necessarily reflecting the views of Deltec International Group, its subsidiaries and/or employees.

About Deltec Bank

Headquartered in The Bahamas, Deltec is an independent financial services group that delivers bespoke solutions to meet clients unique needs. The Deltec group of companies includes Deltec Bank & Trust Limited, Deltec Fund Services Limited, and Deltec Investment Advisers Limited, Deltec Securities Ltd. and Long Cay Captive Management

Media ContactCompany Name: Deltec International GroupContact Person: Media ManagerEmail: Send EmailPhone: 242 302 4100Country: BahamasWebsite: https://www.deltecbank.com/

Read more from the original source:
Deltec Bank, Bahamas says the Impact of Quantum Computing in Banking will be huge - Press Release - Digital Journal

The Need For Computing Power In 2020 And Beyond – Forbes

Having led a Bitcoin mining firm for over two years, I've come to realize the importance of computing power. Computing power connects the real (chip energy) and virtual (algorithm) dimensions of our world. Under the condition that the ownership of the assets remains unchanged, computing power is an intangible asset that can be used and circulated. It is a commercialized technical service and a consumption investment. This is a remarkable innovation for mankind, and it is an upgrade for the digital economy.

2020 marks the birth year of the computing power infrastructure. Our world is at the beginning of a new economic and technological cycle. We have entered the digital economic civilization. This wave of technology is driven by the combination of AI, 5G, quantum computing, big data and blockchain. People have started realizing that in the age of the digital economy, computing power is the most important and innovative form of productivity.

Computing power is not just technical but also economic innovation. It's a small breakthrough at the fundamental level with impact that will be immeasurable. And people have finally seen the value of the bottom layer through the 10 years of crypto mining evolution.

However, there are two major problems faced by the entire technological landscape: First is insufficient computing power. Second is the dominance of centralized computing power, which creates a monopoly and gives rise to manipulation problems and poor data security.

How does more computing power help?

Artificial Intelligence

Mining Bitcoin has allowed my company to build the foundation of computing infrastructure, so we are planning to eventually expand into AI computing. This experience has further shown me the importance of working toward developing more computing power if tech leaders want to continue creating innovative technologies.

Consider this: For an AI system to recognize someone's voice or identify an animal or a human being, it first needs to process millions of audio, video or image samples. It then learns to differentiate between two different pitches of voices or to differentiate faces based on various facial features. To reach that level of precision, an AI model needs to be fed a tremendous amount of data.

It is only possible to do that if we have powerful computers that can process millions of data points every single second. The more the computing power, the faster we can feed the data to train the AI system, resulting in a shorter span for the AI to reach near-perfection, i.e., human-level intelligence.

The computing power required by AI has been doubling roughly every three and a half months since 2012. The need to build better AI has made it mandatory to keep up with this requirement for more computing power. Tech companies are leaving no stone unturned to rise to this demand.

It is almost as if computing power is now an asset into which investors and organizations are pouring millions of dollars. They are constantly testing and modifying their best chips to produce more productive versions of them. The results of this investment are regularly seen in the form of advanced, more compact chips capable of producing higher computing power while consuming lesser energy.

For new technological breakthroughs, computing power itself has become the new "production material" and "energy." Computing power is the fuel of our technologically advanced society. I've observed it is driving the development in various technological landscapes, such as AI, graphics computing, 5G and cryptocurrency.

Cryptocurrency Mining

Similar to AI, the decentralized digital economy sector also relies on high computing power. Transactions of cryptocurrencies, such as Bitcoin, are validated through a decentralized process called "mining." Mining requires miners across the world to deploy powerful computers to find the solution or the hash to a cryptographic puzzle that proves the legitimacy of each transaction requested on the blockchain.

The bad news, however, is that the reward to mine Bitcoin is halved almost every four years. This means that following May 20, 2020 the next halving date miners who mine Bitcoin would receive half the reward per block compared to what they do now. Two primary factors that compensate for the halving of rewards are an increase in the price of Bitcoin and advanced chips with high computing power.

Miners run not one but multiple high-end graphics processing units to mine Bitcoin, which is an electricity-intensive process. The only way to keep mining profitably is to invest in better chips that produce more computing power with lower electricity consumption. This helps miners process more hashes per second (i.e., the hashrate) to get to the right hash and attain the mining reward.

So far, mining chip producers have delivered the promise of more efficient chips leading to an increase in the mining hashrate from 50 exahashes per second to 90 exahashes per second in the past six months. Per the reports, the efficiency of the latest chips combined with increased Bitcoin prices has helped keep the mining business highly profitable since the previous halving.

High computing power has become an addiction we humans are not getting rid of in the foreseeable future. With our growing fondness for faster computer applications and more humanlike AI, it's likely that we demand faster and more perfect versions of the systems we use today. A viable way to fulfill this would be to produce more computing power.

The two biggest challenges that lie in our way are producing clean electricity at lower costs and developing chips that have a lower electricity-consumption-to-computing-power-production ratio. The core of industrial production competition today lies in the cost of producing electricity. Low energy prices enable us to provide stable services. For example, there is an abundance of hydro-electric power in southwest China, and cooperative data centers are located there so they can harness the hydropower.

If we could make low-cost, clean energy available everywhere, we'd cut the cost of producing computing power. When this energy is used by power-efficient computing chips, the total cost drops even more and high computing power becomes highly affordable.

See the rest here:
The Need For Computing Power In 2020 And Beyond - Forbes

LIVE FROM DAVOS: Henry Blodget leads panel on the next decade of tech – Business Insider Nordic

The past decade saw technological advancements that transformed how we work, live, and learn. The next one will bring even greater change as quantum computing, cloud computing, 5G, and artificial intelligence mature and proliferate. These changes will happen rapidly, and the work to manage their impact will need to keep pace.

This session at the World Economic Forum, in Davos, Switzerland, brought together industry experts to discuss how these technologies will shape the next decade, followed by a panel discussion about the challenges and benefits this era will bring and if the world can control the technology it creates.

Henry Blodget, CEO, cofounder, and editorial director, Insider Inc.

This interview is part of a partnership between Business Insider and Microsoft at the 2020 World Economic Forum. Business Insider editors independently decided on the topics broached and questions asked.

Below, find each of the panelists' most memorable contributions:

Julie Love, senior director of quantum business development, Microsoft Microsoft

Julie Love believes global problems such as climate change can potentially be solved far more quickly and easily through developments in quantum computing.

She said: "We [Microsoft] think about problems that we're facing: problems that are caused by the destruction of the environment; by climate change, and [that require] optimization of our natural resources, [such as] global food production."

"It's quantum computing that really a lot of us scientists and technologists are looking for to solve these problems. We can have the promise of solving them exponentially faster, which is incredibly profound. And that the reason is this: [quantum] technology speaks the language of nature.

"By computing the way that nature computes, there's so much information contained in these atoms and molecules. Nature doesn't think about a chemical reaction; nature doesn't have to do some complex computation. It's inherent in the material itself.

Love claimed that, if harnessed in this way, quantum computing could allow scientists to design a compound that could remove carbon from the air. She added that researchers will need to be "really pragmatic and practical about how we take this from, from science fiction into the here-and-now."

Justine Cassell, a professor specializing in AI and linguistics. YouTube/Business Insider

"I believe the future of AI is actually interdependence, collaboration, and cooperation between people and systems, both at the macro [and micro] levels," said Cassell, who is also a faculty member of the Human-Computer Interaction Institute at Carnegie Mellon University.

"At the macro-level, [look], for example, at robots on the factory floor," she said. "Today, there's been a lot of fear about how autonomous they actually are. First of all, they're often dangerous. They're so autonomous, you have to get out of their way. And it would be nice if they were more interdependent if we could be there at the same time as they are. But also, there is no factory floor where any person is autonomous.

In Cassell's view, AI systems could also end up being built collaboratively with experts from non-tech domains, such as psychologists.

"Today, tools [for building AI systems] are mostly machine learning tools," she noted. "And they are, as you've heard a million times, black boxes. You give [the AI system] lots of examples. You say: 'This is somebody being polite. That is somebody being impolite. Learn about that.' But when they build a system that's polite, you don't know why they did that.

"What I'd like to see is systems that allow us to have these bottom-up, black-box approaches from machine learning, but also have, for example, psychologists in there, saying 'that's not actually really polite,' or 'it's polite in the way that you don't ever want to hear.'"

Microsoft president Brad Smith. YouTube/Business Insider

"One thing I constantly wish is that there was a more standardized measurement for everybody to report how much they're spending per employee on employee training because that really doesn't exist, when you think about it," said Smith, Microsoft's president and chief legal officer since 2015.

"I think, anecdotally, one can get a pretty strong sense that if you go back to the 1980s and 1990s employers invested a huge amount in employee training around technology. It was teaching you how to use MS-DOS, or Windows, or how to use Word or Excel interestingly, things that employers don't really feel obliged to teach employees today.

"Learning doesn't stop when you leave school. We're going to have to work a little bit harder. And that's true for everyone.

He added that this creates a further requirement: to make sure the skills people do pick up as they navigate life are easily recognizable by other employers.

"Ultimately, there's a wide variety of post-secondary credentials. The key is to have credentials that employers recognize as being valuable. It's why LinkedIn and others are so focused on new credentialing systems. Now, the good news is that should make things cheaper. It all should be more accessible.

"But I do think that to go back to where I started employers are going to have to invest more [in employee training]. And we're going to have to find some ways to do it in a manner that perhaps is a little more standardized."

Nokia president and CEO, Rajeev Suri. YouTube/Business Insider

Suri said 5G will be able to help develop industries that go far beyond entertainment and telecoms, and will impact physical or manual industries such as manufacturing.

"The thing about 5G is that it's built for machine-type communications. When we received the whole idea of 5G, it was 'how do we get not just human beings to interact with each other, but also large machines," he said.

"So we think that there is a large economic boost possible from 5G and 5G-enabled technologies because it would underpin many of these other technologies, especially in the physical industries."

Suri cited manufacturing, healthcare, and agriculture as just some of the industries 5G could help become far more productive within a decade.

He added: "Yes, we'll get movies and entertainment faster, but it is about a lot of physical industries that didn't quite digitize yet. Especially in the physical industries, we [Nokia] think that the [productivity] gains could be as much as 35% starting in the year 2028 starting with the US first, and then going out into other geographies, like India, China, the European Union, and so on.

View post:
LIVE FROM DAVOS: Henry Blodget leads panel on the next decade of tech - Business Insider Nordic

Google claims to have invented a quantum computer, but IBM begs to differ – The Conversation CA

On Oct. 23, 2019, Google published a paper in the journal Nature entitled Quantum supremacy using a programmable superconducting processor. The tech giant announced its achievement of a much vaunted goal: quantum supremacy.

This perhaps ill-chosen term (coined by physicist John Preskill) is meant to convey the huge speedup that processors based on quantum-mechanical systems are predicted to exhibit, relative to even the fastest classical computers.

Googles benchmark was achieved on a new type of quantum processor, code-named Sycamore, consisting of 54 independently addressable superconducting junction devices (of which only 53 were working for the demonstration).

Each of these devices allows the storage of one bit of quantum information. In contrast to the bits in a classical computer, which can only store one of two states (0 or 1 in the digital language of binary code), a quantum bit qbit can store information in a coherent superposition state which can be considered to contain fractional amounts of both 0 and 1.

Sycamore uses technology developed by the superconductivity research group of physicist John Martinis at the University of California, Santa Barbara. The entire Sycamore system must be kept cold at cryogenic temperatures using special helium dilution refrigeration technology. Because of the immense challenge involved in keeping such a large system near the absolute zero of temperature, it is a technological tour de force.

The Google researchers demonstrated that the performance of their quantum processor in sampling the output of a pseudo-random quantum circuit was vastly better than a classical computer chip like the kind in our laptops could achieve. Just how vastly became a point of contention, and the story was not without intrigue.

An inadvertent leak of the Google groups paper on the NASA Technical Reports Server (NTRS) occurred a month prior to publication, during the blackout period when Nature prohibits discussion by the authors regarding as-yet-unpublished papers. The lapse was momentary, but long enough that The Financial Times, The Verge and other outlets picked up the story.

A well-known quantum computing blog by computer scientist Scott Aaronson contained some oblique references to the leak. The reason for this obliqueness became clear when the paper was finally published online and Aaronson could at last reveal himself to be one of the reviewers.

The story had a further controversial twist when the Google groups claims were immediately countered by IBMs quantum computing group. IBM shared a preprint posted on the ArXiv (an online repository for academic papers that have yet to go through peer review) and a blog post dated Oct. 21, 2019 (note the date!).

While the Google group had claimed that a classical (super)computer would require 10,000 years to simulate the same 53-qbit random quantum circuit sampling task that their Sycamore processor could do in 200 seconds, the IBM researchers showed a method that could reduce the classical computation time to a mere matter of days.

However, the IBM classical computation would have to be carried out on the worlds fastest supercomputer the IBM-developed Summit OLCF-4 at Oak Ridge National Labs in Tennessee with clever use of secondary storage to achieve this benchmark.

While of great interest to researchers like myself working on hardware technologies related to quantum information, and important in terms of establishing academic bragging rights, the IBM-versus-Google aspect of the story is probably less relevant to the general public interested in all things quantum.

For the average citizen, the mere fact that a 53-qbit device could beat the worlds fastest supercomputer (containing more than 10,000 multi-core processors) is undoubtedly impressive. Now we must try to imagine what may come next.

The reality of quantum computing today is that very impressive strides have been made on the hardware front. A wide array of credible quantum computing hardware platforms now exist, including ion traps, superconducting device arrays similar to those in Googles Sycamore system and isolated electrons trapped in NV-centres in diamond.

These and other systems are all now in play, each with benefits and drawbacks. So far researchers and engineers have been making steady technological progress in developing these different hardware platforms for quantum computing.

What has lagged quite a bit behind are custom-designed algorithms (computer programs) designed to run on quantum computers and able to take full advantage of possible quantum speed-ups. While several notable quantum algorithms exist Shors algorithm for factorization, for example, which has applications in cryptography, and Grovers algorithm, which might prove useful in database search applications the total set of quantum algorithms remains rather small.

Much of the early interest (and funding) in quantum computing was spurred by the possibility of quantum-enabled advances in cryptography and code-breaking. A huge number of online interactions ranging from confidential communications to financial transactions require secure and encrypted messages, and modern cryptography relies on the difficulty of factoring large numbers to achieve this encryption.

Quantum computing could be very disruptive in this space, as Shors algorithm could make code-breaking much faster, while quantum-based encryption methods would allow detection of any eavesdroppers.

The interest various agencies have in unbreakable codes for secure military and financial communications has been a major driver of research in quantum computing. It is worth noting that all these code-making and code-breaking applications of quantum computing ignore to some extent the fact that no system is perfectly secure; there will always be a backdoor, because there will always be a non-quantum human element that can be compromised.

More appealing for the non-espionage and non-hacker communities in other words, the rest of us are the possible applications of quantum computation to solve very difficult problems that are effectively unsolvable using classical computers.

Ironically, many of these problems emerge when we try to use classical computers to solve quantum-mechanical problems, such as quantum chemistry problems that could be relevant for drug design and various challenges in condensed matter physics including a number related to high-temperature superconductivity.

So where are we in the wonderful and wild world of quantum computation?

In recent years, we have had many convincing demonstrations that qbits can be created, stored, manipulated and read using a number of futuristic-sounding quantum hardware platforms. But the algorithms lag. So while the prospect of quantum computing is fascinating, it will likely be a long time before we have quantum equivalents of the silicon chips that power our versatile modern computing devices.

[ Deep knowledge, daily. Sign up for The Conversations newsletter. ]

Link:
Google claims to have invented a quantum computer, but IBM begs to differ - The Conversation CA