To mitigate the impact of artificial intelligence we must harness the power of emotional intelligence – The HR Director Magazine

One of the most recognised interpretationsof emotional intelligence (EI) is that of the American author and sciencejournalist, Daniel Goleman. He describes EI as how we use a combination ofself-awareness, self-management and empathy to build and maintain successfulrelationships. Goleman suggests that EI accounts for 67% of the abilitiesneeded to be a successful leader and that it is twice as important as technicalproficiency or IQ.

Why EI is so importantThe most successful leaders enjoy positive relationships with other people. They are able to do this because they are aware of and understand what they feel and why. This awareness and understanding helps them make good decisions and develop a sound moral compass.

As well as understanding theiremotions, successful leaders can manage them and channel how they feel inpositive ways. They understand how other peoples emotions affect their ownfeelings and behaviour. And they bring all this together into how they manage theirrelationships with other people.

Of course, EI has always beenimportant, for all of us. But the impact of technology today is making it essential.83% of executives interviewed by Capgemini in 2019 said a highly emotionallyintelligent workforce will be a prerequisite for success in years to come. And76% said their employees need to develop their EI so they can adapt to newroles and take on tasks that cant be automated.

Claudia Crummenerl, global practicelead, people and organisation at Capgemini Invent said, Companies areincreasingly aware of the need for emotional intelligence skills but are notmoving quickly enough to invest in them.

Why the need for EI is growingToday, were talking to each other less and less while algorithms and AI are influencing us more and more. As a result were losing our ability to connect, to have empathy and to understand. Our EI is suffering.

The more we depend on technologythe more we impair our EI. To counter this we must preserve and capitalize on thethings we can do that technology cant. And we must recognise and value ourimportance as people, as more than simply cogs in a corporate machine.

The value of high EIPeople with high EI, understand their emotions and use them to guide how they act. They know their own strengths and weaknesses, can handle constructive feedback and use it to improve their performance and that of the people they manage.

People with high EI are better atcoping with and managing change. They are more likely to hire people whoperform well in areas they struggle with themselves, and in doing so improvetheir organisations performance.

And people with high EI understandothers and so can motivate them. This makes them more comfortable taking on aleadership role. Because they can manage their own and others emotions, theyare able to create a positive working environment.

These strengths are also thestrengths of people who in my business we call innovative communicators. Andits why we base our communications training firmly in EI.

What high EI means for communicationThe more we interact with other people, the more we learn to understand our own motivations and behaviours. And the more we interact with other people, the more we learn to understand their motivations and behaviours.

So the more we communicate withothers, the more emotionally intelligent we become. And as we become moreemotionally intelligent, so we become better or innovative communicators.

Innovative communication is not afunction, something you delegate to your human resources or communicationsteam. Its a set of qualities anyone can develop to help them lead withconfidence and drive growth. It depends on behaviours such as adaptiveleadership, collaboration and delegation, all of which contribute to high EI.So its impossible to separate high EI from effective communication skills. Thebest communicators will all have high EI because the two are co-dependent.

This means when you train people incommunication skills you need to look at the whole human and base the trainingin EI. Its not about internal comms, external comms, PR, HR or marketing. Itsabout human beings talking to other human beings and the wide range of skillsand personality traits it takes to do that effectively particularly todaywhen we are so influenced by technology and social media.

The time to act is nowThe demand for people with high EI and innovative communication skills is set to soar so you should prepare your business by training your teams now. In 2019, IBMs Institute for Business Value found that, over the next three years, more than 120 million workers worldwide will need retraining in behavioural skills such as communication, teamwork, adaptability, ethics and integrity. All of which are firmly rooted in EI.

Our rhetoric, our politics and our economies are becoming increasingly divisive. Which is why there has never been a better time for people in business to reconnect through meaningful communication, to what matters most to them and to each other, and for the greater good.

Miti Ampoma,Founder and DirectorMiticom Communications Training

Read more:
To mitigate the impact of artificial intelligence we must harness the power of emotional intelligence - The HR Director Magazine

Emotion Artificial Intelligence Market Research 2020: Currently Trending Market Strategies of Production and Applications by 2025 – Dagoretti News

The report presents authentic and accurate research study on the global Emotion Artificial Intelligence Market on the basis of qualitative and quantitative assessment done by the leading industry experts. The report throws light on the present market scenario and how is it anticipated to change in the coming future. Growth determinants, micro and macroeconomic indicators, opportunities, developments, and key market trends are scrutinized in this report that are likely to have a major influence on the global Emotion Artificial Intelligence Market Growth.

Market Overview

The global Emotion Artificial Intelligence market size is expected to gain market growth in the forecast period of 2020 to 2025, with a CAGR of % in the forecast period of 2020 to 2025 and will expected to reach USD million by 2025, from USD million in 2019.

The Emotion Artificial Intelligence Market report provides a detailed analysis of global market size, regional and country-level market size, segmentation market growth, market share, competitive Landscape, sales analysis, impact of domestic and global market players, value chain optimization, trade regulations, recent developments, opportunities analysis, strategic market growth analysis, product launches, area marketplace expanding, and technological innovations.

Get FREE Sample Report + All Related Graphs & Charts @at

https://www.marketresearchnest.com/sample-request/859772-Global-Emotion-Artificial-Intelligence-Market-2020-by-Company,-Regions,-Type-and-Application,-Forecast-to-2025

Market segmentation

Emotion Artificial Intelligence Market is split by Type and by Application. For the period 2015-2025, the growth among segments provides accurate calculations and forecasts for sales by Type and by Application in terms of volume and value. This analysis can help you expand your business by targeting qualified niche markets.

By Type, Emotion Artificial Intelligence market has been segmented into:

By Application, Emotion Artificial Intelligence has been segmented into:

Regions and Countries Level Analysis

Regional analysis is another highly comprehensive part of the research and analysis study of the global Emotion Artificial Intelligence Market presented in the report. This section sheds light on the sales growth of different regional and country-level Emotion Artificial Intelligence markets. For the historical and forecast period 2015 to 2025, it provides detailed and accurate country-wise volume analysis and region-wise market size analysis of the global Emotion Artificial Intelligence market.

The report offers in-depth assessment of the growth and other aspects of the Emotion Artificial Intelligence market in important countries (regions), including:

Competitive Landscape and Emotion Artificial Intelligence Market Share Analysis

Emotion Artificial Intelligence competitive landscape provides details by vendors, including company overview, company total revenue (financials), market potential, global presence, Emotion Artificial Intelligence sales and revenue generated, market share, price, production sites and facilities, SWOT analysis, product launch. For the period 2015-2020, this study provides the Emotion Artificial Intelligence sales, revenue and market share for each player covered in this report.

The major players covered in Emotion Artificial Intelligence are:

Access PDF Version of this Report at:

https://www.marketresearchnest.com/reportdetail/859772/Global-Emotion-Artificial-Intelligence-Market-2020-by-Company,-Regions,-Type-and-Application,-Forecast-to-2025

Thanks for reading this article. You can contact us at [emailprotected] to explore the Emotion Artificial Intelligence market in detail.

Read more:
Emotion Artificial Intelligence Market Research 2020: Currently Trending Market Strategies of Production and Applications by 2025 - Dagoretti News

3 ASX shares for exposure to the artificial intelligence industry – Motley Fool Australia

According to market research, the artificial intelligence (AI) market was valued at US$16.06 billion in 2017 and is expected to reach US$190.61 billion by 2025, with a compound annual growth rate of 36.6%.

Investing in AI is a form of thematic investing; that is, gaining exposure to a niche that is expected to grow significantly over time.

According to market research, the artificial intelligence (AI) market was valued at US$16.06 billion in 2017 and is expected to reach US$190.61 billion by 2025, with a compound annual growth rate of 36.6%.

Investing in AI is a form of thematic investing; that is, gaining exposure to a niche that is expected to grow significantly over time.

AI refers to intelligence demonstrated by machines. It is a wide-ranging branch of computer science focused on building smart machines capable of performing tasks that have typically required human intelligence. AI makes it possible for machines to learn from experience, adjust to new inputs, and perform tasks that previously required human input.

Advances in machine learning, deep learning, and natural language processing have enabled rapid advances in AI over the last decade. Examples include smart assistants such as Siri and Alexa, song and TV show recommendations from Spotify and Netflix, and spam filters on email. AI is also used to provide 24/7 customer service via chatbots, to write news stories, and in self-driving cars.

Here are 3 ASX shares involved in the AI sector.

Appen provides data for use in machine learning and AI. It collects and labels images, text, speech, audio, and video data used to build and improve artificial intelligence systems at some of the worlds biggest tech companies.

Appen listed on the ASX in 2015 and has grown exponentially since then. Total profit for the year ended 31 December 2014 was $1.615 million. Total profit for the year ended 31 December 2018 was $49 million. Appens share price has increased from 56 cents in early 2015 to more than $24 currently.

Brainchip is a provider of neuromorphic computing solutions, a type of AI inspired by the biology of the human neuron. In 2018, Brainchip announced the release of the Akida Neuromorphic System-On-Chip. The Akida is small, low cost, and low power, making it well suited for applications such as autonomous vehicles, drones, and machine vision systems.

The Akida IP was released for sale as a license in mid 2019 and has received a positive response from customers. Brainchip has some revenues, however these are currently not sufficient to cover its expenses for R&D, marketing, etcetera. The company ended the September 2019 quarter with US$9.5 million in cash. Significant reductions in planned expenses in 2020 have been initiated.

This ETF tracks the ROBO Global Robotics and Automation Index. The Index is made up of shares in companies in the global value chain of robotics, automation, and artificial intelligence. The ETF has provided returns of 29.50% in the year to 31 December.

Management fees are 0.69% per annum and distributions are made annually. The ETF has 91 holdings spread across 13 countries with a weighted price-to-earnings ratio of 30.7.

The AI industry will only grow over the coming years. Whether that means Brainchip and Appen will also grow remains to be seen. In my view, the least risky choice of the 3 is likely ROBO, given the diversification it provides.

If you're not sold on the AI theme, here are 5 solid growth shares to consider instead.

NEW. Five Cheap and Good Stocks to Buy for 2020 and beyond!.

Our Motley Fool experts have just released a fantastic report, detailing 5 dirt cheap shares that you can buy in 2020.

One stock is an Australian internet darling with a rock solid reputation and an exciting new business line that promises years (or even decades) of growth while trading at an ultra-low price

Another is a diversified conglomerate trading near a 52-week low all while offering a 2.7% fully franked yield...

Plus 3 more cheap bets that could position you to profit over the next 12 months!

See for yourself now. Simply click the link below to scoop up your FREE copy and discover all 5 shares. But you will want to hurry this free report is available for a brief time only.

CLICK HERE FOR YOUR FREE REPORT!

Motley Fool contributor Kate O'Brien has no position in any of the stocks mentioned. The Motley Fool Australia owns shares of Appen Ltd. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy. This article contains general investment advice only (under AFSL 400691). Authorised by Scott Phillips.

Visit link:
3 ASX shares for exposure to the artificial intelligence industry - Motley Fool Australia

Google claims to have invented a quantum computer, but IBM begs to differ – The Conversation CA

On Oct. 23, 2019, Google published a paper in the journal Nature entitled Quantum supremacy using a programmable superconducting processor. The tech giant announced its achievement of a much vaunted goal: quantum supremacy.

This perhaps ill-chosen term (coined by physicist John Preskill) is meant to convey the huge speedup that processors based on quantum-mechanical systems are predicted to exhibit, relative to even the fastest classical computers.

Googles benchmark was achieved on a new type of quantum processor, code-named Sycamore, consisting of 54 independently addressable superconducting junction devices (of which only 53 were working for the demonstration).

Each of these devices allows the storage of one bit of quantum information. In contrast to the bits in a classical computer, which can only store one of two states (0 or 1 in the digital language of binary code), a quantum bit qbit can store information in a coherent superposition state which can be considered to contain fractional amounts of both 0 and 1.

Sycamore uses technology developed by the superconductivity research group of physicist John Martinis at the University of California, Santa Barbara. The entire Sycamore system must be kept cold at cryogenic temperatures using special helium dilution refrigeration technology. Because of the immense challenge involved in keeping such a large system near the absolute zero of temperature, it is a technological tour de force.

The Google researchers demonstrated that the performance of their quantum processor in sampling the output of a pseudo-random quantum circuit was vastly better than a classical computer chip like the kind in our laptops could achieve. Just how vastly became a point of contention, and the story was not without intrigue.

An inadvertent leak of the Google groups paper on the NASA Technical Reports Server (NTRS) occurred a month prior to publication, during the blackout period when Nature prohibits discussion by the authors regarding as-yet-unpublished papers. The lapse was momentary, but long enough that The Financial Times, The Verge and other outlets picked up the story.

A well-known quantum computing blog by computer scientist Scott Aaronson contained some oblique references to the leak. The reason for this obliqueness became clear when the paper was finally published online and Aaronson could at last reveal himself to be one of the reviewers.

The story had a further controversial twist when the Google groups claims were immediately countered by IBMs quantum computing group. IBM shared a preprint posted on the ArXiv (an online repository for academic papers that have yet to go through peer review) and a blog post dated Oct. 21, 2019 (note the date!).

While the Google group had claimed that a classical (super)computer would require 10,000 years to simulate the same 53-qbit random quantum circuit sampling task that their Sycamore processor could do in 200 seconds, the IBM researchers showed a method that could reduce the classical computation time to a mere matter of days.

However, the IBM classical computation would have to be carried out on the worlds fastest supercomputer the IBM-developed Summit OLCF-4 at Oak Ridge National Labs in Tennessee with clever use of secondary storage to achieve this benchmark.

While of great interest to researchers like myself working on hardware technologies related to quantum information, and important in terms of establishing academic bragging rights, the IBM-versus-Google aspect of the story is probably less relevant to the general public interested in all things quantum.

For the average citizen, the mere fact that a 53-qbit device could beat the worlds fastest supercomputer (containing more than 10,000 multi-core processors) is undoubtedly impressive. Now we must try to imagine what may come next.

The reality of quantum computing today is that very impressive strides have been made on the hardware front. A wide array of credible quantum computing hardware platforms now exist, including ion traps, superconducting device arrays similar to those in Googles Sycamore system and isolated electrons trapped in NV-centres in diamond.

These and other systems are all now in play, each with benefits and drawbacks. So far researchers and engineers have been making steady technological progress in developing these different hardware platforms for quantum computing.

What has lagged quite a bit behind are custom-designed algorithms (computer programs) designed to run on quantum computers and able to take full advantage of possible quantum speed-ups. While several notable quantum algorithms exist Shors algorithm for factorization, for example, which has applications in cryptography, and Grovers algorithm, which might prove useful in database search applications the total set of quantum algorithms remains rather small.

Much of the early interest (and funding) in quantum computing was spurred by the possibility of quantum-enabled advances in cryptography and code-breaking. A huge number of online interactions ranging from confidential communications to financial transactions require secure and encrypted messages, and modern cryptography relies on the difficulty of factoring large numbers to achieve this encryption.

Quantum computing could be very disruptive in this space, as Shors algorithm could make code-breaking much faster, while quantum-based encryption methods would allow detection of any eavesdroppers.

The interest various agencies have in unbreakable codes for secure military and financial communications has been a major driver of research in quantum computing. It is worth noting that all these code-making and code-breaking applications of quantum computing ignore to some extent the fact that no system is perfectly secure; there will always be a backdoor, because there will always be a non-quantum human element that can be compromised.

More appealing for the non-espionage and non-hacker communities in other words, the rest of us are the possible applications of quantum computation to solve very difficult problems that are effectively unsolvable using classical computers.

Ironically, many of these problems emerge when we try to use classical computers to solve quantum-mechanical problems, such as quantum chemistry problems that could be relevant for drug design and various challenges in condensed matter physics including a number related to high-temperature superconductivity.

So where are we in the wonderful and wild world of quantum computation?

In recent years, we have had many convincing demonstrations that qbits can be created, stored, manipulated and read using a number of futuristic-sounding quantum hardware platforms. But the algorithms lag. So while the prospect of quantum computing is fascinating, it will likely be a long time before we have quantum equivalents of the silicon chips that power our versatile modern computing devices.

[ Deep knowledge, daily. Sign up for The Conversations newsletter. ]

Link:
Google claims to have invented a quantum computer, but IBM begs to differ - The Conversation CA

ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam – Yahoo Finance

The 2020 ASC Student Supercomputer Challenge (ASC20) announced the tasks for the new season: using supercomputers to simulate Quantum circuit and training AI models to take English test. These tasks can be unprecedented challenges for the 300+ ASC teams from around the world. From April 25 to 29, 2020, top 20 finalists will fiercely compete at SUSTech in Shenzhen, China.

ASC20 set up Quantum Computing tasks for the first time. Teams are going to use the QuEST (Quantum Exact Simulation Toolkit) running on supercomputers to simulate 30 qubits in two cases: quantum random circuits (random.c), and quantum fast Fourier transform circuits (GHZ_QFT.c). Quantum computing is a disruptive technology, considered to be the next generation high performance computing. However the R&D of quantum computers is lagging behind due to the unique properties of quantum. It adds extra difficulties for scientists to use real quantum computers to solve some of the most pressing problems such as particle physics modeling, cryptography, genetic engineering, and quantum machine learning. From this perspective, the quantum computing task presented in the ASC20 challenge, hopefully, will inspire new algorithms and architectures in this field.

The other task revealed is Language Exam Challenge. Teams will take on the challenge to train AI models on an English Cloze Test dataset, vying to achieve the highest "test scores". The dataset covers multiple levels of English language tests in China, including the college entrance examination, College English Test Band 4 and Band 6, and others. Teaching the machines to understand human language is one of the most elusive and long-standing challenges in the field of AI. The ASC20 AI task signifies such a challenge, by using human-oriented problems to evaluate the performance of neural networks.

Wang Endong, ASC Challenge initiator, member of the Chinese Academy of Engineering and Chief Scientist at Inspur Group, said that through these tasks, students from all over the world get to access and learn the most cutting-edge computing technologies. ASC strives to foster supercomputing & AI talents of global vision, inspiring technical innovation.

Dr. Lu Chun, Vice President of SUSTech - host of the ASC20 Finals, commented that supercomputers are important infrastructure for scientific innovation and economic development. SUSTech makes focused efforts on developing supercomputing and hosting ASC20, hoping to drive the training of supercomputing talent, international exchange and cooperation, as well as inter discipline development at SUSTech.

Furthermore, during January 15-16, 2020, the ASC20 organizing committee held a competition training camp in Beijing to help student teams prepare for the ongoing competition. HPC and AI experts from the State Key Laboratory of High-end Server and Storage Technology, Inspur, Intel, NVIDIA, Mellanox, Peng Cheng Laboratory and the Institute of Acoustics of the Chinese Academy of Sciences gathered to provide on-site coaching and guidance. Previous ASC winning teams also shared their successful experiences.

About ASC

The ASC Student Supercomputer Challenge is the worlds largest student supercomputer competition, sponsored and organized by Asia Supercomputer Community in China and supported by Asian, European, and American experts and institutions. The main objectives of ASC are to encourage exchange and training of young supercomputing talent from different countries, improve supercomputing applications and R&D capacity, boost the development of supercomputing, and promote technical and industrial innovation. The annual ASC Supercomputer Challenge was first held in 2012 and has since attracted over 8,500 undergraduates from all over the world. Learn more ASC at https://www.asc-events.org/.

View source version on businesswire.com: https://www.businesswire.com/news/home/20200121005431/en/

Contacts

Fiona LiuLiuxuan01@inspur.com

Read the original post:
ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam - Yahoo Finance

Toshiba says it created an algorithm that beats quantum computers using standard hardware – TechSpot

Something to look forward to: Some of the biggest problems that need solving in the enterprise world require sifting through vast amounts of data and finding the best possible solution given a number of factors and requirements, some of which are at times unknown. For years, quantum computing has been touted as the most promising jump in computational speed for certain kind of problems, but Toshiba says revisiting classical algorithms helped it develop a new one that can leverage existing silicon-based hardware to get a faster result.

Toshiba's announcement this week claims a new algorithm it's been perfecting for years is capable of analyzing market data much more quickly and efficiently than those used in some of the world's fastest supercomputers.

The algorithm is called the "Simulated Bifurcation Algorithm," and is supposedly good enough to be used in finding accurate approximate solutions for large-scale combinatorial optimization problems. In simpler terms, it can come up with a solution out of many possible ones for a particularly complex problem.

According to its inventor, Hayato Goto, it draws inspiration from the way quantum computers can efficiently comb through many possibilities. Work on SBA started in 2015, and Goto noticed that adding new inputs to a complex system with 100,000 variables makes it easy to solve it in a matter of seconds with a relatively small computational cost.

This essentially means that Toshiba's new algorithm could be used on standard desktop computers. To give you an idea how important this development is, Toshiba demonstrated last year that SBA can get highly accurate solutions for an optimization problem with 2,000 connected variables in 50 microseconds, or 10 times faster than laser-based quantum computers.

SBA is also highly scalable, meaning it can be made to work on clusters of CPUs or FPGAs, all thanks to the contributions of Kosuke Tatsumura, another one of Toshiba's senior researchers that specializes in semiconductors.

Companies like Microsoft, Google, IBM, and many others are racing to be the first with a truly viable quantum commercial system, but so far their approaches have produced limited results that live inside their labs.

Meanwhile, scientists like Goto and Kosuke are going back to the roots by exploring ways to improve on classical algorithms. Toshiba hopes to use SBA to optimize financial operations like currency trading and rapid-fire portfolio adjustments, but this could very well be used to calculate efficient routes for delivery services and molecular precision drug development.

Originally posted here:
Toshiba says it created an algorithm that beats quantum computers using standard hardware - TechSpot

Quantum Computing Inc. Releases Its First Quantum Ready Software Product QAA The Quantum Asset Allocator to Optimize Portfolio Returns – Quantaneo,…

The target market (estimated at over $1 billion) for QAA is financial institutions who are currently addressing asset allocation problems but are looking for better tools with which to optimize portfolio performance. QAA is available both as a cloud based software service and as an on premises software + hardware system. Both implementations are designed to quickly return optimal or near-optimal interactive solutions and analyses of financial asset allocation problems. QAA leverages a financial institutions strategy for calculating risk and expected return, based on analytical values for the various index sectors and subsectors in its investable universe. QAA has been proven to enhance fund strategy by calculating the optimal portfolio mix to maximize returns in beta tests against portfolios using traditional portfolio management techniques. This is a major breakthrough for QCI, stated Robert Liscouski, CEO of Quantum Computing Inc. We are excited to be releasing QAA which will provide small and medium sized funds the ability to do asset allocation that previously was the province of large brokerage firms, mutual fund and the largest quant funds. Beta tests have demonstrated superior portfolio performance using quantum inspired techniques on both classical and existing quantum computing hardware, he added. Liscouski stated that QCI is already working with beta clients to implement QAA in their environment.

QCI develops and sells quantum-ready software solutions for clients who have problems that can be solved using quantum techniques to provide superior results on classical computers today. Our software is designed to also run on quantum computers when they deliver performance faster than classical computers, stated Steve Reinhardt, VP of Product Development at QCI. This is the launch of our first of a series of products that will leverage quantum techniques to provide differentiated performance on both classical computers and on a variety of early-stage quantum computers such as DWave and other annealers, which are on the market today. Our applications are designed to be deployed on a clients infrastructure on premises or in the AWS cloud, he added. Mark Wainger, Director of Application Development stated, Asset allocation is well known for being a complex calculation, with several types of constraints making it an NP-hard problem. QCIs Quantum Asset Allocator has been tested and proven to provide superior results for portfolio management and we are excited to be working with our first clients.

Visit link:
Quantum Computing Inc. Releases Its First Quantum Ready Software Product QAA The Quantum Asset Allocator to Optimize Portfolio Returns - Quantaneo,...

Xanadu Receives $4.4M Investment from SDTC to Advance its Photonic Quantum Computing Technology – Quantaneo, the Quantum Computing Source

Xanadu, a Canadian quantum hardware and technology company has received a $4.4M investment from Sustainable Development Technology Canada (SDTC). The investment will expedite the development of Xanadu's photonic quantum computers and make them available over the cloud. This project will also further the company's overall progress towards the construction of energy-efficient universal quantum computers.

"Canadian cleantech entrepreneurs are tackling problems across Canada and in every sector. I have never been more positive about the future. The quantum hardware technology that Xanadu is building will develop quantum computers with the ability to solve extremely challenging computational problems, completing chemical calculations in minutes which would otherwise require a million CPUs in a data center," said Leah Lawrence, President and CEO, Sustainable Development Technology Canada.

Despite efforts to improve the power efficiency of traditional computing methods, the rapid growth of data centres and cloud computing presents a major source of new electricity consumption. In comparison to classical computing, quantum computing systems have the benefit of performing certain tasks and algorithms at an unprecedented rate. This will ultimately reduce the requirements for electrical power and the accompanying air and water emissions associated with electricity production.

Xanadu is developing a unique type of quantum computer, based on photonic technology, which is inherently more power-efficient than electronics. Xanadu's photonic approach uses laser light to carry information through optical chips, rather than the electrons or ions used by their competitors. By using photonic technology, Xanadu's quantum computers will one day have the ability to perform calculations at room temperature, and eliminate the bulky and power-hungry cooling systems required by most other types of quantum computers.

The project will be undertaken by Xanadu's team of in-house scientists, with collaboration from the University of Toronto and Swiftride. The project will be carried out over three years and will encompass the development of Xanadu's architecture, hardware, software and client interfaces with the overall goal of expediting the development of the company's technology, and demonstrating the practical benefits of quantum computing for users and customers by the end of 2022.

"We are thrilled by the recognition and support that we are receiving from SDTC for the development of our technology. We firmly believe that our unique, photonic-based approach to quantum computing will deliver both valuable insights and tangible environmental benefits for our customers and partners," said Christian Weedbrook, CEO of Xanadu.

The rest is here:
Xanadu Receives $4.4M Investment from SDTC to Advance its Photonic Quantum Computing Technology - Quantaneo, the Quantum Computing Source

Quantum Computing market 2019 |global industry analysis by trends, size, share, company overview, growth and forecast by 2024 | latest research report…

A report on Quantum Computing Market Added by AlexaReports.com, highlights the ongoing and upcoming development patterns of this business notwithstanding exact subtleties identified with the heap topographies that contain the provincial range of the Quantum Computing market. Moreover, the report explains complex insights concerning the stock interest examination, industry share, development measurements and investment of significant players.

The most recent record on the Quantum Computing Industry has the consideration of a far reaching investigation of this industry close by the point by point division of this vertical. According to the report, the Quantum Computing market is anticipated to collect noteworthy returns over the evaluated period, while recording an exceptional development rate y-o-y over the pending years.

The exploration study briefly dismembers the Quantum Computing market and uncovers important estimations relating to the benefit projections, showcase size, deals limit, and various other pivotal parameters. Likewise, the Quantum Computing industry report assesses the business sections just as the driving variables affecting the compensation size of this industry.

Important Companies cover in this report areD-Wave Systems Inc., Qxbranch, LLC, International Business Machines Corporation (IBM), Cambridge Quantum Computing Ltd, 1qb Information Technologies Inc., QC Ware Corp., Magiq Technologies Inc., Station Q Microsoft Corporation, Rigetti Computing, Research at Google Google Inc.

By Revenue SourceHardware, Software, Services

By ApplicationSimulation, Optimization, Sampling

By IndustryDefense, Healthcare & Pharmaceuticals, Chemicals, Banking & Finance, Energy & Power

Elaborating the market with respect to the geographical landscape:The exploration report contains a somewhat across the board investigation of the geographical scene of the Quantum Computing market, which is evidently arranged into the locales North America, Europe, Asia-Pacific, South America and Middle East and Africa, and includes a few parameters relating to the local commitment.

Request a sample Report of Quantum Computing Market at: https://www.alexareports.com/report-sample/140107

Fundamental insights about the sales generated by each zone as well as the registered market share have been mentioned in the research document.

Basic experiences about the business created by each zone just as the enlisted piece of the overall industry have been referenced in the exploration record.The revenues and growth rate that each region will record over the projected duration are also detailed in the report.

The incomes and development rate that every region will record over the anticipated term are detailed in the report.

A brief outline of the major takeaways of Quantum Computing market report has been enlisted below:A thorough overview of the competitive backdrop of the Quantum Computing market that encompasses leading firms are elaborated in the study. A concise synopsis of all the manufacturers, product developed, and product application scopes has been included. The report endorses information about the organizations on the basis of the position they hold in the industry as well as the sales accrued by the manufacturers. Also included in the report are the firms gross margins and price models. The markets product spectrum covers types.

Information about these products has been mentioned in the study the report states the market share that these products will accrue in the industry over the forecast period. The study reports the sales registered by the products as well as the revenues earned over the foreseeable duration. The research highlights the application landscape of Quantum Computing market that includes applications

The report enlists the market share accrued by the application segment. The revenues accumulated by these applications as well as the sales projections for the projected timeframe are also included in the report. The study also deals with important factors like the competition patterns and market concentration rate. Comprehensive information pertaining to the sales channels like direct and indirect marketing opted for by producers for promoting their products is given in the report. The evaluation of the Quantum Computing market claims that this industry is anticipated to depict substantial revenue over the projected timeframe. The report includes supplementary data with respect to the market dynamics such as the potential growth opportunities, challenges present in this vertical, and the factors affecting the business sphere.

Ask for Discount on Report at: https://www.alexareports.com/check-discount/140107

Some of the Major Highlights of TOC covers:Development Trend of Analysis of Quantum Computing MarketGlobal Market Trend AnalysisGlobal Market Size (Volume and Value) Forecast 2019-2024Marketing ChannelDirect MarketingIndirect MarketingQuantum Computing CustomersMarket DynamicsMarket TrendsOpportunitiesMarket DriversChallengesInfluence FactorsMethodology/Research ApproachResearch Programs/DesignMarket Size EstimationMarket Breakdown and Data TriangulationData Source

Follow Us:https://www.linkedin.com/company/alexa-reporthttps://www.facebook.com/alexareportsinchttps://twitter.com/Alexa_Reports

Go here to read the rest:
Quantum Computing market 2019 |global industry analysis by trends, size, share, company overview, growth and forecast by 2024 | latest research report...

U of T’s Peter Wittek, who will be remembered at Feb. 3 event, on why the future is quantum – News@UofT

In September of 2019, Peter Wittek, an assistant professor at the University of Toronto, went missing during a mountaineering expedition in the Himalayas after reportedly being caught in an avalanche. A search and rescue mission was launched but the conditions were very difficult and Wittek was not found.

Peters loss is keenly felt, said Professor Ken Corts, acting dean of the Rotman School of Management. He was the Founding Academic Director of the CDL Quantum Stream, a valued instructor in the MMA program, data scientist in residence with the TD Management Data and Analytics Lab, an exceptional contributor to Rotman and U of T and a wonderful colleague.

A ceremony to remember Wittek will take place on Feb. 3 from 3 to 4:30 pm in Desautels Hall at the Rotman School of Management.

Quantum computing and quantum machine learning an emerging field that counted Wittek as one of its few experts was the topic of his final interview inRotman Management Magazine. It is reprinted below:

You oversee the Creative Destruction Labs Quantum stream, which seeks entrepreneurs pursuing commercial opportunities at the intersection of quantum computing and machine learning. What do those opportunities look like?

Weve been running this stream for three years now, and we were definitely the first to do this in an organized way. However, the focus has shifted slightly. We are now interested in looking at any application of quantum computing.

These are still very early days for quantum computing. To give you a sense of where we are at, some people say its like the state of digital computing in the 1950s, but Id say its more like the 1930s. We dont even agree yet on what the architecture should look likeand, as a result, we are very limited with respect to the kind of applications we can build.

As a result, focusing on quantum is still quite risky. Nevertheless, so far we have had 45 companies complete our program. Not all of them survived, but a good dozen of them have raised funding. If you look at the general survival rate for AI start-ups, our record is roughly the same and given how new this technology is, that is pretty amazing.

What are the successful start-ups doing? Can you give an example of the type of problems theyre looking to solve?

At the moment I would say the main application areas are logistics and supply chain. Another promising area is life sciences, where all sorts of things can be optimized with this technology. For instance, one of our companies,Protein-Qure, is folding proteins with quantum computers.

Finance is another attractive area for these applications. In the last cohort we had a company that figured out a small niche problem where they had both the data and the expertise to provide something new and innovative; they are in the process of raising money right now. The other area where quantum makes a lot of sense is in material discovery. The reason we ever even thought of building these computers was to understand quantum materials, back in the 1980s. Today, one of our companies is figuring out how to discover new materials using quantum processing units instead of traditional supercomputers.

We have a company calledAgnostic, which is doing encryption and obfuscation for quantum computers. Right nowIBM,Rigetti ComputingandD-Wave Systemsare building quantum computers for individual users. They have access to everything that you do on the computer and can see all the data that youre sending. But if youre building a commercial application, obviously you will want tohide that. Agnostic addresses this problem by obfuscating the code you are running. One application weve seen in the life sciences is a company calledEigenMed, which addresses primary care. They provide novel machine learning algorithms for primary care by using quantum-enhanced sampling algorithms.

We also seed companies that dont end up using quantum computing. They might try out a bunch of things and discover that it doesnt work for the application they have in mind, and they end up being 100 per cent classical.StratumAIis an example of this. It uses machine learning to map out the distribution of ore bodies under the ground. The mining industry is completely underserved by technology, and this company figured out thatto beat the state-of-the-art by a significant margin, it didnt even need quantum. They just used classical machine learning and they already have million dollar contracts.

Which industries will be most affected by this technology?

Life sciences will be huge because, as indicated, it often has complex networks and probability distributions, and these are very difficult to analyze with classical computers. The way quantum computers work, this seems to be a very good fit, so that is where I expect the first killer app to come from. One company,Entropica Labs, is looking at various interactions of several genomes to identify how the combined effects cause certain types of disease. This is exactly the sort of problem that is a great fit for a quantum computer.

You touched on quantum applications in primary care. If I walked into a doctors office, how would that affect me?

Its trickybecause, like mining, primary care is vastly underserved by technology. So, if you were to use any machine learning, you would only do better. But EigenMed was actually founded by an MD. He realized that there are certain machine learning methods that we dont use simply because their computational requirements are too high but that they happen to be a very good fit for primary care, because the questions you can ask the computer are similar to what a GP would ask.

For instance, if a patient walks in with a bunch of symptoms, you can ask, What is the most likely disease? and What are the most likely other symptoms that I should verify to make sure it is what I suspect? These are the kinds of probabilistic questions that are hard to ask on current neural network architectures, but they are exactly the kind of questions that probabilistic graphical models handle well.

Are physicians and other health-care providers open to embracing this technology, or do they feel threatened by it?

First of all, health care is a heavily regulated market, so you need approval for everything. Thats not always easy to getand, as a result, it can be very difficult to obtain data. This is the same problem that any machine learning company faces. Fine, they have this excellent piece of technology and theyve mastered it,but if you dont have any good data, you dont have a company. I see that as the biggest obstacle to machine learning-based progress in health care and life sciences.

You have said that QML has the potential to bring about the next wave of technology shock. Any predictions as to what that might look like?

I think its going to be similar to what happened with deep learning. The academic breakthrough happened about nine years ago, but it took a long time to get into the public discussion. This is currently happening with AI which, at its core, is actually just very simple pattern recognition. Its almost embarrassing how simplistic AI is and yet it is already changing entire industries.

Quantum is next not just quantum machine learning but quantum computing in general. Breakthroughs are happening every day, both on the hardware side and in the kind of algorithms you can build with quantum computers. But its going to take another 10 years until it gets into public discussions and starts to disrupt industries. The companies we are seeding today are going to be the ones that eventually disrupt industries.

Alibaba is one of the companies at the forefront of embracing quantum, having already committed $15 billion to it. What is Alibaba after?

First of all, I want to say a huge thank you toAlibaba becausethe moment it made that commitment, everyone woke up and said, Hey, look: the Chinese are getting into quantum computing! Almost immediately, the U.S. government allocated $1.3 billion to invest in and develop quantum computers, and a new initiative is also coming together in Canada.

The worlds oldest commercial quantum computing company is actually from Canada:D-Wave Systemsstarted in 1999 in British Columbia. Over its 20-year history, it managed to raise over $200 million. Then Alibaba came along and announced it was committing $15 billion to quantumand this completely changed the mindset. People suddenly recognized that theres a lot of potential in this area.

What does Alibaba want from quantum? You could ask the same question ofGoogle, which is also building a quantum computer. For them, its because they want to make their search and advertisement placement even better than it already is. Eventually, this will be integrated into their core business. I think Alibaba is looking to do something similar. As indicated, one of the main application areas for quantum is logistics and supply chain. Alibaba has a lot more traffic thanAmazon. Its orders are smaller, but the volume of goods going through its warehouses is actually much larger. Any kind of improved optimization it can achieve will translate into millions of dollars in savings. My bet is that Alibabas use of quantum will be applied to something that is critical to its core operation.

The mission of CDLs Quantum stream is that, by 2022, it will have produced more revenue-generating quantum software companies than the rest of the world combined. What is the biggest challenge you face in making that a reality?

People are really waking up to all of this. There is already a venture capital firm that focuses exclusively on quantum technologies. So, the competition is steep, but we are definitely leading in terms of the number of companies created. In Canada, the investment community is a bit slow to put money into these ventures. But every year we are recruiting better and better people and the cohorts are more and more focused and, as a result, I think we are going to see more and more success stories.

It seems like everyone is interested in quantum andthey are thinking about investing in it, but they are all waiting for somebody else to make the first move. Im waiting for that barrier to break and, in the meantime, we are making progress.Xanadujust raised $32 million in Series A financing, which indicates that it has shown progress in building its business model and demonstrated the potential to grow and generate revenue. ProteinQure raised a seed of around $4 million dollars. And another company,BlackBrane, raised $2 million. So, already, there are some very decent financing rounds happening around quantum. It will take lots of hard work, but I believe we will reach our goal.

Peter Wittekwas an Assistant Professor at the Rotman School of Management and Founding Academic Director of the Creative Destruction Labs Quantum stream. The author ofQuantum Machine Learning: What Quantum Computing Means to Data Mining(Academic Press, 2016),he was also a Faculty Affiliate at the Vector Institute for Artificial Intelligence and the Perimeter Institute for Theoretical Physics.

This article appeared in theWinter 2020 issueof Rotman ManagementMagazine.Published by the University of Torontos Rotman School of Management,Rotman Managementexplores themes of interest to leaders, innovators and entrepreneurs.

See the article here:
U of T's Peter Wittek, who will be remembered at Feb. 3 event, on why the future is quantum - News@UofT