bp Joins the IBM Quantum Network to Advance Use of Quantum Computing in Energy – HPCwire

LONDON,Feb. 15, 2021 IBM today announcedbp has joined the IBM Quantum Network to advance the use of quantum computing in the energy industry.

By joining the IBM Quantum Network as an Industry Partner, bp will have access to IBMs quantum expertise and software and cloud-based access to the most advanced quantum computers available via the cloud. This includes access to a premium 65-qubit quantum computer, the largest universal quantum system available to industry today, and an important milestone on the IBM Quantum roadmapto a 1,000-plus qubit system, targeted for the end of 2023.

bp will work with IBMto explore using quantum computing to solve business and engineering challenges and explore the potential applications for driving efficiencies and reducing carbon emissions.

bps ambition is to become a net zero company by 2050 or sooner and help the world get to net zero. Next-generation computing capabilities such as quantum computing will assist in solving the science and engineering challenges we will face, enabling us to reimagine energy and design new lower carbon products, saidMorag Watson, senior vice president, digital science and engineering for bp.

Quantum computing has the potential to be applied in areas such as: modelling the chemistry and build-up of various types of clay in hydrocarbon wells a crucial factor in efficient hydrocarbon production; analyzing and managing the fluid dynamics of wind farms; optimizing autonomous robotic facility inspection; and helping create opportunities not yet imagined to deliver the clean energy the world wants and needs.

In 2020, bp announced its net zero ambition and its new strategy.By the end of this decade, it aims to have developed around 50 gigawatts of net renewable-generating capacity(a 20-fold increase), increased annual low carbon investment 10-fold to around$5 billionand cut its oil and gas production by 40%.

Joining the IBM Quantum Network will enhance bps ability to leverage quantum advances and applications as they emerge and then influence on how those breakthroughs can be applied to its industry and the energy transition.

bp joins a rapidly growing number of clients working with IBM to explore quantum computing to help accelerate the discovery of solutions to some of todays biggest challenges, addedDario Gil, Senior Vice President and Director of IBM Research. The energy industry is ripe with opportunities to see value from the use of quantum computing through the discovery of new materials designed to improve the generation, transfer, and storage of energy.

bp joins more than 130 members of the IBM Quantum Network, a global community of Fortune 500 companies, start-ups, academic institutions and research labs working to advance quantum computing and explore practical applications. Together, members of the Network and IBM Quantum teams are researching and exploring how quantum computing will help a variety of industries and disciplines, including finance, energy, chemistry, materials science, optimization and machine learning, among many others.

For more information about the IBM Quantum Network, as well as a full list of all partners, members, and hubs, visithttps://www.research.ibm.com/ibm-q/network/.

IBM Quantum Network is a trademark of International Business Machines Corporation.

About bp

bps purpose is to reimagine energy for people and our planet. It has set out an ambition to be a net zero company by 2050, or sooner, and help the world get to net zero, and recently announced its strategy for delivering on that ambition.For more information visitbp.com.

About IBM Quantum

IBM Quantum is an industry-first initiative to build universal quantum systems for business and science applications. For more information about IBMs quantum computing efforts, please visitwww.ibm.com/ibmq.

Source: IBM

Read more here:
bp Joins the IBM Quantum Network to Advance Use of Quantum Computing in Energy - HPCwire

The Fourth Industrial Revolution AI, Quantum, and IoT Impacts on Cybersecurity – Security Boulevard

Technology changes at a breakneck pace, and to be of any use, the security we rely on to protect that technology must change alongside it.

Cybersecurity solutions, in particular, must keep up with the evolving needs of hybrid enterprise networks that connect an ever-expanding mesh of cloud devices, on-prem legacy hardware and everything in between.

The next cybersecurity challenge lies with the advances in quantum computing that are set to revolutionize tech while simultaneously equipping threat actors with a new arsenal of cyberweapons.

The fourth industrial revolution is upon us. Its a bold claim are we really about to usher in an era as potentially impactful as the steam engine, the age of science and mass production and the initial rise of digital technology?

Well, yes. According to several high-profile industry experts who spoke at the Consumer Electronics Show (CES) 2021, advances in artificial intelligence (AI) and quantum computing are set to fundamentally change the way the world engages with technology.

As an emerging concept, the high-level technology industry has yet to arrive at a fully-consistent definition, but widespread consensus points to a focus on several key elements. The fourth industrial revolution will be marked by fundamental advances and interconnectivity between fields like:

Tying them all together is quantum computing, which we can define aswell, its not particularly simple to explain quantum computing for most of us. Even MIT, while trying to explain it like were five years old, refers to quantum computing as technology that harnesses some of the almost-mystical phenomena of quantum mechanics.

Still, its good to develop a high-level understanding so that we can view the impact on cybersecurity within a more informed context. The MIT explainer referenced above offers a relatively-accessible introduction, as does this Microsoft Azure guide. Without diving deep into a course on qubits, superposition and entanglement, however, we can also gain insight by considering how enterprises are already using quantum computing.

Volkswagen and Daimler, for example, are using quantum supercomputers to improve electric vehicle batteries based on chemical simulations. Simulating, at a molecular level, the behavior of matter is one way we will fundamentally change our approach to problem-solving in the age of quantum computing.

Quantum computing is based on technology weve yet to fully harness. However, the same constant remains true when it comes to bad actors: whatever the good guys understand about quantum computing, the bad guys do, too.

Unfortunately, there will always be an army of cyber criminals standing by, ready to apply their knowledge and talents to nefarious activity. Its safe to say that vulnerabilities will plague quantum systems just as theyve plagued every other next generation system.

In order for cybersecurity solutions to adequately guard quantum networks, they will need to address several key factors:

While each of these issues will require specific high-level and granular solutions, networks equipped with true self-learning AI capabilities will fare better when monitoring network activity, even as it occurs at whirlwind, quantum speeds.

MixModes predictive, proactive, efficient AI gives organizations a fighting chance at combating modern actors. Rules-based approaches are doomed to fail against cyberthreats in the quantum space.

On one level, its a simple matter of speed. The systems of tomorrow (and many of the systems of today) will move too quickly for modern SOCs to keep their security platforms up-to-date. Context-aware AI must live within enterprise systems in order to detect anomalies as they occur in such rapidly changing environments.

MixMode is ready to face quantum threats by thriving within quantum networks. MixMode is data- and feed-agnostic it can operate effectively and independently regardless of data format and type.

As systems rapidly expand and scale to allow for the increased data inputs organizations will need to monitor. For example, we can expect an influx of 5G-enabled IoT sensors and increased remote connections among a workforce forever changed by the 2020 pandemic.

Because MixModes third-wave, self-supervised AI doesnt need constant babysitting or continual rules-tweaking, the platform will protect quantum systems with an approach proven to identify threats and anomalies in network traffic, log systems, API, time-series, cloud data, and beyond.

Learn more about MixMode and set up a demo today.

How a Government Entity Switched to MixMode and Decreased Data Storage Costs by 50%

The SOC Reckoning

2021: The Year SOCs Embrace Cybersecurity Convergence

Why Responding to a Cyber Attack with a Traditional SIEM Leaves You Vulnerable

Misconceptions of the SOAR Playbook

Building a Better SOC Based on What We Learned in 2020

Excerpt from:
The Fourth Industrial Revolution AI, Quantum, and IoT Impacts on Cybersecurity - Security Boulevard

Experience: With a PhD, the plan is to expand human knowledge – The Guardian

When Zak Romaszko finished his physics degree at the University of Liverpool, a PhD in computing was his obvious next step. I have always been fascinated with computers, says the 27-year-old. I broke my dads PC when I was younger and he was away in the forces, so I had to fix it myself. His interest grew from there, but Romaszkos choice of focus for his research isnt just any type of computing but the cutting-edge quantum variety.

Thought by many to be the next step in the field, and key to solving complex problems in a manageable amount of time, quantum computers use quantum bits rather than the regular bits used by standard computers.

It will be able to solve problems that might take computers millions and billions of years in timescales that are more realistic to humans, says Romaszko. It seemed to be that this would be the way forward in how big calculations would be done in the future.

He found an opportunity to undertake a PhD at the University of Sussex with Prof Winfried Hensinger a subject expert linked to making an ion trap quantum computer, the next step in the computers of the future. Romaszko, who is from Barnoldswick in Lancashire, spent four years on the project as part of the universitys Ion Quantum Technology group, graduating in June 2020. He has now joined a spin-off company founded by Hensinger called Universal Quantum, which is looking to commercialise the technology to make a large-scale quantum computer.

My PhD focused on how we would scale this technology from the level we are at now and get to the point where we need to be to make a truly useful quantum computer, he says.

It sounds like science fiction but Romaszko explains that quantum computers could hold the key to solving some major issues in our world today. People are looking into things like simulation of chemicals and materials and understanding how medicines interact within the body and AI applications, he says.

While it may be difficult to grasp the scale of the computing power at work in the quantum, Romaszko is thrilled to be pushing the boundaries. With a PhD youre basically learning about a field and a very narrow area of science that you just plan to push out a little bit further and expand human knowledge. Its really exciting.

Link:
Experience: With a PhD, the plan is to expand human knowledge - The Guardian

Hunger in Milwaukee and the World: What We Can Do About It – Wisconsin Public Radio News

The United Nations Association of Greater Milwaukee invites you to "Hunger in Milwaukee and the World: What We Can Do About It"

A Virtual Zoom Program Featuring a Panel Discussion with Lady Lee Thompson, David Sinclair, & Maureen Fitzgerald Saturday February 13th, 2021 from 10 AM 11:30 AM

Free & Open to the Public

Advance registration is required. To register, go to: https://us02web.zoom.us/meeting/register/tZ0vc-ippjsqGN0DiBvtHRhzj17v4myT7r0f.After registering, you will receive a confirmation email with information about joining the meeting. For more information: Contact Jerry Rousseau at jerroldbrousseau@gmail.com (email) or 414.228.9282 (phone).

Hunger stalks Milwaukee and the World and is getting worse with the COVID-19 pandemic. According to the UN World Food Programme, 135 million people suffer from acute hunger largely due to man-made conflicts, climate change, and economic downturns. The COVID-19 pandemic threatens to double that number putting an additional 130 million people at risk of suffering acute hunger. The following panel of local and international experts and activists will share what is important to know about hunger in Milwaukee and the World and what local and global organizations are doing to diminish food insecurity. You will hear specifics about how Milwaukee and distant places like Yemen are facing a desperate food insecurity crisis. We look forward to your questions and comments following the panel discussion.

Lady Lee Thompson is a 2020-2021 UNA-USA Global Goals Ambassador promoting the UN Sustainable Development Goal of Inclusive Economic Growth, Full and Productive Employment, and Decent Work for All. She is a global advocate for womens self-sufficiency, African diaspora inclusion, youth & womens empowerment, agricultural skill transfer, investment matchmaking for minority business enterprises, gender equality, and responding to the impact of COVID-19 on food security.

David Sinclair is a local community advocate for low-income families on Milwaukees North Side. He is the Project Program Coordinator for the Cream City Credible Messenger Program at WestCare Foundation and manages the food pantries located at WestCare Foundation and Jeremiah Missionary Baptist Church.

Maureen Fitzgerald is the owner of Maureen Fitzgerald Consulting, a public policy and advocacy resource for nonprofits and governmental agencies. She worked as the Director of Advocacy at Hunger Task Force for the last 11 years. Prior to that she practiced criminal defense law in Milwaukee. She is a graduate of Marquette & Marquette University Law School.

Here is the original post:
Hunger in Milwaukee and the World: What We Can Do About It - Wisconsin Public Radio News

New EU Consortium shaping the future of Quantum Computing USA – PRNewswire

Europe has always been excellent in academic research, but over the past few decades commercializing research projects has been slow compared to international competition. This is starting to change with quantum technologies. As one of the largest efforts in Europe and worldwide, Germany announced 2 Billion funding into quantum programs in June 2020, from which 120 Million are invested in this current round of research grants.

Today, IQM announced a Quantum project consortium that includes Europe's leading startups (ParityQC, IQM), industry leaders (Infineon Technologies), research centers (Forschungszentrum Jlich),supercomputing centers (Leibniz Supercomputing Centre), and academia (Freie Universitt Berlin) has been awarded 12.4 Million from the German Ministry of Education and Research (BMBF) (Announcement in German).

The scope of the project is to accelerate commercialization through an innovative co-design concept. This project focuses on application-specific quantum processors, which have the potential to create a fastlane to quantum advantage. The digital-analog concept used to operate the processors will further lay the foundation for commercially viable quantum computers. This project will run for four years and aims to develop a 54-qubit quantum processor.

The project is intended to support the European FET Flagship project EU OpenSuperQ, announced in 2018 which is aimed at designing, building, and operating a quantum information processing system of up to 100 qubits. Deploying digital-analog quantum computing, this consortium adds a new angle to the OpenSuperQ project and widens its scope. With efforts from Munich, Berlin and Jlich, as well as Parity QC from Austria, the project builds bridges and seamlessly integrates into the European quantum landscape.

"The grant from the Federal Ministry of Education and Research of Germanyis a huge recognition of our unique co-design approach for quantum computers. Last year when we established our office in Munich, this was one of our key objectives. The concept allows us to become a system integrator for full-stack quantum computers by bringing together all the relevant players. As Europe's leading startup in quantum technologies, this gives us confidence to further invest in Germany and other European countries" said Dr. Jan Goetz, CEO of IQM Quantum Computers.

As European technology leader, Germany is taking several steps to lead the quantum technology race. An important role of such leadership is to bring together the European startups, industry, research and academic partners. This project will give the quantum landscape in Germany an accelerated push and will create a vibrant quantum ecosystem in the region for the future.

Additional Quotes:

"DAQC is an important project for Germany and Europe. It enables us to take a leading role in the area of quantum technologies. It also allows us to bring quantum computing into one of the prime academic supercomputing centres to more effectively work on the important integration of high-performance computing and quantum computing. We are looking forward to a successful collaboration," said Prof. DrMartinSchulz, Member of the Board of Directors, Leibniz Supercomputing Centre (LRZ).

"The path towards scalable and fully programmable quantum computing will be the parallelizability of gates and building with reduced complexity in order to ensure manageable qubit control. Our ParityQC architecture is the blueprint for a fully parallelizable quantum computer, which comes with the associated ParityOS operating system. With the team of extraordinary members of the DAQC consortium this will allow us to tackle the most pressing and complex industry-relevant optimization problems." saidMagdalena Hauser & Wolfgang Lechner, CEOs & Co-founder ParityQC

"We are looking forward to exploring and realizing a tight connection between hardware and applications, and having DAQC quantum computers as a compatible alternative within the OpenSuperQ laboratory. Collaborations like this across different states, and including both public and private partners, have the right momentum to move quantum computing in Germany forward." saidProf. Frank Wilhelm-Mauch, Director, Institute for Quantum Computing Analytics, Forschungszentrum Jlich

"At Infineon, we are looking forward to collaborating with top-class scientists and leading start-ups in the field of quantum computing in Europe. We must act now if we in Germany and Europe do not want to become solely dependent on American or Asian know-how in this future technology area. We are very glad to be part of this highly innovative project and happy to contribute with our expertise in scaling and manufacturing processes." saidDr.Sebastian Luber, Senior Director Technology & Innovation, Infineon Technologies AG

"This is a hugely exciting project. It is a chance of Europe and Germany to catch up in the development of superconducting quantum computers. I am looking forward to adventures on understanding how such machines can be certified in their precise functioning." said Prof.Jens Eisert, Professor of Quantum Physics, Freie Universitt Berlin

About IQM Quantum Computers:

IQM is the European leader in superconducting quantum computers, headquartered in Espoo, Finland. Since its inception in 2018, IQM has grown to 80+ employees and has also established a subsidiary in Munich, Germany, to lead the co-design approach. IQM delivers on-premises quantum computers for research laboratories and supercomputing centers and provides complete access to its hardware. For industrial customers, IQM delivers quantum advantage through a unique application-specific co-design approach. IQM has raised 71 Million from VCs firms and also public grants and is also building Finland's first quantum computer.

For more information, visit http://www.meetiqm.com.

Registered offices:

IQM Finland OyKeilaranta 1902150 EspooFINLANDwww.meetiqm.com

IQM GERMANY GmbHNymphenburgerstr. 8680636 MnchenGermany

IQM: Facts and Figures

Founders:

Media Contact: Raghunath Koduvayur, Head of Marketing and Communications, [emailprotected], +358504876509

Photo - https://mma.prnewswire.com/media/1437806/IQM_Quantum_Computers_Founders.jpg Photo - https://mma.prnewswire.com/media/1437807/IQM_Quantum_computer_design.jpg Logo - https://mma.prnewswire.com/media/1121497/IQM_Logo.jpg

SOURCE IQM Finland Oy

http://meetiqm.com/contact/

See the rest here:
New EU Consortium shaping the future of Quantum Computing USA - PRNewswire

New Machine Learning Theory Raises Questions About the Very Nature of Science – SciTechDaily

A novel computer algorithm, or set of rules, that accurately predicts the orbits of planets in the solar system could be adapted to better predict and control the behavior of the plasma that fuels fusion facilities designed to harvest on Earth the fusion energy that powers the sun and stars.

The algorithm, devised by a scientist at the U.S. Department of Energys (DOE) Princeton Plasma Physics Laboratory (PPPL), applies machine learning, the form of artificial intelligence (AI) that learns from experience, to develop the predictions. Usually in physics, you make observations, create a theory based on those observations, and then use that theory to predict new observations, said PPPL physicist Hong Qin, author of a paper detailing the concept in Scientific Reports. What Im doing is replacing this process with a type of black box that can produce accurate predictions without using a traditional theory or law.

Qin (pronounced Chin) created a computer program into which he fed data from past observations of the orbits of Mercury, Venus, Earth, Mars, Jupiter, and the dwarf planet Ceres. This program, along with an additional program known as a serving algorithm, then made accurate predictions of the orbits of other planets in the solar system without using Newtons laws of motion and gravitation. Essentially, I bypassed all the fundamental ingredients of physics. I go directly from data to data, Qin said. There is no law of physics in the middle.

PPPL physicist Hong Qin in front of images of planetary orbits and computer code. Credit: Elle Starkman / PPPL Office of Communications

The program does not happen upon accurate predictions by accident. Hong taught the program the underlying principle used by nature to determine the dynamics of any physical system, said Joshua Burby, a physicist at the DOEs Los Alamos National Laboratory who earned his Ph.D. at Princeton under Qins mentorship. The payoff is that the network learns the laws of planetary motion after witnessing very few training examples. In other words, his code really learns the laws of physics.

Machine learning is what makes computer programs like Google Translate possible. Google Translate sifts through a vast amount of information to determine how frequently one word in one language has been translated into a word in the other language. In this way, the program can make an accurate translation without actually learning either language.

The process also appears in philosophical thought experiments like John Searles Chinese Room. In that scenario, a person who did not know Chinese could nevertheless translate a Chinese sentence into English or any other language by using a set of instructions, or rules, that would substitute for understanding. The thought experiment raises questions about what, at root, it means to understand anything at all, and whether understanding implies that something else is happening in the mind besides following rules.

Qin was inspired in part by Oxford philosopher Nick Bostroms philosophical thought experiment that the universe is a computer simulation. If that were true, then fundamental physical laws should reveal that the universe consists of individual chunks of space-time, like pixels in a video game. If we live in a simulation, our world has to be discrete, Qin said. The black box technique Qin devised does not require that physicists believe the simulation conjecture literally, though it builds on this idea to create a program that makes accurate physical predictions.

The resulting pixelated view of the world, akin to what is portrayed in the movie The Matrix, is known as a discrete field theory, which views the universe as composed of individual bits and differs from the theories that people normally create. While scientists typically devise overarching concepts of how the physical world behaves, computers just assemble a collection of data points.

Qin and Eric Palmerduca, a graduate student in the Princeton University Program in Plasma Physics, are now developing ways to use discrete field theories to predict the behavior of particles of plasma in fusion experiments conducted by scientists around the world. The most widely used fusion facilities are doughnut-shaped tokamaks that confine the plasma in powerful magnetic fields.

Fusion, the power that drives the sun and stars, combines light elements in the form of plasma the hot, charged state of matter composed of free electrons and atomic nuclei that represents 99% of the visible universe to generate massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

In a magnetic fusion device, the dynamics of plasmas are complexand multi-scale, and the effective governing laws or computational models for a particular physical process that we are interested in are not always clear, Qin said. In these scenarios, we can apply the machine learning technique that I developed to create a discrete field theory and then apply this discrete field theory to understand and predict new experimental observations.

This process opens up questions about the nature of science itself. Dont scientists want to develop physics theories that explain the world, instead of simply amassing data? Arent theories fundamental to physics and necessary to explain and understand phenomena?

I would argue that the ultimate goal of any scientist is prediction, Qin said. You might not necessarily need a law. For example, if I can perfectly predict a planetary orbit, I dont need to know Newtons laws of gravitation and motion. You could argue that by doing so you would understand less than if you knew Newtons laws. In a sense, that is correct. But from a practical point of view, making accurate predictions is not doing anything less.

Machine learning could also open up possibilities for more research. It significantly broadens the scope of problems that you can tackle because all you need to get going is data, Palmerduca said.

The technique could also lead to the development of a traditional physical theory. While in some sense this method precludes the need of such a theory, it can also be viewed as a path toward one, Palmerduca said. When youre trying to deduce a theory, youd like to have as much data at your disposal as possible. If youre given some data, you can use machine learning to fill in gaps in that data or otherwise expand the data set.

Reference: Machine learning and serving of discrete field theories by Hong Qin, 9 November 2020, Scientific Reports.DOI: 10.1038/s41598-020-76301-0

Read this article:
New Machine Learning Theory Raises Questions About the Very Nature of Science - SciTechDaily

Using AI and Machine Learning will increase in horti industry – hortidaily.com

The expectation is that in 2021, artificial intelligence and machine learning technologies will continue to become more mainstream. Businesses that havent traditionally viewed themselves as candidates for AI applications will embrace these technologies.

A great story of machine learning being used in an industry that is not known for its technology investments is the story of Makoto Koike. Using Googles TensorFlow, Makoto initially developed a cucumber sorting system using pictures that he took of the cucumbers. With that small step, a machine learning cucumber sorting system was born.

Getting started with AI and machine learning is becoming increasingly accessible for organizations of all sizes. Technology-as-a-service companies including Microsoft, AWS and Google all have offerings that will get most organizations started on their AI and machine learning journeys. These technologies can be used to automate and streamline manual business processes that have historically been resource-intensive.

An article on forbes.com claims that, as business leaders continue to refine their processes to support the new normal of the Covid-19 pandemic, they should be considering where these technologies might help reduce manual, resource-intensive or paper-based processes. Any manual process should be fair game for review for automation possibilities.

Photo source: Dreamstime.com

Follow this link:
Using AI and Machine Learning will increase in horti industry - hortidaily.com

Machine Learning in Medicine Market 2021 to Perceive Biggest Trend and Opportunity by 2028 KSU | The Sentinel Newspaper – KSU | The Sentinel…

Machine Learning in Medicine Market Comprehensive Study is an expert and top to bottom investigation on the momentum condition of the worldwide Machine Learning in Medicine industry with an attention on the Global market. The report gives key insights available status of the Machine Learning in Medicine producers and is an important wellspring of direction and course for organizations and people keen on the business. By and large, the report gives an inside and out understanding of 2021-2028 worldwide Machine Learning in Medicine Market covering extremely significant parameters.

Free Sample Report @:

https://www.marketresearchinc.com/request-sample.php?id=28540

Key Players in This Report Include,:GoogleBio BeatsJvionLumiataDreaMedHealintArterysAtomwiseHealth FidelityGinger

(Market Size & Forecast, Different Demand Market by Region, Main Consumer Profile etcBrief Summary of Machine Learning in Medicine:

Machine Learning in Medicine Helps in Avoiding Delays in Processing, Turn-Around-Time, & Redundant Operational Costs. It is Efficient in the Management of Entire Claim Administrative Processes, Such as Adjudication, Pricing, Authorizations, & Analytics. It Provides Real-Time Claim Processing With No Wait Time for Batch Processes

Market Drivers The Rise in the Number ofPatients Opting For Medical Insurance & Increasein Premium Costs The Surge in the Geriatric Population with Chronic Diseases

Market Trend Growth in the Health Insurance Claims

Restraints High Cost Linked With Machine Learning in Medicine

The Global Machine Learning in Medicine Market segments and Market Data Break Down are illuminated below:by Type (Integrated Solutions, Standalone Solutions), Application (Healthcare Payers, Healthcare Providers, Other), Delivery Mode (On-Premise, Cloud-Based), Component (Software, Services)

This research report represents a 360-degree overview of the competitive landscape of the Global Machine Learning in Medicine Market. Furthermore, it offers massive data relating to recent trends, technological, advancements, tools, and methodologies. The research report analyzes the Global Machine Learning in Medicine Market in a detailed and concise manner for better insights into the businesses.

Regions Covered in the Machine Learning in Medicine Market: TheMiddle EastandAfrica(South Africa,Saudi Arabia,UAE,Israel,Egypt, etc.)North America(United States,Mexico&Canada)South America(Brazil,Venezuela,Argentina,Ecuador,Peru,Colombia, etc.)Europe(Turkey,Spain,Turkey, Netherlands Denmark,Belgium,Switzerland,Germany, RussiaUK,Italy,France, etc.)Asia-Pacific(Taiwan,Hong Kong,Singapore,Vietnam,China,Malaysia,Japan,Philippines,Korea,Thailand,India,Indonesia, andAustralia).

Get Upto 40% Discount on The Report @https://www.marketresearchinc.com/ask-for-discount.php?id=28540

The research study has taken the help of graphical presentation techniques such as infographics, charts, tables, and pictures. It provides guidelines for both established players and new entrants in the Global Machine Learning in Medicine Market.

The detailed elaboration of the Global Machine Learning in Medicine Market has been provided by applying industry analysis techniques such as SWOT and Porters five-technique. Collectively, this research report offers a reliable evaluation of the global market to present the overall framework of businesses.

Attractions of the Machine Learning in Medicine Market Report: The report provides granular level information about the market size, regional market share, historic market (2014-2018) and forecast (2021-2028) The report covers in-detail insights about the competitors overview, company share analysis, key market developments, and their key strategies The report outlines drivers, restraints, unmet needs, and trends that are currently affecting the market The report tracks recent innovations, key developments and start-ups details that are actively working in the market The report provides plethora of information about market entry strategies, regulatory framework and reimbursement scenario

Enquire for customization in Report @https://www.marketresearchinc.com/enquiry-before-buying.php?id=28540

Key Points Covered in the Table of Content:Chapter 1 to explain Introduction, market review, market risk and opportunities, market driving force, product scope of Machine Learning in Medicine Market;Chapter 2 to inspect the leading manufacturers (Cost Structure, Raw Material) with sales Analysis, revenue Analysis, and price Analysis of Machine Learning in Medicine Market;Chapter 3 to show the focused circumstance among the best producers, with deals, income, and Machine Learning in Medicine market share 2021;Chapter 4 to display the regional analysis of Global Machine Learning in Medicine Market with revenue and sales of an industry, from 2021to 2028;Chapter 5, 6, 7 to analyze the key countries (United States,China,Europe,Japan,Korea&Taiwan), with sales, revenue and market share in key regions;Chapter 8 and 9 to exhibit International and Regional Marketing Type Analysis, Supply Chain Analysis, Trade Type Analysis;Chapter 10 and 11 to analyze the market by product type and application/end users (industry sales, share, and growth rate) from2021 to 2028Chapter 12 to show Machine Learning in Medicine Market forecast by regions, forecast by type and forecast by application with revenue and sales, from 2021 to 2028;Chapter 13, 14 & 15 to specify Research Findings and Conclusion, Appendix, methodology and data source of Machine Learning in Medicine market buyers, merchants, dealers, sales channel.

Browse for Full Report at @:

Machine Learning in Medicine Market research provides answers to the following key questions:What is the expected growth rate of the Machine Learning in Medicine Market?What will be the Machine Learning in Medicine Market size for the forecast period, 2021 2028?What are the main driving forces responsible for changing the Machine Learning in Medicine Market trajectory?Who are the big suppliers that dominate the Machine Learning in Medicine Market across different regions? Which are their wins to stay ahead in the competition?What are the Machine Learning in Medicine Market trends business owners can rely upon in the coming years?What are the threats and challenges expected to restrict the progress of the Machine Learning in Medicine Market across different countries?

About Us

Market Research Inc is farsighted in its view and covers massive ground in global research. Local or global, we keep a close check on both markets. Trends and concurrent assessments sometimes overlap and influence the other. When we say market intelligence, we mean a deep and well-informed insight into your products, market, marketing, competitors, and customers. Market research companies are leading the way in nurturing global thought leadership. We help your product/service become the best they can with our informed approach.

Contact Us

Market Research Inc

Kevin

51 Yerba Buena Lane, Ground Suite,

Inner Sunset San Francisco, CA 94103, USA

Call Us: +1 (628) 225-1818

Write Us@sales@marketresearchinc.com

https://www.marketresearchinc.com

Read more here:
Machine Learning in Medicine Market 2021 to Perceive Biggest Trend and Opportunity by 2028 KSU | The Sentinel Newspaper - KSU | The Sentinel...

The head of JPMorgan’s machine learning platform explained what it’s like to work there – eFinancialCareers

For the past few years, JPMorgan has been busy building out its machine learning capability underDaryush Laqab, its San Francisco-based head of AI platform product management, who was hired from Google in 2019. Last time we looked, the bank seemed to be paying salaries of $160-$170k to new joiners onLaqab's team.

If that sounds appealing, you might want to watch the video below so that you know what you're getting into. Recorded at the AWS re:Invent conferencein December, it's only just made it to you YouTube. The video is flagged as a day in the life of JPMorgan's machine learning data scientists, butLaqab arguably does a better of job of highlighting some of the constraints data professionals at allbanks have to work under.

"There are some barriers to smooth data science at JPMorgan," he explains - a bank is not the same as a large technology firm.

For example, data scientists at JPMorgan have to check data is authorized for use, saysLaqab: "They need to go to a process to log that use and make surethat they have the adequate approvals for that intent in terms of use."

They also have to deal with the legacy infrastructureissue: "We are a large organization, we have a lot of legacy infrastructure," says Laqab. "Like any other legacy infrastructure, it is built over time,it is patched over time. These are tightly integrated,so moving part or all of that infrastructure to public cloud,replacing rule base engines with AI/ML based engines.All of that takes time and brings inertia to the innovation."

JPMorgan's size and complexity is another source of inertia as multiple business lines in multiple regulated entities in different regulated environments need to be considered. "Making sure that those regulatory obligationsare taken care of, again, slows down data science at times," saysLaqab.

And then there are more specific regulations such as those concerning model governance. At JPMorgan, a machine learning model can't go straight into a production environment."It needs to go through a model review and a model governance process," says Laqab. "- To make sure we have another set of eyes that looksat how that model was created, how that model was developed..." And then there are software governance issues too.

Despite all these hindrances, JPMorgan has already productionized AI models and built an 'Omni AI ecosystem,'which Laqab heads,to help employees to identify and ingest minimum viable data so that they canbuild models faster. Laqab saysthe bank saved $150m in expenses in 2019 as a result. JPMorgan's AI researchers are now working on everything fromFAQ bots and chat bots, to NLP search models for the bank'sown content, pattern recognition in equities markets and email processing. - The breadth of work on offer is considerable. "We play in every market that is out there," saysLaqab,

The bank has also learned that the best way to structure its AI team is to split people into data scientists who train and create models and machine learning engineers who operationalize models, saysLaqab. - Before you apply, you might want to consider which you'd rather be.

Photo by NeONBRAND on Unsplash

Have a confidential story, tip, or comment youd like to share? Contact:sbutcher@efinancialcareers.comin the first instance. Whatsapp/Signal/Telegram also available. Bear with us if you leave a comment at the bottom of this article: all our comments are moderated by human beings. Sometimes these humans might be asleep, or away from their desks, so it may take a while for your comment to appear. Eventually it will unless its offensive or libelous (in which case it wont.)

Read this article:
The head of JPMorgan's machine learning platform explained what it's like to work there - eFinancialCareers

Mental health diagnoses and the role of machine learning – Health Europa

It is common for patients with psychosis or depression to experience symptoms of both conditions which has meant that traditionally, mental health diagnoses have been given for a primary illness with secondary symptoms of the other.

Making an accurate diagnosis often poses difficulties to mental health clinicians and diagnoses often do not accurately reflect the complexity of individual experience or neurobiology. For example, a patient being diagnosed with psychosis will often have depression regarded as a secondary condition, with more focus on the psychosis symptoms, such as hallucinations or delusions; this has implications on treatment decisions for patients.

A team at the University of Birminghams Institute for Mental Health and Centre for Human Brain Health, along with researchers at the European Union-funded PRONIA consortium, explored the possibility of using machine learning to create extremely accurate models of pure forms of both illnesses and using these models to investigate the diagnostic accuracy of a cohort of patients with mixed symptoms. The results of this study have been published in Schizophrenia Bulletin.

Paris Alexandros Lalousis, lead author, explains that the majority of patients have co-morbidities, so people with psychosis also have depressive symptoms and vice versa That presents a big challenge for clinicians in terms of diagnosing and then delivering treatments that are designed for patients without co-morbidity. Its not that patients are misdiagnosed, but the current diagnostic categories we have do not accurately reflect the clinical and neurobiological reality.

The researchers analysed questionnaire responses and detailed clinical interviews, as well as data from structural magnetic resonance imaging from a cohort of 300 patients taking part in the study. From this group of patients, they identified small subgroups of patients, who could be classified as suffering either from psychosis without any symptoms of depression, or from depression without any psychotic symptoms.

With the goal of developing a precise disease profile for each patient and testing it against their diagnosis to see how accurate it was, the research team was able to identify machine learning models of pure depression, and pure psychosis by using the collected data. They were then able to use machine learning methods to apply these models to patients with symptoms of both illnesses.

The team discovered that patients with depression as a primary illness were more likely to have accurate mental health diagnoses, whereas patients with psychosis with depression had symptoms which most frequently leaned towards the depression dimension. This may suggest that depression plays a greater part in the illness than had previously been thought.

Lalousis added: There is a pressing need for better treatments for psychosis and depression, conditions which constitute a major mental health challenge worldwide. Our study highlights the need for clinicians to understand better the complex neurobiology of these conditions, and the role of co-morbid symptoms; in particular considering carefully the role that depression is playing in the illness.

In this study we have shown how using sophisticated machine learning algorithms, which take into account clinical, neurocognitive, and neurobiological factors can aid our understanding of the complexity of mental illness. In the future, we think machine learning could become a critical tool for accurate diagnosis. We have a real opportunity to develop data-driven diagnostic methods this is an area in which mental health is keeping pace with physical health and its really important that we keep up that momentum.

Continued here:
Mental health diagnoses and the role of machine learning - Health Europa

5 Ways the IoT and Machine Learning Improve Operations – BOSS Magazine

Reading Time: 4 minutes

By Emily Newton

The Internet of Things (IoT) and machine learning are two of the most disruptive technologies in business today. Separately, both of these innovations can bring remarkable benefits to any company. Together, they can transform your business entirely.

The intersection of IoT devices and machine learning is a natural progression. Machine learning needs large pools of relevant data to work at its best, and the IoT can supply it. As adoption of both soars, companies should start using them in conjunction.

Here are five ways the IoT and machine learning can improve operations in any business.

Around 25% of businesses today use IoT devices, and this figure will keep climbing. As companies implement more of these sensors, they add places where they can gather data. Machine learning algorithms can then analyze this data to find inefficiencies in the workplace.

Looking at various workplace data, a machine learning program could see where a company spends an unusually high amount of time. It could then suggest a new workflow that would reduce the effort employees expend in that area. Business leaders may not have ever realized this was a problem area without machine learning.

Machine learning programs are skilled at making connections between data points that humans may miss. They can also make predictions 20 times earlier than traditional tools and do so with more accuracy. With IoT devices feeding them more data, theyll only become faster and more accurate.

Machine learning and the IoT can also automate routine tasks. Business process automation (BPA) leverages AI to handle a range of administrative tasks, so workers dont have to. As IoT devices feed more data into these programs, they become even more effective.

Over time, technology like this has contributed to a 40% productivity increase in some industries. Automating and streamlining tasks like scheduling and record-keeping frees employees to focus on other, value-adding work. BPAs potential doesnt stop there, either.

BPA can automate more than straightforward data manipulation tasks. It can talk to customers, plan and schedule events, run marketing campaigns and more. With more comprehensive IoT implementation, it would have access to more areas, becoming even more versatile.

One of the most promising areas for IoT implementation is in the supply chain. IoT sensors in vehicles or shipping containers can provide companies with critical information like real-time location data or product quality. This data alone improves supply chain visibility, but paired with machine learning, it could transform your business.

Machine learning programs can take this real-time data from IoT sensors and put it into action. It could predict possible disruptions and warn workers so they can respond accordingly. These predictive analytics could save companies the all-too-familiar headache of supply chain delays.

UPS Orion tool is the gold standard for what machine learning can do for supply chains. The system has saved the shipping giant 10 million gallons of fuel a year by adjusting routes on the fly based on traffic and weather data.

If a company cant understand the vulnerabilities it faces, business leaders cant make fully informed decisions. IoT devices can provide the data businesses need to get a better understanding of these risks. Machine learning can take it a step further and find points of concern in this data that humans could miss.

IoT devices can gather data about the workplace or customers that machine learning programs then process. For example, Progressive has made more than 1.7 trillion observations about its customers driving habits through Snapshot, an IoT tracking device. These analytics help the company adjust clients insurance rates based on the dangers their driving presents.

Business risks arent the only hazards the Internet of Things and machine learning can predict. IoT air quality sensors could alert businesses when to change HVAC filters to protect employee health. Similarly, machine learning cybersecurity programs could sense when hackers are trying to infiltrate a companys network.

Another way the IoT and machine learning could transform your business is by eliminating waste. Data from IoT sensors can reveal where the company could be using more resources than it needs. Machine learning algorithms can then analyze this data to suggest ways to improve.

One of the most common culprits of waste in businesses is energy. Thanks to various inefficiencies, 68% of power in America ends up wasted. IoT sensors can measure where this waste is happening, and with machine learning, adjust to stop it.

Machine learning algorithms in conjunction with IoT devices could restrict energy use, so processes only use what they need. Alternatively, they could suggest new workflows or procedures that would be less wasteful. While many of these steps may seem small, they add up to substantial savings.

Without the IoT and machine learning, businesses cant reach their full potential. These technologies enable savings companies couldnt achieve otherwise. As they advance, theyll only become more effective.

The Internet of Things and machine learning are reshaping the business world. Those that dont take advantage of them now could soon fall behind.

Emily Newton is the Editor-in-Chief of Revolutionized, a magazine exploring how innovations change our world. She has over 3 years experience writing articles in the industrial and tech sectors.

See the article here:
5 Ways the IoT and Machine Learning Improve Operations - BOSS Magazine

Parascript and SFORCE Partner to Leverage Machine Learning Eliminating Barriers to Automation – GlobeNewswire

Longmont, CO, Feb. 09, 2021 (GLOBE NEWSWIRE) -- Parascript, which provides document analysis software processing for over 100 billion documents each year, announced today the Smart-Force (SFORCE) and Parascript partnership to provide a digital workforce that augments operations by combining cognitive Robotic Process Automation (RPA) technology with customers current investments for high scalability, improved accuracy and an enhanced customer experience in Mexico and across Latin America.

Partnering with Smart-Force means we get to help solve some of the greatest digital transformation challenges in Intelligent Document Processing instead of just the low-hanging fruit. Smart-Force is forward-thinking and committed to futureproofing their customers processes, even with hard-to-automate, unstructured documents where the application of techniques such as NLP is often required, said Greg Council, Vice President of Marketing and Product Management at Parascript. Smart-Force leverages bots to genuinely collaborate with staff so that the staff no longer have to spend all their time on finding information, and performing data entry and verification, even for the most complex multi-page documents that you see in lending and insurance.

Smart-Force specializes in digital transformation by identifying processes in need of automation and implementing RPA to improve those processes so that they run faster without errors. SFORCE routinely enables increased productivity, improves customer satisfaction, and improves staff morale through leveraging the technology of Automation Anywhere, Inc., a leader in RPA, and now Parascript Intelligent Document Processing.

As intelligent automation technology becomes more ubiquitous, it has created opportunities for organizations to ignite their staff towards new ways of working freeing up time from the manual tasks to focus on creative, strategic projects, what humans are meant to do, said Griffin Pickard, Director of Technology Alliance Program at Automation Anywhere. By creating an alliance with Parascript and Smart-Force, we have enabled customers to advance their automation strategy by leveraging ML and accelerate end-to-end business processes.

Our focus at SFORCE is on RPA with Machine Learning to transform how customers are doing things. We dont replace; we compliment the technology investments of our customers to improve how they are working, said Alejandro Castrejn, Founder of SFORCE. We make processes faster, more efficient and augment their staff capabilities. In terms of RPA processes that focus on complex document-based information, we havent seen anything approach what Parascript can do.

We found that Parascript does a lot more than other IDP providers. Our customers need a point-to-point RPA solution. Where Parascript software becomes essential is in extracting and verifying data from complex documents such as legal contracts. Manual data entry and review produces a lot of errors and takes time, said Barbara Mair, Partner at SFORCE. Using Parascript software, we can significantly accelerate contract execution, customer onboarding and many other processes without introducing errors.

The ability to process simple to very complex documents such as unstructured contracts and policies within RPA leveraging FormXtra.AI represents real opportunities for digital transformation across the enterprise. FormXtra.AI and its Smart Learning allow for easy configuration, and by training the systems on client-specific data, the automation is rapidly deployed with the ability to adapt to new information introduced in dynamic production environments.

About SFORCE, S.A. de C.V.

SFORCE offers services that allow customers to adopt digital transformation at whatever pace the organization needs. SFORCE is dedicated to helping customers get the most out of their existing investments in technology. SFORCE provides point-to-point solutions that combine existing technologies with next generation technology, which allows customers to transform operations, dramatically increase efficiency as well as automate manual tasks that are rote and error-prone, so that staff can focus on high-value activities that significantly increase revenue. From exploring process automation to planning a disruptive change that ensures high levels of automation, our team of specialists helps design and implement the automation of processes for digital transformation. Visit SFORCE.

About Parascript

Parascript software, driven by data science and powered by machine learning, configures and optimizes itself to automate simple and complex document-oriented tasks such as document classification, document separation and data entry for payments, lending and AP/AR processes. Every year, over 100 billion documents involved in banking, insurance, and government are processed by Parascript software. Parascript offers its technology both as software products and as software-enabled services to our partners. Visit Parascript.

Read more from the original source:
Parascript and SFORCE Partner to Leverage Machine Learning Eliminating Barriers to Automation - GlobeNewswire

There Is No Silver Bullet Machine Learning Solution – Analytics India Magazine

A recommendation engine is a class of machine learning algorithm that suggests products, services, information to users based on analysis of data. Robust recommendation systems are the key differentiator in the operations of big companies like Netflix, Amazon, and Byte Dance (TikTok parent) etc.

Alok Menthe, Data Scientist at Ericsson, gave an informative talk on building Custom recommendation engines for real-world problems at the Machine Learning Developers Summit (MLDS) 2021. Whenever a niche business problem comes in, it has complicated intertwined ways of working. Standard ML techniques may be inadequate and might not serve the customers purpose. That is where the need for a custom-made engine comes in. We were also faced with such a problem with our service network unit at Ericsson, he said.

Menthe said the unit wanted to implement a recommendation system to provide suggestions for assignment workflow a model to delegate the incoming projects to the most appropriate team or resource pool

Credit: Alok Menthe

There were three kinds of data available:

Pool definition data: It relates to the composition of a particular resource poolthe number of people, their competence, and other metadata.

Historical demand data: This kind of data helps in establishing a relationship between the feature demand and a particular resource pool.

Transactional data: It is used for operational purposes.

Menthe said building a custom recommendation system in this context involves the following steps:

Credit: Alok Menthe

After building our model, the most difficult part was feature engineering, which is imperative for building an efficient system. Among the two major modules classification and clusteringwe faced challenges with respect to the latter. We had only categorical information making it difficult to find distances within the objects. We went out of the box to see if we can do any special encoding for the data. We adopted data encoding techniques and frequency-based encoding in this regard, said Menthe.

Clustering module: For this module, initially the team implemented K-modes and agglomerative. However, the results were far from perfect, prompting the team to consider the good-old K-means algorithm. For evaluation purposes, it was done manually with the help of subject matter experts.

The final model had 700 resource pools condensed to 15 pool clusters.

Classification module: For this module, three kinds of algorithm iterations were usedRandom Forest, Artificial Neural Network, XGBoost. Classification accuracy was used as an evaluation metric. Finally, upon 50,00,000 training records, this module demonstrated an accuracy of 71 percent.

Menthe said this recommendation model is monitored on a fortnightly basis by validating the suggested pools against the allocated pools for project demands:

The model has proved to be successful on three fronts:

Menthe summarised the three major takeaways from this project in his concluding remarks: the need to preserve business nuances in ML solutions; thinking beyond standard ML approaches; and understanding that there is no silver bullet ML solution.

I am a journalist with a postgraduate degree in computer network engineering. When not reading or writing, one can find me doodling away to my hearts content.

Here is the original post:
There Is No Silver Bullet Machine Learning Solution - Analytics India Magazine

The Collision of AI’s Machine Learning and Manipulation: Deepfake Litigation Risks to Companies from a Product Liability, Privacy, and Cyber…

AI and machine-learning advances have made it possible to produce fake videos and photos that seem real, commonly known as deepfakes. Deepfake content is exploding in popularity.[i] In Star Wars: The Rise of Skywalker, for instance, a visage of Carrie Fischer graced the screen, generated through artificial intelligence models trained on historic footage. Using thousands of hours of interviews with Salvador Dali, the Dali Museum of Florida created an interactive exhibit featuring the artist.[ii] For Game of Thrones fans miffed over plot holes in the season finale, Jon Snow can be seen profusely apologizing in a deepfake video that looks all too real.[iii]

Deepfake technologyhow does it work? From a technical perspective, deepfakes (also referred to as synthetic media) are made from artificial intelligence and machine-learning models trained on data sets of real photos or videos. These trained algorithms then produce altered media that looks and sounds just like the real deal. Behind the scenes, generative adversarial networks (GANs) power deepfake creation.[iv] With GANs, two AI algorithms are pitted against one another: one creates the forgery while the other tries to detect it, teaching itself along the way. The more data is fed into GANs, the more believable the deepfake will be. Researchers at academic institutions such as MIT, Carnegie Mellon, and Stanford University, as well as large Fortune 500 corporations, are experimenting with deepfake technology.[v] Yet deepfakes are not solely the province of technical universities or AI product development groups. Anybody with an internet connection can download publicly available deepfake software and crank out content.[vi]

Deepfake risks and abuse. Deepfakes are not always fun and games. Deepfake videos can phish employees to reveal credentials or confidential information, e-commerce platforms may face deepfake circumvention of authentication technologies for purposes of fraud, and intellectual property owners may find their properties featured in videos without authorization. For consumer-facing online platforms, certain actors may attempt to leverage deepfakes to spread misinformation. Another well-documented and unfortunate abuse of deepfake technology is for purposes of revenge pornography.[vii]

In response, online platforms and consumer-facing companies have begun enforcing limitations on the use of deepfake media. Twitter, for example, announced a new policy within the last year to prohibit users from sharing synthetic or manipulated media that are likely to cause harm. Per its policy, Twitter reserves the right to apply a label or warning to Tweets containing such media.[viii] Reddit also updated its policies to ban content that impersonates individuals or entities in a misleading or deceptive manner (while still permitting satire and parody).[ix] Others have followed. Yet social media and online platforms are not the only industries concerned with deepfakes. Companies across industry sectors, including financial and healthcare, face growing rates of identity theft and imposter scams in government services, online shopping, and credit bureaus as deepfake media proliferates.[x]

Deepfake legal claims and litigation risks. We are seeing legal claims and litigation relating to deepfakes across multiple vectors:

1. Claims brought by those who object to their appearance in deepfakes. Victims of deepfake media sometimes pursue tort law claims for false light, invasion of privacy, defamation, and intentional infliction of emotional distress. At a high level, these overlapping tort claims typically require the person harmed by the deepfake to prove that the deepfake creator published something that gives a false or misleading impression of the subject person in a manner that (a) damages the subjects reputation, (b) would be highly offensive to a reasonable person, or (c) causes mental anguish or suffering. As more companies begin to implement countermeasures, the lack of sufficient safeguards against misleading deepfakes may give rise to a negligence claim. Companies could face negligence claims for failure to detect deepfakes, either alongside the deepfake creator or alone if the creator is unknown or unreachable.

2. Product liability issues related to deepfakes on platforms. Section 230 of the Communications Decency Act shields online companies from claims arising from user content published on the companys platform or website. The law typically bars defamation and similar tort claims. But e-commerce companies can also use Section 230 to dismiss product liability and breach of warranty claims where the underlying allegations focus on a third-party sellers representation (such as a product description or express warranty). Businesses sued for product liability or other tort claims should look to assert Section 230 immunity as a defense where the alleged harm stems from a deepfake video posted by a user. Note, however, the immunity may be lost where the host platform performs editorial functions with respect to the published content at issue. As a result, it is important for businesses to implement clear policies addressing harmful deepfake videos that broadly apply to all users and avoid wading into influencing a specific users content.

3. Claims from consumers who suffer account compromise due to deepfakes. Multiple claims may arise where cyber criminals leverage deepfakes to compromise consumer credentials for various financial, online service, or other accounts. The California Consumer Privacy Act (CCPA), for instance, provides consumers with a private right of action to bring claims against businesses that violate the duty to implement and maintain reasonable security procedures and practices.[xi] Plaintiffs may also bring claims for negligence, invasion of privacy claims under common law or certain state constitutions, and state unfair competition or false advertising statutes (e.g., Californias Unfair Competition Law and Consumers Legal Remedies Act).

4. Claims available to platforms enforcing Terms of Use prohibitions of certain kinds of deepfakes. Online content platforms may be able to enforce prohibitions on abusive or malicious deepfakes through claims involving breach of contract and potential violations of the Computer Fraud and Abuse Act (CFAA), among others. These claims may turn on nuanced issues around what conduct constitutes exceeding authorized access under the CFAA, or Terms of Use assent and enforceability of particular provisions.

5. Claims related to state statutes limiting deepfakes. As malicious deepfakes proliferate, several states such as California, Texas, and Virginia have enacted statutes prohibiting their use to interfere with elections or criminalizing pornographic deepfake revenge video distribution.[xii] More such statutes are pending.

Practical tips for companies managing deepfake risks. While every company and situation is unique, companies dealing with deepfakes on their platforms, or as a potential threat vector for information security attacks, can consider several practical avenues to manage risks:

While the future of deepfakes is uncertain, it is apparent that the underlying AI and machine-learning technology is very real and here to staypresenting both risks and opportunity for organizations across industries.

Read more here:
The Collision of AI's Machine Learning and Manipulation: Deepfake Litigation Risks to Companies from a Product Liability, Privacy, and Cyber...

Postdoctoral Research Associate in Digital Humanities and Machine Learning job with DURHAM UNIVERSITY | 246392 – Times Higher Education (THE)

Department of Computer Science

Grade 7:-33,797 - 40,322 per annumFixed Term-Full TimeContract Duration:7 monthsContracted Hours per Week:35Closing Date:13-Mar-2021, 7:59:00 AM

Durham University

Durham University is one of the world's top universities with strengths across the Arts and Humanities, Sciences and Social Sciences. We are home to some of the most talented scholars and researchers from around the world who are tackling global issues and making a difference to people's lives.

The University sits in a beautiful historic city where it shares ownership of a UNESCO World Heritage Site with Durham Cathedral, the greatest Romanesque building in Western Europe. A collegiate University, Durham recruits outstanding students from across the world and offers an unmatched wider student experience.

Less than 3 hours north of London, and an hour and a half south of Edinburgh, County Durham is a region steeped in history and natural beauty. The Durham Dales, including the North Pennines Area of Outstanding Natural Beauty, are home to breathtaking scenery and attractions. Durham offers an excellent choice of city, suburban and rural residential locations. The University provides a range of benefits including pension and childcare benefits and the Universitys Relocation Manager can assist with potential schooling requirements.

Durham University seeks to promote and maintain an inclusive and supportive environment for work and study that assists all members of our University community to reach their full potential. Diversity brings strength and we welcome applications from across the international, national and regional communities that we work with and serve.

The Department

The Department of Computer Science is rapidly expanding. A new building for the department (joint with Mathematical Sciences) has recently opened to house the expanded Department. The current Department has research strengths in (1) algorithms and complexity, (2) computer vision, imaging, and visualisation and (3) high-performance computing, cloud computing, and simulation. We work closely with industry and government departments. Research-led teaching is a key strength of the Department, which came 5th in the Complete University Guide. The department offers BSc and MEng undergraduate degrees and is currently redeveloping its interdisciplinary taught postgraduate degrees. The size of its student cohort has more than trebled in the past five years. The Department has an exceptionally strong External Advisory Board that provides strategic support for developing research and education, consisting of high-profile industrialists and academics.Computer Science is one of the very best UK Computer Science Departments with an outstanding reputation for excellence in teaching, research and employability of our students.

The Role

Postdoctoral Research Associate to work on the AHRC-funded project Visitor Interaction and Machine Curation in the Virtual Liverpool Biennial.

The project looks at virtual art exhibitions that are curated by machines, or even co-curated by humans and machines; and how audiences interact with these exhibitions in the era of online art shows. The project is in close collaboration with the 2020 (now 2021) Liverpool Biennial (http://biennial.com/). The role of the post holder is, along with the PI Leonardo Impett, to implement different strategies of user-machine interaction for virtual art exhibits; and to investigate the interaction behaviour of different types of users with such systems.

Responsibilities:

This post is fixed term until31 August 2021 as the research project is time limited and will end on 31 August 2021.

The post-holder is employed to work on research/a research project which will be led by another colleague. Whilst this means that the post-holder will not be carrying out independent research in his/her own right, the expectation is that they will contribute to the advancement of the project, through the development of their own research ideas/adaptation and development of research protocols.

Successful applicants will, ideally, be in post byFebruary 2021.

How to Apply

For informal enquiries please contactDr Leonardo Impett (leonardo.l.impett@durham.ac.uk).All enquiries will be treated in the strictest confidence.

We prefer to receive applications online via the Durham University Vacancies Site.https://www.dur.ac.uk/jobs/. As part of the application process, you should provide details of 3 (preferably academic/research) referees and the details of your current line manager so that we may seek an employment reference.

Applications are particularly welcome from women and black and minority ethnic candidates, who are under-represented in academic posts in the University.We are committed to equality: if for any reason you have taken a career break or periods of leave that may have impacted on your career path, such as maternity, adoption or parental leave, you may wish to disclose this in your application.The selection committee will recognise that this may have reduced the quantity of your research accordingly.

What to Submit

All applicants are asked to submit:

The Requirements

Essential:

Qualifications

Experience

Skills

Desirable:

Experience

Skills

DBS Requirement:Not Applicable.

Read more:
Postdoctoral Research Associate in Digital Humanities and Machine Learning job with DURHAM UNIVERSITY | 246392 - Times Higher Education (THE)

Microsofts Big Win in Quantum Computing Was an Error After All – WIRED

Whatever happened, the Majorana drama is a setback for Microsofts ambitions to compete in quantum computing. Leading computing companies say the technology will define the future by enabling new breakthroughs in science and engineering.

Quantum computers are built from devices called qubits that encode 1s and 0s of data but can also use a quantum state called a superposition to perform math tricks not possible for the bits in a conventional computer. The main challenge to commercializing that idea is that quantum states are delicate and easily quashed by thermal or electromagnetic noise, making qubits error-prone.

Google, IBM, and Intel have all shown off prototype quantum processors with around 50 qubits, and companies including Goldman Sachs and Merck are testing the technology. But thousands or millions of qubits are likely required for useful work. Much of a quantum computers power would probably have to be dedicated to correcting its own glitches.

Microsoft has taken a different approach, claiming qubits based on Majorana particles will be more scalable, allowing it to leap ahead. But after more than a decade of work, it does not have a single qubit.

From the fuller data, theres no doubt that theres no Majorana.

Sergey Frolov, University of Pittsburgh

Majorana fermions are named after Italian physicist Ettore Majorana, who hypothesized in 1937 that particles should exist with the odd property of being their own antiparticles. Not long after, he boarded a ship and was never seen again. Physicists wouldnt report a good glimpse of one of his eponymous particles until the next millennium, in Kouwenhovens lab.

Microsoft got interested in Majoranas after company researchers in 2004 approached tech strategy chief Craig Mundie and said they had a way to solve one problem holding back quantum computersqubits flakiness.

The researchers seized on theoretical physics papers suggesting a way to build qubits that would make them more dependable. These so-called topological qubits would be built around unusual particles, of which Majorana particles are one example, that can pop into existence in clumps of electrons inside certain materials at very low temperatures.

Microsoft created a new team of physicists and mathematicians to flesh out the theory and practice of topological quantum computing, centered on an outpost in Santa Barbara, California, christened Station Q. They collaborated with and funded leading experimental physicists hunting for the particles needed to build this new form of qubit.

Kouwenhoven, in Delft, was one of the physicists who got Microsofts backing. His 2012 paper reporting signatures of Majorana particles inside nanowires started chatter about a future Nobel prize for proving the elusive particles existence. In 2016, Microsoft stepped up its investmentand the hype.

Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.

Kouwenhoven and another leading physicist, Charles Marcus, at the University of Copenhagen were hired as corporate Majorana hunters. The plan was to first detect the particles and then invent more complex devices that could control them and function as qubits. Todd Holmdahl, who previously led hardware for Microsofts lucrative Xbox games console, took over as leader of the topological quantum computing project. Early in 2018, he told Barrons he would have a topological qubit by the end of the year. The now-disputed paper appeared a month later.

While Microsoft sought Majoranas, competitors working on established qubit technologies reported steady progress. In 2019, Google announced it had reached a milestone called quantum supremacy, showing that a chip with 53 qubits could perform a statistical calculation in minutes that would take a supercomputer millennia. Soon after, Microsoft appeared to hedge its quantum bet, announcing it would offer access to quantum hardware from other companies via its cloud service Azure. The Wall Street Journal reported that Holmdahl left the project that year after missing an internal deadline.

Microsoft has been quieter about its expected pace of progress on quantum hardware since Holmdahl's departure. Competitors in quantum computing continue to tout hardware advances and urge software developers to access prototypes over the internet, but none appear close to creating a quantum computer ready for prime time.

Read the original post:
Microsofts Big Win in Quantum Computing Was an Error After All - WIRED

IBM and ExxonMobil are building quantum algorithms to solve this giant computing problem – ZDNet

Research teams from energy giant ExxonMobil and IBM have been working together to find quantum solutions to one of the most complex problems of our time: managing the tens of thousands of merchant ships crossing the oceans to deliver the goods that we use every day.

The scientists lifted the lid on the progress that they have made so far and presented the different strategies that they have been using to model maritime routing on existing quantum devices, with the ultimate goal of optimizing the management of fleets.

ExxonMobil was the first energy company to join IBM's Quantum Network in 2019, and has expressed a keen interest in using the technology to explore various applications, ranging from the simulation of new materials to solving optimization problems.

SEE: Research: Why Industrial IoT deployments are on the rise (TechRepublic Premium)

Now, it appears that part of the energy company's work was dedicated to tapping quantum capabilities to calculate journeys that minimize the distance and time traveled by merchant ships across the globe.

On a worldwide scale, the equation is immense intractable, in fact, for classical computers. About 90% of world trade relies on maritime shipping, with more than 50,000 ships, themselves carrying up to 200,000 containers each, moving around every day to transport goods with a total value of $14 trillion.

The more the number of ships and journeys increase, the bigger the problem becomes. As IBM and ExxonMobil's teams put itin a blog post detailing their research: "Logistically speaking, this isn't the 'traveling salesperson problem.'"

While this type of exponentially growing problem can only be solved with simplifications and approximations on classical computers, the challenge is well-suited to quantum technologies. Quantum computers can effectively leverage a special dual state that is taken on by quantum bits, or qubits, to run many calculations at once; meaning that even the largest problems could be resolved in much less time than is possible on a classical computer.

"We wanted to see whether quantum computers could transform how we solve such complex optimization problems and provide more accurate solutions in less computational times," said the researchers.

Although the theory behind the potential of quantum computing is well-established, it remains to be found how quantum devices can be used in practice to solve a real-world problem such as the global routing of merchant ships. In mathematical terms, this means finding the right quantum algorithms that could be used to most effectively model the industry's routing problems, on current or near-term devices.

To do so, IBM and ExxonMobil's teams started with widely-used mathematical representations of the problem, which account for factors such as the routes traveled, the potential movements between port locations and the order in which each location is visited on a particular route. There are many existing ways to formulate the equation, one of which is called the quadratic unconstrained binary optimization (QUBO) technique, and which is often used in classical computer science.

The next question was to find out whether well-known models like QUBO can be solved with quantum algorithms and if so, which solvers work better. Using IBM's Qiskit optimization module, which was released last year toassist developers in building quantum optimization algorithms, the team tested various quantum algorithms labeled with unbeatably exotic names: the Variational Quantum Eigensolver (VQE), the Quantum Approximate Optimization Algorithm (QAOA), and Alternating Direction Method of Multiplier (ADMM) solvers.

After running the algorithms on a simulated quantum device, the researchers found that models like QUBO could effectively be solved by quantum algorithms, and that depending on the size of the problem, some solvers showed better results than others.

In another promising finding, the team said that the experiment showed some degree of inexactness in solving QUBOs is tolerable. "This is a promising feature to handle the inherent noise affecting the quantum algorithms on real devices," said the researchers.

SEE: BMW explores quantum computing to boost supply chain efficiencies

Of course, while the results suggest that quantum algorithms could provide real-world value, the research was carried out on devices that are still technically limited, and the experiments can only remain small-scale. The idea, however, is to develop working algorithms now, to be ready to harness the power of a fully fledged quantum computer when the technology develops.

"As a result of our joint research, ExxonMobil now has a greater understanding of the modelling possibilities, quantum solvers available, and potential alternatives for routing problems in any industry," said the researchers.

What applies to merchant ships, in effect, can also work in other settings. Routing problems are not inherent to the shipping industry, and the scientists confirmed that their findings could easily be transferred to any vehicle optimization problem that has time constraints, such as goods delivery, ride-sharing services or urban waste management.

In fact, ExxonMobil is not the first company to look at ways to use quantum computing techniques to solve optimization problems. Electronics manufacturer OTI Lumionics, for example, has been using QUBO representations to find the most optimal simulation of next-generation OLED materials. Instead of using gate-based quantum computers to run the problem, however, the company has been developing quantum-inspired algorithms to solve calculations on classical Microsoft Azure hardware,with encouraging results.

The mathematical formulas and solution algorithmsare described in detail in the research paper, and the ExxonMobil/IBM team stressed that their use is not restricted. The researchers encouraged their colleagues to reproduce their findings to advance the global field of quantum solvers.

Here is the original post:
IBM and ExxonMobil are building quantum algorithms to solve this giant computing problem - ZDNet

Kangaroo Court: Quantum Computing Thinking on the Future – JD Supra

The promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor.

Quantum computing is a beautiful fusion of quantum physics with computer science. It incorporates some of the most stunning ideas of physics from the twentieth century into an entirely new way of thinking about computation. Quantum computers have the potential to resolve problems of a high complexity and magnitude across many different industries and application, including finance, transportation, chemicals, and cybersecurity. Solving the impossible in a few hours of computing time.

Quantum computing is often in the news: China teleported a qubit from earth to a satellite; Shors algorithm has put our current encryption methods at risk; quantum key distribution will make encryption safe again; Grovers algorithm will speed up data searches. But what does all this really mean? How does it all work?

Todays computers operate in a very straightforward fashion: they manipulate a limited set of data with an algorithm and give you an answer. Quantum computers are more complicated. After multiple units of data are input into qubits, the qubits are manipulated to interact with other qubits, allowing for several calculations to be done simultaneously. Thats where quantum computers are a lot faster than todays machines.

Quantum computers have four fundamental capabilities that differentiate them from todays classical computers:

All computations involve inputting data, manipulating it according to certain rules, and then outputting the final answer. For classical computations, the bit is the basic unit of data. For quantum computation, this unit is the quantum bit usually shortened to qubit.

The basic unit of quantum computing is a qubit. A classical bit is either 0 or 1. If its 0 and we measure it, we get 0. If its 1 and we measure 1, we get 1. In both cases the bit remains unchanged. The standard example is an electrical switch that can be either on or off. The situation is totally different for qubits. Qubits are volatile. A qubit can be in one of an infinite number of states a superposition of both 0 and 1 but when we measure it, as in the classical case, we just get one of two values, either 0 or 1. Qubits can also become entangled. In fact, the act of measurement changes the qubit. When we make a measurement of one of them, it affects the state of the other. Whats more, they interact with other qubits. In fact, these interactions are what make it possible to conduct multiple calculations at once.

Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as spooky action at a distance. But its key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.

These three things superposition, measurement, and entanglement are the key quantum mechanical ideas. Controlling these interactions, however, is very complicated. The volatility of qubits can cause inputs to be lost or altered, which can throw off the accuracy of results. And creating a computer of meaningful scale would require hundreds of thousands of millions of qubits to be connected coherently. The few quantum computers that exist today can handle nowhere near that number. But the good news is were getting very, very close.

Quantum computing and classical computer are not two distinct disciplines. Quantum computing is the more fundamental form of computing anything that can be computed classically can be computed on a quantum computer. The qubit is the basic unit of computation, not the bit. Computation, in its essence, really means quantum computing. A qubit can be represented by the spin of an electron or the polarization of a photon.

In 2019 Google achieved a level of quantum supremacy when they reported the use of a processor with programmable superconducting qubits to create quantum states on 54 qubits, corresponding to a computational state-space of dimension 253(about 1016). This incredible achievement was slightly short of their mission goal for creating quantum states of 72 qubits. What is so special about this number? Classical computers can simulate quantum computers if the quantum computer doesnt have too many qubits, but as the number of qubits increases we reach the point where that is no longer possible.

There are 8 possible three-bit combinations: 000,001, 010, 011, 100, 101, 110, 111. The number 8 comes from 23. There are two choices for the first bit, two for the second and two for the third, and we might multiple these three 2s together. If instead of bits we switch to qubits, each of these 8 three-bit strings is associated with a basis vector, so the vector space is 8-dimensional. If we have 72 qubits, the number of basis elements is 2. This is about 4,000,000,000,000,000,000,000. It is a large number and is considered to be the point at which classical computers cannot simulate quantum computers. Once quantum computers have more than 72 or so qubits we truly enter the age of quantum supremacy when quantum computers can do computations that are beyond the ability of any classical computer.

To provide a little more perspective, lets consider a machine with 300 qubits. This doesnt seem an unreasonable number of the not too distant future. But 2300 is an enormous number. Its more than the number of elementary particles in the known universe. A computation using 300 qubits would be working with 2300 basis elements.

Some calculations required for the effective simulation of real-life scenarios are simply beyond the capability of classical computers whats known as intractable problems. Quantum computers, with their huge computational power, are ideally suited to solving these problems. Indeed, some problems, like factoring, are hard on a classical computer, but are easy on a quantum computer. This creates a world of opportunities, across almost every aspect of modern life.

Healthcare: classical computers are limited in terms of size and complexity of molecules they can simulate and compare (an essential process of early drug development). Quantum computers will allow much larger molecules to be simulated. At the same time, researchers will be able to model and simulate interactions between drugs and all 20,000+ proteins encoded in the human genome, leading to greater advancements in pharmacology.

Finance: one potential application is algorithmic trading using complex algorithms to automatically trigger share dealings based on a wide variety of market variables. The advantages, especially for high-volume transactions, are significant. Another application is fraud detection. Like diagnostics in healthcare, fraud detection is reliant upon pattern recognition. Quantum computers could deliver a significant improvement in machine learning capabilities; dramatically reducing the time taken to train a neural network and improving the detection rate.

Logistics: Improved data analysis and modelling will enable a wide range of industries to optimize workflows associated with transport, logistics and supply-chain management. The calculation and recalculation of optimal routes could impact on applications as diverse as traffic management, fleet operations, air traffic control, freight and distribution.

It is, of course, impossible to predict the long-term impact of quantum computing with any accuracy. Quantum computing is now in its infancy, and the comparison to the first computers seems apt. The machines that have been constructed so far tend to be large and not very powerful, and they often involve superconductors that need cooled to extremely low temperatures. To minimize the interaction of quantum computers with the environment, they are always protected from light and heat. They are shieled against electromagnetic radiation, and they are cooled. One thing that can happen in cold places is that certain materials become superconductors they lose all electrical resistance and superconductors have quantum properties that can be exploited.

Many countries are experimenting with small quantum networks using optic fiber. There is the potential of connecting these via satellite and being able to form a worldwide quantum network. This work is of great interest to financial institutions. One early impressive result involves a Chinese satellite that is devoted to quantum experiments. Its named Micius after a Chinese philosopher who did work in optics. A team in China connected to a team in Austria the first time that intercontinental quantum key distribution (QKD) had been achieved. Once the connection was secured, the teams sent pictures to one another. The Chinese team sent the Austrians a picture of Micius, and the Austrians sent a picture of Schrodinger to the Chinese.

To actually make practical quantum computers you need to solve a number of problems, the most serious being decoherence the problem of your qubit interacting with something from the environment that is not part of the computation. You need to set a qubit to an initial state and keep it in that state until you need to use it. Their quantum state is extremely fragile. The slightest vibration or change in temperature disturbances known as noise in quantum-speak can cause them to tumble out of superposition before their job has been properly done. Thats why researchers are doing the best to protect qubits from the outside world in supercooled fridges and vacuum chambers.

Alan Turing is one of the fathers of the theory of computation. In his landmark paper of 1936 he carefully thought about computation. He considered what humans did as they performed computations and broke it down to its most elemental level. He showed that a simple theoretical machine, which we now call a Turing machine, could carry out any algorithm. But remember, Turing was analyzing computation based on what humans do. With quantum computation the focus changes from how humans compute to how the universe computes. Therefore, we should think of quantum computation as not a new type of computation but as the discovery of the true nature of computation.

More:
Kangaroo Court: Quantum Computing Thinking on the Future - JD Supra

New EU Consortium shaping the future of Quantum Computing USA – PR Newswire India

Europe has always been excellent in academic research, but over the past few decades commercializing research projects has been slow compared to international competition. This is starting to change with quantum technologies. As one of the largest efforts in Europe and worldwide, Germany announced 2 Billion funding into quantum programs in June 2020, from which 120 Million are invested in this current round of research grants.

Today, IQM announced a Quantum project consortium that includes Europe's leading startups (ParityQC, IQM), industry leaders (Infineon Technologies), research centers (Forschungszentrum Jlich),supercomputing centers (Leibniz Supercomputing Centre), and academia (Freie Universitt Berlin) has been awarded 12.4 Million from the German Ministry of Education and Research (BMBF) (Announcement in German).

The scope of the project is to accelerate commercialization through an innovative co-design concept. This project focuses on application-specific quantum processors, which have the potential to create a fastlane to quantum advantage. The digital-analog concept used to operate the processors will further lay the foundation for commercially viable quantum computers. This project will run for four years and aims to develop a 54-qubit quantum processor.

The project is intended to support the European FET Flagship project EU OpenSuperQ, announced in 2018 which is aimed at designing, building, and operating a quantum information processing system of up to 100 qubits. Deploying digital-analog quantum computing, this consortium adds a new angle to the OpenSuperQ project and widens its scope. With efforts from Munich, Berlin and Jlich, as well as Parity QC from Austria, the project builds bridges and seamlessly integrates into the European quantum landscape.

"The grant from the Federal Ministry of Education and Research of Germanyis a huge recognition of our unique co-design approach for quantum computers. Last year when we established our office in Munich, this was one of our key objectives. The concept allows us to become a system integrator for full-stack quantum computers by bringing together all the relevant players. As Europe's leading startup in quantum technologies, this gives us confidence to further invest in Germany and other European countries" said Dr. Jan Goetz, CEO of IQM Quantum Computers.

As European technology leader, Germany is taking several steps to lead the quantum technology race. An important role of such leadership is to bring together the European startups, industry, research and academic partners. This project will give the quantum landscape in Germany an accelerated push and will create a vibrant quantum ecosystem in the region for the future.

Additional Quotes:

"DAQC is an important project for Germany and Europe. It enables us to take a leading role in the area of quantum technologies. It also allows us to bring quantum computing into one of the prime academic supercomputing centres to more effectively work on the important integration of high-performance computing and quantum computing. We are looking forward to a successful collaboration," said Prof. DrMartinSchulz, Member of the Board of Directors, Leibniz Supercomputing Centre (LRZ).

"The path towards scalable and fully programmable quantum computing will be the parallelizability of gates and building with reduced complexity in order to ensure manageable qubit control. Our ParityQC architecture is the blueprint for a fully parallelizable quantum computer, which comes with the associated ParityOS operating system. With the team of extraordinary members of the DAQC consortium this will allow us to tackle the most pressing and complex industry-relevant optimization problems." saidMagdalena Hauser & Wolfgang Lechner, CEOs & Co-founder ParityQC

"We are looking forward to exploring and realizing a tight connection between hardware and applications, and having DAQC quantum computers as a compatible alternative within the OpenSuperQ laboratory. Collaborations like this across different states, and including both public and private partners, have the right momentum to move quantum computing in Germany forward." saidProf. Frank Wilhelm-Mauch, Director, Institute for Quantum Computing Analytics, Forschungszentrum Jlich

"At Infineon, we are looking forward to collaborating with top-class scientists and leading start-ups in the field of quantum computing in Europe. We must act now if we in Germany and Europe do not want to become solely dependent on American or Asian know-how in this future technology area. We are very glad to be part of this highly innovative project and happy to contribute with our expertise in scaling and manufacturing processes." saidDr.Sebastian Luber, Senior Director Technology & Innovation, Infineon Technologies AG

"This is a hugely exciting project. It is a chance of Europe and Germany to catch up in the development of superconducting quantum computers. I am looking forward to adventures on understanding how such machines can be certified in their precise functioning." said Prof.Jens Eisert, Professor of Quantum Physics, Freie Universitt Berlin

About IQM Quantum Computers:

IQM is the European leader in superconducting quantum computers, headquartered in Espoo, Finland. Since its inception in 2018, IQM has grown to 80+ employees and has also established a subsidiary in Munich, Germany, to lead the co-design approach. IQM delivers on-premises quantum computers for research laboratories and supercomputing centers and provides complete access to its hardware. For industrial customers, IQM delivers quantum advantage through a unique application-specific co-design approach. IQM has raised 71 Million from VCs firms and also public grants and is also building Finland's first quantum computer.

For more information, visit http://www.meetiqm.com.

Registered offices:

IQM Finland OyKeilaranta 1902150 EspooFINLANDwww.meetiqm.com

IQM GERMANY GmbHNymphenburgerstr. 8680636 MnchenGermany

IQM: Facts and Figures

Founders:

Photo: https://mma.prnewswire.com/media/1437806/IQM_Quantum_Computers_Founders.jpg Photo: https://mma.prnewswire.com/media/1437807/IQM_Quantum_computer_design.jpg Logo: https://mma.prnewswire.com/media/1121497/IQM_Logo.jpg

Media Contact: Raghunath Koduvayur, Head of Marketing and Communications, [emailprotected], +358504876509

http://meetiqm.com/contact/

SOURCE IQM Finland Oy

Read more from the original source:
New EU Consortium shaping the future of Quantum Computing USA - PR Newswire India

Quantum venture funding dipped 12% in 2020, but quantum investments rose 46% – VentureBeat

Sorting through the hype surrounding quantum computing these days isnt easy for enterprises trying to figure out the right time to jump in. Skeptics say any real impact is still years away, and yet quantum startups continue to seduce venture capitalists in search of the next big thing.

A new report from CB Insights may not resolve this debate, but it does add some interesting nuance. While the number of venture capital deals for quantum computing startups rose 46% to 37 in 2020 compared to 2019, the total amount raised in this sector fell 12% to $365 million.

Looking at just the number of deals, the annual tally has ticked up steadily from just 6 deals in 2015. As for the funding total, while it was down from $417 million in 2019, it remains well above the $73 million raised in 2015.

Theres a couple of conclusions to draw from this.

First, the number of startups being drawn into this space is clearly rising. As research has advanced, more entrepreneurs with the right technical chops feel the time is now to start building their startup.

Second, the average deal size for 2020 was just under $10 million. And if you include the $46 million IQM raised, that squeezes the average for most other deals down even further. That certainly demonstrates optimism, but its far from the kind of financial gusher or valuations that would indicate any kind of quantum bubble.

Finally, its important to remember that startups are likely a tiny slice of whats happening in quantum these days. A leading indicator? Perhaps.But a large part of the agenda is still being driven by tech giants who have massive resources to invest in a technology that may have a long horizon and could be years away from generating sufficient revenues. That includes Intel, IBM, Google, Microsoft, and Amazon.

Indeed, Amazon just rolled out a new blog dedicated to quantum computing.Last year, Amazon Web Services launched Amazon Braket, a product that lets enterprises start experimenting with quantum computing. Even so, AWS quantum computing director Simone Severini wrote in the inaugural blog post that business customers are still scratching their heads over the whole phenomenon.

We heard a recurring question, When will quantum computing reach its true potential? My answer was, I dont know.' she wrote. No one does. Its a difficult question because there are still fundamental scientific and engineering problems to be solved. The uncertainty makes this area so fascinating, but it also makes it difficult to plan. For some customers, thats a real issue. They want to know if and when they should focus on quantum computing, but struggle to get the facts, to discern the signal from all the noises.

Continued here:
Quantum venture funding dipped 12% in 2020, but quantum investments rose 46% - VentureBeat