Page 64«..1020..63646566..7080..»

Category Archives: Quantum Computing

Quantum Computing in Agriculture Market to Witness Stellar CAGR During the Forecast Period 2021 -2026 – Northwest Diamond Notes

Posted: September 29, 2021 at 7:13 am

Executive Summary:

The data presented in the research report on Quantum Computing in Agriculture market is compiled with an aim to provide readers with a thorough understanding of industry dynamics in this business sphere. It entails a detailed analysis of growth stimulants, opportunities, challenges, and all other critical factors that will dictate the growth trajectory of this domain.

Proceeding further, the research document analyses past records and recent data to predict market valuation and business expansion trends between 2021-2026. It also elaborates on the segmentations of this vertical and their respective contribution to the overall growth. Besides, competitive landscape of this industry is included to help stakeholders make right decisions for the future.

Request Sample Copy of this Report @ https://www.nwdiamondnotes.com/request-sample/22629

Market Rundown:

Regional landscape:

Product spectrum outline:

Application scope overview

Competitive landscape review:

Influence of the Quantum Computing in Agriculture Market report:

The huge assortment of tables, graphs, diagrams, and charts obtained in this market research report generates a strong niche for an in-depth analysis of the ongoing trends in the Quantum Computing in Agriculture Market. The report also looks at the latest developments and advancement among the key players in the market such as mergers, partnerships, and achievements.

Quantum Computing in Agriculture Market Research Reports Includes PESTLE Analysis:

Quantum Computing in Agriculture Market Drivers Affecting:

In short, the Global Quantum Computing in Agriculture Market report offers a one-stop solution to all the key players covering various aspects of the industry like growth statistics, development history, industry share, Quantum Computing in Agriculture Market presence, potential buyers, consumption forecast, data sources, and beneficial conclusion.

MAJOR TOC OF THE REPORT:

Request Customization on This Report @ https://www.nwdiamondnotes.com/request-for-customization/22629

See the original post here:

Quantum Computing in Agriculture Market to Witness Stellar CAGR During the Forecast Period 2021 -2026 - Northwest Diamond Notes

Posted in Quantum Computing | Comments Off on Quantum Computing in Agriculture Market to Witness Stellar CAGR During the Forecast Period 2021 -2026 – Northwest Diamond Notes

The coevolution of particle physics and computing – Symmetry magazine

Posted: at 7:13 am

In the mid-twentieth century, particle physicists were peering deeper into the history and makeup of the universe than ever before. Over time, their calculations became too complex to fit on a blackboardor to farm out to armies of human computers doing calculations by hand.

To deal with this, they developed some of the worlds earliest electronic computers.

Physics has played an important role in the history of computing. The transistorthe switch that controls the flow of electrical signal within a computerwas invented by a group of physicists at Bell Labs. The incredible computational demands of particle physics and astrophysics experiments have consistently pushed the boundaries of what is possible. They have encouraged the development of new technologies to handle tasks from dealing with avalanches of data to simulating interactions on the scales of both the cosmos and the quantum realm.

But this influence doesnt just go one way. Computing plays an essential role in particle physics and astrophysics as well. As computing has grown increasingly more sophisticated, its own progress has enabled new scientific discoveries and breakthroughs.

Illustration by Sandbox Studio, Chicago with Ariel Davis

In 1973, scientists at Fermi National Accelerator Laboratory in Illinois got their first big mainframe computer: a 7-year-old hand-me-down from Lawrence Berkeley National Laboratory. Called the CDC 6600, it weighed about 6 tons. Over the next five years, Fermilab added five more large mainframe computers to its collection.

Then came the completion of the Tevatronat the time, the worlds highest-energy particle acceleratorwhich would provide the particle beams for numerous experiments at the lab. By the mid-1990s, two four-story particle detectors would begin selecting, storing and analyzing data from millions of particle collisions at the Tevatron per second. Called the Collider Detector at Fermilab and the DZero detector, these new experiments threatened to overpower the labs computational abilities.

In December of 1983, a committee of physicists and computer scientists released a 103-page report highlighting the urgent need for an upgrading of the laboratorys computer facilities. The report said the lab should continue the process of catching up in terms of computing ability, and that this should remain the laboratorys top computing priority for the next few years.

Instead of simply buying more large computers (which were incredibly expensive), the committee suggested a new approach: They recommended increasing computational power by distributing the burden over clusters or farms of hundreds of smaller computers.

Thanks to Intels 1971 development of a new commercially available microprocessor the size of a domino, computers were shrinking. Fermilab was one of the first national labs to try the concept of clustering these smaller computers together, treating each particle collision as a computationally independent event that could be analyzed on its own processor.

Like many new ideas in science, it wasnt accepted without some pushback.

Joel Butler, a physicist at Fermilab who was on the computing committee, recalls, There was a big fight about whether this was a good idea or a bad idea.

A lot of people were enchanted with the big computers, he says. They were impressive-looking and reliable, and people knew how to use them. And then along came this swarm of little tiny devices, packaged in breadbox-sized enclosures.

The computers were unfamiliar, and the companies building them werent well-established. On top of that, it wasnt clear how well the clustering strategy would work.

As for Butler? I raised my hand [at a meeting] and said, Good ideaand suddenly my entire career shifted from building detectors and beamlines to doing computing, he chuckles.

Not long afterward, innovation that sparked for the benefit of particle physics enabled another leap in computing. In 1989, Tim Berners-Lee, a computer scientist at CERN, launched the World Wide Web to help CERN physicists share data with research collaborators all over the world.

To be clear, Berners-Lee didnt create the internetthat was already underway in the form the ARPANET, developed by the US Department of Defense. But the ARPANET connected only a few hundred computers, and it was difficult to share information across machines with different operating systems.

The web Berners-Lee created was an application that ran on the internet, like email, and started as a collection of documents connected by hyperlinks. To get around the problem of accessing files between different types of computers, he developed HTML (HyperText Markup Language), a programming language that formatted and displayed files in a web browser independent of the local computers operating system.

Berners-Lee also developed the first web browser, allowing users to access files stored on the first web server (Berners-Lees computer at CERN). He implemented the concept of a URL (Uniform Resource Locator), specifying how and where to access desired web pages.

What started out as an internal project to help particle physicists share data within their institution fundamentally changed not just computing, but how most people experience the digital world today.

Back at Fermilab, cluster computing wound up working well for handling the Tevatron data. Eventually, it became industry standard for tech giants like Google and Amazon.

Over the next decade, other US national laboratories adopted the idea, too. SLAC National Accelerator Laboratorythen called Stanford Linear Accelerator Centertransitioned from big mainframes to clusters of smaller computers to prepare for its own extremely data-hungry experiment, BaBar. Both SLAC and Fermilab also were early adopters of Lees web server. The labs set up the first two websites in the United States, paving the way for this innovation to spread across the continent.

In 1989, in recognition of the growing importance of computing in physics, Fermilab Director John Peoples elevated the computing department to a full-fledged division. The head of a division reports directly to the lab director, making it easier to get resources and set priorities. Physicist Tom Nash formed the new Computing Division, along with Butler and two other scientists, Irwin Gaines and Victoria White. Butler led the division from 1994 to 1998.

These computational systems worked well for particle physicists for a long time, says Berkeley Lab astrophysicist Peter Nugent. That is, until Moores Law started grinding to a halt.

Moores Law is the idea that the number of transistors in a circuit will double, making computers faster and cheaper, every two years. The term was first coined in the mid-1970s, and the trend reliably proceeded for decades. But now, computer manufacturers are starting to hit the physical limit of how many tiny transistors they can cram onto a single microchip.

Because of this, says Nugent, particle physicists have been looking to take advantage of high-performance computing instead.

Nugent says high-performance computing is something more than a cluster, or a cloud-computing environment that you could get from Google or AWS, or at your local university.

What it typically means, he says, is that you have high-speed networking between computational nodes, allowing them to share information with each other very, very quickly. When you are computing on up to hundreds of thousands of nodes simultaneously, it massively speeds up the process.

On a single traditional computer, he says, 100 million CPU hours translates to more than 11,000 years of continuous calculations. But for scientists using a high-performance computing facility at Berkeley Lab, Argonne National Laboratory or Oak Ridge National Laboratory, 100 million hours is a typical, large allocation for one year at these facilities.

Although astrophysicists have always relied on high-performance computing for simulating the birth of stars or modeling the evolution of the cosmos, Nugent says they are now using it for their data analysis as well.

This includes rapid image-processing computations that have enabled the observations of several supernovae, including SN 2011fe, captured just after it began. We found it just a few hours after it exploded, all because we were able to run these pipelines so efficiently and quickly, Nugent says.

According to Berkeley Lab physicist Paolo Calafiura, particle physicists also use high-performance computing for simulationsfor modeling not the evolution of the cosmos, but rather what happens inside a particle detector. Detector simulation is significantly the most computing-intensive problem that we have, he says.

Scientists need to evaluate multiple possibilities for what can happen when particles collide. To properly correct for detector effects when analyzing particle detector experiments, they need to simulate more data than they collect. If you collect 1 billion collision events a year, Calafiura says, you want to simulate 10 billion collision events.

Calafiura says that right now, hes more worried about finding a way to store all of the simulated and actual detector data than he is about producing it, but he knows that wont last.

When does physics push computing? he says. When computing is not good enough We see that in five years, computers will not be powerful enough for our problems, so we are pushing hard with some radically new ideas, and lots of detailed optimization work.

Thats why the Department of Energys Exascale Computing Project aims to build, in the next few years, computers capable of performing a quintillion (that is, a billion billion) operations per second. The new computers will be 1000 times faster than the current fastest computers.

The exascale computers will also be used for other applications ranging from precision medicine to climate modeling to national security.

Innovations in computer hardware have enabled astrophysicists to push the kinds of simulations and analyses they can do. For example, Nugent says, the introduction of graphics processing units has sped up astrophysicists ability to do calculations used in machine learning, leading to an explosive growth of machine learning in astrophysics.

With machine learning, which uses algorithms and statistics to identify patterns in data, astrophysicists can simulate entire universes in microseconds.

Machine learning has been important in particle physics as well, says Fermilab scientist Nhan Tran. [Physicists] have very high-dimensional data, very complex data, he says. Machine learning is an optimal way to find interesting structures in that data.

The same way a computer can be trained to tell the difference between cats and dogs in pictures, it can learn how to identify particles from physics datasets, distinguishing between things like pions and photons.

Tran says using computation this way can accelerate discovery. As physicists, weve been able to learn a lot about particle physics and nature using non-machine-learning algorithms, he says. But machine learning can drastically accelerate and augment that processand potentially provide deeper insight into the data.

And while teams of researchers are busy building exascale computers, others are hard at work trying to build another type of supercomputer: the quantum computer.

Remember Moores Law? Previously, engineers were able to make computer chips faster by shrinking the size of electrical circuits, reducing the amount of time it takes for electrical signals to travel. Now our technology is so good that literally the distance between transistors is the size of an atom, Tran says. So we cant keep scaling down the technology and expect the same gains weve seen in the past."

To get around this, some researchers are redefining how computation works at a fundamental levellike, really fundamental.

The basic unit of data in a classical computer is called a bit, which can hold one of two values: 1, if it has an electrical signal, or 0, if it has none. But in quantum computing, data is stored in quantum systemsthings like electrons, which have either up or down spins, or photons, which are polarized either vertically or horizontally. These data units are called qubits.

Heres where it gets weird. Through a quantum property called superposition, qubits have more than just two possible states. An electron can be up, down, or in a variety of stages in between.

What does this mean for computing? A collection of three classical bits can exist in only one of eight possible configurations: 000, 001, 010, 100, 011, 110, 101 or 111. But through superposition, three qubits can be in all eight of these configurations at once. A quantum computer can use that information to tackle problems that are impossible to solve with a classical computer.

Fermilab scientist Aaron Chou likens quantum problem-solving to throwing a pebble into a pond. The ripples move through the water in every possible direction, simultaneously exploring all of the possible things that it might encounter.

In contrast, a classical computer can only move in one direction at a time.

But this makes quantum computers faster than classical computers only when it comes to solving certain types of problems. Its not like you can take any classical algorithm and put it on a quantum computer and make it better, says University of California, Santa Barbara physicist John Martinis, who helped build Googles quantum computer.

Although quantum computers work in a fundamentally different way than classical computers, designing and building them wouldnt be possible without traditional computing laying the foundation, Martinis says. We're really piggybacking on a lot of the technology of the last 50 years or more.

The kinds of problems that are well suited to quantum computing are intrinsically quantum mechanical in nature, says Chou.

For instance, Martinis says, consider quantum chemistry. Solving quantum chemistry problems with classical computers is so difficult, he says, that 10 to 15% of the worlds supercomputer usage is currently dedicated to the task. Quantum chemistry problems are hard for the very reason why a quantum computer is powerfulbecause to complete them, you have to consider all the different quantum-mechanical states of all the individual atoms involved.

Because making better quantum computers would be so useful in physics research, and because building them requires skills and knowledge that physicists possess, physicists are ramping up their quantum efforts. In the United States, the National Quantum Initiative Act of 2018 called for the National Institute of Standards and Technology, the National Science Foundation and the Department of Energy to support programs, centers and consortia devoted to quantum information science.

In the early days of computational physics, the line between who was a particle physicist and who was a computer scientist could be fuzzy. Physicists used commercially available microprocessors to build custom computers for experiments. They also wrote much of their own softwareranging from printer drivers to the software that coordinated the analysis between the clustered computers.

Nowadays, roles have somewhat shifted. Most physicists use commercially available devices and software, allowing them to focus more on the physics, Butler says. But some people, like Anshu Dubey, work right at the intersection of the two fields. Dubey is a computational scientist at Argonne National Laboratory who works with computational physicists.

When a physicist needs to computationally interpret or model a phenomenon, sometimes they will sign up a student or postdoc in their research group for a programming course or two and then ask them to write the code to do the job. Although these codes are mathematically complex, Dubey says, they arent logically complex, making them relatively easy to write.

A simulation of a single physical phenomenon can be neatly packaged within fairly straightforward code. But the real world doesnt want to cooperate with you in terms of its modularity and encapsularity, she says.

Multiple forces are always at play, so to accurately model real-world complexity, you have to use more complex softwareideally software that doesnt become impossible to maintain as it gets updated over time. All of a sudden, says Dubey, you start to require people who are creative in their own rightin terms of being able to architect software.

Thats where people like Dubey come in. At Argonne, Dubey develops software that researchers use to model complex multi-physics systemsincorporating processes like fluid dynamics, radiation transfer and nuclear burning.

Hiring computer scientists for research projects in physics and other fields of science can be a challenge, Dubey says. Most funding agencies specify that research money can be used for hiring students and postdocs, but not paying for software development or hiring dedicated engineers. There is no viable career path in academia for people whose careers are like mine, she says.

In an ideal world, universities would establish endowed positions for a team of research software engineers in physics departments with a nontrivial amount of computational research, Dubey says. These engineers would write reliable, well-architected code, and their institutional knowledge would stay with a team.

Physics and computing have been closely intertwined for decades. However the two developtoward new analyses using artificial intelligence, for example, or toward the creation of better and better quantum computersit seems they will remain on this path together.

Read the original here:

The coevolution of particle physics and computing - Symmetry magazine

Posted in Quantum Computing | Comments Off on The coevolution of particle physics and computing – Symmetry magazine

Quantum Predictions: Weather Forecasting with Quantum Computers – Analytics Insight

Posted: at 7:13 am

Quantum computing has the potential to change the world and disrupt every industry by providing the opportunity to solve critical problems that modern-day supercomputers just cant achieve. The potentiality of quantum computers also includes the mapping of extremely complex weather patterns. This article is focused on forecasting the weather using quantum computers. Lets see how.

Forecasting the weather can be difficult. In weather prediction, it is hard to get 100% accuracy especially when the weather is considered changeable and the information available is limited. Advanced warnings of wild weather are necessary to minimize the impact of catastrophic events and the ensuing devastation and loss, but current models can only predict regional-scale weather events such as snowstorms and hurricanes, not more localized events such as thunderstorms. Thus, there is a requirement of computing power to keep an eye on the whole globe and predict when a simple storm might become dangerous. But this is not available. Many of the worlds largest supercomputers are already dedicated to weather forecasting but to achieve greater accuracy, they need even more computational brute force. Here comes the emergence of quantum computing.

Every year there are hurricanes, extreme heatwaves, tornadoes, and other extreme weather events, resulting in thousands of deaths and billions of dollars in damages. Prediction of extreme weather further in advance and with increased accuracy could allow for targeted regions to be better prepared to reduce loss of life and property damage.

Granted, there has been much work undertaken in the development of advanced computational models to enhance forecasting over the years, and much progress has been made. Weather forecasting requires analyzing huge amounts of data containing several dynamic variables, such as air temperature, pressure, and density that interact in a non-trivial way. However, there are limitations to using classical computers and even supercomputers in developing numerical weather and climate prediction models. Also, the process of analyzing the weather data by traditional computers may not be fast enough to keep up with ever-changing weather conditions.

Even local weather forecasting, which is rapidly evolving all the time, can stand to benefit from improved forecasting. Take, for example, thunderstorms, where highly accurate and advanced prediction by improved data analysis could minimize the resulting damage, as there could be warning further in advance about potential power outages, and increased preparedness, allowing the local community to restore power faster.

Quantum computing will serve to benefit weather forecasting on both the local scale as well as on a grander scale for more advanced and accurate warning of extreme weather events, potentially saving lives and reducing property damage annually. Beyond weather prediction, to stay informed on the state of quantum computing and its increasing impact on a variety of industries, keep up to date with the 1QBit blog, and follow us on social media.

Quantum computing has the potential to improve conventional numerical methods to boost tracking and predictions of meteorological conditions by handling huge amounts of data containing many variables efficiently and quickly, by harnessing the computing power of qubits, and by using quantum-inspired optimization algorithms. Moreover, pattern recognition, crucial for understanding the weather, can be enhanced utilizing quantum machine learning.

The enhancement of weather forecasting using quantum computing has already been set to become a reality in the coming future.

The UK Met Office has already heavily invested in quantum computing to help improve forecasting, while IBM Research has collaborated with The Weather Company, the University Corporation for Atmospheric Research (UCAR), and the National Centre for Atmospheric Research (NCAR) in America to develop a rapidly-updating, storm-scale model that could predict thunderstorms at a local level. Their model is the first to cover the entire globe and will provide high-resolution forecasts even in the most underserved areas. It employs IBMs supercomputing technology and geographical processing units and, in the future, has the potential to combine with quantum computing to help track and predict the meteorological conditions in ways that classical supercomputers are unable to achieve.

Read the original:

Quantum Predictions: Weather Forecasting with Quantum Computers - Analytics Insight

Posted in Quantum Computing | Comments Off on Quantum Predictions: Weather Forecasting with Quantum Computers – Analytics Insight

PQShield and Kudelski Security Partner to Address Quantum Threat – PRNewswire

Posted: at 7:13 am

OXFORD, England, Sept. 29, 2021 /PRNewswire/ --PQShield, the cybersecurity company specialising in post-quantum cryptography, today announces a reciprocal agreement with Kudelski Security, the cybersecurity division within the Kudelski Group (SIX:KUD.S) to work together to address data protection as we near the era of quantum computing.

Within the agreement, PQShield will offer expert training and advisory services to Kudelski's clients, alongside post-quantum cryptography solutions for both software and hardware, to make preparations to address quantum threats. In addition to advisory services, there will be a collaborative referral and co-marketing relationship to support Kudelski Security's go-to-market strategy.

Kudelski Security launched its quantum security practice in December 2020 in recognition of the threat quantum technology raises in the future. Working with PQShield will bolster the practice by expanding availability of expert advisory and applied security resources, as well as providing clients the potential to utilise PQShield's quantum-ready cryptographic solutions for both software and hardware.

Headquartered in Oxford, with additional teams in the Netherlands and France, PQShield is helping businesses prepare for the quantum threat by pioneering the development and commercial deployment of quantum-ready cryptographic solutions for hardware, software and communications. With one of the UK's highest concentrations of cryptography PhDs outside academia and the classified sector, the PQShield team is also a leading contributor to NIST's post-quantum cryptographystandardisation project, now in its concluding stages, having contributed two of the seven finalist algorithms.

Several leading device manufacturers have already chosen PQShield's PQC IP for integration into hardware security solutions that will be compliant with upcoming US NIST/FIPS post-quantum cryptography standards, as well as localised standards from the likes of BSI in Germany.

Dr. Ali El Kaafarani, CEO and Founder of PQShield, said:

"Our partnership with Kudelski Security offers a great opportunity for us to educate the wider business world on the risks quantum technologies pose to secure data. We want to support Kudelski Security - whose commitment to empowering businesses to tackle the quantum threat is laudable and necessary - in advising companies looking to take a proactive approach in mitigating new risks.

"I am proud of the endorsement in PQShield's expertise in quantum-resistant cryptography indicated by this partnership, and am excited to see where the relationship takes us."

Andrew Howard, chief executive officer, Kudelski Security, said:

"This partnership will build on our internal knowledge across our teams of researchers, analysis and cryptographers and bolster them with the industry leading expertise from PQShield."

Tommaso Gagliardoni, Senior Expert in Quantum Security and Cryptography at Kudelski Security, added:

"There are certain sectors in business and government where there is a desire to secure sensitive data over a longer time span. These sectors are already starting to address the quantum threat to prevent bad actors from leveraging this technology against their encrypted data. I look forward to working alongside the experts at PQShield to educate industry and share best practice to help organisations secure their sensitive information now and for years to come."

About PQShield

PQShield is a cybersecurity startup that specialises in post-quantum cryptography, protecting information from today's attacks while readying organisations for the threat landscape of tomorrow. It is the only cybersecurity company that can demonstrate quantum-safe cryptography on chips, in applications and in the cloud. Headquartered in Oxford, with additional teams in the Netherlands and France, its quantum-secure cryptographic solutions work with companies' legacy systems to protect devices and sensitive data now and for years to come. PQShield is principally backed by Kindred Capital, Crane Venture Partners, Oxford Sciences Innovation, and InnovateUK. Its latest white papers are available to readhere.

http://www.pqshield.com

Public Press contact: [emailprotected]

SOURCE PQShield

.

More:

PQShield and Kudelski Security Partner to Address Quantum Threat - PRNewswire

Posted in Quantum Computing | Comments Off on PQShield and Kudelski Security Partner to Address Quantum Threat – PRNewswire

Job Hunting? The Quantum Industry is Hiring for Diverse Positions: New Assessment by the Quantum Economic Development Consortium Shows Many…

Posted: at 7:13 am

Survey shows a wide range of skills and educational levels needed to support a diversity of jobs

ARLINGTON, Va., Sept. 28, 2021 /PRNewswire/ -- The Quantum Economic Development Consortium (QED-C) recently released an assessment based on a survey of U.S. quantum businesses outlining the diversity of jobs in the quantum industry requiring various skills and education levels. The study provides guidance to educators, policy makers and students to help grow a quantum-ready workforce. The analysis identified skills, several of which are not quantum-specific, relevant for multiple jobs.

"This study provides timely insight into the wide variety of jobs required to support the emerging quantum industry. The study results will help the U.S. grow a quantum workforce with the relevant skills," said Corey Stambaugh with the National Quantum Coordination Office in the White House Office of Science and Technology Policy.

The paper includes recommendations for educators preparing students for the quantum industry and advises those developing new degree programs should provide both quantum-specific and general science, technology, engineering and mathematics (STEM) courses. It also guides educators to consider adding broad quantum courses for students pursuing non-quantum degree programs, equipping them for multiple quantum-related roles.

The report acknowledges business skills will become increasingly important as the industry continues its progress from research to commercialization and suggests universities seek ways to prepare students for roles in sales and marketing.

"The QED-C workforce study highlights the opportunities and challenges for employers and prospective students for the quantum industry. The study also provides guidance to policy makers and educators on how best to prepare the future quantum workforce," said Alan Ho, Google Quantum AI product manager and QED-C steering committee member.

Information gathered from 57 QED-C member companies detailed specific work roles expected to be filled in the next five years and for each role, the associated skills and degrees required. Respondents were representative of the entire quantum supply chain, including hardware and software developers, component suppliers and end users.

Story continues

An assessment by QED-C and Hyperion Research forecasted the quantum computing industry could grow to $830 million by 2024 with an estimated compound annual growth rate of 27 percent. Such growth in quantum computing and other areas of application requires thousands of additional scientists, engineers, technicians and other employees to fill the variety of jobs, including those identified in the new survey. Skills sought by the employers surveyed include quantum algorithm development, circuit design, systems architecture, and technical sales and marketing. The preferred degree varies by job categoryfrom PhD to associate degree.

Growing the quantum workforce has been identified as an enabling factor to ensure the industry's success. The new report reveals the breadth of jobs and skills needed and can aid both educators and students to prepare for careers in this emerging field.

About Quantum Economic Development Consortium The Quantum Economic Development Consortium (QED-C) is an industry-driven consortium managed by SRI International and established in response to the 2018 National Quantum Initiative Act. Membership includes more than 120 US companies from across the supply chain and more than 40 academic institutions and other stakeholders. The consortium seeks to enable and grow the quantum industry and associated supply chain. For more about QED-C, visit quantumconsortium.org and follow us on Twitter @The_QEDC.

ContactCelia MerzbacherQED-C Executive Director319822@email4pr.com

Media Contact:Shannon Blood(949)-777-2428319822@email4pr.com

Amanda TomasettiSRI International319822@email4pr.com

Cision

View original content to download multimedia:https://www.prnewswire.com/news-releases/job-hunting-the-quantum-industry-is-hiring-for-diverse-positions-new-assessment-by-the-quantum-economic-development-consortium-shows-many-traditional-skills-will-be-important-301385622.html

SOURCE SRI International

More:

Job Hunting? The Quantum Industry is Hiring for Diverse Positions: New Assessment by the Quantum Economic Development Consortium Shows Many...

Posted in Quantum Computing | Comments Off on Job Hunting? The Quantum Industry is Hiring for Diverse Positions: New Assessment by the Quantum Economic Development Consortium Shows Many…

IBM CIO: ‘Quantum computing will be important in the IT landscape’ – Siliconrepublic.com

Posted: September 26, 2021 at 4:51 am

The CIO of Big Blue discusses the importance of AI and hybrid cloud, along with emerging tech such as quantum computing and cryptography.

Kathryn Guarini is the CIO of multinational tech giant IBM. Big Blue has been switching its focus to AI and cloud services in recent months, following the news that it is set to undergo a major restructuring.

Guarini and her team are responsible for developing, deploying and transforming the companys internal IT including hardware, software and services across more than 170 countries.

Her tenure at IBM spans more than two decades. Prior to being named CIO earlier this year, she was COO of IBM Research and vice-president for Impact Science, a research team within IBM that sought to apply deep technical expertise to the most pressing global challenges facing society while advancing the underlying science.

Guarini told Siliconrepublic.com that her CIO team supports every part of the business including digital workplace services, as well as thousands of business applications used by professionals in HR, sales, marketing, finance and more.

The recent rise in sophisticated cyberattacks requires us to take innovative approaches to secure the enterprise KATHRYN GUARINI

There are many major IT initiatives that we are focused on at IBM. Let me highlight three here.

First, Kyndryl. My team is playing a key role in supporting the separation of IBMs managed infrastructure business into an independent market-leading company called Kyndryl. IT plays a critical role in ensuring Kyndryl is set up for success with robust and secure infrastructure and applications, segmented to protect data and configured to run each business.

Second, hybrid cloud. We are adopting hybrid cloud at scale in IBM. That means we are moving IBMs internal IT workload from legacy data centres into public and private cloud environments to get the benefits of hybrid cloud from faster deployments to better availability to improved sustainability. Hybrid cloud offers a unified experience with end-to-end security and observability, harnessing the power of the open community.

Third, AI. AI is critical for business agility, resilience and growth. We are applying AI to automate business processes, modernise applications, predict outcomes and secure everything. As one example, we have applied AI to personalise and automate the IT support experience for IBMers, with AI-powered voice response, chat and search that improve the user experience.

Our CIO team is made up of more than 10,000 IBMers with a wide range of technical skills and expertise required to architect, develop, modernise and run IBMs internal IT systems. Our global team is organised into empowered agile squads doing iterative development in support of specific business solutions.

We partner with IBMs businesses, who simultaneously serve as stakeholders for our IT delivery and providers of differentiating technology and services that we adopt on behalf of IBM.

We recognise that building, fostering and advancing a pipeline of diverse talent is key to our long-term success. Weve made focused investments in critical skills that enable transformation, including software development, design, AI, automation, cloud computing, cybersecurity and software-defined networking.

We also have domain expertise in areas like human resources, sales, marketing and finance to help us better serve the IBM company and improve the employee experience.

Like most enterprises, IBM is on its own digital transformation journey, leveraging technologies like hybrid cloud and AI to unlock new business value and accelerate innovation. Hybrid cloud is the foundation of our cognitive enterprise. AI helps make better decisions, automate tasks, streamline processes and enable self-service across the enterprise.

Our CIO team is responsible for modernising our infrastructure and application environment and redesigning our business architecture into end-to-end intelligent workflows.

For example, we have transformed IBMs sales processes, launching a new solution that offers simplification and insight. The solution involved consolidating more than 160 different sales tools into one scalable, global platform to optimise IBMs relationship with our customers.

In another example, we transformed IBMs global client support process, providing improved experience, operational efficiency and AI-driven insights.

Ive already talked about how hybrid cloud and AI are hugely important technologies today to drive business transformation. Were using both to enable intelligent operations, modernise applications and automate insights.

Whats next? Quantum computing will become an important part of the IT landscape, offering competitive advantage to those who can capitalise on the unique capabilities of this new computing paradigm.

Quantum is maturing quickly, with rapid advances in the technology, software ecosystem and use cases across industries. Along with quantum computing comes advanced quantum-safe cryptography solutions that enable encryption that even large-scale fault tolerant quantum computers cant crack.

Theres a tremendous amount of technology innovation happening here that promises to have outsized impact on our industry and the world.

Cybersecurity is a business imperative and the recent rise in sophisticated cyberattacks requires us to take innovative approaches to secure the enterprise.

We have adopted a zero-trust framework, which includes advanced identity protection, vulnerability management and threat detection.

We are adopting security-by-design approaches in the development of our IT solutions to ensure they are foundationally secure against growing threats. And we are adopting IBMs confidential computing technologies to protect sensitive data at all times.

Dont miss out on the knowledge you need to succeed. Sign up for theDaily Brief, Silicon Republics digest of need-to-know sci-tech news.

Follow this link:

IBM CIO: 'Quantum computing will be important in the IT landscape' - Siliconrepublic.com

Posted in Quantum Computing | Comments Off on IBM CIO: ‘Quantum computing will be important in the IT landscape’ – Siliconrepublic.com

Trailblazing Supercomputer Will Enable Scientists And Engineers To Optimize Its Hardware To Support Groundbreaking Research – Texas A&M University…

Posted: at 4:51 am

The system will let researchers perform calculations and solve problems that current supercomputers cannot handle.

Henrik5000/iStock.com

Backed by a multi-million-dollar federal grant, a research team from three major universities will soon start working on a pioneering supercomputing system that allows scientists and engineers to align its processors, accelerators, memory and other hardware components to best serve their needs.

This innovative system will operate increasingly complex levels of software while sidestepping the hardware bottlenecks that often hinder high-level computations. This system will let researchers perform calculations and solve problems that current supercomputers cannot handle.

On Oct. 1, 2021, researchers from Texas A&M University, the University of Illinois Urbana-Champaign (UIUC) and the Texas Advanced Computing Center (TACC) at The University of Texas at Austin (UT Austin) will begin collaborating on a prototype for what they call the Accelerating Computing for Emerging Sciences (ACES) system.

The National Science Foundation (NSF) will provide $5 million for ACESs development and an additional $1 million per year over five years to pay for system operation and support.

Texas A&M Interim Vice President for Research Jack Baldaulf expressed gratitude to the NSF for its substantial investment in the ACES project. We are thankful to NSF for the opportunity to lead such an important initiative and to our Texas A&M HPRC staff and collaborators at UT Austin and UIUC for making this a successful effort, Baldauf said. Computational science is critical to our national needs and the ACES platform will not only advance research but also help educate the future workforce in this area.

The teams goal is to develop an all-inclusive system that will serve researchers across a wide range of scholarly disciplines and computer skills, according to Honggao Liu, executive director of Texas A&Ms High Performance Research Computing (HPRC) and the projects principal investigator.

These disciplines include artificial intelligence and machine learning, cybersecurity, health population informatics, genomics and bioinformatics, human and agricultural life sciences, oil-and-gas simulations, new-materials design, climate modeling, molecular dynamics, quantum-computing architectures, imaging, smart and connected societies, geosciences and quantum chemistry.

The ACES system will support the national research community through coordination systems supported by the NSF, Liu said. In this way, the ACES system will provide invaluable support to cutting-edge projects across a broad spectrum of research disciplines in the nation. ACES will also leverage HPRCs efforts that promote science and broaden participation in computing at the K-12, collegiate and professional levels to have a transformative impact nationally by focusing on training, education and outreach.

Researchers should think of ACES as a cyber-buffet, said Timothy M. Cockerill, director of user services, TACC at UT Austin, and a co-principal investigator on the ACES project. Theyll be able to essentially build the custom environment they require on a per job basis and not be constrained to the contents of a physical server node, Cockerill said.

ACES will open new avenues to scientific advancement, said Shaowen Wang, head of the Department of Geography and Geographic Information Science, professor at UIUC and a co-principal investigator on the ACES project. Exciting advances on many science frontiers will become possible by harnessing the hybrid computing resources and highly adaptable framework offered byACESto enable increasingly complex scientific workflows driven by geospatial big data and artificial intelligence, Wang said.

Also serving as co-principal investigators are Lisa M. Perez, associate director for advanced computing enablement, and Dhruva Chakravorty, associate director for user services and research, both from HPRC at Texas A&M.

Research that generates breakthrough discoveries will require highly advanced computer designs that can meet the challenge, Texas A&M Senior Associate Vice President for Research Costas N. Georghiades said. With the increasing complexity of computational problems in the big-data era we live in, it is no longer sufficient to use traditional supercomputers which rely only on optimizing the software, Georghiades said. The ACES system will also be able to adapt hardware resources on the fly to tackle complex computational tasks more efficiently. Texas A&M is proud to lead this effort in collaboration with our university partners at UT Austin and Illinois.

Technical description

ACES leverages an innovative composable framework via PCIe (Peripheral Component Interconnect Express) Gen5 on Intels upcoming Sapphire Rapid (SPR) processors to offer a rich accelerator testbed consisting of Intel Ponte Vecchio (PVC) GPUs (Graphics Processing Units), Intel FPGAs (Field Programmable Gate Arrays), NEC Vector Engines, NextSilicon co-processors and Graphcore IPUs (Intelligence Processing Units).

The accelerators are coupled with Intel Optane memory and DDN Lustre storage interconnected with Mellanox NDR 400Gbps InfiniBand to support workflows that benefit from optimized devices. ACES will allow applications and workflows to dynamically integrate the different accelerators, memory and in-network computing protocols to glean new insights by rapidly processing large volumes of data and provide researchers with a unique platform to produce complex hybrid programming models for effectively supporting calculations that were not feasible before.

About Research at Texas A&M University: As one of the worlds leading research institutions, Texas A&M is at the forefront in making significant contributions to scholarship and discovery, including science and technology. Research conducted at Texas A&M generated annual expenditures of more than $1.131 billion in fiscal year 2020. Texas A&M ranked in the top 25 of the most recent National Science Foundation Higher Education Research and Development survey based on expenditures of more than $952 million in fiscal year 2019. Texas A&Ms research creates new knowledge that provides basic, fundamental, and applied contributions resulting in economic benefits to the state, nation, and world. research.tamu.edu

View original post here:

Trailblazing Supercomputer Will Enable Scientists And Engineers To Optimize Its Hardware To Support Groundbreaking Research - Texas A&M University...

Posted in Quantum Computing | Comments Off on Trailblazing Supercomputer Will Enable Scientists And Engineers To Optimize Its Hardware To Support Groundbreaking Research – Texas A&M University…

CSC-IT: Finnish businesses start cooperation to capture the benefits of quantum technologies – Science Business

Posted: at 4:51 am

As the global quantum computing market is forecast to reach over EUR 50 billion by 2030, Finnish companies have started cooperation to capture the business opportunities emerging from advances in quantum technologies.

OP Financial Group, Accenture, CSC IT Center for Science, and globally recognized quantum technology companies Bluefors and IQM are the first to join BusinessQ, a VTT-coordinated network to support businesses in the adoption and development of quantum technologies and solutions. The companies will work together to build a business roadmap for Finland around the opportunities of quantum technologies.

Bringing together companies and organisations with quantum expertise, as well as potential end-users, BusinessQ works to position Finnish businesses to the global forefront of adapting new quantum-enabled technologies.

Developments in quantum technologies will create new opportunities for Finnish companies. At VTT, we have decades of experience in turning emerging technologies into viable business, and now we want to foster an active community and support Finnish industries and society in capturing the benefits of quantum technologies early on, says VTTsErja Turunen, Executive Vice President, Digital technologies.

BusinessQ wants to grow and attract new companies from different industries to join the network. Cooperation and dialogue can benefit both companies and the quantum research community as it provides a better understanding of the different industry challenges that quantum-based technologies could tackle in the future.

Discussions with our first BusinessQ partners have shown that Finnish businesses have curiosity, ambition, and an open approach to quantum technologies. We are eager to welcome more companies from different industries and want to build an active business community around the opportunities of quantum technologies,explainsHimadri MajumdarManager of Quantum Programmes at VTT.

Finland also has an active research community that fosters innovation around quantum technologies. In April 2021, Aalto University, Helsinki University, and VTT announced InstituteQ: The Finnish Quantum Institute aims at raising the readiness of Finnish society for the disruptive potential and implications quantum technologies will have for society and the economy at large. In this context, it coordinates operations that foster collaboration in research, education, innovation and infrastructure in the field of quantum technologies. BusinessQs activities share the mission of InstituteQ in strengthening Finlands growing quantum ecosystem. VTT also hosts Finlands first quantum computer that is being built in Espoo in partnership with IQM.

This article was first published on September 22 by CSC.

Link:

CSC-IT: Finnish businesses start cooperation to capture the benefits of quantum technologies - Science Business

Posted in Quantum Computing | Comments Off on CSC-IT: Finnish businesses start cooperation to capture the benefits of quantum technologies – Science Business

IonQ Unveils The Power Of Its Next-Generation Quantum Computer Along With Quantum Finance Announcements – Forbes

Posted: September 24, 2021 at 10:38 am

IonQ

This week, IonQ, Inc. (IonQ) announced the research results for two separate finance-related quantum research projects. The announcement coveredone with Fidelity Center for Applied Technology, and one with Goldman Sachs and QC Ware.

While classical computers use bits for computation, quantum computers use quantum bits or qubits. IonQ uses ion qubits created using precision lasers to remove an outer electron from an atom of ytterbium. IonQ has plans to evolve its existing architecture to a more advanced version sometime in the future. The power of its new hardware was demonstrated in the Goldman Sachs and QC-Ware research below. Moor Insights & Strategy previously wrote about IonQs evolution to its new architecturehere.

Even though the finance industry is computationally intensive, applications containing large numbers of variables are too complex to perform on classical computers. Eventually, solutions for classically intractable problems will become available using quantum computers. Most experts believe it will likely take another five to seven years before todays quantum machines have enough power to move from experimental prototypes to production environments.

When that happens, financial institutions will begin to use quantum computers for everything from pricing to option derivatives to risk management to liquidity coverage. Even though that is still a few years away, most major finance companies have already begun to staff quantum computing research departments.

Here is a summary of IonQs recent announcements:

IonQ and Fidelity Center for Applied Technology (FCAT)

FCAT

FCAT and IonQ researchers used IonQ's cloud-based quantum computer to develop a quantum machine learning (QML) proof of concept that achieved far better results than previous research.It's important to note that this research also demonstrated that quantum computers can outperform classical computers for limited price correlation analysis in the finance industry. Technical details of the study are availablehere.

Historical data is heavily used for training and analysis in today's financial models. For the output to be correct and free from bias, the training data must accurately reflect the characteristics of the modeled scenario.A standard testing process called "backtesting" uses data separate from the training data but believed to be similar can determine if a model produces accurate results.However, backtesting is often insufficient because it is challenging to obtain test data that accurately depicts all the market scenarios represented in an extensive training dataset.

The FCAT-IonQ team built a quantum AI model that created a new and more accurate set of synthetic data to obtain accurate data for backtesting. The synthetic data was created from samples of the same data used to train the model. This procedure is much like the uncanny ability of AI models trained on facial images to create new and authentic faces that look identical to real people.

Instead of facial images, the IonQ and FCAT teams modeled numerical relationships contained in the daily returns of Apple and Microsoft stock from 2010 to 2018. Two quantum machine learning algorithms used this data to produce a highly accurate synthetic data set for backtesting.

A few considerations:

IonQ, Goldman Sachs, and QC-Ware

IonQ

Using IonQ's newest quantum computing hardware, Goldman Sachs and QC-Ware teamed up to push quantum boundaries beyond previous research. The team demonstrated a quantum algorithm developed by QC-Ware for Monte Carlo simulations on IonQ's recently announced quantum processing unit (QPU), Evaporated Glass Trap. Applications of quantum Monte Carlo methods to problems in computational finance have been the subject of several previous research papers. That research involved applying quantum Monte Carlo to specific financial problems such as pricing simple options and credit risk calculations.

According to IonQ, its new QPU has an order of magnitude better fidelity and better throughput than its current generation of quantum processors. In its press release, Peter Chapman, CEO and President of IonQ, emphasized the importance of using a combination of state-of-the-art hardware and best-in-class quantum algorithms.

In the published results, the quantum researchers also attributed the projects success to the high fidelity of IonQ's quantum hardware. The researchers also stated that similar experiments were attempted using other quantum hardware available on the cloud but obtained "considerably worse results."

Quantum computers are expected to not only have a major impact for Monte Carlo simulations, but in other areas of science and engineering as well. Monte Carlo simulations demonstrated by this research are especially important to finance in the areas of risk and derivative pricing for such things as options.Some estimates size the derivatives market to be worth over one quadrillion dollars. Monte Carlo simulations are usually run on classical computers and require the algorithm to be run a number of times to obtain an estimated answer with acceptable precision. When large fault-tolerant quantum computers become available, it will significantly reduce the amount of time needed to obtain solutions for complicated Monte Carlo problems containing a large number of variables. The precision of estimated answers can be improved by increasing the number of samples. For example, to increase a classical computer's answer precision by one order of magnitude requires increasing sampling by 100X.For an equivalent accuracy, a quantum computer would only require a sampling increase of 10X.In finance, time is an important commodity. A few seconds in a large, fast-moving market such as stocks and options can mean the difference between a profit or a loss.

Notes:

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including 8x8, Advanced Micro Devices, Amazon, Applied Micro, ARM, Aruba Networks, AT&T, AWS, A-10 Strategies,Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera,Clumio, Cognitive Systems, CompuCom, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Digital Optics,Dreamchain, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Google (Nest-Revolve), Google Cloud, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Ion VR,Inseego, Infosys, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation,MapBox, Marvell,Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco),Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek,Novumind, NVIDIA, Nuvia, ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Poly, Panasas,Peraso, Pexip, Pixelworks, Plume Design, Poly,Portworx, Pure Storage, Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat,Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak, SONY,Springpath, Spirent, Splunk, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity,TensTorrent,TobiiTechnology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications,Vidyo, VMware, Wave Computing,Wellsmith, Xilinx, Zebra,Zededa, and Zoho which may be cited in blogs and research.

Read more here:

IonQ Unveils The Power Of Its Next-Generation Quantum Computer Along With Quantum Finance Announcements - Forbes

Posted in Quantum Computing | Comments Off on IonQ Unveils The Power Of Its Next-Generation Quantum Computer Along With Quantum Finance Announcements – Forbes

Archer Materials well-funded to advance development of 12CQ quantum computing and lab-on-a-chip" biochip technologies – Proactive Investors USA

Posted: at 10:38 am

During FY21 the company strengthened its transformation into a pure materials technology play by disposing of mineral exploration assets.

(, ) made considerable progress during the financial year ended June30, 2021, and is well-funded to drive the ongoing development of its 12CQ quantum computing and lab-on-a-chip biochip technologies.

The company is developing these advanced semiconductor devices for commercialisation in the multi-billion-dollar global quantum technology and human health industries.

It is doing so after consolidatingits technology development to operate out of a world-class semiconductor research and prototyping foundry in Sydney, and linked nodes of Australian technology development facilities.

The company is well funded to advance its technology strategy with a net cash position of$6.2 million on June 30, 2021.

Archer has made considerable progress in modelling qubit behaviour and the control of qubits, and the control measurements going forward will beworld-first, particularly for solid-state, non-optical quantum computing systems.

During the period, the company signed a new agreement with IBM, allowing it to retain membership to the global IBM Quantum Network and the associated IBM Quantum Startup Program, and to progress the work initiated under the previous agreement.

In addition, Archer began working with Max Kelsen, another Australian member of the IBM Quantum Network, on possible end-use cases for the 12CQ chip.

This collaboration has so far involved adapting a unique class of quantum algorithms for potential big-data'related applications of quantum computing.

The company also signed a non-binding letter of intent (LOI) with the Australian Missile Corporation Pty Ltd, a wholly-owned subsidiary of Australian Defence Prime Contractor NIOA.

By signing the LOI, Archer confirmed its interest in cooperating with the AMC to help fulfil the Australian Governments long-term vision of developing sovereign Australian defence industrial capabilities.

Archers biochip is at an earlier stage of development than the 12CQ chip with the biochip nestled within the product category of MEMS/Sensor devices in the semiconductor industry.

Notably, biochip design principles involve using proprietary graphene-based materials to form the critical sensing elements in lab-on-a-chip technology.

Archer is also focusing on technological barriers to commercialising such devices that involve nanofabrication; another is in assembling a talented multidisciplinary team.

Archer'sbiochip technology.

As Archer moves towards commercialisation, intellectual property (IP) protection and patents will become crucial to stop others from manufacturing, using or selling its technologies in the relevant jurisdictions without the companys permission.

Promisingly, the company is working through patent application procedures in Europe, Hong Kongand Australia, after having patents granted in China, South Korea, Japan and the United States of America.

Archers strategic focus on technology strengthenedwith the completion of the sale of the Leigh Creek Magnesite Project, the Kelly Tank and Jamieson Tank projects, and an agreement with iTech Minerals Ltd for the conditional sale of all remaining mineral tenements.

Upon completion of the iTech sale, the company will receive 50 million iTech shares which itintends to pass on to shareholders through a pro-rata in-specie distribution.

Archer will not hold any iTech shares after the completion of the in-specie distribution, however, will keep the 2.0% net smelter return (NSR) royalty granted on the Jamieson Tank and Kelly Tank projects.

The transformation has signalled an increase in share price over FY21 from A$0.36at market close on July 27, 2020, to A$0.95 on June 30, 2021, and since then has hit a new high of A$3.08 in mid-August while the market cap is now approximately A$454.3 million.

Read more from the original source:

Archer Materials well-funded to advance development of 12CQ quantum computing and lab-on-a-chip" biochip technologies - Proactive Investors USA

Posted in Quantum Computing | Comments Off on Archer Materials well-funded to advance development of 12CQ quantum computing and lab-on-a-chip" biochip technologies – Proactive Investors USA

Page 64«..1020..63646566..7080..»