2022-08-09 | NDAQ:WKEY | Press Release | WISeKey International Holding AG – Stockhouse

WISeKey Implementing PostQuantum Algorithms in itsSecure Semiconductors MS6001/MS6003

Geneva August 9, 2022 WISeKey International Holding Ltd (WISeKey ) (SIX: WIHN, NASDAQ: WKEY), a leading global cybersecurity, AI, Blockchain and IoT company, announces substantial progress in the implementation of post-quantum algorithms in its Secure Semiconductors MS6001/MS6003.

During the last two years, WISeKey has made substantial progress in developing post-quantum resistant algorithms by establishing strategic R&D partnerships with MINES Saint-Etienne Research Institute (MINES Saint-Etienne”), an internationally renowned multidisciplinary university and lab created in 1816, aiming to help the international community find cryptography algorithms that will resist future quantum computing based cyber-attacks.

The WISeKey’s team of experts is working with several NIST’s candidates for the MS600X Common Criteria products: Crystals-Kyber for key exchange mechanism, and Crystals-Dilithium for signatures. The partnership is focusing into the practical implementation aspects for both algorithms, considering physical side-channel attack and deep learning process. This work completes the implementation of NTRU and ROLLO algorithms that the team has already studied, paving the way of a complete post-quantum cryptography toolbox.

This post-quantum cryptography toolbox will help to protect against the security threat posed by quantum computers, allowing hybrid solutions no later than 2025 as recommended by the French ANSSI. In addition to this, WISeKey will upgrade its PKI offer, adding new post-quantum features for the IoT market: Secure authentication, Brand protection, Network communications, future FIDO (Fast IDentity Online”) evolutions and additional generally web-connected smart devices that obtain, analyze, and process the data collected from their surroundings.

WISeKey is also working with NIST to define recommended practices for performing trusted network-layer onboarding, which will aid in the implementation and use of trusted onboarding solutions for IoT devices at scale. The WISeKey contribution to the project will be Trust Services for credentials and secure semiconductors to keep credential secure. Specifically, WISeKey will offer INeS Certificate Management Service (CMS) for issuing credentials and VaultIC secure semiconductors to provide tamperproof key storage and cryptographic acceleration.

While quantum computing offers endless perspectives to incredibly increase computing power, hackers will take advantage of this technology to crack cryptography algorithms, corrupt cybersecurity and compromise global economy. Research about quantum computing, namely how to use quantum mechanical phenomena to perform fast computation, was initiated in the early 1980s. The perspectives and unbelievable performances offered by this promising technology are so huge that many countries are sponsoring public/private R&D initiatives.

WISeKey brings its decades of expertise in designing Common Criteria EAL5+ and FIPS 140-2 Level 3 certified hardware based secure elements (MS600x secure microcontrollers, VaultIC, etc.) and in developing hacker resistant firmware. The new algorithms to be evaluated will first have to practically run on WISeKey’s existing and new hardware architectures. The Company will also share its expertise in deep learning AI techniques to prove the robustness of the implementations.

About WISeKey WISeKey (NASDAQ: WKEY; SIX Swiss Exchange: WIHN) is a leading global cybersecurity company currently deploying large scale digital identity ecosystems for people and objects using Blockchain, AI and IoT respecting the Human as the Fulcrum of the Internet. WISeKey microprocessors secure the pervasive computing shaping today’s Internet of Everything. WISeKey IoT has an install base of over 1.5 billion microchips in virtually all IoT sectors (connected cars, smart cities, drones, agricultural sensors, anti-counterfeiting, smart lighting, servers, computers, mobile phones, crypto tokens etc.). WISeKey is uniquely positioned to be at the edge of IoT as our semiconductors produce a huge amount of Big Data that, when analyzed with Artificial Intelligence (AI), can help industrial applications to predict the failure of their equipment before it happens.

Our technology is Trusted by the OISTE/WISeKey’s Swiss based cryptographic Root of Trust (RoT”) provides secure authentication and identification, in both physical and virtual environments, for the Internet of Things, Blockchain and Artificial Intelligence. The WISeKey RoT serves as a common trust anchor to ensure the integrity of online transactions among objects and between objects and people. For more information, visit http://www.wisekey.com.

Press and investor contacts: WISeKey International Holding Ltd Company Contact: Carlos Moreira Chairman & CEO Tel: +41 22 594 3000 info@wisekey.com WISeKey Investor Relations (US) Contact: Lena Cati The Equity Group Inc. Tel: +1 212 836-9611 lcati@equityny.com

Disclaimer: This communication expressly or implicitly contains certain forward-looking statements concerning WISeKey International Holding Ltd and its business. Such statements involve certain known and unknown risks, uncertainties, and other factors, which could cause the actual results, financial condition, performance, or achievements of WISeKey International Holding Ltd to be materially different from any future results, performance or achievements expressed or implied by such forward-looking statements. WISeKey International Holding Ltd is providing this communication as of this date and does not undertake to update any forward-looking statements contained herein as a result of new information, future events or otherwise. This press release does not constitute an offer to sell, or a solicitation of an offer to buy, any securities, and it does not constitute an offering prospectus within the meaning of article 652a or article 1156 of the Swiss Code of Obligations or a listing prospectus within the meaning of the listing rules of the SIX Swiss Exchange. Investors must rely on their own evaluation of WISeKey and its securities, including the merits and risks involved. Nothing contained herein is, or shall be relied on as, a promise or representation as to the future performance of WISeKey.

Read more from the original source:
2022-08-09 | NDAQ:WKEY | Press Release | WISeKey International Holding AG - Stockhouse

UArizona leads international partnership to boost development of the internet of the future – University of Arizona News

By Daniel Stolte, University Communications

Today

The University of Arizona Center for Quantum Networks is leading a new international research and development partnership that will investigate technologies that will form the foundations of a quantum internet.

The partnership, with research centers in the Republic of Ireland and Northern Ireland, was made possible by a combined investment of $3 million from the National Science Foundation, Science Foundation Ireland and the Northern Ireland Department for the Economy.

Dubbed CoQREATE, which stands for Convergent Quantum Research Alliance in Telecommunications, the transatlantic partnership will focus on developing technology that will provide connectivity between quantum computers over short and long distances.

Quantum computers are being rapidly developed, with the first devices already commercially available. Unlike conventional computers, which use electrical charges inside semiconductors commonly referred to as "zeros and ones" quantum computers harness quantum mechanical effects known as quantum bits, or "qubits," which make them orders of magnitude faster and more capable of certain enormously complex calculations.

"Because they compute using qubits, networking quantum computers will require fundamentally new communications infrastructure that is capable to transmitting packets of qubits reliably and fast over long distances, while relying on the classical internet for some of their functions," said Saikat Guha, a professor in the UArizona James C. Wyant College of Optical Sciences and director of the Center for Quantum Networks, a National Science Foundation Engineering Research Center based at the university.

Guha said the quantum internet will not replace the existing "classical" internet. Rather, the two will coexist and cooperate to allow for many new applications that are not possible today. The quantum internet will need a robust classical communications backbone to function, thereby increasing the burden on the classical network, and some of the underlying technologies of the existing internet will need to be upgraded to be compatible with quantum communications.

"CoQREATE is an effort to figure out how these two technologies can work together," Guha said.

The partnership brings together four large research centers: the Center for Quantum Networks; Science Foundation Ireland's Research Centre for Future Networks and Communications; the Irish Photonic Integration Centre, a center of excellence for research and training in photonics; and Quantum Technology at Queen's University in Northern Ireland. The project provides funding for at least 10 research positions.

Quantum computers have the potential to perform many computing tasks faster than classical computers, in some cases solving problems that are impossible for classical computers to solve with today's computing power. Quantum computing takes advantage of two quantum mechanical phenomena:

The quantum internet will surpass the capabilities of today's internet because of the unique advantages of entanglement, which will improve the internet in at least two important ways, Guha said.

"First, it will enable physics-based communications security and privacy guarantees in multi-party transactions that cannot be compromised by any amount of computational power," he said. "Second, it will create a global network of quantum computers, processors and sensors that are fundamentally more powerful than today's technology. This will bring unprecedented advances in distributed computing and powerful long-baseline telescope systems, and enable secure access to quantum computers for the public."

Linking computers together over a quantum internet will allow for quantum computing with even greater computational power compared to that of individual computers. The new internet will be a collection of repeaters, routers, switches and other elements that allow the distribution of entanglement over large distances: across cities, countries and, eventually, continents.

"In addition to the technology aspect, the CoQREATE partnership will include research on socio-technical convergence to bring a broader social perspective to the program," Guha said. "This research will focus on societal impacts of quantum internet-enabled technologies surrounding privacy and security, unintended implicit biases embedded in the technology, equitable access and education."

Original post:
UArizona leads international partnership to boost development of the internet of the future - University of Arizona News

Quantum Computing in Transportation Market 2022 Industry Analysis by Geographical Regions, Type and Application, Forecast to 2028 Shanghaiist -…

MarketsandResearch.biz led the statistical surveying on Global Quantum Computing in Transportation Market considering a very long time from 2022- 2028. The exploration philosophies followed are essence and intend to give a more profound picture of the continuous just as approaching changes that the Quantum Computing in Transportation market is and will be exposed to in the previously mentioned conjecture time frame.

The report tries to consider its clients requests who are meaning to make inferences for the Quantum Computing in Transportation market. Thusly, it is a completely painted portrayal of these above- referenced key portions in country-wise examination also.

DOWNLOAD FREE SAMPLE REPORT: https://www.marketsandresearch.biz/sample-request/239446

It further feels free to take into regard both subjective, quantitative parts of the Quantum Computing in Transportation market. The subjective area incorporates data about market main impetuses, possibilities, and client requests and necessities, which assists organizations with growing new systems to contend over the long haul. The quantitative part of the report, on the other hand, contains the most dependable industry information screened completely by experts to infer deductions by its inspectors. Some exceptionally summed up wellsprings of data are taken into utilization. Articles, (yearly) reports, data sets, both of government & NGOs, data accumulated from industry-specialists, advisors, etc structure the reports substance. The numbers that have been taken are based on common research assumptions varying from region to region.

The report has been expanded into four significant regions relying upon the item under study:

Central parts of the market

Nations covered

Type

ACCESS FULL REPORT: https://www.marketsandresearch.biz/report/239446/global-quantum-computing-in-transportation-market-2021-by-company-regions-type-and-application-forecast-to-2026

Application

Customization of the Report:

This report can be customized to meet the clients requirements. Please connect with our sales team (sales@marketsandresearch.biz), who will ensure that you get a report that suits your needs. You can also get in touch with our executives on 1-201-465-4211 to share your research requirements.

Contact UsMark StoneHead of Business DevelopmentPhone: 1-201-465-4211Email: sales@marketsandresearch.bizWeb: http://www.marketsandresearch.biz

Go here to see the original:
Quantum Computing in Transportation Market 2022 Industry Analysis by Geographical Regions, Type and Application, Forecast to 2028 Shanghaiist -...

USC’s Biggest Wins in Computing and AI – USC Viterbi | School of Engineering – USC Viterbi School of Engineering

USC has been an animating force for computing research since the late 1960s.

With the advent of the USC Information Sciences Institute (ISI) in 1972 and the Department of Computer Science in 1976 (born out of the Ming Hsieh Department of Electrical and Computer Engineering), USC has played a propulsive role in everything from the internet to the Oculus Rift to recent Nobel Prizes.

Here are seven of those victories reimagined as cinemagraphs still photographs animated by subtle yet remarkable movements.

Cinemagraph: Birth of .Com

1. The Birth of the .com (1983)

While working at ISI, Paul Mockapetris and Jon Postel pioneered the Domain Name System, which introduced the .com, .edu, .gov and .org internet naming standards.

As Wired noted on the 25th anniversary, Without the Domain Name System, its doubtful the internet could have grown and flourished as it has.

The DNS works like a phone book for the internet, automatically translating text names, which are easy for humans to understand and remember, to numerical addresses that computers need. For example, imagine trying to remember an IP address like 192.0.2.118 instead of simply usc.edu.

In a 2009 interview with NPR, Mockapetris said he believed the first domain name he ever created was isi.edu for his employer, the (USC) Information Sciences Institute. That domain name is still in use today.

Grace Park, B.S. and M.S. 22 in chemical engineering, re-creates Len Adlemans famous experiment.

2. The Invention of DNA Computing (1994)

In a drop of water, a computation took place.

In 1994, Professor Leonard Adleman, who coined the term computer virus, invented DNA computing, which involves performing computations using biological molecules rather than traditional silicon chips.

Adleman who received the 2002 Turing Award, often called the Nobel Prize of computer science saw that a computer could be something other than a laptop or machine using electrical impulses. After visiting a USC biology lab in 1993, he recognized that the 0s and 1s of conventional computers could be replaced with the four DNA bases: A, C, G and T. As he later wrote, a liquid computer can exist in which interacting molecules perform computations.

As the New York Times noted in 1997: Currently the worlds most powerful supercomputer sprawls across nearly 150 square meters at the U.S. governments Sandia National Laboratories in New Mexico. But a DNA computer has the potential to perform the same breakneck-speed computations in a single drop of water.

Weve shown by these computations that biological molecules can be used for distinctly non-biological purposes, Adleman said in 2002. They are miraculous little machines. They store energy and information, they cut, paste and copy.

Professor Maja Matari with Blossom, a cuddly, robot companion to help people with anxiety and depression practice breathing exercises and mindfulness.

3. USC Interaction Lab Pioneers Socially Assistive Robotics (2005)

Named No. 5 by Business Insider as one of the 25 Most Powerful Women Engineers in Tech, Maja Matari leads the USC Interaction Lab, pioneering the field of socially assistive robotics (SAR).

As defined by Matari and her then-graduate researcher David Feil-Seifer 17 years ago, socially assistive robotics was envisioned as the intersection of assistive robotics and social robotics, a new field that focuses on providing social support for helping people overcome challenges in health, wellness, education and training.

Socially assistive robots have been developed for a broad range of user communities, including infants with movement delays, children with autism, stroke patients, people with dementia and Alzheimers disease, and otherwise healthy elderly people.

We want these robots to make the user happier, more capable and better able to help themselves, said Matari, the Chan Soon-Shiong Chair and Distinguished Professor of Computer Science, Neuroscience and Pediatrics at USC. We also want them to help teachers and therapists, not remove their purpose.

The field has inspired investments from federal funding agencies and technology startups. The assistive robotics market is estimated to reach $25.16 billion by 2028.

Is the ball red or blue? Is the cat alive or dead? Professor Daniel Lidar, one of the worlds top quantum influencers, demonstrates the idea of superposition.

4. First Operational Quantum Computing System in Academia (2011)

Before Google or NASA got into the game, there was the USC-Lockheed Martin Quantum Computing Center (QCC).

Led by Daniel Lidar, holder of the Viterbi Professorship in Engineering, and ISIs Robert F. Lucas (now retired), the center launched in 2011. With the worlds first commercial adiabatic quantum processor, the D-Wave One, USC is the only university in the world to host and operate a commercial quantum computing system.

As USC News noted in 2018, quantum computing is the ultimate disruptive technologyit has the potential to create the best possible investment portfolio, dissolve urban traffic jams and bring drugs to market faster. It can optimize batteries for electric cars, predictions for weather and models for climate change.Quantum computing can do this, and much more, because it can crunch massive data and variables and do it quickly with advantage over classical computers as problems get bigger.

Recently, QCC upgraded to D-Waves Advantage system, with more than 5,000 qubits, an order of magnitude larger than any other quantum computer. The upgrades will enable QCC to host a new Advantage generation of quantum annealers from D-Wave and will be the first Leap quantum cloud system in the United States. Today, in addition to Professor Lidar one of the worlds top quantum computing influencers QCC is led by Research Assistant Professor Federico Spedalieri, as operations director, and Research Associate Professor Stephen Crago, associate director of ISI.

David Traum, a leader at the USC Institute for Creative Technologies (ICT), converses with Pinchas Gutter, a Holocaust survivor, as part of the New Dimensions in Testimony.

5. USC ICT Enables Talking with the Pastin the Future (2015)

New Dimensions in Testimony, a collaboration between the USC Shoah Foundation and the USC Institute for Creative Technologies (ICT), in partnership with Conscience Display, is an initiative to record and display testimony in a way that will continue the dialogue between Holocaust survivors and learners far into the future.

The project uses ICTs Light Stage technology to record interviews using multiple high-end cameras for high-fidelity playback. The ICT Dialogue Groups natural language technology allows fluent, open-ended conversation with the recordings. The result is a compelling and emotional interactive experience that enables viewers to ask questions and hear responses in real-time, lifelike conversation even after the survivors have passed away.

New Dimensions in Testimony debuted in the Illinois Holocaust Museum & Education Center in 2015. Since then, more than 50 survivors and other witnesses have been recorded and presented in dozens of museums around the United States and the world. It remains a powerful application of AI and graphics to preserve the stories and lived experiences of culturally and historically significant figures.

Eric Rice and Bistra Dilkina are co-directors of the Center for AI in Society (CAIS), a remarkable collaboration between the USC Dworak-Peck School of Social Work and the USC Viterbi School of Engineering.

6. Among the First AI for Good Centers in Higher Education (2016)

Launched in 2016, the Center for AI in Society (CAIS) became one of the pioneering AI for Good centers in the U.S., uniting USC Viterbi and the USC Suzanne Dworak-Peck School of Social Work.

In the past, CAIS used AI to prevent the spread of HIV/AIDS among homeless youth. In fact, a pilot study demonstrated a 40% increase in homeless youth seeking HIV/AIDS testing due to an AI-assisted intervention. In 2019, the technology was also used as part of the largest global deployment of predictive AI to thwart poachers and protect endangered animals.

Today, CAIS fuses AI, social work and engineering in unique ways, such as working with the Los Angeles Homeless Service Authority to address homelessness; battling opioid addiction; mitigating disasters like heat waves, earthquakes and floods; and aiding the mental health of veterans.

CAIS is led by co-directors Eric Rice, a USC Dworak-Peck professor of social work, and Bistra Dilkina, a USC Viterbi associate professor of computer science and the Dr. Allen and Charlotte Ginsburg Early Career Chair.

Pedro Szekely, Mayank Kejriwal and Craig Knoblock of the USC Information Sciences Institute (ISI) are at the vanguard of using computer science to fight human trafficking.

7. AI That Fights Modern Slavery (2017)

Beginning in 2017, a team of researchers at ISI led by Pedro Szekely, Mayank Kejriwal and Craig Knoblock created software called DIG that helps investigators scour the internet to identify possible sex traffickers and begin the process of capturing, charging and convicting them.

Law enforcement agencies across the country, including in New York City, have used DIG as well as other software programs spawned by Memex, a Defense Advanced Research Projects Agency (DARPA)-funded program aimed at developing internet search tools to help investigators thwart sex trafficking, among other illegal activities. The specialized software has triggered more than 300 investigations and helped secure 18 felony sex-trafficking convictions, according to Wade Shen, program manager in DARPAs Information Innovation Office and Memex program leader. It has also helped free several victims.

In 2015, Manhattan District Attorney Cyrus R. Vance Jr. announced that DIG was being used in every human trafficking case brought by the DAs office. With technology like Memex, he said, we are better able to serve trafficking victims and build strong cases against their traffickers.

This is the most rewarding project Ive ever worked on, said Szekely. Its really made a difference.

Published on July 28th, 2022

Last updated on July 28th, 2022

More here:
USC's Biggest Wins in Computing and AI - USC Viterbi | School of Engineering - USC Viterbi School of Engineering

CXL Brings Datacenter-sized Computing with 3.0 Standard, Thinks Ahead to 4.0 – HPCwire

A new version of a standard backed by major cloud providers and chip companies could change the way some of the worlds largest datacenters and fastest supercomputers are built.

The CXL Consortium on Tuesday announced a new specification called CXL 3.0 also known as Compute Express Link 3.0 that eliminates more chokepoints that slow down computation in enterprise computing and datacenters.

The new spec provides a communication link between chips, memory and storage in systems, and it is two times faster than its predecessor called CXL 2.0.

CXL 3.0 also has improvements for more fine-grained pooling and sharing of computing resources for applications such as artificial intelligence.

CXL 3.0 is all about improving bandwidth and capacity, and can better provision and manage computing, memory and storage resources, said Kurt Lender, the co-chair of the CXL marketing work group (and senior ecosystem manager at Intel), in an interview with HPCwire.

Hardware and cloud providers are coalescing around CXL, which has steamrolled other competing interconnects. This week, OpenCAPI, an IBM-backed interconnect standard, merged with CXL Consortium, following the footsteps of Gen-Z, which did the same in 2020.

CXL released the first CXL 1.0 specification in 2019, and quickly followed it up with CXL 2.0, which supported PCIe 5.0, which is found in a handful of chips such as Intels Sapphire Rapids and Nvidias Hopper GPU.

The CXL 3.0 spec is based on PCIe 6.0, which was finalized in January. CXL has a data transfer speed of up to 64 gigatransfers per second, which is the same as PCIe 6.0.

The CXL interconnect can link up chips, storage and memory that are near and far from each other, and that allows system providers to build datacenters as one giant system, said Nathan Brookwood, principal analyst at Insight 64.

CXLs ability to support the expansion of memory, storage and processing in a disaggregated infrastructure gives the protocol a step-up over rival standards, Brookwood said.

Datacenter infrastructures are moving to a decoupled structure to meet the growing processing and bandwidth needs for AI and graphics applications, which require large pools of memory and storage. AI and scientific computing systems also require processors beyond just CPUs, and organizations are installing AI boxes, and in some cases, quantum computers, for more horsepower.

CXL 3.0 improves bandwidth and capacity with better switching and fabric technologies, CXL Consortiums Lender said.

CXL 1.1 was sort of in the node, then with 2.0, you can expand a little bit more into the datacenter. And now you can actually go across racks, you can do decomposable or composable systems, with the fabric technology that weve brought with CXL 3.0, Lender said.

At the rack level, one can make CPU or memory drawers as separate systems, and improvements in CXL 3.0 provide more flexibility and options in switching resources compared to previous CXL specifications.

Typically, servers have a CPU, memory and I/O, and can be limited in physical expansion. In disaggregated infrastructure, one can take a cable to a separate memory tray through a CXL protocol without relying on the popular DDR bus.

You can decompose or compose your datacenter as you like it. You have the capability of moving resources from one node to another, and dont have to do as much overprovisioning as we do today, especially with memory, Lender said, adding its a matter of you can grow systems and sort of interconnect them now through this fabric and through CXL.

The CXL 3.0 protocol uses the electricals of the PCI-Express 6.0 protocol, along with its protocols for I/O and memory. Some improvements include support for new processors and endpoints that can take advantage of the new bandwidth. CXL 2.0 had single-level switching, while 3.0 has multi-level switching, which provides more latency on the fabric.

You can actually start looking at memory like storage you could have hot memory and cold memory, and so on. You can have different tiering and applications can take advantage of that, Lender said.

The protocol also accounts for the ever-changing infrastructure of datacenters, providing more flexibility on how system administrators want to aggregate and disaggregate processing units, memory and storage. The new protocol opens more channels and resources for new types of chips that include SmartNICs, FPGAs and IPUs that may require access to more memory and storage resources in datacenters.

HPC composable systems youre not bound by a box. HPC loves clusters today. And [with CXL 3.0] now you can do coherent clusters and low latency. The growth and flexibility of those nodes is expanding rapidly, Lender said.

The CXL 3.0 protocol can support up to 4,096 nodes, and has a new concept of memory sharing between different nodes. That is an improvement from a static setup in older CXL protocols, where memory could be sliced and attached to different hosts, but could not be shared once allocated.

Now we have sharing where multiple hosts can actually share a segment of memory. Now you can actually look at quick, efficient data movement between hosts if necessary, or if you have an AI-type application that you want to hand data from one CPU or one host to another, Lender said.

The new feature allows peer-to-peer connection between nodes and endpoints in a single domain. That sets up a wall in which traffic can be isolated to move only between nodes connected to each other. That allows for faster accelerator-to-accelerator or device-to-device data transfer, which is key in building out a coherent system.

If you think about some of the applications and then some of the GPUs and different accelerators, they want to pass information quickly, and now they have to go through the CPU. With CXL 3.0, they dont have to go through the CPU this way, but the CPU is coherent, aware of whats going on, Lender said.

The pooling and allocation of memory resources is managed by a software called Fabric Manager. The software can sit anywhere in the system or hosts to control and allocate memory, but it could ultimately impact software developers.

If you get to the tiering level, and when you start getting all the different latencies in the switching, thats where there will have to be some application awareness and tuning of application. I think we certainly have that capability today, Lender said.

It could be two to four years before companies start releasing CXL 3.0 products, and the CPUs will need to be aware of CXL 3.0, Lender said. Intel built in support for CXL 1.1 in its Sapphire Rapids chip, which is expected to start shipping in volume later this year. The CXL 3.0 protocol is backward compatible with the older versions of the interconnect standard.

CXL products based on earlier protocols are slowly trickling into the market. SK Hynix this week introduced its first DDR5 DRAM-based CXL (Compute Express Link) memory samples, and will start manufacturing CXL memory modules in volume next year. Samsung has also introduced CXL DRAM earlier this year.

While products based on CXL 1.1 and 2.0 protocols are on a two-to-three-year product release cycle, CXL 3.0 products could take a little longer as it takes on a more complex computing environment.

CXL 3.0 could actually be a little slower because of some of the Fabric Manager, the software work. Theyre not simple systems when you start getting into fabrics, people are going to want to do proof of concepts and prove out the technology first. Its going to probably be a three-to-four year timeframe, Lender said.

Some companies already started work on CXL 3.0 verification IP six to nine months ago, and are finetuning the tools to the final specification, Bender said.

The CXL has a board meeting in October to discuss the next steps, which could also involve CXL 4.0. The standards organization for PCIe, called the PCI-Special Interest Group, last month announced it was planning PCIe 7.0, which increases the data transfer speed to 128 gigatransfers per second, which is double that of PCIe 6.0.

Lender was cautious about how PCIe 7.0 could potentially fit into a next-generation CXL 4.0. CXL has its own set of I/O, memory and cache protocols.

CXL sits on the electricals of PCIe so I cant commit or absolutely guarantee that [CXL 4.0] will run on 7.0. But thats the intent to use the electricals, Lender said.

Under that case, one of the tenets of CXL 4.0 will be to double the bandwidth by going to PCIe 7.0, but beyond that, everything else will be what we do more fabric or do different tunings, Lender said.

CXL has been on an accelerated pace, with three specification releases since its formation in 2019. There was confusion in the industry on the best high-speed, coherent I/O bus, but the focus has now coagulated around CXL.

Now we have the fabric. There are pieces of Gen-Z and OpenCAPI that arent even in CXL 3.0, so will we incorporate those? Sure, well look at doing that kind of work moving forward, Lender said.

See the original post here:
CXL Brings Datacenter-sized Computing with 3.0 Standard, Thinks Ahead to 4.0 - HPCwire

Congress Is Giving Billions to the Chip Industry. Strings Are Attached. – The New York Times

Its an embrace of industrial policy not seen in Washington for decades. Gary Hufbauer, a nonresident senior fellow at the Peterson Institute for International Economics who has surveyed U.S. industrial policy, said the bill was the most significant investment in industrial policy that the United States had made in at least 50 years.

Worrying outlook. Amid persistently high inflation, rising consumer pricesand declining spending, the American economy is showing clear signs of slowing down, fueling concerns about a potential recession. Here are other eight measures signaling trouble ahead:

Consumer confidence. In June, the University of Michigans survey of consumer sentimenthit its lowest level in its 70-year history, with nearly half of respondents saying inflation is eroding their standard of living.

The housing market. Demand for real estate has decreased, and construction of new homes is slowing. These trends could continue as interest rates rise, and real estate companies, including Compass and Redfin, have laid off employees in anticipation of a downturn in the housing market.

Copper. A commodity seen by analysts as a measure of sentiment about the global economy because of its widespread use in buildings, cars and other products copper is down more than 20 percent since January, hitting a 17-month low on July 1.

Oil. Crude prices are up this year, in part because of supply constraints resulting from Russias invasion of Ukraine, but they have recently started to waveras investors worry about growth.

The bond market. Long-term interest rates in government bonds have fallen below short-term rates, an unusual occurrence that traders call a yield-curve inversion. It suggests that bond investors are expecting an economic slowdown.

American politicians of both parties have long hailed the economic power of free markets and free trade while emphasizing the dangers and inefficiencies of government interference. Republicans, and some Democrats, argued that the government was a poor arbiter of winners and losers in business, and that its interference in the private market was, at best, wasteful and often destructive.

But Chinas increasing dominance of key global supply chains, like those for rare earth metals, solar panels and certain pharmaceuticals, has generated new support among both Republicans and Democrats for the government to nurture strategic industries. South Korea, Japan, the European Union and other governments have outlined aggressive plans to woo semiconductor factories. And the production of many advanced semiconductors in Taiwan, which is increasingly under risk of invasion, has become for many an untenable security threat.

Semiconductors are necessary to power other key technologies, including quantum computing, the internet of things, artificial intelligence and fighter jets, as well as mundane items like cars, computers and coffee makers.

The question really needs to move from why do we pursue an industrial strategy to how do we pursue one, Brian Deese, the director of the National Economic Council, said in an interview. This will allow us to really shape the rules of where the most cutting-edge innovation happens.

Disruptions in the supply chains for essential goods during the pandemic have added to the sense of urgency to stop American manufacturing from flowing overseas. That includes semiconductors, where the U.S. share of global manufacturing fell to 12 percent in 2020 from 37 percent in 1990, according to the Semiconductor Industry Association. Chinas share of manufacturing rose to 15 percent from almost nothing in the same time period.

Read more here:
Congress Is Giving Billions to the Chip Industry. Strings Are Attached. - The New York Times

Ed Husic demands universities reveal Google partnership terms – The Australian Financial Review

The government needs to be funding that kind of research and Im determined to develop a sovereign quantum computing capability here, Mr Husic said.

We dont want this to be like solar technology, where we were pioneers until it went offshore and we lost much of the environmental and economic benefits.

Google Australia confirmed the intellectual property would remain in Australia.

The universities we are working with retain ownership of any intellectual property they create, a Google spokeswoman said.

This funding stems from the commitment Google made last year to support Australias digital future and that is what these collaborations seek to achieve.

Mr Husic has confirmed he plans to direct part of the $1 billion tech investment fund he pledged during the election campaign towards Australias nascent quantum computing industry.

The tech fund is part of the $15 billion National Reconstruction Fund (NRF), a body modelled on the Clean Energy Finance Corporation, which will invest in tech and innovative manufacturing to help drive a post-pandemic economic recovery.

It is an off-budget measure providing investment support through loans, equity investments and loan guarantees for businesses in critical technologies; taking minority shareholder positions in relevant companies, rather than having majority ownership.

The Australian government should be the main investment partner for these frontier technologies, Mr Husic said.

Rather than partnering with overseas firms, we hope to be a part of the profound economic upside offered by quantum computing and Australias growing capacity to develop it.

Googles collaboration with the universities will underpin its long-term goal to develop bigger, more sophisticated quantum algorithms that can be used in applications such as machine learning and artificial intelligence, making quantum computing useful to the companys core business of selling ads.

Australian universities are building a global reputation as developers of quantum computing.

Earlier this year, Silicon Quantum Computing, which was spun out of the University of NSW in 2017 and is hoping to bank $130 million to continue its development of a quantum computer by 2030.

Internationally, quantum computing is part of the trilateral AUKUS partnership between Australia, the US and Britain which has established working groups to hasten the development of quantum technologies, artificial intelligence and undersea capabilities.

Seventeen trilateral working groups have begun work under the AUKUS banner, with nine focused on conventionally armed nuclear-powered submarines, and eight relating to other advanced military capabilities.

See original here:
Ed Husic demands universities reveal Google partnership terms - The Australian Financial Review

What has quantum computing got to do with AI? – Verdict

Artificial intelligence (AI) is emerging as one of the key industry trends after decades of just being a researchers dream. From conversations with Alexa and Siri to Waymo (Google) and Teslas vehicles driving themselves, OpenAIs GPT-3 writing prose like a human, and DeepMind (Google)s AlphaZero beating human chess grandmasters, it is becoming clear that AI is now mature enough to resolve real-life problems and is often faster and better at it than humans.

Elsewhere in the tech industry, several visionaries are working towards developing quantum computers, which seek to leverage the properties of quantum physics to perform calculations much faster than todays computers.

At this point, you cannot be blamed for wondering: what exactly has quantum computing got to do with AI?

Algorithmic complexity is a somewhat obscure mathematical concept that connects the work being done by AI researchers and quantum computing pioneers.

Computational complexity theory, a field sitting across mathematics and computer science, focuses on classifying computational problems according to their resource usages, such as space (memory) and time. In essence, a computational problem is a task that can be solved by a computer mechanically following the mathematical steps defined in an algorithm.

For instance, consider the problem of sorting the numbers in a list. One possible algorithm, called Selection Sort, consists of repeatedly finding the minimum element (in ascending order) from the unsorted part of the list (initially, all of it) and putting it at the beginning. This algorithm effectively maintains two sub-lists within the original list as it works its way through: the already sorted part and the remaining unsorted part. After several passes of this process, the outcome is a sorted list from smaller to larger. In terms of time complexity, this is expressed by the complexity of N2, where N means the size or number of elements in the list. Mathematicians have come up with more efficient, albeit more complex sorting algorithms, such as Cube Sort or Tim Sort, both of which have an N x log(N) complexity. Sorting a list of 100 elements is a simple task for todays computers but sorting a list of a billion records might be less so. Therefore, the time complexity (or the number of steps in the algorithm in relation to the size of the input problem) is very important.

To solve a problem faster, one can either use a faster computer or find a more efficient algorithm that requires fewer operations, which is what lower time complexity means. However, it is clear that in the case of problems of exponential complexity (for instance, N2 or 2N), the math works against you, and with larger problem sizes it is not realistic to just use faster computers. And this is precisely the case in the field of artificial intelligence.

First, we will look at the computational complexity of the artificial neural networks used by todays artificial intelligence (AI) systems. These mathematical models are inspired by the biological neural networks that constitute animal brains. They learn to identify or categorize input data, by seeing many examples. They are a collection of interconnected nodes or neurons, combined with an activation function that determines the output based on the data presented in the input layer and the weights in the interconnections.

To adjust the weights in the interconnections so that the output is useful or correct, the network can be trained by exposure to many data examples and backpropagating the output loss.

For a neural network with N inputs, M hidden layers, where the i-th hidden layer contains mi hidden neurons, and k output neurons, the algorithm that adjusts the weights of all neurons (called a backpropagating algorithm) will have a time complexity of:

To put things in context, the popular OpenAIs GPT-3 model, which is already capable of writing original prose with fluency equivalent to that of a human, has 175 billion parameters (or neurons). With an M in the billions, this AI model currently takes months to train, even using powerful server computers in large cloud data centers. Furthermore, AI models are going to continue growing in size, so the situation will get worse over time.

Quantum computers are machines that use the properties of quantum physics, specifically superposition and entanglement, to store data and perform computations. The expectation is that they can execute billions of simultaneous operations, therefore providing a very material speedup for highly complex problems, including AI.

While classical computers transmit information in bits (short for binary digits), quantum computers use qubits (short for quantum bits). Like classical bits, qubits do eventually have to transmit information as a one or zero but are special in that they can represent both a one and a zero at the same time. A qubit is considered to have a probability distribution, e.g., it is 70% likely to be a one and 30% likely to be a zero. This is what makes quantum computers special.

There are two essential properties in quantum mechanics that quantum computers take advantage of: superposition and entanglement.

When a qubit is both a one and a zero at the same time, it is said to be in a superposition. Superposition is the general name for the condition when a system is in multiple states at once and only assumes a single state when it is measured. If we pretend that a coin is a quantum object, a superposition can be imposed when the coin is flipped: there is only a probability of the coin being either heads or tails. Once the coin has landed, we have made a measurement, and we know whether the coin is heads or tails. Likewise, only when we measure the spin of an electron (similar to the coin landing) do we know what state the electron is in and whether it is a one or a zero.

Quantum particles in superposition are only useful if we have more than one of them. This brings us to our second fundamental principle of quantum mechanics: entanglement. Two (or more) particles that are entangled cannot be individually described, and their properties depend completely on one another. So, entangled qubits can affect each other. The probability distribution of a qubit (being a one or zero) depends on the probability distribution of all other qubits in the system.

Because of that, the addition of each new qubit to a system has the effect of doubling the number of states that the computer can analyze. This exponential increase in computer power contrasts with classical computing, which only scales linearly with each new bit.

Entangled qubits can, theoretically, execute billions of simultaneous operations. It is obvious that this capability would provide a dramatic speedup to any algorithm with complexities in the range of N2, 2N, or NN.

Because of the impressive potential of quantum computing, while hardware teams continue to work on making these systems a reality (the largest to date is IBMs 127-Qubit Eagle system), software researchers are already working on new algorithms that could leverage this simultaneous computation capability, in fields such as cryptography, chemistry, materials science, systems optimization, and machine learning/AI. It is believed that Shors factorization quantum algorithm will provide an exponential speedup over classical computers, which poses a risk to current cryptographic algorithms.

Most interestingly, it is believed quantum linear algebra will provide a polynomial speed-up, which will enormously improve the performance of our artificial neural networks. Google has launched TensorFlow Quantum, a software framework for quantum machine learning, which allows rapid prototyping of hybrid quantum-classical ML models. IBM, also a leader in quantum computing, recently announced that it has found mathematical proof of a quantum advantage for quantum machine learning. However, while the likes of IBM and Google are vertically integrated (thus developing both the hardware systems and the software algorithms) there is also a very interesting group of quantum software startups including Zapata, Riverlane, 1Qbit, and, to a certain degree, Quantinuum (since Cambridge Quantum Computing merged with Honeywell and rebranded, it is not a pure software company anymore), to name just a few.

As quantum hardware becomes more powerful and quantum machine learning algorithms are perfected, quantum computing is likely to take a significant share of the AI chips market. For a more detailed discussion on AI chips and quantum computing, please take a look at our thematic reports on AI, AI chips, and quantum computing.

View post:
What has quantum computing got to do with AI? - Verdict

New phase of matter with 2D time created in quantum computer – Cosmos

Quantum computers hold the promise of revolutionising information technology by utilising the whacky physics of quantum mechanics. But playing with strange, new machinery often throws up even more interesting and novel physics. This is precisely what has happened to quantum computing researchers in the US.

Reported in Nature, physicists who were shining a pulsing laser at atoms inside a quantum computer observed a completely new phase of matter. The new state exhibits two time dimensions despite there still being only a singular time flow.

The researchers believe the new phase of matter could be used to develop quantum computers in which stored information is far more protected against errors than other architectures.

See, what makes quantum computers great is also what makes them exceedingly tricky.

Unlike in classical computers, a quantum computers transistor is on the quantum scale, like a single atom. This allows information to be encoded not just using zeroes and ones, but also a mixture, or superposition, of zero and one.

Hence, quantum bits (or qubits) can store multidimensional data and quantum computers would be thousands, even millions of times faster than classical computers, and perform far more efficiently.

But this same mixture of 0 and 1 states in qubits is also what makes them extremely prone to error. So a lot of quantum computing research revolves around making machines with reduced flaws in their calculations.

Read more: Australian researchers develop a coherent quantum simulator

The mind-bending property discovered by the authors of the Nature paper was produced by pulsing a laser shone on the atoms inside the quantum computer in a sequence inspired by the Fibonacci numbers.

Using an extra time dimension is a completely different way of thinking about phases of matter, says lead author Philipp Dumitrescu, a research fellow at the Flatiron Institutes Centre for Computational Quantum Physics in New York City, US. Ive been working on these theory ideas for over five years and seeing them realised in experiments is exciting.

The teams quantum computer is built on ten atomic ions of ytterbium which are manipulated by laser pulses.

Get an update of science stories delivered straight to your inbox.

Quantum mechanics tells us that superpositions will break down when qubits are influenced (intentionally or not), leading the quantum transistor to pick to be either in the 0 or 1 state. This collapse is probabilistic and cannot be determined with certainty beforehand.

Even if you keep all the atoms under tight control, they can lose their quantumness by talking to their environment, heating up, or interacting with things in ways you didnt plan, Dumitrescu says. In practice, experimental devices have many sources of error that can degrade coherence after just a few laser pulses.

So, quantum computing engineers try to make qubits more resistant to outside effects.

One way of doing this is to exploit what physicists call symmetries which preserve properties despite certain changes. For example, a snowflake has rotational symmetry it looks the same when rotated a certain angle.

Time symmetry can be added using rhythmic laser pulses, but Dumitrescus team added two time symmetries by using ordered but non-repeating laser pulses.

Other ordered but non-repeating structures include quasicrystals. Unlike typical crystals which have repeating structure (like honeycombs), quasicrystals have order, but no repeating pattern (like Penrose tiling). Quasicrystals are actually the squished down versions, or projections, of higher-dimensional objects. For example, a two-dimensional Penrose tiling is a projection of a five-dimensional lattice.

Could quasicrystals be emulated in time, rather than space? Thats what Dumitrescus team was able to do.

Whereas a periodic laser pulse alternates (A, B, A, B, A, B, etc), the parts of the quasi-periodic laser-pulse based on the Fibonacci sequence are the sum of the two previous parts (A, AB, ABA, ABAAB, ABAABABA, etc.). Like a quasicrystal, this is a two-dimensional pattern jammed into a single dimension. Hence, theres an extra time symmetry as a boon from this time-based quasicrystal.

The team fired the Fibonacci-based laser pulse sequence at the qubits at either end of the ten-atom arrangement.

Using a strictly periodic laser pulse, these edge qubits remained in their superposition for 1.5 seconds an impressive feat in itself given the strong interactions between qubits. But, with the quasi-periodic pulses, the qubits stayed quantum for the entire length of the experiment around 5.5 seconds.

With this quasi-periodic sequence, theres a complicated evolution that cancels out all the errors that live on the edge, Dumitrescu explains. Because of that, the edge stays quantum-mechanically coherent much, much longer than youd expect. Though the findings bear much promise, the new phase of matter still needs to be integrated into a working quantum computer. We have this direct, tantalising application, but we need to find a way to hook it into the calculations, Dumitrescu says. Thats an open problem were working on.

Link:
New phase of matter with 2D time created in quantum computer - Cosmos

"Quantum is for everyone:" An all-girls computing camp gives young coders a head start – KGUN 9 Tucson News

TUCSON, Ariz. (KGUN)Its the worlds first ever free all-girls quantum computing summer camp. This week-long program is being hosted in Tucson in partnership with the University of Arizona and the Girl Scouts of Southern Arizona.

Qubit by Qubit is a non-profit pushing this initiative nationwide. Program manager Gabbie Meis says, weve taught over 14,000 students virtually since 2013. But this camp is hosted in-person.

Meis says the camp is about harnessing the power of quantum mechanics to power a whole new type of computer called 'quantum computers'.

Experts in the field of quantum computing say this developing technology will eventually be so powerful, we will use it to solve problems traditional computers cant.

Meis says, huge problems that deal with a big amount of data. Some examples include climate change.

Michelle Higgins, the camp organizer from the University of Arizona, says, there will be more individualized and more precise healthcare, better communication systems, and things that perhaps dont go down when we have huge storms.

Girls are encouraged to pursue a career in quantum because it is currently a male-dominated industry.

Higgins says, in the past there have been more men going into the field and thats what they see now.

She adds, "quantum really has a marketing problem because we automatically think, 'oh we have to be a genius, I have to be like Einstein, I have to have my PhD to understand'."

But thats not the case. The camp is meant to do the opposite in hopes of encouraging girls to get a start in physics early on. Meis says for anyone interested, there are no prerequisites for the camp, so we have girls that have never even coded on a computer before, to some that have created their own game. So we like to say quantum is for everyone.

The camp will be hosted again next summer.

-Heidi Alagha is an anchor and reporter for KGUN 9. Heidi spent 5 years as the morning anchor in Waco where she was named the best anchor team by the Texas Associated Press. Share your story ideas and important issues with Heidi by emailing heidi.alagha@kgun9.com or by connecting on Facebook, Instagram, and Twitter.

View original post here:
"Quantum is for everyone:" An all-girls computing camp gives young coders a head start - KGUN 9 Tucson News