Checkmarx acquires an Israeli startup that’s securing the open source space – Geektime

Israeli cyber giant Checkmarx has acquired Israeli startup Dustico -- a SaaS platform that detects malicious attacks and backdoors in open source software supply chains. The sum of the deal has yet to be disclosed.

Dustico, a startup that has run bootstrap to date, has developed an open source platform designed for analyzing code packages using a machine learning algorithm (ML) to accurately detect supply chain attacks.

The young Israeli startups solution has found increased demand, as we witness a rise in supply chain attacks, some even garnering extensive media coverage due to their tremendous scope. One of those attacks, and stop me if youve heard about this one before, was the SolarWinds debacle, which released malicious code throughout different branches of the U.S. Federal Government.

The Israeli startup has developed a platform that operates in three stages to ensure that the code packages it checks are legitimate. First, it examines the "trust" - which focuses on the identity behind the code package, as well as anyone else who contributed to the open source code. It then tests the "health" of the code package, checking that its level of maintenance meets standards. And finally, the platform performs behavioral analysis of the package; while searching for malicious code that may have been implanted in it, through backdoors, ransomware, Trojans, or code that will allow for multi-stage attacks.

Checkmarx is expected to incorporate Dusticos platform, including its supply chain behavioral analysis into its AST tool -- Designed for developers looking to perform security checks on their applications, which will now be able to expand not only to a specific application but to the entire supply chain.

Todays adversaries have zoned-in on software supply chains many of which rely heavily on open source. As the threat of tampering in third-party packages increases, development teams must operate with the proactive assumption that all code may have been maliciously manipulated, said Maty Siman, CTO, Checkmarx. With Dustico, were building on our mission to secure open source by enabling customers to perform vulnerability, behavioral, and reputational analysis from a single solution. This will give developers and security leaders the insights and confidence needed to choose safer code packages, and in turn, build more secure applications at speed.

This is a very exciting time for Dustico and our community, said Tzachi Zornstain, Co-Founder and CEO, Dustico. We founded Dustico to help organizations cope with the explosion in supply chain and dependency attacks and fortify their trust in open source software, and were thrilled to join Checkmarx to further execute on this vision and bring our capabilities to a global set of customers.

Visit link:

Checkmarx acquires an Israeli startup that's securing the open source space - Geektime

Nvidia expands Omniverse with a new GPU, new collaborations – ZDNet

Nvidia on Tuesday announced a series of ways it plans to bring the Omniverse design and collaboration platform to a vastly larger audience. Those plans include new integrations with Blender and Adobe, companies that will extend the potential reach of Omniverse by millions. Nvidia is also introducing the new RTX A2000 GPU, bringing the RTX technology that powers Omniverse to a wide range of mainstream computers.

Nvidia rolled out Omniverse in open beta back in December, giving 3D designers a shared virtual world from which they can collaborate across different software applications and from different geographic locations. Earlier this year, the company introduced Omniverse Enterprise, bringing the platform to the enterprise community via a familiar licensing model.

"We are building Omniverse to be the connector of the physical and virtual worlds," Richard Kerris, VP of Omniverse for Nvidia, said to reporters last week. "We believe that there will be more content and experiences shared in virtual worlds than in physical worlds. And we believe that there will be amazing exchange markets and economic situations that will be first built in the virtual world... Omniverse is an exchange of these vital worlds. We connect everything and everyone, through a baseline architecture that is familiar to existing tools that are out there and existing workflows."

Nvidia unveiled the RTX A2000 GPU to bring RTX technology to mainstream workstations.

In a blog post, Nvidia VP Bob Pette wrote that the new A2000 GPU "would serve as a portal" to Omniverse "for millions of designers." The A2000 is Nvidia's most compact, power-efficient GPU for standard and small-form-factor workstations.

The GPU has 6GB of memory capacity with an error correction code (ECC) to maintain data integrity -- a feature especially important for industries such as healthcare and financial services.

Based on the Nvidia Ampere architecture, it features 2nd Gen RT Cores, enabling real-time ray tracing for professional workflows. It offers up to 5x the rendering performance from the previous generation with RTX on. It also features 3rd Gen tensor cores to enable AI-augmented tools and applications, as well as CUDA cores with up to 2x the FP32 throughput of the previous generation.

Speaking to reporters, Pette said the A200 would enable RTX in millions of additional mainstream computers. More designers will have access to the real-time ray tracing and AI acceleration capabilities that RTX offers. "This is the first foray of RTX into what is the largest volume segment of GPUs for Nvidia," Pette said.

Among the first customers using the RTX A2000 are Avid, Cuhaci & Peterson and Gilbane Building Company.

The A2000 desktop GPU will be available in workstations from manufacturers including ASUS, BOXX Technologies, Dell Technologies, HP and Lenovo, as well as Nvidia's global distribution partners, starting in October.

Meanwhile, Nvidia is encouraging the adoption of Omniverse by supporting Universal Scene Description (USD), an interchange framework invented by Pixar in 2012. USD was released as open-source software in 2016, providing a common language for defining, packaging, assembling and editing 3D data.

Omniverse is built on the USD framework, giving other software makers different ways to connect to the platform. Nvidia announced Tuesday that it's collaborating with Blender, the world's leading open-source 3D animation tool, to provide USD support to the upcoming release of Blender 3.0. This will give Blender's millions of users access to Omniverse production pipelines. Nvidia is contributing USD and materials support in Blender 3.0 alpha USD, which will be available soon.

Nvidia has also collaborated with Pixar and Apple to define a common approach for expressing physically accurate models in USD. More specifically, they've developed a new schema for rigid-body physics, the math that describes how solids behave in the real world (for example, how marbles would roll down a ramp). This will help developers create and share realistic simulations in a standard way.

Nvidia also announced a new collaboration with Adobe on a Substance 3D plugin that will bring Substance Material support to Omniverse. This will give Omniverse and Substance 3D users new material editing capabilities.

Nvidia on Tuesday also announced that Omniverse Enterprise, currently in limited early access, will be available later this year on a subscription basis from its partner network. That includes ASUS, BOXX Technologies, Dell Technologies, HP, Lenovo, PNY and Supermicro.

The company is also extending its Developer Program to include Omniverse. This means the developer community for Nvidia will have access to Omniverse with custom extensions, microservices, source code, examples, resources and training.

More here:

Nvidia expands Omniverse with a new GPU, new collaborations - ZDNet

Samsung returns to Wear OS with the Galaxy Watch 4 – TechCrunch

Samsungs watches have long been something of an anomaly. While the company embraced Wear OS (then Android Wear) in its earliest days with the massive Gear Live, the company quickly shifted to Tizen, an open-source operating system largely used by Samsung for wearables and smart TVs.

Thats no doubt been a kind of bugbear for Google, which has long struggled to crack a significant portion of the smartwatch market. Samsung, meanwhile, has had its share of success with its products while doing its own thing. But theres always more market share to be grabbed.

Third-party apps have long been an issue for basically every smartwatch maker but Apple (its the main reason Fitbit bought Pebble, if youll recall), and clearly Samsung saw the opportunity in reigniting its partnership with Google. The deal first mentioned at I/O and discussed more recently at MWC is now seeing the light of day on the brand new Galaxy Watch 4 and Galaxy Watch 4 Classic.

Image Credits: Brian Heater

The companies refer to it as the new Wear OS Powered by Samsung. What that means, practically, is that Wear OS serves as the code base. Design and other elements of Tizen exist in here, but for all practical intents and purposes, its a custom built version of Googles wearable operating system, which Samsung helped build out.

The company will stress that latter bit as an important bit of clarification that it didnt just slap a new coat of paint on the OS here. The companys One UI Watch sits atop all of that, in a bid to create a unified user experience across Samsungs mobile devices and wearable line.

Per a release:

Galaxy Watch 4 Series is also the first generation of smartwatches to feature Wear OS Powered by Samsung a new platform that elevates every aspect of the smartwatch experience. Built by Samsung and Google, this cutting-edge platform lets you tap into an expansive ecosystem right from your wrist with popular Google apps like Google Maps, and beloved Galaxy services, like Samsung Pay, SmartThings and Bixby. The new platform also includes support for leading third-party apps, like Adidas Running, Calm, Strava and Spotify.

In a blog post this morning, Google breaks down its end of the partnership thusly,

Were taking what weve learned from Wear OS and Tizen to jointly build what smartwatch users need. Compared to previous Wear OS smartwatches, the Galaxy Watch4 features a 2.5x shorter set up experience, up to 40 hours of battery life, optimized performance with app launch times 30 percent faster than before and access to a huge ecosystem of apps and services.

And there are more ways to get more done from your wrist with Wear OS. Were introducing more capabilities and a fresh new look based on Material You design language for Google Maps, Messages by Google and Google Pay apps as well as launching a YouTube Music app. There are also new apps and Tiles coming to Wear OS for quicker access to your favorites.

The software giant singles out turn-by-turn directions on Google Maps, the ability to download and listen to songs on YouTube Music and improved app discovery via Google Play. The news also finds Google Pay on Wear OS coming to 16 additional countries, including Belgium, Brazil, Chile, Croatia, Czech Republic, Denmark, Finland, Hong Kong, Ireland, New Zealand, Norway, Slovakia, Sweden, Taiwan, Ukraine and United Arab Emirates.

The other key focus on the line continues to be health its the field on which all smartwatches are currently competing. The monitoring is built around a smaller version of the companys BioActive Sensor, which measures optical heart rate, electrical heart (ECG) and Bioelectrical Impedance Analysis. The trio of sensors measure a bunch of different metrics, including blood pressure, AFib monitoring, blood oxygen and now body composition/BMI. So now, for better or worse, your watch will tell you your body fat percentage [post-pandemic grimace face emoji]. Says Samsung, In about 15 seconds, your watchs sensor will capture 2,400 data points.

Image Credits: Brian Heater

Design is the primary distinction between the two models. The Galaxy Watch 4 is the thinner and lighter of the two more in line with the Galaxy Watch Active. It sports a touch bezel, versus the Classics physical spinning bezel arguably Samsungs best innovation in the category.

Also, of note: Both models come in two sizes. Thats always been a bit of a sticking point for me on Samsung Watches. If your devices are large and only come in the one size, youre essentially knocking out a sizable portion of your customer base right off the bat. The Watch 4 comes in 40mm and 44mm and the Classic is available in 42mm and 46mm. The models start at $250 and $350, respectively. Another $50 will get you LTE connectivity.

The watches go up for preorder today and start shipping on August 26. Preordering will get you a $50 Samsung Credit. The company is also launching a limited-edition Thom Browne version of the Classic in September, which will almost certainly cost an arm and/or leg.

Read this article:

Samsung returns to Wear OS with the Galaxy Watch 4 - TechCrunch

QR codes: more than just a COVID thing – Tech News | Particle – Particle

The short answer: nothing unexpected but its a fascinating look at how code and coders store data and why transparency matters.

QR codes have been around for decades. They were originally invented at Toyota to help label car parts on production lines, but theyve worked their way into everyday life as well. From putting links on a bus stop ad to tracing contacts during a pandemic, storing a little bit of computer data on paper is a useful thing to be able to do.

But how do these strange little squares actually work? And what kind of data are they hiding? To find out, we talked to James Hentsridge, an open-source software developer. James wrote an amazing technical breakdown of whats in WAs very own SafeWA check-in codes.

So this is kind of like barcodes on your food, except its designed so that it can store a lot more data, James says.

Data is still stored as a pattern of black and white spaces, but rather than lines read from left to right, its squares dotted from right to left, bottom to top, snaking back and forth across a grid.

And rather than just a sequence of numbers, it can be arbitrary text data or even binary, James says.

That means QR codes can store code, images, words, or as well see web addresses. Because theyre storing more data, theyve got a few extra tricks up their sleeve to make sure that data survives.

Its also designed with some error correction, so that if theres a smudge on your camera lens or someones put a logo in the centre of the QR code, it can still recover all the data.

Its a bit like the check digit on a regular barcode or a credit card number, but instead of just being able to verify the data, it can actually help fix it as well.

What caught Jamess eye about the SafeWA codes was their size. While theyre the same type of codes as, say, our Particle posters, the SafeWA codes have a lot more dots.

Theres two things that can increase the size of the QR code. One of them is, yes, more data means bigger QR codes, he says.

The other one is you can change the redundancy. You could have very little redundancy, which means any errors in the scanning will mean you lose data, or lots of redundancy means it will increase the size of the code but you can fix more errors.

SafeWA codes are either storing a lot of data or using a lot of error correction, but to figure out which it is, we need to go deeper.

Fortunately, the SafeWA app isnt the only bit of software that can read these codes. Since its a standard QR code, we can open it in a different app one that shows us exactly whats being encoded.

And what do we see? Its a web address, or URL, but thats not exactly surprising, according to James.

Theres a good reason to have this as a URL, because it identifies this as a SafeWA QR code rather than something else, he says. After that, youve got a big long string of letters and numbers, he says.

That means if you scan it with your camera app say, if youre new to WA and dont have the app yet you get sent to a website with more details rather than seeing just those letters and numbers. To a regular camera app, they dont mean much, but to the SafeWA app, its the name and location of the venue youve visited.

The SafeWA app sees that this URL is structured in a certain way, and then it extracts this venue ID and location, and it talks to the server and says you were here.

The codes are big because that web address comes out pretty long perhaps longer than it needs to be. It turns out theyve got a little bit of redundancy and error correction in there as well, which James says could easily be left off to make the code a bit smaller.

So despite the layers and layers of coding, SafeWA codes really do seem to be exactly what they claim to be. Its your check-in location wrapped up in a way thats easy for a phone to understand. But its important for humans to be able to understand it too.

All these check-in systems depend on consent of people to actually participate. So if you break that trust, then theyre worthless, James says.

Part of that trust is transparency. Its easier to trust a system if you know how it works and that it does what it says. James says thats something anyone can try.

Trying to investigate these sorts of things a lot of it is just looking for patterns, he says.

This article was originally published on Particle. Read the original article.

Read this article:

QR codes: more than just a COVID thing - Tech News | Particle - Particle

SiliconArts releases Ray Tracing IP Core to Intel Solutions Marketplace, open source ray tracing APIs on Github – Design and Reuse

SiliconArts is releasing a dedicated ray tracing accelerator core into the Intel Solutions Marketplace as an Intel Partner Alliance(IPA) Gold Member.

SEOUL, SOUTH KOREA -- August 10, 2021 -- Silicon Arts releases GPU ray tracing RayCore RC1000 core into the Intel Solutions Marketplace. Open source release of ray tracing examples, apis, and drivers on Github with FPGA macro

SiliconArts announces it is releasing a dedicated ray tracing accelerator core into the Intel Solutions Marketplace. As an Intel Partner Alliance(IPA) Gold Member, SiliconArts is expanding its support for FPGA design services to incorporate its leading ray tracing IP into a dedicated IP core that can be combined with legacy GPUs to provide ray tracing to any level of graphics marketplace. A link describing the product on the Intel Partner website describes the RC1000 RayCore FPGA deliverables for evaluation and development purposes. This is especially useful for gaming and VR, as well as embedded, medical, industrial, military and professional use cases, where real-time ray tracing is a requirement that cannot be met with large dedicated GPUs and often must work with an existing legacy GPU.

To make the evaluation of the RC1000 as simple as possible for developers to evaluate or utilize the ray tracing api extensions to OpenGL, SiliconArts has released a demonstration build on the Github site (https://github.com/siliconarts) that allows for developers to access the source code for the rendering examples and execute them on an Intel PAC with Intel ArriaV GX FPGA card. The RayCore 1000 ray tracing accelerator IP core is provided in a downloadable FPGA build along with open source releases of its apis, driver and build files. Having the ray tracing apis released to the open source communities will enable experimentation and innovation for ray tracing beyond the traditional markets. This will enable anyone to program and evaluate ray tracing without dedicating the latest generation GPU to the task. SiliconArts is working to incorporate advanced ray tracing functions into the open source developer environment to prepare for the conversion of graphics into photorealistic representations that can provide natural looking lighting and AR visual immersion.

SiliconArts own graphics technology, the RayCore MC-Series, enabling a scalable 3D GPU rendering solution providing from 1 GRays/sec to up to 10 Grays/sec for multi-core solutions can be integrated to this GPU platform to provide futuristic capabilities for next generation visualization. Higher performance rendering platforms for dedicated and professional use cases can be scaled to 100s of Grays/sec performance with multi-chip board level designs.SiliconArts CEO Hyung-Min Yoon says The ray tracing movement is so critical to our computing platforms user interface. We can expect ray tracing to be a core function of all GPUs in the future.

SiliconArts

Bringing ray tracing to the mainstream GPU market based on a licensable IP core that incorporates patented innovations to accelerate ray tracing for all GPU product ranges, including embedded and legacy GPUs.

Go here to read the rest:

SiliconArts releases Ray Tracing IP Core to Intel Solutions Marketplace, open source ray tracing APIs on Github - Design and Reuse

ShapeShift Appoints Crypto Veteran Willy Ogorzaly as the Foundation’s Head of Decentralization – PRNewswire

DENVER, Aug. 10, 2021 /PRNewswire/ --ShapeShift, a decentralized, non-custodial cryptocurrency platform, has named Willy Ogorzaly, ShapeShift's principal product manager, as head of decentralization within its newly forming Foundation. Ogorzaly will help lead the transfer of open source code, intellectual property and operations to the ShapeShift Decentralized Autonomous Organization (DAO), act as spokesperson for the transitioning entity, and carry forward ShapeShift's mission and product value proposition to the decentralized community.

In July, ShapeShift announcedthat it would be dissolving all corporate structure and turning its management and operations over to a DAO run by holders of its FOX Token. The foundation's purpose is to oversee this shift and, when its mission is complete, disband. The company will announce additional appointments in the coming weeks.

"I'm so excited to help lead the decentralization of ShapeShiftI can't imagine a more worthwhile project to pour my energy into," said Ogorzaly. "The world deserves an open-source, multi-chain, community-owned interface in the growing decentralized universe, and ShapeShift is in the best position to deliver this. Leading the effort to fully decentralize ShapeShift is a dream come true, and I'm grateful to work with our talented and enthusiastic community to help do it right."

At ShapeShift, Ogorzaly has been responsible for leading product strategy, defining new features and solutions, and ensuring ShapeShift's products meet the needs of the community. Before joining ShapeShift, he co-founded Bitfract, the first tool enabling trades from bitcoin into multiple cryptocurrencies in a single transaction; the company was acquired by ShapeShift in 2018. Willy also actively contributes to Giveth, an open-source and decentralized application for donating to social impact projects.

"Willy was the natural choice for this position," said Erik Voorhees, founder and CEO of ShapeShift. "He lives and breathes DeFi and has been a helpful guide in my own learning. Willy's infectious energy will help a decentralizing ShapeShift build partnerships and community channels to make it successful. He has been an incredibly instrumental force in our team toward building stronger DeFi components into our platform, and we are very pleased to have him driving the decentralization and cross-communications efforts."

About ShapeShiftSince 2014, ShapeShift has been pioneering self-custody for digital asset trading. The company's web and mobile platforms allow users around the world to safely buy, hold, trade and interact with digital assets such as Bitcoin and Ethereum.

Learn more at ShapeShift.com.

Media Contact:Lindsay Smith[emailprotected]

SOURCE ShapeShift

https://shapeshift.com

Read the rest here:

ShapeShift Appoints Crypto Veteran Willy Ogorzaly as the Foundation's Head of Decentralization - PRNewswire

Quantum computing: How BMW is getting ready for the next technology revolution – ZDNet

BMW has been preparing to be quantum-ready for the past four years.

Quantum computing may still be at an early stage, but BMW has been quietly ramping up plans for the moment when it reaches maturity.

Most recently, the company justlaunched a "quantum computing challenge" a call for talent designed to encourage external organizations to come up with solutions that will help the car manufacturer make the best use of quantum technologies.

"It's a search for hidden gems," Oliver Wick, technology scout at BMW Research and Technology, tells ZDNet.

"It's a clear message to the world that BMW is working on quantum, and if you have innovative algorithms or great hardware, then please come to us and we can check if we could use it for BMW."

SEE: What is quantum computing? Everything you need to know about the strange world of quantum computers

The challenge, which is run in partnership with Amazon's quantum computing division AWS Braket, is targeting corporations as well as startups and academics with a simple pitch: come up with quantum solutions to the problems that BMW has identified.

Specifically, explains Wick, BMW wants to see four challenges addressed. In the pre-production stage, quantum algorithms could help optimize the configuration of features for the limited number of cars that can be assembled for various tests, so that as many tests as possible can be carried out with a minimal amount of resources.

Similarly, optimization algorithms could improve sensor placement on vehicles, to make sure that the final configurations of sensors can reliably detect obstacles in different driving scenarios something that is becoming increasingly important as autonomous driving becomes more common.

Candidates have also been invited to submit ideas for the simulation of material deformation during production, to predict costly problems in advance, as well as for the use of quantum machine learning to classify imperfections, cracks and scratches during automated quality inspection.

Participants are required to submit a concept proposal for any of the four challenges, after which a panel of experts will shortlist the most promising ideas. The successful candidates will then have a few months to build out their solutions on Amazon Braket, before pitching them next December. Winning ideas will earn a contract with BMW to implement their projects in real-life pilots.

"We are using the power of the crowd to solve our own problems inside BMW," says Wick.

The quantum challenge is only the latest development in a strategy that aims to aggressively push the company's quantum readiness.

BMW's high-performance computers are currently handling 2,000 tasks a day, ranging from high-end visualizations to crash simulations; but even today's most sophisticated systems are fast reaching their computing limits.

Quantum computers, however, could one day carry out computations exponentially faster, meaning that they could resolve problems that classical computers find intractable. For example, the amount of compute power required to optimize vehicle sensor placement is proving to be increasingly challenging for classical algorithms to take on; quantum algorithms, on the other hand, could come up with solutions in minutes. At BMW's production scale, this could mean huge business value.

Wick explains that the potential of quantum computers was identified by the company as early as 2017. A tech report promptly followed to acquire some knowledge about the technology and its key providers, before work started on proofs of concept.

At this stage, says Wick, the biggest challenge was to find out the business case for quantum computing. "We initiated proofs of concept in optimization or scheduling, but those were activities in which no business case was included," says Wick. "Initially, everybody came to me asking why we even needed quantum computing."

But now proof of concepts are slowly starting to emerge as business projects. One of the company's first research proposals, for instance, looked at the use of quantum computers to calculate the optimum circuit to be followed by a robot sealing welding seams on a vehicle. More recently, BMWunveiled that it has been making progress in designing quantum algorithmsfor supply-chain management, which have been successfully tested on Honeywell's 10-qubit system.

SEE: Supercomputers are becoming another cloud service. Here's what it means

BMW says it has now identified over 50 challenges at various stages of the value chain where quantum computing could provide significant benefits four of which have now been delegated to the crowd thanks to the quantum challenge.

In other words, from a blue-sky type of endeavor, quantum computing is now solidly implanted in BMW's strategy. "We've now built two teams, one in the development department and one in the IT department," says Wick. "From this perspective, we have integrated quantum computer in our strategy."

Partnerships are central to this approach. Last June, BMW co-founded the Quantum Technology and Application Consortium (QUTAC), together with firms ranging from Bosch to Volkswagen. The objective, says Wick, is to come up with a set of problems shared across different industries, to join forces in finding solutions that can then be applied to each specific use case.

BMW is also providing a 5.1 million ($6 million) to the University of Munich to support a professorship, who will be expected to conduct research into applying quantum technologies to industry problems such as those faced by BMW.

But just because quantum computing has become part of BMW's business strategy doesn't mean that the technology is already generating value. Quantum computers are still small-scale experimental devices that are utterly incapable of running programs large enough to be useful. They are known as Noisy, Intermediate-Scale Quantum Computers (NISQ), a term of reflective of how emergent the technology remains.

"We are in the NISQ era and we will need better quantum computers," says Wick. "Personally, I think we could start having business benefits in five years. But that doesn't mean we should wait for five years, lay back, and let other companies do the work instead."

SEE:Bigger quantum computers, faster: This new idea could be the quickest route to real world apps

Preparing for large-scale quantum computers means developing partnerships with the best talent, filing patents to secure IP, but also understanding company processes very well to know how to reform them.

"You need imagination to re-think your own processes," says Wick. "I can imagine that in the next 20 years, BMW customers will sit in front of a screen and configure their own BMW in real time, for example. This is what quantum computing is for to re-think processes and setups."

The biggest challenge for now, according to Wick, is tofully understand the ever-expanding quantum ecosystem, to make sure that the right quantum algorithms are fitted with the right quantum hardware to solve the right company problem.

This is easier said than done in a field that is buzzing with activity, and where noise and reality can be hard to distinguish. Quantum computing is rapidly joining blockchain, AR, VR and others on the list of popular buzzwords, and Wick can only count on his experience as a technology scout to make sure that the company doesn't fall to the quantum hype.

In the automotive industry, BMW's competitors are getting ready for quantum computing to change business processes, too. Volkswagen, for one,was early in joining the bandwagon, and has been expanding its capabilities ever since. The pressure is on to not fall behind in the race for quantum technologies, or so it would seem and BMW is making it clear that it wants to be in the lead.

See the original post here:
Quantum computing: How BMW is getting ready for the next technology revolution - ZDNet

Quantum Computing Tech is Amazing. But What Does Business Think? – DesignNews

Recent scientific and technological breakthroughs in quantum computing hardware and software demonstrate the commercial viability of quantum computers. Specifically, Honeywell and Cambridge Quantum just announced three scientific and technical milestones that significantly move large-scale quantum computing into the commercial world

These milestones include demonstrated real-time quantum error correction (QEC), doubling the quantum volume of Honeywells System H1 to 1,024, and developing a new quantum algorithm that uses fewer qubits to solve optimization problems. Lets break each of these topical areas down into understandable bits of information.

Related: What Will it Take to Make a Successful Quantum Computing Platform? Two Things

Optical signal conditioning used on quantum computers.

Real-time quantum error correction (QEC) is used in quantum computing to protect the information from errors due to decoherence and other quantum noise. Quantum decoherence is the loss of coherence. Decoherence can be viewed as the loss of information from a system into the environment. Quantum coherence is needed to perform computing on quantum information encoded in quantum states.

Related: 4 Experts Let The Cat Out Of The Box On Quantum Computing And Electronic Design

In contrast, classical error correction employs redundancy. The simplest way to achieve redundancy is to store the information multiple times in memory and then constantly compare the information to determine if corruption has occurred.

Another difference between classical and quantum error correction is one of continuity. In classic error correction, the bit is either a 1 or a 0, i.e., it is either flipped on or off. However, errors are continuous in the quantum state. Continuous errors can occur on a qubit, in which a qubit is partially flipped, or the phase is partially changed.

Honeywell researchers have addressed quantum error correction by creating a single logical qubit from seven of the ten physical qubits available on the H1 Model and then applying multiple rounds of QEC. Protected from the main types of errors that occur in a quantum computer, the logical qubit combats errors that accumulate during computations.

Quantum Volume (QV) is the other key metric used to gauge quantum computing performance. QV is a single number meant to encapsulate the performance of quantum computers, like a classical computer's transistor count in Moores Law.

QV is a hardware-agnostic metric that IBM initially used to measure the performance of its quantum computers. This metric was needed since a classical computers transistor count and a quantum computers quantum bit count isnt the same. Qubits decohere, forgetting their assigned quantum information in less than a millisecond. For quantum computers to be commercially viable and useful, they must have a few low-error, highly connected, and scalable qubits to ensure a fault-tolerant and reliable system. That is why QV now serves as a benchmark for the progress being made by quantum computers to solve real-world problems.

According to Honeywells recent release, the System Model H1 has become the first to achieve a demonstrated quantum volume of 1024. This QV represents a doubling of its record from justfour months ago.

The third milestone comes from Cambridge Quantum Computing recently merged with Honeywell - also has developed a new quantum algorithm that uses fewer qubits to solve optimization problems.

Honeywell and Cambridge Quantum Computing (CQC) have met three key quantum milestones with the Model H1 systems.

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

Go here to see the original:
Quantum Computing Tech is Amazing. But What Does Business Think? - DesignNews

Google says it has created a time crystal in a quantum computer, and it’s weirder than you can imagine – ZDNet

Google's scientists now rather excitingly say that their results establish a "scalable approach" to study time crystals on current quantum processors.

In a new research paper, Google scientists claim to have used a quantum processor for a useful scientific application: to observe a genuine time crystal.

If 'time crystal' sounds pretty sci-fi that's because they are. Time crystals are no less than a new "phase of matter", as researchers put it, which has been theorized for some years now as a new state that could potentially join the ranks of solids, liquids, gases, crystals and so on. Thepaper remains in pre-print and still requires peer review.

Time crystals are also hard to find. But Google's scientists now rather excitingly say that their results establish a "scalable approach" to study time crystals on current quantum processors.

SEE: What is quantum computing? Everything you need to know about the strange world of quantum computers

Understanding why time crystals are interesting requires a little bit of background in physics particularly, knowledge of the second law of thermodynamics, which states that systems naturally tend to settle in a state known as "maximum entropy".

To take an example: if you pour some milk into a coffee cup, the milk will eventually dissolve throughout the coffee, instead of sitting on the top, enabling the overall system to come to an equilibrium. This is because there are many more ways for the coffee to randomly spread throughout the coffee than there are for it to sit, in a more orderly fashion, at the top of the cup.

This irresistible drive towards thermal equilibrium, as described in the second law of thermodynamics, is reflective of the fact that all things tend to move towards less useful, random states. As time goes on, systems inevitably degenerate into chaos and disorder that is, entropy.

Time crystals, on the other hand, fail to settle in thermal equilibrium. Instead of slowly degenerating towards randomness, they get stuck in two high-energy configurations that they switch between and this back-and-forth process can go on forever.

To explain this better, Curt von Keyserlingk, lecturer at the school of physics and astronomy at the University of Birmingham, who did not participate in Google's latest experiment, pulls out some slides from an introductory talk to prospective undergraduate students. "They usually pretend to understand, so it might be useful," von Keyserlingk warns ZDNet.

It starts with a thought experiment: take a box in a closed system that is isolated from the rest of the universe, load it with a couple of dozens of coins and shake it a million times. As the coins flip, tumble and bounce off each other, they randomly move positions and increasingly become more chaotic. Upon opening the box, the expectation is that you will be faced with roughly half the coins on their heads side, and half on their tails.

It doesn't matter if the experiment started with more coins on their tails or more coins on their heads: the system forgets what the initial configuration was, and it becomes increasingly random and chaotic as it is shaken.

This closed system, when it is translated into the quantum domain, is the perfect setting to try and find time crystals, and the only one known to date. "The only stable time crystals that we've envisioned in closed systems are quantum mechanical," says von Keyserlingk.

Enter Google's quantum processor, Sycamore,which is well known for having achieved quantum supremacyand is now looking for some kind of useful application for quantum computing.

A quantum processor, by definition, is a perfect tool to replicate a quantum mechanical system. In this scenario, Google's team represented the coins in the box with qubits spinning upwards and downwards in a closed system; and instead of shaking the box, they applied a set of specific quantum operations that can change the state of the qubits, which they repeated many times.

This is where time crystals defy all expectations. Looking at the system after a certain number of operations, or shakes, reveals a configuration of qubits that is not random, but instead looks rather similar to the original set up.

"The first ingredient that makes up a time crystal is that it remembers what it was doing initially. It doesn't forget," says von Keyserlingk. "The coins-in-a-box system forgets, but a time crystal system doesn't."

It doesn't stop here. Shake the system an even number of times, and you'll get a similar configuration to the original one but shake it an odd number of times, and you'll get another set up, in which tails have been flipped to heads and vice-versa.

And no matter how many operations are carried out on the system, it will always flip-flop, going regularly back-and-forth between those two states.

Scientists call this a break in the symmetry of time which is why time crystals are called so. This is because the operation carried out to stimulate the system is always the same, and yet the response only comes every other shake.

"In the Google experiment, they do a set of operations on this chain of spins, then they do exactly the same thing again, and again. They do the same thing at the hundredth step that they do at the millionth step, if they go that far," says von Keyserlingk.

"So they subject the system to a set of conditions that have symmetry, and yet the system responds in a manner that breaks that symmetry. It's the same every two periods instead of every period. That's what makes it literally a time crystal."

SEE:Bigger quantum computers, faster: This new idea could be the quickest route to real world apps

The behavior of time crystals, from a scientific perspective, is fascinating: contrary to every other known system, they don't tend towards disorder and chaos. Unlike the coins in the box, which get all muddled up and settle at roughly half heads and half tails, they buck the entropy law by getting stuck in a special, time-crystal state.

In other words, they defy the second law of thermodynamics, which essentially defines the direction that all natural events take. Ponder that for a moment.

Such special systems are not easy to observe. Time crystals have been a topic of interest since 2012, when Nobel Prize-winning MIT professor Frank Wilczek started thinking about them; and the theory has been refuted, debated and contradicted many times since then.

Several attempts have been made to create and observe time crystals to date, with varying degrees of success. Only last month, a team from Delft University of Technology in the Netherlandspublished a pre-print showing that they had built a time crystal in a diamond processor, although a smaller system than the one claimed by Google.

The search giant's researchers used a chip with 20 qubits to serve as the time crystal many more, according to von Keyserlingk, than has been achieved until now, and than could be achieved with a classical computer.

Using a laptop, it is fairly easy to simulate around 10 qubits, explains von Keyserlingk. Add more than that, and the limits of current hardware are soon reached: every extra qubit requires exponential amounts of memory.

The scientist stops short of stating that this new experiment is a show of quantum supremacy. "They're not quite far enough for me to be able to say it's impossible to do with a classical computer, because there might be a clever way of putting it on a classical computer that I haven't thought of," says von Keyserlingk.

"But I think this is by far the most convincing experimental demonstration of a time crystal to date."

SEE: Quantum computing just took on another big challenge, one that could be as tough as steel

The scope and control of Google's experiment means that it is possible to look at time crystals for longer, do detailed sets of measurements, vary the size of the system, and so on. In other words, it is a useful demonstration that could genuinely advance science and as such, it could be key in showing the central role that quantum simulators will play in enabling discoveries in physics.

There are, of course, some caveats. Like all quantum computers, Google's processor still suffers from decoherence, which can cause a decay in the qubits' quantum states, and means that time crystals' oscillations inevitably die out as the environment interferes with the system.

The pre-print, however, argues that as the processor becomes more effectively isolated, this issue could be mitigated.

One thing is certain: time crystals won't be sitting in our living rooms any time soon, because scientists are yet to find a definitive useful application for them. It is unlikely, therefore, that Google's experiment was about exploring the business value of time crystals; rather, it shows what could potentially be another early application of quantum computing, and yet another demonstration of the company's technological prowess in a hotly contested new area of development.

The rest is here:
Google says it has created a time crystal in a quantum computer, and it's weirder than you can imagine - ZDNet

From theory to reality: Google claims to have created physics-defying ‘time crystal’ inside its quantum computer – Silicon Canals

Image credits: Google Quantum AI

As the Quantum computing race is heating up, many companies across countries are spending billions on different qubit technologies to stabilise and commercialise the technology. While it is too early to declare a winner in quantum computing, Googles quantum computing lab may have created something truly remarkable.

In the latest development, researchers at Google, in collaboration with physicists at Princeton, Stanford, and other universities, have created the worlds first Time Crystal inside a quantum computer.

Get to know the amazing finalists here

Time crystals developed by Google could be the biggest scientific accomplishment for fundamental physics and quantum physics. Dreamt up by the Nobel Prize-winning physicist Frank Wilczek in 2012, the notion of time crystals is now moving from theory to reality.

In a recently published study, Observation of Time-Crystalline Eigenstate Order on a Quantum Processor, the researchers claim that Time Crystal is a new phase of matter that violates Newtons law of Thermodynamics.

Well, a time crystal sounds like a complicated component of a time machine, but it is not. So, what exactly are Time Crystals? As per researchers, a time crystal is a new phase of matter that alternates between two shapes, never losing any energy during the process.

To make it simple, regular crystals are an arrangement of molecules or atoms that form a regular repeated pattern in space. A time crystal, on the other hand, is an arrangement of molecules or atoms that form a regular, repeated pattern but in time. Meaning, theyll sit in one pattern for a while, then flip to another, and repeat back and forth.

Explaining about Time Crystal in layman terms to Silicon Canals, Loc Henriet, head of Applications and Quantum Software, Pasqal, explains, Some phases of matter are known to spontaneously break symmetries. A crystal breaks spatial translation: one finds atoms only at well-defined positions. Magnets break discrete spin symmetry: the magnetisation points to a well-defined direction. However, no known physical system was known to break one of the simplest symmetries: translation in time. Googles DTC result is the most convincing experimental evidence of the existence of non-equilibrium states of matter that break time-translation symmetry.

Further, Time crystals can withstand energy processes without entropy and transform endlessly within an isolated system without expending any fuel or energy.

Our work employs a time-reversal protocol that discriminates external decoherence from intrinsic thermalisation, and leverages quantum typicality to circumvent the exponential cost of densely sampling the eigenspectrum, says researchers. In addition, we locate the phase transition out of the DTC with experimental finite-size analysis. These results establish a scalable approach to study non-equilibrium phases of matter on current quantum processors.

For the demonstration, the researchers used a chip with 20 qubits to serve as the time crystal. Its worth mentioning that researchers performed the experiments on Googles Sycamore device, which solved a task in 200 seconds that would take a conventional computer 10,000 years.

According to the researchers, their experiment offers preliminary evidence that their system could create time crystals. This discovery could have profound implications in the world of quantum computing if its proven.

Henriet shares, This result is most interesting from a fundamental physics standpoint, as an identification of a novel quantum phase of matter. In itself, it will not directly impact our day-to-day life but it illustrates the richness of many-body quantum physics out-of-equilibrium. It also proves that quantum processors are now powerful enough to discover new interesting regimes for quantum matter with disruptive properties.

The consequence is amazing: You evade the second law of thermodynamics, says Roderich Moessner, director of the Max Planck Institute for the Physics of Complex Systems in Dresden, Germany, and a co-author on the Google paper.

This is just this completely new and exciting space that were working in now, says Vedika Khemani, a condensed matter physicist now at Stanford who co-discovered the novel phase, while she was a graduate student and co-authored the new paper with the Google team.

In 2012, Frank Wilczek came up with the idea of time crystals while teaching a class about ordinary (spatial) crystals.

If you think about crystals in space, its very natural also to think about the classification of crystalline behaviour in time, he told Quanta.

Googles quantum computer has certainly achieved what many thought was impossible. Having said that, the experiment is in the preliminary stage and requires a lot of work. Moreover, the pre-print version of the research awaits validation from the scientists community and has to be reviewed by peers as well.

There are good reasons to think that none of those experiments completely succeeded, and a quantum computer like [Googles] would be particularly well placed to do much better than those earlier experiments, University of Oxford physicist John Chalker, who wasnt involved in the research, told Quanta.

How partnering up with Salesforce helped him succeed!

See more here:
From theory to reality: Google claims to have created physics-defying 'time crystal' inside its quantum computer - Silicon Canals