The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Daily Archives: June 22, 2020
Why Carnival, Royal Caribbean, and Norwegian Cruise Stocks Just Got Torpedoed and Are Going Down – Motley Fool
Posted: June 22, 2020 at 2:49 pm
What happened
America's cruise lines are extending their involuntary corona-cation -- but this time they're doing so voluntarily.
Just before 2 p.m. Friday, the Cruise Lines International Association (CLIA), the trade association representing Carnival Corporation (NYSE:CCL) (NYSE:CUK), Royal Caribbean (NYSE:RCL), and Norwegian Cruise Line Holdings (NASDAQ:NCLH), among others, announced that "the association's ocean-going cruise line members will voluntarily extend the suspension of cruise operations from U.S. ports until 15 September 2020."
As of 2:25 p.m. EDT, shares of Carnival stock are down 5.1%, Norwegian Cruise Line Holdings is down 5.8%, and Royal Caribbean is suffering worst of all -- down 6.3%.
Image source: Getty Images.
If you recall, it was way back in April that the U.S. Centers for Disease Control and Prevention (CDC) last extended its no-sail order forbidding cruise ships from sailing out of U.S. ports before July 24. To date, the CDC has not updated or extended that order.
Regardless, observing that "it is increasingly clear that more time will be needed to resolve barriers to resumption in the United States," CLIA members have agreed "to err on the side of caution" and "further extend our suspension of operations from U.S. ports until 15 September."
This "extension of suspension" will come at a cost. According to CLIA, each day the cruise industry remains shut down costs the U.S. economy about $110 million "in economic activity."
More pertinently to cruise line investors, though, Norwegian Cruise says it is burning through cash at the rate of $110 million to $150 million each month, and has probably fewer than 10 months of cash left to it. Royal Caribbean is burning $250 million to $275 million per month during its shutdown, and Carnival just revealed a $650 million-a-month burn rate. And now, according to CLIA, each of these cruise lines can expect to keep burning cash for nearly two months longer than the most optimistic scenario for their respective returns to service.
Long story short? We all probably suspected that the CDC would extend its no-sail order eventually, and that cruise lines would have to remain out of service for longer than that optimistic July 24 scenario.
Now, sadly, that out-of-service suspicion has been confirmed, even if it didn't actually come at the behest of the CDC.
View post:
Posted in Caribbean
Comments Off on Why Carnival, Royal Caribbean, and Norwegian Cruise Stocks Just Got Torpedoed and Are Going Down – Motley Fool
To live up to the hype, quantum computers must repair their error problems – Science News
Posted: at 2:45 pm
Astronaut John Glenn was wary about trusting a computer.
It was 1962, early in the computer age, and a room-sized machine had calculated the flight path for his upcoming orbit of Earth the first for an American. But Glenn wasnt willing to entrust his life to a newfangled machine that might make a mistake.
The astronaut requested that mathematician Katherine Johnson double-check the computers numbers, as recounted in the book Hidden Figures. If she says theyre good, Glenn reportedly said, then Im ready to go. Johnson determined that the computer, an IBM 7090, was correct, and Glenns voyage became a celebrated milestone of spaceflight (SN: 3/3/62, p. 131).
A computer that is even slightly error-prone can doom a calculation. Imagine a computer with 99 percent accuracy. Most of the time the computer tells you 1+1=2. But once every 100 calculations, it flubs: 1+1=3. Now, multiply that error rate by the billions or trillions of calculations per second possible in a typical modern computer. For complex computations, a small probability for error can quickly generate a nonsense answer. If NASA had been relying on a computer that glitchy, Glenn would have been right to be anxious.
Luckily, modern computers are very reliable. But the era of a new breed of powerful calculator is dawning. Scientists expect quantum computers to one day solve problems vastly too complex for standard computers (SN: 7/8/17, p. 28).
Current versions are relatively wimpy, but with improvements, quantum computers have the potential to search enormous databases at lightning speed, or quickly factor huge numbers that would take a normal computer longer than the age of the universe. The machines could calculate the properties of intricate molecules or unlock the secrets of complicated chemical reactions. That kind of power could speed up the discovery of lifesaving drugs or help slash energy requirements for intensive industrial processes such as fertilizer production.
But theres a catch: Unlike todays reliable conventional computers, quantum computers must grapple with major error woes. And the quantum calculations scientists envision are complex enough to be impossible to redo by hand, as Johnson did for Glenns ambitious flight.
If errors arent brought under control, scientists high hopes for quantum computers could come crashing down to Earth.
Conventional computers which physicists call classical computers to distinguish them from the quantum variety are resistant to errors. In a classical hard drive, for example, the data are stored in bits, 0s or 1s that are represented by magnetized regions consisting of many atoms. That large group of atoms offers a built-in redundancy that makes classical bits resilient. Jostling one of the bits atoms wont change the overall magnetization of the bit and its corresponding value of 0 or 1.
But quantum bits or qubits are inherently fragile. They are made from sensitive substances such as individual atoms, electrons trapped within tiny chunks of silicon called quantum dots, or small bits of superconducting material, which conducts electricity without resistance. Errors can creep in as qubits interact with their environment, potentially including electromagnetic fields, heat or stray atoms or molecules. If a single atom that represents a qubit gets jostled, the information the qubit was storing is lost.
Additionally, each step of a calculation has a significant chance of introducing error. As a result, for complex calculations, the output will be garbage, says quantum physicist Barbara Terhal of the research center QuTech in Delft, Netherlands.
Before quantum computers can reach their much-hyped potential, scientists will need to master new tactics for fixing errors, an area of research called quantum error correction. The idea behind many of these schemes is to combine multiple error-prone qubits to form one more reliable qubit. The technique battles what seems to be a natural tendency of the universe quantum things eventually lose their quantumness through interactions with their surroundings, a relentless process known as decoherence.
Its like fighting erosion, says Ken Brown, a quantum engineer at Duke University. But quantum error correction provides a way to control the seemingly uncontrollable.
Scientists and journalists share a core belief in questioning, observing and verifying to reach the truth. Science News reports on crucial research and discovery across science disciplines. We need your financial support to make it happen every contribution makes a difference.
Quantum computers gain their power from the special rules that govern qubits. Unlike classical bits, which have a value of either 0 or 1, qubits can take on an intermediate state called a superposition, meaning they hold a value of 0 and 1 at the same time. Additionally, two qubits can be entangled, with their values linked as if they are one entity, despite sitting on opposite ends of a computer chip.
These unusual properties give quantum computers their game-changing method of calculation. Different possible solutions to a problem can be considered simultaneously, with the wrong answers canceling one another out and the right one being amplified. That allows the computer to quickly converge on the correct solution without needing to check each possibility individually.
The concept of quantum computers began gaining steam in the 1990s, when MIT mathematician Peter Shor, then at AT&T Bell Laboratories in Murray Hill, N.J., discovered that quantum computers could quickly factor large numbers (SN Online: 4/10/14). That was a scary prospect for computer security experts, because the fact that such a task is difficult is essential to the way computers encrypt sensitive information. Suddenly, scientists urgently needed to know if quantum computers could become reality.
Shors idea was theoretical; no one had demonstrated that it could be done in practice. Qubits might be too temperamental for quantum computers to ever gain the upper hand. It may be that the whole difference in the computational power depends on this extreme accuracy, and if you dont have this extreme accuracy, then this computational power disappears, says theoretical computer scientist Dorit Aharonov of Hebrew University of Jerusalem.
But soon, scientists began coming up with error-correction schemes that theoretically could fix the mistakes that slip into quantum calculations and put quantum computers on more solid footing.
For classical computers, correcting errors, if they do occur, is straightforward. One simple scheme goes like this: If your bit is a 1, just copy that three times for 111. Likewise, 0 becomes 000. If one of those bits is accidentally flipped say, 111 turns into 110, the three bits will no longer match, indicating an error. By taking the majority, you can determine which bit is wrong and fix it.
But for quantum computers, the picture is more complex, for several reasons. First, a principle of quantum mechanics called the no-cloning theorem says that its impossible to copy an arbitrary quantum state, so qubits cant be duplicated.
Secondly, making measurements to check the values of qubits wipes their quantum properties. If a qubit is in a superposition of 0 and 1, measuring its value will destroy that superposition. Its like opening the box that contains Schrdingers cat. This imaginary feline of quantum physics is famously both dead and alive when the box is closed, but opening it results in a cat thats entirely dead or entirely alive, no longer in both states at once (SN: 6/25/16, p. 9).
So schemes for quantum error correction apply some work-arounds. Rather than making outright measurements of qubits to check for errors opening the box on Schrdingers cat scientists perform indirect measurements, which measure what error occurred, but leave the actual information [that] you want to maintain untouched and unmeasured, Aharonov says. For example, scientists can check if the values of two qubits agree with one another without measuring their values. Its like checking whether two cats in boxes are in the same state of existence without determining whether theyre both alive or both dead.
And rather than directly copying qubits, error-correction schemes store data in a redundant way, with information spread over multiple entangled qubits, collectively known as a logical qubit. When individual qubits are combined in this way, the collective becomes more powerful than the sum of its parts. Its a bit like a colony of ants. Each individual ant is relatively weak, but together, they create a vibrant superorganism.
Those logical qubits become the error-resistant qubits of the final computer. If your program requires 10 qubits to run, that means it needs 10 logical qubits which could require a quantum computer with hundreds or even hundreds of thousands of the original, error-prone physical qubits. To run a really complex quantum computation, millions of physical qubits may be necessary more plentiful than the ants that discovered a slice of last nights pizza on the kitchen counter.
Creating that more powerful, superorganism-like qubit is the next big step in quantum error correction. Physicists have begun putting together some of the pieces needed, and hope for success in the next few years.
Massive excitement accompanied last years biggest quantum computing milestone: quantum supremacy. Achieved by Google researchers in October 2019, it marked the first time a quantum computer was able to solve a problem that is impossible for any classical computer (SN Online: 10/23/19). But the need for error correction means theres still a long way to go before quantum computers hit their stride.
Sure, Googles computer was able to solve a problem in 200 seconds that the company claimed would have taken the best classical computer 10,000 years. But the task, related to the generation of random numbers, wasnt useful enough to revolutionize computing. And it was still based on relatively imprecise qubits. That wont cut it for the most tantalizing and complex tasks, like faster database searches. We need a very small error rate to run these long algorithms, and you only get those with error correction in place, says physicist and computer scientist Hartmut Neven, leader of Googles quantum efforts.
So Neven and colleagues have set their sights on an error-correction technique called the surface code. The most buzzed-about scheme for error correction, the surface code is ideal for superconducting quantum computers, like the ones being built by companies including Google and IBM (the same company whose pioneering classical computer helped put John Glenn into space). The code is designed for qubits that are arranged in a 2-D grid in which each qubit is directly connected to neighboring qubits. That, conveniently, is the way superconducting quantum computers are typically laid out.
As in an ant colony with workers and soldiers, the surface code requires that different qubits have different jobs. Some are data qubits, which store information, and others are helper qubits, called ancillas. Measurements of the ancillas allow for checking and correcting of errors without destroying the information stored in the data qubits. The data and ancilla qubits together make up one logical qubit with, hopefully, a lower error rate. The more data and ancilla qubits that make up each logical qubit, the more errors that can be detected and corrected.
In 2015, Google researchers and colleagues performed a simplified version of the surface code, using nine qubits arranged in a line. That setup, reported in Nature, could correct a type of error called a bit-flip error, akin to a 0 going to a 1. A second type of error, a phase flip, is unique to quantum computers, and effectively inserts a negative sign into the mathematical expression describing the qubits state.
Now, researchers are tackling both types of errors simultaneously. Andreas Wallraff, a physicist at ETH Zurich, and colleagues showed that they could detect bit- and phase-flip errors using a seven-qubit computer. They could not yet correct those errors, but they could pinpoint cases where errors occurred and would have ruined a calculation, the team reported in a paper published June 8 in Nature Physics. Thats an intermediate step toward fixing such errors.
But to move forward, researchers need to scale up. The minimum number of qubits needed to do the real-deal surface code is 17. With that, a small improvement in the error rate could be achieved, theoretically. But in practice, it will probably require 49 qubits before theres any clear boost to the logical qubits performance. That level of error correction should noticeably extend the time before errors overtake the qubit. With the largest quantum computers now reaching 50 or more physical qubits, quantum error correction is almost within reach.
IBM is also working to build a better qubit. In addition to the errors that accrue while calculating, mistakes can occur when preparing the qubits, or reading out the results, says physicist Antonio Crcoles of IBMs Thomas J. Watson Research Center in Yorktown Heights, N.Y. He and colleagues demonstrated that they could detect errors made when preparing the qubits, the process of setting their initial values, the team reported in 2017 in Physical Review Letters. Crcoles looks forward to a qubit that can recover from all these sorts of errors. Even if its only a single logical qubit that will be a major breakthrough, Crcoles says.
In the meantime, IBM, Google and other companies still aim to make their computers useful for specific applications where errors arent deal breakers: simulating certain chemical reactions, for example, or enhancing artificial intelligence. But the teams continue to chase the error-corrected future of quantum computing.
Its been a long slog to get to the point where doing error correction is even conceivable. Scientists have been slowly building up the computers, qubit by qubit, since the 1990s. One thing is for sure: Error correction seems to be really hard for anybody who gives it a serious try, Wallraff says. Lots of work is being put into it and creating the right amount of progress seems to take some time.
For error correction to work, the original, physical qubits must stay below a certain level of flakiness, called a threshold. Above this critical number, error correction is just going to make life worse, Terhal says. Different error-correction schemes have different thresholds. One reason the surface code is so popular is that it has a high threshold for error. It can tolerate relatively fallible qubits.
Imagine youre really bad at arithmetic. To sum up a sequence of numbers, you might try adding them up several times, and picking the result that came up most often.
Lets say you do the calculation three times, and two out of three of your calculations agree. Youd assume the correct solution was the one that came up twice. But what if you were so error-prone that you accidentally picked the one that didnt agree? Trying to correct your errors could then do more harm than good, Terhal says.
Headlines and summaries of the latest Science News articles, delivered to your inbox
The error-correction method scientists choose must not introduce more errors than it corrects, and it must correct errors faster than they pop up. But according to a concept known as the threshold theorem, discovered in the 1990s, below a certain error rate, error correction can be helpful. It wont introduce more errors than it corrects. That discovery bolstered the prospects for quantum computers.
The fact that one can actually hope to get below this threshold is one of the main reasons why people started to think that these computers could be realistic, says Aharonov, one of several researchers who developed the threshold theorem.
The surface codes threshold demands qubits that err a bit less than 1 percent of the time. Scientists recently reached that milestone with some types of qubits, raising hopes that the surface code can be made to work in real computers.
But the surface code has a problem: To improve the ability to correct errors, each logical qubit needs to be made of many individual physical qubits, like a populous ant colony. And scientists will need many of these superorganism-style logical qubits, meaning millions of physical qubits, to do many interesting computations.
Since quantum computers currently top out at fewer than 100 qubits (SN: 3/31/18, p. 13), the days of million-qubit computers are far in the future. So some researchers are looking at a method of error correction that wouldnt require oodles of qubits.
Everybodys very excited, but theres these questions about, How long is it going to take to scale up to the stage where well have really robust computations? says physicist Robert Schoelkopf of Yale University. Our point of view is that actually you can make this task much easier, but you have to be a little bit more clever and a little bit more flexible about the way youre building these systems.
Schoelkopf and colleagues use small, superconducting microwave cavities that allow particles of light, or photons, to bounce back and forth within. The numbers of photons within the cavities serve as qubits that encode the data. For example, two photons bouncing around in the cavity might represent a qubit with a value of 0, and four qubits might indicate a value of 1. In these systems, the main type of error that can occur is the loss of a photon. Superconducting chips interface with those cavities and are used to perform operations on the qubits and scout for errors. Checking whether the number of photons is even or odd can detect that type of error without destroying the data.
Using this method, Schoelkopf and colleagues reported in 2016 in Naturethat they can perform error correction that reaches the break-even point. The qubit is just beginning to show signs that it performs better with error correction.
To me, Aharonov says, whether you actually can correct errors is part of a bigger issue. The physics that occurs on small scales is vastly different from what we experience in our daily lives. Quantum mechanics seems to allow for a totally new kind of computation. Error correction is key to understanding whether that dramatically more powerful type of calculation is truly possible.
Scientists believe that quantum computers will prove themselves to be fundamentally different than the computer that helped Glenn make it into orbit during the space race. This time, the moon shot is to show that hunch is right.
See the original post:
To live up to the hype, quantum computers must repair their error problems - Science News
Posted in Quantum Computing
Comments Off on To live up to the hype, quantum computers must repair their error problems – Science News
Honeywell Says It Has Built The Worlds Most Powerful Quantum Computer – Forbes
Posted: at 2:45 pm
Honeywell says its new quantum computer is twice as fast than any other machine.
In the race to the future of quantum computing, Honeywell has just secured a fresh lead.
The North Carolina-based conglomerate announced Thursday that it has produced the worlds fastest quantum computer, at least twice as powerful as the existing computers operated by IBM and Google.
The machine, located in a 1,500-square-foot high-security storage facility in Boulder, Colorado, consists of a stainless steel chamber about the size of basketball that is cooled by liquid helium at a temperature just above absolute zero, the point at which atoms stop vibrating. Within that chamber, individual atoms floating above a computer chip are targeted with lasers to perform calculations.
While people have studied the potential of quantum computing for decades, that is, building machines with the ability to complete calculations beyond the limits of classic computers and supercomputers, the sector has until recently been limited to the intrigue of research groups at tech companies such as IBM and Google.
But in the past year, the race between those companies to claim supremacy and provide a commercial use in the quantum race has become heated. Honeywells machine has achieved a Quantum Volume of 64, a metric devised by IBM that measures the capability of the machine and error rates, but is also difficult to decipher (and as quantum computing expert Scott Aaronson wrote in March, is potentially possible to game). By comparison, IBM announced in January that it had achieved a Quantum Volume of 32 with its newest machine, Raleigh.
Google has also spent significant resources on developing its quantum capabilities and In October said it had developed a machine that completed a calculation that would have taken a supercomputer 10,000 years to process in just 200 seconds. (IBM disputed Googles claim, saying the calculation would have taken only 2.5 days to complete.)
Honeywell has been working toward this goal for the past decade when it began developing the technology to produce cryogenics and laser tools. In the past five years, the company assembled a team of more than 100 technologists entirely dedicated to building the machine, and in March, Honeywell announced it would be within three months a goal it was able to meet even as the Covid-19 turned its workforce upside down and forced some employees to work remotely. We had to completely redesign how we work in the facilities, had to limit who was coming on the site, and put in place physical barriers, says Tony Uttley, president of Honeywell Quantum Solutions. All of that happened at the same time we were planning on being on this race.
The advancement also means that Honeywell is opening its computer to companies looking to execute their own unimaginably large calculations a service that can cost about $10,000 an hour, says Uttley. While it wont disclose how many customers it has, Honeywell did say that it has a contract with JPMorgan Chase, which has its own quantum experts who will use its machine to execute gargantuan tasks, such as building fraud detection models. For those companies without in-house quantum experts, queries can be made through intermediary quantum firms, Zapata Computing and Cambridge Quantum Computing.
With greater access to the technology, Uttley says, quantum computers are nearing the point where they have graduated from an item of fascination to being used to solve problems like climate change and pharmaceutical development. Going forward, Uttley says Honeywell plans to increase the Quantum Volume of its machine by a factor of 10 every year for the next five years, reaching a figure of 640,000 a capability far beyond that imagined ever before.
Read more here:
Honeywell Says It Has Built The Worlds Most Powerful Quantum Computer - Forbes
Posted in Quantum Computing
Comments Off on Honeywell Says It Has Built The Worlds Most Powerful Quantum Computer – Forbes
Tech company uses quantum computers to help shipping and trucking industries – FreightWaves
Posted: at 2:45 pm
Ed Heinbockel, president and chief executive officer of SavantX, said hes excited about how a powerful new generation of quantum computers can bring practical solutions to industries such as trucking and cargo transport.
With quantum computing, Im very keen on this, because Im a firm believer that its a step change technology, Heinbockel said. Its going to rewrite the way that we live and the way we work.
Heinbockel referred to recent breakthroughs such as Googles quantum supremacy, a demonstration where a programmable quantum processor solved a problem that no classical computer could feasibly solve.
In October 2019, Googles quantum processor, named Sycamore, performed a computation in 200 seconds that would have taken the worlds fastest supercomputer 10,000 years to solve, according to Google.
Jackson, Wyoming-based SavantX also recently formed a partnership with D-Wave Systems Inc., a Burnaby, Canada-based company that develops and offers quantum computing systems, software and services.
With D-Waves quantum services, SavantX has begun offering its Hyper Optimization Nodal Efficiency (HONE) technology to solve optimization problems to customers such as the Pier 300 container terminal project at the Port of Los Angeles.
The project, which began last year, is a partnership between SavantX, Blume Global and Fenix Marine Services. The projects goal is to optimize logistics on the spacing and placement of shipping containers to better integrate with inbound trucks and freight trains. The Pier 300 site handles 1.2 million container lifts per year.
With Pier 300, when do you need trucks at the pier and when and how do you get them scheduled optimally?, Heinbockel said. So the appointing part of it is very important and that is a facet of HONE technology.
Heinbockel added, Were very excited about the Pier 300 project, because HONE is a generalized technology. Then its a question of what other systems can we optimize? In all modes of transportation, the winners are going to be those that can minimize the energy in the systems; energy reduction. Thats all about optimization.
Heinbockel co-founded SavantX in 2015 with David Ostby, the companys chief science officer. SavantX offers data collection and visualization tools for industries ranging from healthcare to nuclear energy to transportation.
Heinbockel also recently announced SavantX will be relocating its corporate research headquarters to Santa Fe, New Mexico. The new center, which could eventually include 100 employees, will be focused on the companys HONE technology and customizing it for individual clients.
Heinbockel said SavantX has been talking to trucking, transportation and aviation companies about how HONE can help solve issues such as driver retention and optimizing schedules.
One of the problems Ive been hearing consistently from trucking companies is that they hire somebody. The HR department tells the new employee well have you home every Thursday night, Heinbockel said. Then you get onto a Friday night or Saturday, and [the driver] is still not home.
Heinbockel said if quantum computing and HONE can be used to help trucking companies with driver retention, and that it will make a lot of companies happy.
Heinbockel said cross-border operations could use HONE to understand what the flow patterns are like for commercial trucks crossing through different ports at various times of the day.
You would optimize your trucking flow based on when those lax periods were at those various ports, or you could ask yourself, is it cheaper for me to send a truck 100 miles out of the way to another port, knowing that it can get right through that port without having to sit for two or three hours in queue, Heinbockel said.
Click for more FreightWaves articles byNoi Mahoney.
See more here:
Tech company uses quantum computers to help shipping and trucking industries - FreightWaves
Posted in Quantum Computing
Comments Off on Tech company uses quantum computers to help shipping and trucking industries – FreightWaves
Two-Electron Qubit Points the Way to Scaling up Quantum Computers, According to RIKEN Research – HPCwire
Posted: at 2:45 pm
June 22, 2020 The high-accuracy, resonant operation in silicon of a new type of qubitthe basic unit of data in quantum computershas been demonstrated for the first time by an all-RIKEN team1. This qubit overcomes a problem with conventional qubits in silicon, which has been a roadblock to scaling up quantum computers.
Quantum computers promise to revolutionize computing as they will be able to perform certain types of calculations much faster than conventional computers.
There are various competing technologies for realizing quantum computers, all with their advantages and disadvantages. One of the most promising is the use of electron spins in silicon. It has the huge head start of being able to apply the semiconductor manufacturing techniques used today for conventional electronics.
But in all these diverse technologies, quantum computers are based on qubitsthe quantum equivalent of bits in conventional computersand use them to store information and perform calculations.
In silicon-based quantum computers, the simplest qubit is the spin of a single electron, which can be in a superposition of two possible states: up and down. However, these qubits require high-frequency microwave pulses to control them, which are hard to focus down so that they only control one qubit without disrupting its neighbors.
Now, Seigo Tarucha, Kenta Takeda and three co-workers, all at the RIKEN Center for Emergent Matter Science, have realized high-accuracy operation using a qubit that employs the spins of two electrons, which can exist in the superposition of two states: up, down and down, up.
Compared to qubits based on single electrons, this qubit can be controlled by much lower frequency microwave pulses, which are easier to restrict to narrow areas. The big advantage of our qubit is that it doesnt require high-frequency control pulses, which are usually difficult to localize and can be a problem when scaling a system up, explains Takeda. The crosstalk caused by high-frequency signals can unintentionally rotate qubits near the target one.
While these two-electron qubits have been realized in previous studies, this is the first time that the accuracy of the operation was 99.6%.
Previous demonstrations of these qubits suffered from both nuclear and charge noises, Takeda notes. In this study, we used an improved device and operation scheme to mitigate the issues and show that the control fidelity of the qubit can exceed the 99% threshold for quantum error correction.
The team now intends to make their device even more accurate by rendering the nuclear noise negligible through employing a special type of silicon that contains only one isotope.
About RIKEN
RIKEN is Japans largest comprehensive research institution renowned for high-quality research in a diverse range of scientific disciplines. Founded in 1917 as a private research foundation in Tokyo, RIKEN has grown rapidly in size and scope, today encompassing a network of world-class research centers and institutes across Japan.
Source: RIKEN
Read the original here:
Posted in Quantum Computing
Comments Off on Two-Electron Qubit Points the Way to Scaling up Quantum Computers, According to RIKEN Research – HPCwire
RIKEN Physicists Develop Pseudo-2D Architecture for Quantum Computers that is Simple and Scalable – HPCwire
Posted: at 2:45 pm
June 22, 2020 A simple pseudo-2D architecture for connecting qubitsthe building blocks of quantum computershas been devised by RIKEN physicists1. This promises to make it easier to construct larger quantum computers.
Quantum computers are anticipated to solve certain problems overwhelmingly faster than conventional computers, but despite rapid progress in recent years, the technology is still in its infancy. Were still in the late 1940s or early 1950s, if we compare the development of quantum computers with that of conventional computers, notes Jaw-Shen Tsai of the RIKEN Center for Emergent Matter Science and the Tokyo University of Science.
One bottleneck to developing larger quantum computers is the problem of how to arrange qubits in such a way that they can both interact with their neighbors and be readily accessed by external circuits and devices. Conventional 2D networks suffer from the problem that, as the number of qubits increases, qubits buried deep inside the networks become difficult to access.
To overcome this problem, large companies such as Google and IBM have been exploring complex 3D architectures. Its kind of a brute-force approach, says Tsai. Its hard to do and its not clear how scalable it is, he adds.
Tsai and his team have been exploring a different tack from the big companies. Its very hard for research institutes like RIKEN to compete with these guys if we play the same game, Tsai says. So we tried to do something different and solve the problem they arent solving.
Now, after about three years of work, Tsai and his co-workers have come up with a quasi-2D architecture that has many advantages over 3D ones.
Their architecture is basically a square array of qubits deformed in such a way that all the qubits are arranged in two rows (Fig. 1)a bilinear array with cross wiring, as Tsai calls it. Since all the qubits lie on the edges, it is easy to access them.
The deformation means that some wires cross each other, but the team overcame this problem by using airbridges so that one wire passes over the other one, much like a bridge at the intersection of two roads allows traffic to flow without interruption. Tests showed that there was minimal crosstalk between wires.
The scheme is much easier to construct than 3D ones since it is simpler and can be made using conventional semiconductor fabrication methods. It also reduces the number of wires that cross each other. And importantly, it is easy to scale up.
The team now plans to use the architecture to make a 1010 array of qubits.
About RIKEN
RIKEN is Japans largest comprehensive research institution renowned for high-quality research in a diverse range of scientific disciplines. Founded in 1917 as a private research foundation in Tokyo, RIKEN has grown rapidly in size and scope, today encompassing a network of world-class research centers and institutes across Japan.
Source: RIKEN
See the original post:
Posted in Quantum Computing
Comments Off on RIKEN Physicists Develop Pseudo-2D Architecture for Quantum Computers that is Simple and Scalable – HPCwire
Baidus deep-learning platform fuels the rise of industrial AI – MIT Technology Review
Posted: at 2:45 pm
Behind these smart drones are well-trained deep-learning models based on Baidus PaddlePaddle, the first open-source deep-learning platform in China. Like mainstream AI frameworks such as Googles TensorFlow and Facebooks PyTorch, PaddlePaddle, which was open sourced in 2016, provides software developers of all skill levels with the tools, services, and resources they need to rapidly adopt and implement deep learning at scale.
PaddlePaddle is being used by more than 1.9 million developers and 84,000 enterprises globally. Industries throughout China are using the platform to create specialized applications for their sectors, from the automotive industrys acceleration ofautonomous vehiclesto the health-care industrysapplications for fighting covid-19.
Indeed, the coronavirus pandemic, which has spread over 150 countries and caused a worldwide economic shock, is increasing demands for AI transformation. Now is an unprecedented opportunity for the development of PaddlePaddle given the rise of industrial intelligence and the acceleration of AI-powered infrastructure, says Haifeng Wang, chief technology officer at Baidu. We will continue to embrace the open-source spirit, driving technological innovation, partnering with developers to advance deep-learning and AI technologies, and speeding up the process of industrial intelligence.
Deep-learning technologies create opportunities for revamping operations, workload management, and productivity, even in traditional industries such as manufacturing, forestry, energy, and waste management. For example, in waste management, AI is transforming refuse picking, sorting, and recycling, supporting efforts to conserve natural resources, reduce carbon emissions, and lessen waste going into landfill sites. According to a World Bank report, more than2 billion tons of municipal solid wasteare produced in the world each year. Collecting it and separating it exposes waste pickers to any number of risk factors and hazards, making this a critical area for the development of innovative AI technologies.
In Europe and the US, computer-vision technology has been extensively used for detecting different types of waste, such as glass, plastic, and cardboard, to make waste sorting more efficient. But the task is not as efficient in all countries.
Using traditional computer-vision models in China would be useless, says Zhiwen Zhang, CEO of Jinlu Technology. The garbage in China is not compatible with what can be detected by this technology. Complications tend to arise with the detection quality and with identifying diverse garbage, says Zhang.
A computer-vision veteran, Zhang was eyeing PaddlePaddle to develop applications for improving waste sorting in China. Although the industry lacks the expertise of deep learning, with PaddlePaddle, developers dont necessarily have to be deep-learning experts or build things like data-processing models from scratch.
Jinlu Technology uses a garbage-sorting robot programmed with an object-detection model to identify different types of garbage. It also uses an image-segmentation model to find garbage and do things like detect the edge of a bottle and determine its center point. The model takes just half a second to recognize an image.
For plastic bottles, Jinlu Technology trains an instance-segmentation model using Paddle Detection, a PaddlePaddle toolkit for image processing. The model predicts on Edgeboard (PaddlePaddles edge computing development platform) through Paddle Lite, PaddlePaddles deep-learning framework tailored for lightweight models, and sends signals to robotic arms that classify the garbage. While traditional algorithm-accuracy screening stays between 60% and 90%, depending on the quality of the garbage, deep-learning algorithms deliver an accuracy of 93% to 99%.
Using AI in waste management promises further potential. AI can not only spare human labor by 96%, but it can also refine sorting and further identify waste that can be difficult to categorize, such as large pieces of organic matter, small pieces of metal, and other particles. Not to mention, AI can self-learn to optimize the pipeline, says Zhang.
Currently, PaddlePaddle offers 146 algorithms and has advanced more than 200 pretraining models, some of them with open-source codes to facilitate the rapid development of industrial applications. The platform also hosts toolkits for cutting-edge research purposes, like Paddle Quantum for quantum-computing models and Paddle Graph Learning for graph-learning models.
PaddlePaddle facilitates AI development while lowering the technical burden for users, using a programmable scheme to architect the neural networks. It supports declarative and imperative programming with development flexibilityso can develop software with different types of requirementsall while preserving a high runtime performance. Algorithms can automatically design neural architectures that offer better performance than those developed by human experts.
PaddlePaddle has also made breakthroughs in ultra-large-scale deep neural networks training. Its platform, the first in the world of its kind, supports the training of deep neural networks with more than 100 billion features and trillions of parameters using data sources distributed over hundreds of nodes.One of the beneficiaries is Oppo, a smartphone producer in China, which uses PaddlePaddle to boost the training efficiency of its recommendation system by 80%.
Not only is PaddlePaddle compatible with other open-source frameworks for model training, it also accelerates the inference of deep neural networks for a variety of processors and hardware platforms. At the recent Baidu Deep Learning Developer Conference Wave Summit 2020, PaddlePaddle announced its collaboration in a hardware ecosystem that includes leading global tech companies such as Intel, NVIDIA, Arm China, Huawei, MediaTek, Cambricon, Inspur, and Sugon.
PaddlePaddle still has room for improvement, says Baidus corporate vice president Tian Wu. In the future, PaddlePaddle will keep advancing large-scale distributed computing and heterogeneous computing, providing the most powerful production platform and infrastructure for developers to accelerate the development of intelligent industries.
One of the industrial applications developed from PaddlePaddle is currently in use for medical purposes to combat covid-19. The primary diagnostic tool for pneumonia, one of the severe effects of covid-19, is a chest computed-tomography (CT) scan. With limited front-line doctors and resources to read an exponentially growing number of scans quickly and accurately, CT imaging technology is crucial to helping clinicians detect and monitor infections more effectively.
LinkingMed, a Beijing-based oncology data platform and medical data analysis company, released Chinas first open-source AI model for pneumonia CT image analysis, powered by PaddlePaddle. The AI model can quickly detect and identify pneumonic lesions while providing a quantitative assessment for diagnosis information, including the number, volume, and proportion of pneumonic lesions.
By using PaddlePaddle and its semantic segmentation toolkit PaddleSeg, LinkingMed has developed an AI-powered pneumonia screening and lesion-detection system being used in the hospital affiliated with Xiangnan University in Hunan Province. The system can pinpoint the disease in less than one minute with a detection accuracy of 92% and a recall rate of 97% on test data sets.
Robust AI will be needed to manage the increasingly complex tasks required for technological growth. Baidu is committed to developing the PaddlePaddle deep-learning platform along with AI researchers to create a better future. Were thrilled to see what weve accomplished in 2020 and look forward to new breakthroughs in the future.
Follow this link:
Baidus deep-learning platform fuels the rise of industrial AI - MIT Technology Review
Posted in Quantum Computing
Comments Off on Baidus deep-learning platform fuels the rise of industrial AI – MIT Technology Review
Global Quantum Information Processing Market Expanding Rapidly with Forecast 2025 and Top Players : 1QB Information Technologies, Airbus, Anyon…
Posted: at 2:45 pm
This high-end research comprehension on the Global Quantum Information Processing Market renders major impetus on detailed growth facets, in terms of product section, payment and transaction platforms, further incorporating service portfolio, applications, as well as a specific compilation on technological interventions that facilitate ideal growth potential of the market.
The report is so designed as to direct concrete headways in identifying and deciphering each of the market dimensions to evaluate logical derivatives which have the potential to set the growth course in the aforementioned Quantum Information Processing market. Besides presenting notable insights on market factors comprising above determinants, this specific, innately crafted research report offering further in its subsequent sections states information on regional segmentation, as well as thoughtful perspectives on specific understanding comprising region specific developments as well as leading market players objectives to trigger maximum revenue generation and profits.
This study covers following key players:1QB Information TechnologiesAirbusAnyon SystemsCambridge Quantum ComputingD-Wave SystemsGoogleMicrosoftIBMIntelQC WareQuantumRigetti ComputingStrangeworksZapata Computing
Request a sample of this report @ https://www.orbismarketreports.com/sample-request/84594?utm_source=Pooja
This illustrative research report on the Quantum Information Processing market is an all-in-one, ready to use handbook of market dynamics that upon mindful inference lends valuable insights on market developments, growth trajectory, dominant trends as well as technological sophistication as well as segment expansion and competition spectrum that have a strong bearing on the growth probabilities of the Quantum Information Processing market.
This particular section of the Quantum Information Processing market report specifically stresses upon various indigenous tactical discretion that eventually contributed towards soliciting heralding market consolidation, impeccable stability and sustainable revenue pools, the ultimate touchstone to judge the potency of the Quantum Information Processing market.
Access Complete Report @ https://www.orbismarketreports.com/global-quantum-information-processing-market-growth-analysis-by-trends-and-forecast-2019-2025?utm_source=Pooja
Market segment by Type, the product can be split into HardwareSoftware
Market segment by Application, split into BFSITelecommunications and ITRetail and E-CommerceGovernment and DefenseHealthcareManufacturingEnergy and UtilitiesConstruction and EngineeringOthers
The report further unveils pertinent details about segment contribution in coining ample revenue flow, sustainability and long term growth in global Quantum Information Processing market. A thorough knowledge base of market facets remains integral and indispensable to decode Quantum Information Processing market prognosis. This recent research compilation on the Quantum Information Processing market presents a deep analytical review and a concise presentation of ongoing market trends that collectively inculcate a strong influence on the growth trajectory of the aforementioned Quantum Information Processing market.
The report sheds light on the particular segment that sets revenue maximization, rolling, thus incurring steady growth in revenues and contributing towards steady sustenance of the Quantum Information Processing market. This well versedreport is thoughtfully crafted to arm report readers with convincing market insights on the mettle of all aforementioned factors that propel relentless growth despite significant bottlenecks in the Quantum Information Processing market.
Some Major TOC Points:1 Report Overview2 Global Growth Trends3 Market Share by Key Players4 Breakdown Data by Type and ApplicationContinued
In addition to all of the above stated inputs, discussed at length in the report, the report sheds tangible light on dynamic segmentation based on which the market has been systematically split into prominent segments inclusive of type, end use technology, as well as region specific diversification of the Quantum Information Processing market to encourage highly remunerative business discretion.
For Enquiry before buying report @ https://www.orbismarketreports.com/enquiry-before-buying/84594?utm_source=Pooja
About Us : With unfailing market gauging skills, has been excelling in curating tailored business intelligence data across industry verticals. Constantly thriving to expand our skill development, our strength lies in dedicated intellectuals with dynamic problem solving intent, ever willing to mold boundaries to scale heights in market interpretation.
Contact Us : Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: USA: +1 (972)-362-8199 | IND: +91 895 659 5155
More here:
Posted in Quantum Computing
Comments Off on Global Quantum Information Processing Market Expanding Rapidly with Forecast 2025 and Top Players : 1QB Information Technologies, Airbus, Anyon…
Headliners, part I: 32 upcoming events & program deadlines in the Triangle – WRAL Tech Wire
Posted: at 2:45 pm
WRAL TechWire keeps tabs on the latest and greatest meetups, panels, workshops, conferences, application deadlines and all things happening in the North Carolina startup/tech world. The Headliners is a multi-part weekly roundup of upcoming events to add to your calendar.
Following is a list of June events in Raleigh, Durham, Chapel Hill and the greater Triangle area.
However, links to events are not available due to technical problems with our interactive calendar.
If youd like your event to be included, feel free to send me an email.
Also, check outour comprehensive resource guide for startups in the Triangle.
Note: The following list is our lineup of Triangle events through the end of Junemost have been switched to a virtual format due to COVID-19 social distancing requirements.
This monthly webinar series hosts a global health professional who will share insights and advice for advancing your career.
Led by The Storytelling Companion podcast host Chris Thiede, this workshop will teach participants how storytelling can help them connect with their audience.
Raleigh Chambers next virtual Networking 101 event features a talk from Barfield Revenue Consulting CEO and President Will Barfield, who will discuss how to maximize online networking tools. (Tickets are free for employees of Raleigh Chamber member firms.)
The North Carolina Sustainable Energy Association is hosting a monthly webinar serieswith local and national experts covering clean energy trends.
Join Cary Chamber of Commerce for an evening social event with Wake County elected officials.
Code for Durham brings together technologists, designers, developers, data scientists, map makers and activists to collaborate on civic technology projects. Meetings are held every two weeks on Tuesdays. Pizza will be provided.
The Small Business and Technology Development Center is hosting a four-week business development program for entrepreneurs that are ready to take their ideas to the next level by getting their business up and running. Classes are held every Tuesday from June 2-23.
NC TECHs Diversity + Inclusion Summit will bring together executives, professionals and organizational leaders to discuss the benefits of a diverse workplace. The virtual event will take place over two days.
In this virtual session, IT leaders at NC TECH member companies will discuss relevant topics and best practices in their field.
This online meeting will convene CISOs, VPs and director-level security leaders from NC TECH member companies.
1 Million Cups, presented by Kauffman, is a weekly informal pitch event for the startup community. Join for free coffee and entrepreneurial support as local startups deliver their presentations.
NC4ME (North Carolina for Military Employment) is hosting a virtual hiring event where job seekers can meet face-to-face with hiring managers and recruiters.
In this four-week virtual summit series, Raleigh Chamber will cover a range of topics relevant to the local business community, from COVID-19s impact on the economy to economic mobility, talent and other topics.
The Economic Development Partnership of North Carolina is hosting a summer-long webinar series covering topics of interest to businesses seeking the latest information on exports and the global trade landscape. This webinar will cover trade opportunities for North Carolina companies in Canada.
In this lunch and learn, Agustin Pelez and Cristina Botero of Ubidots will discuss how solopreneurs around the world are using IoT technologies for their business.
Wake Technical Community College is hosting a free virtual career fair where students and alumni can chat individually with local organizations.
Grow With Google is hosting regularly scheduled webinars for North Carolina-based small businesses. Sessions are led by Latesha Byrd, founder and CEO of Byrd Career Consulting. Every webinar covers a different topic relevant to local businesses.
The Council for Entrepreneurial Development is hosting a virtual event presenting the organizations 2019 Innovators Report. Learn about last years successes within the local startup community.
Join this webinar to learn about how quantum computing is used in economic models, including quantum money and quantum speed-ups.
ProductCampRTP is hosting an online conference spread across four weeks in June and July. The series is aimed towards product professionals worldwide. Join to hear talks from global experts. (More TechWire coverage here.)
Hosted by Forward Cities, this webinar will feature an economic development case study on how Kensington, PA is partnering with small businesses to promote growth.
RIoT is hosting a virtual town hall to discuss topics around racism and strategies to create change. The discussion will feature Denitresse Ferrell of DF Consulting and Brandon Johnson of NetApp.
Capitol Broadcasting Company and WRAL are hosting a webinar series featuring local industry experts and business owners. Each week brings a new topic relevant to businesses in the Triangle. More coverage here.
Wake Technical Community College is hosting a virtual event series featuring experts sharing their insights on the role of technology in solving problems during the COVID-19 pandemic.
DHIT is hosting a panel with healthcare technology leaders discussing how biomedical sensors can be applied in the COVID-19 pandemic.
The Research Triangle Cleantech Cluster, the Triangle Clean Cities Coalition and greater Charlottes Centralina Clean Fuels Coalition are co-hosting a webinar with public and private sector leaders discussing the current transportation landscape and the opportunities to futurize it.
This weekly meetup brings together developers, IT professionals and tech enthusiasts who are interested in the Google Cloud Platform.
The 12-week RIoT Accelerator Program connects early-stage IoT startups with an industry consortium of more than 80 companies to learn, partner and bring your product to market. The fall 2020 cohort will run from August 24 to November 13 in Raleigh.
Raleigh Chambers first virtual Courageous Conversation will feature Opal Tometi, co-founder of #BlackLivesMatter, along with other panelists discussing how companies can have honest conversations about race that lead to positive change.
Join the Code for Chapel Hill meetup to network with like-minded individuals and work on civic hacking projects. Meetings are held every two weeks on Tuesdays.
Bring your ideas and opinions to the next Midtown Techies meetup. Events are held on the last Tuesday of every month.
See the original post:
Headliners, part I: 32 upcoming events & program deadlines in the Triangle - WRAL Tech Wire
Posted in Quantum Computing
Comments Off on Headliners, part I: 32 upcoming events & program deadlines in the Triangle – WRAL Tech Wire
These are the technologies that will revolutionise businesses in the Post-COVID era – Moneycontrol
Posted: at 2:45 pm
Tulika Saxena
COVID19 has highlighted the importance of technology-led transformation across industries. The coming decade is expected to witness a transformation catalysed by the emergence and confluence of variety of technologies.
How has COVID19 changed the status quo?
Biologists Stephen Goulds and Niles Eldredges punctuated equilibrium theory states that evolutionary change happens in short, stressful bursts of time. We can consider these brief moments akin to a revolution that drives a much-needed transformation. The entities that can undertake such a transformation succeed in the new normal.
The ongoing COVID19 pandemic is one such brief moment. Although technologies such as AI, IoT and big data analytics (BDA) made their presence felt in the last decade, their adoption had not reached their full potential. COVID19 transformed the scenario; some enterprises underwent a transformation in a matter of weeks; something that might have taken a few years.
In the post-COVID19 business landscape, technologies will play a critical role in enabling enterprises to create scalable business impact and design highly personalised offerings; thereby creating a strong foundation for success. This in turn will enable them to navigate through challenges, from geo-political tensions to trade wars and natural disasters.
Which technologies and use cases will shape the next decade?
Relatively well-established technologies will be sought after, given their proven capabilities in paving the way for transformation. However, the real difference will lie in the ability of organisations to leverage the advancements in these technologies. For instance, in AI and machine learning, the open sourcing movement will accelerate their innovation, facilitating the development of highly customised solutions. Enhancements in analytical, predictive, and prescriptive capabilities will enable them to assist humans in complex use cases such as improving product design and reducing customer churn. Decentralisation of AI will transform IoT into IoT 2.0, enabling remote assets to leverage edge computing for generating and acting on actionable insights, almost independent of cloud connectivity. This will be immensely useful in scenarios such as remote health monitoring and autonomous vehicles where compromised network connectivity could have life-threatening implications.
Advancements in AI techniques such as NLP and machine vision will drive advancements in BDA, enabling enterprises to leverage troves of structured and unstructured data to generate actionable insights. Sectors such as healthcare, BFSI and retail will be able to leverage historic and real-time customer insights to tailor experiences and offerings based on their preferences. Digital twins, leveraging AI, IoT and BDA, will benefit from enhanced functionality, thus driving their adoption. Advancements in mechatronics will lead to more capable and safer Cobots. Blockchain will continue to be adopted in diverse areas. Blockchain-based smart contracts will gain traction as enterprises increasingly adopt code-based and automated legal solutions to optimise their operations. Blockchains decentralised architecture and its use of encryption algorithms will also find applications in protecting remote assets using IoT solutions.
Some of last decades technologies that were ahead of their time will witness a resurgence as their use cases become more established. For instance, 3D printing will gain traction as enterprises from automobile OEMs to medical device manufacturers develop contingency plans for supply chain disruptions. Amid advancements in raw materials and product design, 3D printed parts will be used in medical prosthetics and modular construction. Adoption of AR and VR solutions will increase as they become portable, affordable, and powerful. Besides use cases such as training and equipment maintenance, they will also be sought for creating virtual experiences in sectors such as entertainment, sports, and retail. Quantum computers will also progress beyond the R&D stage. Given their high speed, they are ideal for complex use cases, ranging from asset degradation modelling to drug development.
The coming decade will also witness transformation in technologies that enable and accelerate the adoption of technologies covered above. For instance, in cloud computing, a mature technology, hybrid cloud model will gain traction. It enables enterprises to strike the perfect balance between private and third-party clouds, and is flexible, cost-effective, and secure. In telecom, 5G will pave the way for dynamic and self-regulating networks capable of supporting use cases from autonomous vehicles to remote surgery. 6G, featuring 100Gbps speeds and sub 1ms end-to-end latency, could also appear on the horizon, rendering applications such as high-fidelity holograms and pervasive AI possible. In cybersecurity, the Secure Access Service Edge (SASE) architecture model will gain traction. It will allow enterprises to replace multiple security solutions with a unified SaaS-based network and security platform, thereby offering more flexibility and cost benefits.
What must be done to derive the maximum benefit from technologies?
The decade 202030 will see enterprises leverage past learnings, including the COVID19 pandemic. Optimising technology usage for maximising benefits will depend on multiple factors having a flexible organisational culture, pursuing customer-centric approach, drawing and applying insights from the most relevant and feasible technology use cases, effectively implementing the roadmap, and monitoring the right key performance indicators. This knowledge and experience will enable enterprises to be cognisant of the evolving business landscape, take appropriate tactical and strategic decisions, improve resilience, and succeed in the coming decade.
Moneycontrol Ready Reckoner
Now that payment deadlines have been relaxed due to COVID-19, the Moneycontrol Ready Reckoner will help keep your date with insurance premiums, tax-saving investments and EMIs, among others.
WEBINAR: Tune in to find out how term insurance can provide risk protection during tough times. Register Now!
View post:
These are the technologies that will revolutionise businesses in the Post-COVID era - Moneycontrol
Posted in Quantum Computing
Comments Off on These are the technologies that will revolutionise businesses in the Post-COVID era – Moneycontrol