Quantum and classical computers handle time differently. What does that mean for AI? – The Next Web

Posted: September 18, 2020 at 1:02 am

As humans, we take time for granted. Were born into an innate understanding of the passage of events because its essential to our survival. But AI suffers from no such congenital condition. Robots do not understand the concept of time.

State of the art AI systems only understand time as an implicit construct (we program it to output time relevant to a clock) or as an explicit representation of mathematics (we use the time it takes to perform certain calculations to instruct its understanding of the passage of events). But an AI has no way of understanding the concept of time itself as we do.

Time doesnt exist in our classical reality in a physical, tangible form. We can check our watch or look at the sun or try and remember how long its been since we last ate, but those are all just measurements. The actual passage of time, in the physics sense, is less proven.

In fact, scientists have proven that times arrow a bedrock concept related to the classical view of time doesnt really work on quantum computers. Classical physics suffers from a concept called causal asymmetry. Basically, if you throw a bunch of confetti in the air and take a picture when each piece is at its apex, itll be easier for a classical computer to determine what happens next (where the confetti is going) than what happened before (what direction the confetti would travel in going backwards through time).

Quantum computers can perform both calculations with equal ease, thus indicating they do not suffer causal asymmetry. Times arrow is only relevant to classical systems of which the human mind appears to be, though our brains are almost certainly quantum constructs.

Where things get most interesting is if you consider the addition of artificial intelligence into the mix. As mentioned previously, AI doesnt have a classical or quantum understanding of time: time is irrelevant to a machine.

But experts such as Gary Marcus and Ernest Davis believe an understanding of time is essential to the future of AI, especially as it relates to human-level artificial general intelligence (AGI). The duo penned an op-ed for the New York Times where they stated:

In particular, we need to stop building computer systems that merely get better and better at detecting statistical patterns in data sets often using an approach known as deep learning and start building computer systems that from the moment of their assembly innately grasp three basic concepts: time, space and causality.

While the statement is intended as a sweeping indictment on relying on bare bones deep learning systems and brute force to achieve AGI, it serves as a bit of a litmus test as to where the computer science community is at when it comes to AI .

Currently, were building classical AI systems with the hopes theyll one day be robust enough to mimic the human mind. This is a technology endeavor, meaning computer experts are continuously pushing the limits of what modern hardware and software can do.

The problem with this approach is that its creating a copy of a copy. Quantum physics tells us that, at the very least, our understanding of time is likely different from what might be theultimate universal reality.

How close can robots ever come to imitating humans if they, like us, only think in classical terms? Perhaps a better question is: what happens when AI learns to think in quantum terms while us humans are still stuck with our classical interpretation of reality?

So youre interested in AI? Thenjoin our online event, TNW2020, where youll hear how artificial intelligence is transforming industries and businesses.

Published September 17, 2020 18:52 UTC

Read the rest here:

Quantum and classical computers handle time differently. What does that mean for AI? - The Next Web

Related Posts