Is Neuromorphic Computing The Answer For Autonomous Driving And Personal Robotics? – Forbes

Intel's Loihi 2 is the company's second generation neuromorphic computing chip, a technology that's ... [+] designed to function like a digital representation of a biological brain complete with neurons and synapses.

If you follow the latest trends in the tech industry, you probably know that theres been a fair amount of debate about what the next big thing is going to be. Odds-on favorite for many has been augmented reality (AR) glasses, while others point to fully autonomous cars, and a few are clinging to the potential of 5G. With the surprise debut of Amazons Astro a few weeks back, personal robotic devices and digital companions have also thrown their hat into the ring.

However, while there has been little agreement on exactly what the next thing is, there seems to be little disagreement that whatever it turns out to be, it will be somehow powered, enabled, or enhanced by artificial intelligence (AI). Indeed, the fact that AI and machine learning (ML) are our future seems to be a foregone conclusion.

Yet, if we do an honest assessment of where some of these technologies actually stand on a functionality basis versus initial expectations, its fair to argue that the results have been disappointing on many levels. In fact, if we extend that thought process out to what AI/ML were supposed to do for us overall, then we start to come to a similarly disappointing conclusion.

To be clear, weve seen some incredible advancements in many areas that AI has powered. Advanced analytics, neural network training, and other related fields (where large chunks of data are used to find patterns, learn rules, and then apply them) have been huge benefactors of existing AI approaches.

At the same time, if we look at an application like autonomous driving, it seems increasingly clear that just pushing more and more data into algorithms that crank out ever refined, yet still flawed, ML models isnt really working. Were still years away from true Level 5 autonomy, and, given the number of accidents and even deaths that efforts like Teslas AutoPilot have led to, its probably time to consider another approach.

Similarly, though we are still at the dawn of the personal robotics age, its easy to imagine how the conceptual similarities between autonomous cars and robots will lead to conceptually similar problems in this new field. The problem, ultimately, is that there is simply no way to feed every potential scenario into an AI training model and create a predetermined answer on how to react for any given situation. Randomness and unexpected surprises are simply too strong an influence.

Whats needed is a type of computing that can really think and learn on its own and then adapt its learning to those unexpected scenarios. As crazy and potentially controversial as that may sound, thats essentially what researchers in the field of neuromorphic computing are attempting to do. The basic idea is to replicate the structure and function of the most adaptable computing/thinking device we know ofthe human brainin digital form. Following the principles of basic biology, neuromorphic chips attempt to re-create a series of connected neurons using digital synapses that send electrical pulses between them, much as biological brains do.

Its an area of academic research thats been around for a few decades now, but only recently has it started to make real progress and gain more attention. In fact, buried in the wave of tech industry announcements that have been made over the last few weeks was news that Intel had released the second generation of its neuromorphic chip, named Loihi 2, along with a new open-source software framework for it that theyve dubbed Lava.

To put realistic expectations around all of this, Loihi 2 is not going to be made commercially availableits termed a research chipand the latest version offers 1 million neurons, a far cry from the approximately 100 billion found in a human brain. Still, its an extremely impressive, ambitious project that offers 10x the performance, 15x the density of its 2018-era predecessor (its built on the companys new Intel 4 chip manufacturing process technology), and improved energy efficiency. In addition, it also provides better (and easier) means of interconnecting its unique architecture with other more traditional chips.

Intel clearly learned a great deal from the first Loihi, and one of the biggest realizations was that software development for this radically new architecture is extremely hard. As a result, another essential part of the companys news was the debut of Lava, an open-source software framework and set of tools that can be used to write applications for Loihi. The company is also offering tools that can simulate its operation on traditional CPUs and GPUs so that developers can create code without having access to the chips.

Whats particularly fascinating about how neuromorphic chips operate is that, despite the fact they function in a dramatically different fashion from both traditional CPU computing and parallel GPU-like computing models, they can be used to achieve some of the same goals. In other words, neuromorphic chips like Loihi 2 can provide the desired outcomes that traditional AI is shooting for, but in a significantly faster, more energy efficient, and less data intensive way. Through a series of event-based spikes that occur asynchronously and trigger digital neurons to respond in various waysmuch as a human brain operates (vs. the synchronous, structured processing in CPUs and GPUs)a neuromorphic chip can essentially learn things on the fly. As a result, its ideally suited for devices that must react to new stimuli in real-time.

These capabilities are why these chips are so appealing to those designing and building robots and robotic-like systems, which autonomous driving cars essentially are. Bottom line is that it could take commercially available neuromorphic chips to power the kind of autonomous cars and personal robots of our science fiction-inspired dreams.

Of course, neuromorphic computing isnt the only new approach to advancing the world of technology. Theres also a great deal of work being done in the more widely discussed world of quantum computing. Like quantum computing, the inner workings of neuromorphic computing are extraordinarily complex and, for now, primarily seen as research projects for corporate R&D labs and academic research. Unlike quantum, however, neuromorphic computing doesnt require the extreme physical challenges (temperatures near absolute zero) and power requirements that quantum currently does. In fact, one of the many appealing aspects of neuromorphic architectures is that theyre designed to be extremely low power, making them suitable for a variety of mobile or other battery-powered applications (like autonomous cars and robots).

Despite recent advancements, its important to remember that commercial application of neuromorphic chips is still several years away. However, its hard not to get excited and intrigued by a technology that has the potential to make AI-powered devices truly intelligent, instead of simply very well-trained. The distinction may seem subtle, but ultimately, its that kind of new smarts that well likely need in order to make some of the next big things really happen in a way that we can all appreciate and imagine.

Disclosure: TECHnalysis Research is a tech industry market research and consulting firm and, like all companies in that field, works with many technology vendors as clients, some of whom may be listed in this article.

See the original post:
Is Neuromorphic Computing The Answer For Autonomous Driving And Personal Robotics? - Forbes

Related Posts
This entry was posted in $1$s. Bookmark the permalink.