Human brain structure inspires artificial intelligence – CBC.ca

The human brain is the most powerful supercomputer on Earth, and now researchers from the University of Southern California are taking inspiration from the structure of the human brain to make better artificial intelligence systems.

Artificial intelligence (or AI) is a system of computing that aims to mimic the power of the human brain.We have more than 100 trillion neurons, or electrically conducting cells in our brain, that give us the incredible computing power for which we are known.Computers can do things like multiply 134,341 by 989,999 really well, but they can't do things like recognize human faces or learn or change their understanding of the world.At least not yet, and that's the goal of AI:to devise a computer system that can learn, process images and otherwise be human-like.

Very good question! Part of this answer is:why not?AI istheholy grail for computerscientists who want to make a computer as powerful asthe human brain. Basically, they want to create a computer that doesn't need to be programmed with all the variables becauseit can learn them just like our brain does.

A six-foot-tall, 300-pound Valkyrie robot is seen at University of Massachusetts-Lowell's robotics center in Lowell, Mass. "Val," one of four sister robots built by NASA, could be the vanguard for the colonization of Mars by helping to set up a habitat for future human explorers. (Elise Amendola/Associated Press)

Another reason scientists are interested in AI is thatit could be used for things like surveillance and face recognition, and having computer systems that can learn new terrain or solve a new problem somewhat autonomously, which, in certain situations, could be very beneficial.

In order to fully mimic the power of our own cognitive capacity, we have to first understand how the brain works, which is a feat in and of itself.We have to re-engineer and re-envision the computer to be completely different from hardware to software and everything in between, and the reason we have to do this has to do with how our brains are powered.

"If we compare, for example, our brain to the super computers we have today,they run on megawatts, [which is] a huge amount of power that's equivalent to a few hundred households, while our brain only relies on water and sandwiches to function," saidartificial intelligence and computing expert Han Wang from the University of Southern California said."It consumes power that's equivalent to a light bulb."

So you see the incredible efficiency of millions of years of evolution on our brain means we have learned to work with limited resources and become so power-efficient that we can beat a supercomputer for complex processing without breaking the energy bank.

This is where the main difference between the brain and the computer lie.

"Our current computers, there's a very powerful corebut then you have a long queue of tasks [which]come in sequentially and are processed sequentially," Wang said. "While our brain, the computation of units, which are the neurons, are connected in highly parallel manner. It's this high level parallelism that has advantages in learning and recognition."

So it's the parallelism in the brain that allows us to use only what we need only when we need it, and to not waste energy on running background processes that we all know slow down our computing power.

It's this concept of running at low energy in parallel circuits. The key to this is to make computer circuits more complex in the messagesthey can send.

In a typical computer, we think that each node sends a one or a zero, and then there's a series of ones and zerosuntil a program is made.

In the brain, it's a very small circuit and they can send a onewhich means go, a zerowhich means no signal, and/ora two that says stop, or both a one and a twoat the same time.

Artificial intelligence could be beneficial in situations where robots need to make quick decisions, like how to maneuver unknown terrain. (Boston Dynamics)

In other words, our brains can send double the information in any given exchange compared to a computer, and that, coupled with smaller networks working in parallel, reduces the power strain.

What Wang and colleagues did was to create a system of wires that connect using tin selenite and black phosphate that can send, stop, go, do nothing, or do both signals, dependingon the voltage sent.

Nowthe plan is to re-engineer the computer from the ground up andbuild a computer that has the capacity for these low voltage decisions that aren't wired through these few cores that we see today, but instead with each circuit of messages working in parallel like the brain does.

Until recently, this was a theoretical concept because there was really no way to send as much information in a single transmission as we have now.

So, artificial intelligence is only a few incredible brilliant research careers away from a reality.

More here:

Human brain structure inspires artificial intelligence - CBC.ca

Related Posts

Comments are closed.