These AI bots created their own language to talk to each other – Recode

Posted: March 23, 2017 at 1:58 pm

It is now table stakes for artificial intelligence algorithms to learn about the world around them. The next level: For AI bots to learn how to talk to each other and develop their own shared language.

New research released last week by OpenAI, the artificial intelligence nonprofit lab founded by Elon Musk and Y Combinator president Sam Altman, details how theyre training AI bots to create their own language, based on trial and error, as the bots move around a set environment.

This is different from how artificial intelligence algorithms typically learn using large sets of data, like to recognize a dog by taking in thousands of pictures of dogs.

The world the researchers created for the AI bots to learn in is a computer simulation of a simple, two-dimensional white square. There, the AIs, which took the shape of green, red and blue circles, were tasked with achieving certain goals, like moving to other colored dots within the white square.

But to get the task done, the AIs were encouraged to communicate in their own language. The bots created terms that were grounded, or corresponded directly with objects in their environment and other bots and actions, like Go to or Look at. But the language the bots created wasnt words in the way humans think of them rather, the bots generated sets of numbers, which researchers labeled with English words.

You can get a sense in this demonstration video:

The researchers taught the AIs how to communicate using reinforcement learning: Through trial and error, the bots remembered what worked and what didnt for the next time they were asked to complete a task. Igor Mordatch, one of the authors of the paper, will join the faculty at Carnegie Mellon in September. And Pieter Abbeel, the other author, is a research scientist at OpenAI and a professor at the University of California, Berkeley.

There are already AI assistants that can understand language, like Siri or Alexa, or help with translation, but this is mostly done by feeding language data to the AI, rather than understanding language through experience.

We think that if we slowly increase the complexity of their environment, and the range of actions the agents themselves are allowed to take, its possible theyll create an expressive language which contains concepts beyond the basic verbs and nouns that evolved here, the researchers wrote in a blog post.

Why does this matter?

Language understanding is super important to make progress on before AI reaches its full potential, said Miles Brundage, an AI policy fellow at Oxford University, who also notes that OpenAIs work represents a potentially important direction for the field of AI research to move toward.

It's not clear how good we can get at AI language understanding without grounding words in experience, Brundage said, and most work still looks at words in isolation.

More:

These AI bots created their own language to talk to each other - Recode

Related Posts