The age of artificial humans is here – Livemint

Posted: January 18, 2020 at 10:25 am

Coras human-like features were developed by a company called Soul Machines, co-founded by Mark Sagar. What is happening to bots like Cora point to radical changes that are currently under way an ongoing push to create machines with souls. And, in all likelihood, 2020 may turn out to be the breakthrough year in the evolution of a slew of human-like avatars, which could potentially upturn everything from television news anchoring and advertising to movie-making.

Incidentally, Sagar had won an Academy Award for Scientific Engineering in 2011 for his work on the movie Avatar. Today, Soul Machines has built a business out of creating what the company calls Digital Heroes" (referred to as artificial humans in this article), which are avatars that look, feel, and interact just like humans.

At Soul Machines, we have a strong focus on biologically-inspired artificial intelligence in order to create Digital Heroes that can respond in real-time to your emotions and what you are saying," said Greg Cross, co-founder and chief business officer at Soul Machines. The autonomous animation of our Digital Heroes is driven by the worlds first digital brain, which is modelled on the way the human brain and nervous system work."

While research into artificial humans isnt particularly new any more, the topic recently made news as Samsungs advanced research division, STAR Labs (Samsung Technology and Advanced Research), announced a project called NEON at the Consumer Electronics Show (CES) in Las Vegas, US, this past week. What Soul Machines calls Digital Heroes, STAR Labs calls NEONs.

STAR Labs is the arm of Samsung responsible for technologies like Gear VR, a virtual reality headset. With the CES demo, what was being tinkered with inside labs is now finally out in the world. And it raises the possibility of artificial human products", which consumers and companies can buy in the near future. Whereas the early use of artificial humans still showed an avatar that was noticeably artificial, things have advanced dramatically. Samsungs NEONs look and behave exactly like real humansto the point where it is nearly impossible to tell whether the person youre interacting with on-screen is a human or a machine.

Path to cyborgs

Artificial humans use a specialized branch of artificial intelligence (AI) called generative adversarial networks (GANs). Interestingly, this is the same technology that goes into creating deepfakes, which are artificially created videos where a persons face is replaced by someone else (say, a prominent celebrity or politician) while the underlying speech remains the same.

According to an expert in machine learning, who has worked with InMobi and McKinsey, unlike most AI algorithms, GANs contain two neural networksa discriminator and a creator. The discriminators (which is trained to recognize a real image, object, etc. by feeding it lots of data) job is to catch fakes, while the creators job is to create those fakes. When the creator makes something that the discriminator cannot catch, that becomes the output of the algorithm.

GANs specialize in creating content that didnt earlier exist. There have been examples of GANs being used to create music, poems, and even paintings. STAR Labs Core R3 platform uses such algorithms, along with other advanced technologies, to create its NEONs.

Unlike earlier versions of digitally created humans, artificial humans today look, feel and behave almost exactly like humans. The experience of talking to an artificial human is similar to talking to someone over a video call", according to Soul Machines Cross.

While GANs are sort of like the backbone of the process of generating artificial humans, theres obviously a lot more than can be used. What we are doing here is not based only on AI," said Pranav Mistry, president and chief executive of STAR Labs.

NEONs patented Core R3 platform uses behavioural neural networks, evolutionary generative intelligence, and computational reality. Voice and other services dont have to come from us. Core R3 or our techniques can connect to any third party value added service for domain specific knowledge, languages, etc.," he added.

Mistry said he believes that putting NEONs to use in practical daily scenarios will help them learn and understand how humans speak and behave. They will learn from their interactions and become better as time progresses," he said.

The current version of artificial humans are quite new, said Mistry, adding that the demos shown at CES cant even be called betas at the moment.

While it may, at first glance, seem like a bunch of new-age companies are merely putting a face on top of the voice assistants that anyone with a smartphone is familiar with, the important technological effort here is in understanding human emotions and expressions. Traditionally, thats an area where AI research has hit a roadblock.

Emotion is essential to human interaction, and much of the way we connect as humans is done face-to-face, in the reading and understanding of each others facial expressions and voice," said Soul Machines Cross.

Digital transformation expert and author of The Tech Whisperer, Jaspreet Bindra explained that all the AI we see today falls under artificial narrow intelligence, or ANI. Narrow intelligence has existed even before Alexa," he said. According to Bindra, the real advancements made in creating AI assistants was in voice technologies.

In much the same way, artificial humans bring about a leap in understanding emotions and expressions, adding one more realm after voice in the path towards buildings truly independent robots. In a lot of ways, an artificial human is the same as your regular video game character, but its going to be hard to call it an animation unless youre aware of it beforehand.

Technologies have existed for a lot of things for a long time. But bringing them into a version where you can actually demo it like this is definitely new," said Bindra. It manifests the fact that some of these uber-futuristic technologies (like cyborgs, etc.) can actually happen in real life."

Practical uses

According to STAR Labs Mistry, the field is so new right now that a business model is hard to predict. However, he plans to licence Samsungs NEONs to companies and eventually consumers, for various purposes.

NEON has been able to give different personalities to each of its artificial humans. So, customers will be able to choose the one that fits their needs the best. The company doesnt plan to sell a NEON to a customer yet, so it will be a subscription-driven model. An avatar could be rented for a specific purpose.

That said, there have already been instances of artificial humans being used in advertising. For instance, American consumer goods firm Proctor and Gamble deployed a digital influencer" called Yumi last year to act as the brand ambassador for its SK-II skincare products.

A lot of pictures (used in advertising) have now become stock photography. I would imagine artificial humans can be useful for such applications," said Sreekanth Khandekar, co-founder and director of afaqs!, a publication aimed at advertising, media and marketing professionals. It will come down to whether this is realistic enough, and how much money it saves a company."

Other than advertising, films are billed as another possible area where artificial humans can be used. Chaitanya Chinchlikar, vice president of film-making institute Whistling Woods, believes that such technologies are a huge asset for content creation".

If Im able to scan an actor and use an artificial version of him for a film, thats great," he said, but added that cost will be a major factor as such technologies are usually not cheap and accessible, at least at the moment.

Essentially, an artificial human can be used for nearly anything that involves interactions. They can be used as receptionists at a hotel, models who sell consumer goods, and so on. In fact, with some added tech, they can also perform certain tasks, making them a mix or AI assistants and digital avatars.

Digital Heroes can perform literally any customer experience roletasks like booking hotel rooms, showing prospective real estate clients available apartments, and answering essential questions," said Cross of Soul Machines.

While Samsungs artificial humans will initially be meant for organizations and enterprises, Mistry says he eventually plans to allow consumers to use them too. You find a NEON you think can be a good friend of yours and you can select that one," he said. That NEON wont know anything about you in the beginning, but it will learn over time, which is where the personalization will come in."

For example, Rapper and entrepreneur Will.I.Am had Soul Machines make an artificial version of himself, which was shown in YouTubes recent documentary called The Age of AI. You cant be at two places at once...thats the promise of the avatar," says Will.I.Am on the show.

The fact that an artificial human is at the end of day a piece of computer software also means it can be accessed over the internet. The smartphone is the way most people would do this today, but, in future, it could be using augmented reality or virtual reality glasses," said Cross.

Privacy worries

But almost any new digital technology that promises transformation comes with a set of gaping privacy worries. Artificial humans pose a particularly big problem, as deepfakes have already been used to create fake versions of powerful executives and world leaders.

For example, BuzzFeed used Adobe After Effects software and FakeApp to show former US president Barack Obama calling current President Donald Trump a dipshit". While BuzzFeed used this only to demonstrate the nefarious uses of deepfakes, many have also found their faces being put on pornographic videos without their permission.

In fact, the worlds biggest social media platform, Facebook, recently banned deepfakes and manipulated content from the platform.

While it is unclear whether artificial humans will face such a backlash, the fact that they do not replicate existing content is important. According to Soul Machines Cross, the company doesnt allow its Digital Heroes to be used for pornography, etc. and retains the right to not provision the technology to certain industries.

When likeness to an existing person is created, Soul Machines requires its clients to demonstrate they actually have a license agreement" to use such a likeness.

In The Age of AI, Will.I.Am asked Soul Machines to keep his avatar slightly robotic, so that people can tell its not the real person. We are at a place weve never been in as a society, where people have to determine whats real and whats not," he said.

STAR Labs Mistry also thinks privacy is an important aspect of the technology. He said the interaction between a person and the NEON never goes beyond the two parties involved. That is, even Samsung cannot access these conversations and interactions. This is probably why NEONs do not act as an interface between a user and the internet.

The singularity

In essence, artificial humans seem to be the missing piece of the puzzle. In every movie involving AI, the AI seems to have a voice, a face and emotions. So, whereas Google and Amazon brought voice to AI, companies like Soul Machines and Samsung are bringing the face and emotions.

Futurist Ray Kurzweil had famously predicted that humanity will achieve the singularity" by the year 2045, which is the point when machines become smarter than humans. A real artificial being that is as, or more intelligent than human beings falls under the brand of artificial general intelligence (AGI). In fact, Cross referred to Soul Machines as an AGI research" company instead of an AI research firm.

AGI is pretty much every world-dominating robot or AI you have seen in movies, like, say, The Terminator or Avengers: Age of Ultron. NEONs and Digital Heroes seem to be missing many pieces of that puzzle.

In October last year, a Russian startup called Promobot claimed to have created the worlds first Android that looks like a real person. Its robot, Robo-C is capable of more than 600 facial expressions and can be made to look like anyone you want.

Professor Krithi Ramamritham of the Indian Institute of Technology Bombay, who was at Robo-Cs launch, said that the current advances are critical because the learnings from them can be used to produce general-purpose solutions in the long run. While self-thinking, autonomous robots may or may not materialize by 2045, the era of artificial humans which can speak toif not out-thinkregular humans is well and truly here.

Excerpt from:

The age of artificial humans is here - Livemint

Related Posts