Several years ago, I packed up my life in Cairo, Egypt, and moved to the UK to pursue my PhD thousands of miles away from everyone I knew and loved. As I settled into my new life, I found myself spending more hours with my laptop than with any other human being.
I felt isolated and incredibly homesick. Chatting online with my family back home, I was often in tears, but they had no idea how I was feeling behind my screen (with the exception of a sad face emoticon that I would send).
I realised then that our technology and devices which we consider to be smart, and helpful in many aspects of our lives are emotion blind. Technology has a high IQ, but no EQ, no emotional intelligence, and that comes with a cost.
Face-to-face, people share so much emotion and meaning beyond the words they say, through facial expressions, gestures, vocal intonations.
Read more about emotions:
But online, were relegated to texts, emojis, and one-dimensional expressions of how we feel. All of the richness of our communication disappears in cyberspace, making it more difficult to meaningfully and empathetically connect with one another.
That issue is even more pressing now that weve all been separated from each other by social distancing. Were leaning on technology to stay in touch with loved ones, to work and learn remotely, and everything in between. But we all know it doesnt feel the same. This begs the question: what can we do to preserve our humanity and our emotions in a digital world?
I believe that we can harness the power of artificial intelligence (AI) to redesign technology to not mask these elements of what makes us human, but emphasise them. Ive staked my career on this idea, and believe that AI with emotional intelligence Emotion AI will become ingrained in the fabric of the devices we use every day.
Theres a rapidly evolving market around Emotion AI: software that can detect nuanced human emotions and complex cognitive states from peoples facial and vocal expressions.
As with any new technology, we need to be thoughtful about how Emotion AI is developed, and where it is deployed, given the highly personal and private nature of peoples emotions. But I fundamentally believe that this technology has the power to amplify our empathy, online and offline, if applied mindfully and deliberately.
So, what would an emotionally intelligent digital world look like?
One area Ive been thinking a lot about recently is online conferences and virtual meetings. Like everyone else, I have spent more than my fair share of hours on Zoom over the past few months while my company, Affectiva, is remote, and many conferences where I was scheduled to speak have shifted to virtual.
Read more about artificial intelligence:
When Im leading a team meeting or presenting a keynote in person, Im able to take a pulse on how the audience is feeling based on the energy in the room.
I can riff off of signs of interest or excitement if I see peoples faces light up at an idea; or I can change course if I sense that people are becoming bored or zoning out. But presenting online is like being in a vacuum. I have no idea how people are reacting to what Im saying, or if theyre even paying attention.
If online video platforms were built with Emotion AI, we could restore some of the energy and emotion thats lost, and make meetings and conferences much more engaging as a result. If participants were willing to turn on their devices camera, and opted in to using Emotion AI, the technology could decipher peoples emotional expressions and aggregate that data in real time.
Picture an emotion newsfeed or graph that skyrocketed when attendees were excited or smiling, and tapered off when they became bored or disengaged. Not only would that insight help me as a presenter; but also, it could give attendees a sense of the energy in the (virtual) meeting room, helping restore the camaraderie we feel at in-person events or meetings.
The applications of Emotion AI arent limited to our work lives, though. Equally, if not more exciting, is its potential to bolster our interpersonal communications, and help us interact with each other more meaningfully. That drove one of the very first applications of Emotion AI that I explored: building an Emotion AI application for autism.
People on the autism spectrum often struggle with recognising and responding to non-verbal communications and emotions. But Emotion AI could be used as a tool to help people learn to navigate challenging social and emotional situations. Picture smart glasses with Emotion AI built in, that could give the wearer insight into the emotions of people theyre interacting with.
Companies are already turning this idea into reality. For example, a company called Brain Power is developing the worlds first augmented smart glass system (powered by Emotion AI) for kids and students on the autism spectrum.
The results are powerful: parents recount stories of being able to connect with their kids on an emotional level that was previously unimaginable, and the impact that has is incredibly moving.
Rana el Kaliouby
Another exciting application is in the automotive industry, where Emotion AI can improve road safety. Each year, dangerous driving behaviour (such as drowsy and distracted driving) causes thousands of accidents and fatalities in the UK alone.
Automakers are turning to Emotion AI to address these issues, by developing in-vehicle systems that use Emotion AI to detect signs of distracted or drowsy driving. The idea is that, if a vehicle recognised that its driver was starting to doze off, or was becoming distracted, the system could send an alert to remind the driver to keep their eyes on the road or pull over if the behaviour reaches a dangerous point.
Beyond making our roads safer, Emotion AI could also improve the transportation experience. Automakers are exploring in-cabin sensing systems that can understand the state of the driver, the cabin, and the occupants in it, in order to optimise and personalise the ride.
For example, after a long day at work, your car might be able to sense if youre tired or stressed. Perhaps it could play soothing music, or recommend stopping for take-out from your favourite restaurant on the way home.
Read more about future technology:
When I think about the future with Emotion AI, I envision a world where the de-facto human-machine interface more closely mirrors the way humans interact: through conversation, perception and empathy.
I believe this technology will be built into the fabric of the devices we use each day, ranging from our phones to our cars, our smart fridges to our smart speakers in turn, making our lives safer, healthier and more productive; while making our interactions with others more meaningful and empathetic.
Still, we cannot ignore the potential risks of AI. We need to be deliberate in recognising them and mitigating them. Human-centric AI systems including Emotion AI deal with data that is highly personal, and we may not (and should not) always be okay with that.
For example, I feel strongly that AI should not be deployed for use cases like security or surveillance, where there is no opportunity for people to opt-in and consent. Beyond considering how AI is deployed, its critical for the tech industry to examine how AI is developed.
The risk of bias in AI is a significant concern: if AI is not built with diverse data, and by a diverse team, they will fail to recognise all people, and stand to disenfranchise those they interact with.
Its up to tech leaders to come together and establish standards for AI ethics, and to advocate for thoughtful regulation. The industry still has a ways to go, but this movement is critical in order to realise the positive potential of this technology.
At the end of the day, we need to put the human before the artificial both in arming our technology with emotional intelligence and in determining the role it will play in our world.
Girl Decoded by Rana el Kaliouby is out now (20, Penguin Business).
Excerpt from:
Artificial intelligence has a high IQ but no emotional intelligence, and that comes with a cost - BBC Focus Magazine