On being human
In this issue of Think:Act magazine we examine in detail what it means to be human in our complex and fast changing world now and in the days to come.
by Nicola Davison
As AI and robotics make huge leaps in progress, the question arises as to whether empathy and emotions can be engineered and replicated, or are they irreplaceable, valuable qualities that are unique to us.
One afternoon at IBM's headquarters in San Francisco this June, audience members took their seats to witness a tradition that has been a hallmark of civilized society since the time of Socrates – a live debate. The first topic up for discussion: "We should subsidize space exploration." Arguing against the statement were Dan Zafrir and Noa Ovadia, the 2016 Israeli national debate champion. Their opponent, a 182-cm-tall box in which an artificial intelligence (AI) software named Project Debater resides, was placed on stage. Both sides were tasked with delivering a four-minute opening statement on the topic – which neither knew about ahead of time – followed by a four-minute rebuttal and a two-minute summary.
In order to form a cogent rebuttal, the AI would have to "listen" to the argument of its opponent. Its responses would have to follow unscripted reasoning, proposing lines of argument with which people would agree, the winner judged not by the logic of an objective score but by a subjective poll; the AI would need to cajole. After closing statements, the organizers canvassed the audience. Project Debater was judged to have held its own by citing sources and even cracking jokes. It also performed well in the second debate about telemedicine, well enough to demonstrate the possibility that computers may be capable of something like human decision-making one day. Yet as we marvel at cutting-edge advances, less ostentatious forms of AI are creeping into our everyday lives. The International Federation of Robotics forecasts 1.7 million new robots will be installed in factories worldwide by 2020. In April, the US Food and Drug Administration permitted IDx to market an AI-powered diagnostic device for ophthalmologists. But will patients be content with a machine in something as personal as medical care?
One line of thinking is that as robots and AI become more common in the workplace, uniquely human skills and qualities will be appreciated afresh. It is a view held by Lauren Elmore, president of Firmatek, a mapping and measurement company specializing in drones and data. "I do think that [ ... ] the 'human element' will become a differentiator," she says. "I rarely think about how great a chatbot served my needs."
Over recent years, Firmatek has expanded its person-to-person engagement – what Elmore calls the company's "human element" – as a distinct part of its business strategy. Minor administrative questions are dealt with over email. Otherwise, if a client has what Elmore calls a "why" question, a Firmatek specialist will schedule a webinar. She also encourages communication by phone. "We've found that it is often more productive to ease a client's fears or answer their questions by picking up the phone and calling them," she says. "They also feel valued when we talk to them, as opposed to just getting an email through our system."
It is true that automated responses often feel shallow, but what if it were possible for a bot to faithfully mimic a human, if only in set circumstances? A growing number of tech developers and companies believe that AI systems will be able to achieve their full potential only if they become more humanlike. They are working to imbue machines with "artificial emotional intelligence." A leader in the field of emotion AI is Affectiva, founded by Rosalind Picard and Rana el Kaliouby at MIT's Media Lab. The pair's first project was aimed at autistic people, a device that scans faces and interprets the social cues for the person wearing it. Among Affectiva's first clients were advertisers who wanted to use the technology to measure campaign impact. "We're now surrounded by hyper-connected smart devices that are autonomous, conversational and relational, but they're completely devoid of any ability to tell how annoyed or happy or depressed we are," el Kaliouby wrote in MIT Technology Review. "And that's a problem."
Ava, a chatbot that uses facial and voice recognition software to detect emotions such as joy, sadness and frustration and "react" accordingly on screen is among the first "emotionally intelligent" AIs to enter the market. Developed by Auckland-based Soul Machines for Autodesk, Ava (Autodesk Virtual Assistant) is designed to handle customer service enquiries and carries out about 100,000 conversations a month. Ava looks human, her physiognomy created from the facial scans of an actor. But her purple irises clearly mark her out as being different – a detail added to avoid what is referred to by academics as "uncanny valley," the creepy feeling that comes with a robot looking too much like a human, but not quite enough. To set the scene before an interaction, Ava announces that she (it?) is not like you. "I am a virtual assistant," she declares. According to Soul Machines, bots need to communicate a human touch, but not too human. But is that approach right?
In human-AI interaction, there is a threshold at which a person will stop talking to a computer and start responding as if to a fellow being. Dylan Glas, a senior robotics software architect at Futurewei Technologies in Silicon Valley, observed this when he was working on one of the most advanced bots on the planet: Erica. Created in 2014 by Japanese researchers, the Erato Intelligent Conversational Android (Erica) is the world's first autonomous robot – that is, she can carry on a conversation that has not been scripted, much like a human. The trigger that makes people interact with Erica as if she were conscious, says Glas, is the moment she does something that makes them feel like she perceives or understands them. "Sometimes this happens if they ask something obscure and Erica responds appropriately, or if they interrupt her while she is talking and she immediately stops talking and responds to them," he says.
Working to make a robot as much like a person as possible has allowed Glas to consider what it is that is unique to humans. Though companies such as Affectiva have made advances in emotion recognition, humans are capable of a array of complex emotional states. For instance, there is a gulf of feelings between anger and jealousy, but also some overlap, and there is a long way to go before AI is capable of distinguishing between nuanced states.
Perhaps the largest barrier to creating a truly humanlike bot is the problem of imparting empathy. "Detecting emotion is only the first step," Glas says. "In order to really achieve 'empathy' the robot needs to understand why those feelings are there and this can vary according to cultural norms." True empathy, he adds, requires the robot to put itself in the human's shoes, to understand social context and to have "commonsense" background knowledge. "On this level it becomes less of a computer vision problem and more a problem of dialogue understanding and theory of mind."
Even if bots will one day be able to seamlessly replicate human emotion and capture nuance, it does not mean that we will mistake them for human. Our ability to spot artifice is strong. What might be possible is for us to suspend disbelief. Glas thinks of it like talking with somebody from a different culture: If someone speaks a different language and has different customs it can be difficult to communicate. "But eventually, as you get to understand and accept each others' cultures, it becomes possible to forget sometimes that you have a different background," he says. "I believe the same thing will happen with robots in the future. Maybe you won't actually forget that it's a robot, but it will become a non-issue."
In this issue of Think:Act magazine we examine in detail what it means to be human in our complex and fast changing world now and in the days to come.
Curious about the contents of our newest Think:Act magazine? Receive your very own copy by signing up now! Subscribe here to receive our Think:Act magazine and the latest news from Roland Berger.