AI and you

Think:Act Magazine "It’s time to rethink AI"
AI and you

May 15, 2024

Your personal life and development are set to evolve

Listen to the article

Article

by Steffan Heuer
Artworks byCarsten Gueth

Chatbots are a substitute for real conversations – but for how long? In this piece, experts explore how AI will redefine the meaning of human relationships.

Complex grammar is what sets human language apart from animal communication, yet advances in AI systems keep blurring the line between what it means to be a real person interacting with another human and communicating with a piece of software that passes itself off as a human interlocutor, listening and querying us, nudging us and, at times, even taunting or comforting us.

No one is more aware of the potential and problems that this machine capability creates than Eugenia Kuyda. A journalist by training, she started dabbling in chatbots around 2012 in Silicon Valley, initially to improve restaurant recommendations. What put her on the map long before ChatGPT was the launch of Replika in 2017, a chatbot she had initially created to remember and recreate conversations with a friend who had died in a car accident.

Eugenia Kuyda, founder and CEO of Replika.
Eugenia Kuyda, founder and CEO of Replika.

To this day, conversing with somebody who's not really there is at the core of the app. Personalized virtual friends engage with users to fill the various voids they experience in their lives. "As someone who has spent most of my life with words, I kept coming back to the idea why we talk all day long?" Kuyda recounts her journey. "We need conversations to emotionally feel better, to feel connected. It's a fundamental need to survive, thrive, grow." For her, the potential of AI tools in the social realm hinges on the question whether developers can create or re-create such life-sustaining conversations with software – and to what extent these powers should be unleashed upon hundreds of millions of people who feel alone, isolated or simply misunderstood by their "fleshie" contemporaries, as the tech world sometimes derides two-legged carbon life-forms.

"Most companies focus on the practical conversations that are just an interface to solving a problem. That has monetary, but not human, value."

Eugenia Kuyda

Founder and CEO
Replika

Probing the nuanced depths of human conversations that involve emotions, anxieties and sometimes subterfuge is a long way from building chatbots that make life's little mundane tasks easier, such as booking a table or getting customer support for a finicky gadget. In Kuyda's view, that's the low-hanging fruit of her industry. "There's no value in conversations to get things done, the valuable ones usually don't have a practical goal. We just start chatting about life," she explains. "Unfortunately, most companies focus on the practical conversations that are just an interface to solving a problem. That has monetary, but not human, value."

Chatbots that mimic human verbal faculties are nothing new. The most famous one is ELIZA, created in 1966 by computer scientist Joseph Weizenbaum at MIT. Named after the character in the play Pygmalion, the program surprised users with its warm banter, with one script called DOCTOR spitting out the type of open-ended questions psychotherapists use. Not only his secretary was fooled by ELIZA, as Weizenbaum discovered. He soon regretted his foray into this uncharted territory, writing in his book Computer Power and Human Reason in 1976 that "I had not realized that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people."

Eugenia Kuyda

Eugenia Kuyda is a Moscow native who studied journalism in Milan and then graduated from the Moscow State Institute of International Relations and the London Business School before venturing into technology. Her company Luka currently runs three chatbot services: Replika, Blush and Tomo.

Weizenbaum's key insight was that there should be inherent limits to this charade, which back then was comparatively low-tech: "However intelligent machines may be made to be, there are some acts of thought that ought to be attempted only by humans." The demarcation point for Weizenbaum lay with the inner development and introspective capabilities of humans, something that programs lack. "What could it mean," he mused almost half a century ago, "to speak of risk, courage, trust, endurance, and overcoming when one speaks of machines?"

Many things have changed since Weizenbaum's warnings, of course. Computer scientists are trying to agree on the qualities to measure a machine's consciousness, while linguists have pointed out that today's large language models are nothing but "stochastic parrots," stringing words together in a probabilistic fashion without understanding them or the larger context. But the world is without a doubt beholden to portable supercomputers and apps designed for maximum addiction, as well as rattled by a mental health crisis that existing care models, i.e. humans alone, cannot address.

To AI pioneers like Kuyda, that mismatch between the demand for and supply of human connection presents a twofold opportunity. If 86% of the world's population owns a smartphone and chatbots can be employed to hear us out, encourage or flirt with us, she argues, why shouldn't they? "We are in a loneliness pandemic. Talking to a therapist is sort of a contract and it's not that much different from talking to a machine. Once that machine becomes smart and good enough and maybe immersive of an experience enough, that's enough to re-create that conversation," she says.

After launching Replika, which has more than 2 million users a month, she has recently branched out into dating with the Blush dating simulator app as well as launching a wellness and meditation app with an AI-generated avatar guide, Tomo, in January 2024. Yet she sees a limit to what those simulated conversations ought to do. They should augment, not substitute, real human interaction and companionship. "It has a lot to do with how you design the product or the business. It could go in both directions," she admits. "Right now, we are just helping people open up and build a little bit of self-esteem to eventually make those connections with real humans."

But she is aware that the end game might be something altogether different: the emergence of chatbots that are deeply embedded into our lives, from wearables to experiences in virtual reality. Programs that can access our calendars, chats, emails and other datasets to strike up a conversation out of the blue while you're driving: "You haven't contacted your three closest friends in weeks and didn't respond to your spouse's email today. Let's talk about your life." Having such ambient assistants would erect a mirage of empathy and convenience available 24/7 that many companies would love to sell – and probably many users would love to have, possibly removing them even further from human interaction. That's why companies from Snapchat and Alphabet to Meta, the owner of Facebook, are working on programs that can play life coach or romantic foil.

Which raises the question whether we want minors to grow up with synthetic companions. Pointing to her own two young children, Kuyda says no, for now. "There's potentially tremendous value in this tech for kids and teens to have a confidant, but we first need to figure it out with adults," she explains. "We shouldn't be experimenting and need to do this in steps, having nuanced conversations about how to build it in a safe way." Current technology has only touched the surface. It's entirely possible to arm a chatbot with the capacity to decode people's brain signals, making the abuse of deeply personal data a chilling prospect. That's why UNESCO took the unusual step of warning that rapid advances in brain implants and scans combined with AI pose a threat to "mental privacy."

The way social media has already established a deep hold on human interaction, however, makes it look likely that personalized AI might be another, additional way of locking us into a condition of what renowned MIT sociologist Sherry Turkle summarized as being "alone together." It is a profound change that will force everyone to ponder what it means to build relations with systems to which some experts in the field already ascribe signs of consciousness.

The Al impact on the boardroom, the workplace, society and you. Please click below to read the other parts of the cover story:
About the author
Portrait of
Steffan Heuer has been covering the intersection of technology, commerce and culture in Silicon Valley for more than two decades. His work has appeared in The Economist, the MIT Technology Review and the German business monthly brand eins. He currently divides his time and reporting between Berlin and California.
Download the full edition
Think:Act Edition

It's time to rethink AI

{[downloads[language].preview]}

As our lives and work are increasingly being reshaped by AI, the new edition of Think:Act Magazine explores what this means for C-level managers.

Published May 2024. Available in
All online publications of this edition
Load More
Portrait of Think:Act Magazine

Think:Act Magazine

Munich Office, Central Europe