Thinking in decades
![{[downloads[language].preview]}](https://www.rolandberger.com/publications/publication_image/ta44_cover_download_preview.jpg)
The latest edition of Think:Act Magazine explores enduring business themes from past decades through a new lens of pertinence for the next 20 years.
by Grace Browne
Photos by Roderick Aichinger
Overflowing with apps and prompts to interact, smartphones promised the world for a place in our pockets. Now there's a call to reconsider if this symbiotic relationship is truly beneficial, or parasitic instead.
The earliest cellular phone, first released in 1983, was Motorola's DynaTAC. It was better known as "The Brick," weighing in at over a kilogram and costing around $10,000 in today's money. As the technology grew more sophisticated – and shrunk in size – these phones correspondingly exploded in popularity. IBM was credited with releasing the first-ever "smartphone" in 1994, called the Simon; it featured a touchscreen, fax-sending capabilities and an email and calendar function, but its success suffered due to a short battery life and it ended up lasting just six months on the shelves.
The first truly successful smartphone was the BlackBerry, launched in 1999. It became synonymous with high-status white-collar workers who could stay connected to the office whenever they liked; a status symbol for the uber-busy, and ergo, the uber-successful. But there were rumblings that all was not well with this increasingly blurry line between home and office. A term arose to describe the growing obsession the owners had with their nifty device: the CrackBerry, which became the 2006 Webster's New World College Dictionary word of the year.
In 2007, The Guardian columnist Jonathan Freedland wrote about the frustration he had with his obsessive relationship with his BlackBerry. "I became transfixed by the palm-sized device, my eye returning constantly to its top right corner, to see if the red light was winking. If it was, the curiosity was unbearable. Sure, you knew it was bound to be spam or an office round-robin, but what if it was something else, something exciting," he wrote.
If the BlackBerry was considered addictive, it seems nobody was ready for what was to come next. In 2007, the first iPhone was revealed, marketed by Steve Jobs as a combination of the iPod, mobile phone and a breakthrough internet communicator. By the end of 2013, one in five people across the globe had their own smartphone.
Kevin Roose, the technology columnist for The New York Times, described our relationship with phones as less an addiction than a "species-level environmental shock," as he put it. "We might someday evolve the correct biological hardware to live in harmony with portable supercomputers that satisfy our every need and connect us to infinite amounts of stimulation," he wrote in 2019. "But for most of us, it hasn't happened yet."
As concerns about the compulsive nature of our relationship with these devices cemented, other worries arose. How were they affecting our work, our relationships, our health – or our brains? It was long thought that once our brains reached adulthood, they remained fixed that way for the rest of our lives. However, research in the 20th century turned that belief on its head with evidence that our brains continue to change and adapt as we grow older. That also means our brains continue to be amenable to outside forces that may tweak and influence our neural processes.
Today's kids are exposed to screens from a young age: A recent study found that around 40% of kids in America have a smartphone by age 10.
Source: Common Sense Media
The term "addiction" is often bandied about when it comes to these devices in our hands, although whether it falls under the strictly clinical definition of the word typically reserved for addictive substances such as alcohol and opioids is up for debate. Nonetheless, a 2022 meta-analysis that looked at 82 studies across 150,000 participants concluded that up to a quarter of the general population could be suffering from "smartphone addiction." Indeed, Anna Lembke, a world-leading expert on addiction at Stanford University, has called the smartphone the "modern-day hypodermic needle." Every like, message or TikTok video gives us a boost of dopamine, the pleasure-seeking neurotransmitter, which in turn reinforces the need in our brains to spend more time on these devices, to scratch that itch.
This endless cycle, experts have argued, also has neurological implications. For one, our collective attention spans may be splintering into pieces. We're less able to deeply immerse ourselves in a piece of work or a good book, because the urge to check our phones is too great. This has been dubbed the "brain drain effect" – this theory maintains that even the mere presence of a smartphone in your vicinity reduces your cognitive capacity because your brains may be subconsciously working harder to ignore the impulse to check your phone.
It may also be wreaking havoc with our memory. The idea of "digital dementia," coined by German neuroscientist Manfred Spitzer, proposes that our constant reliance on and overuse of digital technology is causing issues with our short-term memory. And while once a technology reserved only for the wealthy adult, access to smartphones has become much more pervasive – and, crucially, reaching a much younger audience. Today's kids are exposed to screens from a young age: A recent study found that around 40% of kids in America have a smartphone by age 10.
Due to the much more malleable nature of an adolescent brain, experts are growing increasingly concerned about how this exposure to phones is reshaping the brains of young people. The "great rewiring of childhood" is what it's called by Jonathan Haidt, a social psychologist and professor at NYU Stern School of Business. Haidt is the author of the new book The Anxious Generation: How the Great Rewiring of Childhood is Causing an Epidemic of Mental Illness, which instantly skyrocketed to the top of The New York Times bestseller list.
This early exposure to smartphones – as well as restricted playtime – is fundamentally altering children's brains at a time when they're slap bang in the middle of developing, he argues. According to Haidt, smartphones disrupt the natural development of the adolescent brain. Before a kid's prefrontal cortex has time to develop, smartphone use disrupts the development of impulse control, he says, fundamentally rejigging the brains of kids today.
All of this leads to Haidt's ultimate argument and the crux of the book: that this screen-filled adolescence is what's behind the youth mental health crisis that is currently raging. Haidt traces its inception to around the year 2012, when rates of depression, anxiety and self-harm in young people shot up. Between the years 2010 and 2015, suicide rates in 10- to 14-year-old girls rose by 167%, and by 92% in boys. And these stark figures can be attributed to the decline in a play-based childhood – and its replacement with a phone-based childhood.
"Mental health challenges are complex and there's no one cause."
The arguments in his book haven't escaped criticism, however. Haidt has been accused of cherry-picking small, underpowered studies to prove his points, and for mistaking correlation for causation. Critics argue that the youth mental health crisis is too complicated to attribute to a single cause, with an origin far more multifaceted in nature. "Mental health challenges are complex and there's no one cause," warns Heather Kirkorian, a developmental psychologist and professor at the University of Wisconsin, Madison.
Moral panics around technology are certainly not a new phenomenon. Strikingly similar waves of panic arose with the advent of television in the mid-20th century, radio in the 1940s, even with the rise of the novel in the 18th century. The Greek philosopher Socrates worried about the effects of writing in ancient Greece, that it would destroy the ability to commit things to memory.
What does set smartphones apart according to Jenny Radesky, co-director of the American Academy of Pediatrics Center of Excellence on Social Media and Youth Mental Health, is just how quickly the technologies are adapting and sophisticating. "Part of the business model is to grow big, get to scale, be adopted as fast as possible, get a ton of users," she says. "And so I think we have less time to reach a homeostasis with technology where we feel like we've adapted to it, or the technology has adapted to us."
Kirsten Drotner, a media historian of media panics around technology, forecasts that this moral panic cycle will continue ad infinitum, with the invention of every new gadget or gizmo. It's what Amy Orben, a psychologist at the University of Cambridge, has dubbed the "Sisyphean cycle of technology panics." She writes: "With every new technology treated as completely separate from any technology that came before, psychological researchers routinely address the same questions; they roll their boulder up the hill, investing effort, time and money to understand their technology's implications, only for it to roll down again when a novel technology is introduced. Psychology is trapped in this cycle because the fabric of moral panics has become inherently interwoven with the needs of politics, society and our own scientific discipline."
The research is still in its infancy, yet the concerns about how smartphones are affecting our brains – as well as our children's – are unsettling enough to prompt a rethink of our relationships with these devices. So what can we do? In the 2010s, Tristan Harris was a product manager at Google when he became concerned about how the products he himself was working on were hijacking their users' attention. In 2013, he wrote a manifesto for his colleagues entitled A Call to Minimize Distraction & Respect Users' Attention. His screed, which warned that technology companies had a moral responsibility to shape society – and stealing its attention was not the way – earned him a new job title: design ethicist.
He left Google in 2015 to forge his own mission: to reform the attention economy. He founded what he called the Time Well Spent movement, with the goal of stopping "tech companies from hijacking our minds." He became one of Big Tech's biggest critics; The Wall Street Journal called him "the conscience of Silicon Valley." Today, Time Well Spent has been rebranded into the Center for Humane Technology, whose mission is to reverse "human downgrading" – which he has described as the "social climate change of culture" – and realign technology with humanity.
We can – and should – wrestle back control from the Big Tech companies feeding on our attention spans.
After Harris first began calling out the industry, the companies started to take heed. Apple and Google began to roll out features that promoted "digital wellness." You could now track your screen time, set access limits, shush the incessant notifications and switch your phone to grayscale. These are all tweaks that Harris himself advocated for, but he also called them just baby steps in what will be a much longer journey.
Indeed, changes may have to come at a policy level to fight smartphone addiction. In the United Kingdom and the EU, governments have targeted the companies making the social media platforms and smartphones, passing regulations that penalize tech companies for failing to clamp down on illegal or harmful content, such as disinformation and bullying, for targeting children with advertising based on their personal data or cookies and companies are forced to carry out risk assessments on the negative effects on the mental health of children to present to the commission.
When it comes to children, movements in the US tend to place the liability on parents to safeguard their own kids. "Unfortunately, it puts a lot of the onus of responsibility on individual parents to battle against multibillion-dollar companies," says Kirkorian. "There's a place for policy around things like privacy, things like manipulative design, where you're agreeing to things you don't know you're agreeing to." But she adds that "individual parents and kids should be armed with information about what might make them feel better, and what may make them feel worse."
The reality is that ever since the iPhone was first unveiled by Apple in 2007, we'll never be able to put the genie back in the bottle: Smartphones aren't going away any time soon. These devices in our pockets are now part of the fabric of modern life. But we can – and should – wrestle back control from the Big Tech companies feeding on our attention spans. For now, the jury is out on just how much damage they are wreaking on our brains. But it wouldn't hurt any of us to unplug from time to time.
The latest edition of Think:Act Magazine explores enduring business themes from past decades through a new lens of pertinence for the next 20 years.