The new fight against digital obsession

Think:Act Magazine Purpose
The new fight against digital obsession

July 2, 2018

Technology companies are waking up to the social impact of their omnipresent devices

Article

by Steffan Heuer
illustrations by Filippo Fontana

Read more about the topic
"The purpose principle – missions give businesses strength"

Tech companies are being taken to task for "digital addiction" by a growing coalition of technologists, shareholders and academics. Are the pioneers of Silicon Valley at last waking up to the downsides of their device-driven dreams?

Early march 2018: It was an unusual sight outside Apple's shiny new headquarters in Cupertino, California. Four Stanford University computer science students, all of whom have an internship at a renowned tech company on their resume, waved picket signs and handed out flyers. The group's name says it all: Stanford Students Against Addictive Devices (SSAAD). "We want to call attention to an increasingly critical problem," explains the group's co-founder Sanjay Kannan. "Device makers, and in particular app developers, have a responsibility to address the issue of addictive technology. We chose Apple because they are a trendsetter that everyone's watching. And they are in the best position to do something about it."

A man holding a smartphone plays a slot machine with Twitter, Facebook and SnapChat logos

The student protest in the heart of Silicon Valley was one more example of the growing and, more importantly, homegrown backlash against the tyranny of always-on devices, apps, services and platforms such as Facebook. SSAAD may seem inconsequential compared to the raging scandal over unauthorized access to the data of the platform's two billion users and the controversy surrounding Russian meddling, yet it speaks to a larger point that goes far beyond Facebook. What exactly does technology do to humans? And what can – and should – the people who design those technologies do about it?

"We have all those wonderful technical tools and somehow used them to create this large weapon."
Portrait of Aza Raskin

Aza Raskin

Co-Founder
Center for Humane Technology

Silicon Valley is finally having second thoughts about its greatest and most successful inventions as evidence mounts that they wreak havoc on individuals, families and society. As a result, the tech industry is being forced to grow a conscience and address issues of safety, self-regulation and also potential government intervention. While the term "digital addiction" may not have yet entered the Diagnostic and Statistical Manual of Mental Disorders, the psychiatrist's handbook, it looks increasingly as if spending your waking hours on apps, games and platforms designed for maximum engagement takes a measurable toll. "There is emerging research that kids and adults feel addicted to their devices. We still avoid the phrase addiction from a diagnostic point of view, but we can definitely say many people have feelings of addiction," says Colby Zintl, vice president of external affairs at Common Sense Media, an advocacy group based in San Francisco that researches and promotes healthy media use for families as well as educators. Her organization has now partnered with the Center for Humane Technology (CHT), a group of rebellious techies launched in early 2018 that encapsulates the growing remorse and guilt among Silicon Valley elites about the (often unintended) consequences of their urge to "make the world a better place" and score a lucrative exit.

Tech giants should provide more options to consumers

When it comes to calling out the dangerous downsides of technology, CHT is at the forefront. The group was launched by former Google and Facebook programmers and supported by a Who's Who of the industry, among them early Facebook investors and one of the programmers responsible for developing the blue "Like" button back in 2007. "Technology is hijacking our minds and society," declares the group's website, listing studies and anecdotal evidence about how smartphones and apps hurt humans psychologically and physiologically. "We have all those wonderful technical tools and somehow used them to create this large weapon. We let whoever bids the most money point this gun at our heads," says Aza Raskin, one of the group's co-founders.

While some might have concerns about the application and use of the data, others fear what tech is doing to the way we think. Tony Fadell, one of the creators of the iPod and iPhone, ripped into his peers with a tirade of tweets: "Apple Watches, Google Phones, Facebook, Twitter – they've gotten so good at getting us to go for another click, another dopamine hit," making reference to the neurotransmitter molecule in the brain commonly associated with addiction and pleasure. Roger McNamee, an early investor in Amazon, Facebook and Google demanded: "For the sake of restoring balance to our lives and hope to our politics, it is time to disrupt the disrupters." Former Facebook executive Chamath Palihapitiya has expressed "tremendous guilt" for his part in creating "tools that are ripping apart the social fabric of how society works." And even financier George Soros weighed in at this year's World Economic Forum, saying that tech companies "deliberately engineer addiction to the services they provide."

40% of Americans who use social media responded that it would be "hard to give up" using it in a 2018 Pew Research Center survey. The number climbed to 51% in the 18–24 demographic.

It's a feeling of responsibility that tech insiders kept bottled up for far too long, admits CHT co-founder Raskin. "If you live in Silicon Valley, you get up every day and go to work, thinking you'll make the world a better place. My entire career was about drinking that Kool-Aid." A key turning point, by many accounts, was a presentation that then Google Design Ethicist Tristan Harris gave inside the search giant back in 2014. He lamented the vast influence Google yields over two billion humans, steering them like an ant colony. His presentation became one of the most internally requested topics at Google, Raskin recalls and adds: "Until the election of 2016 it was easy to paper over this problem." It took the revelations about Russian meddling to give those second thoughts a unifying theme and urgency.

At Google's most recent developer conference, CEO Sundar Pichai admitted something’s amiss. “We want to help you understand your habits and focus on what matters,” he declared, all the while introducing plenty of new features that intrude even more deeply into the private lives of the tech giant’s users. Artificial intelligence features such as a voice assistant called Duplex that masquerades as a human created particular concern among many technologists, even if Pichai proclaimed Google had “a deep responsibility to get this right.”

Big tech companies like Google are aware that Wall Street has begun to take notice, too, drawing a connection between digital dependence and the future revenues, profits and valuations of tech companies. In January, New York-based hedge fund Jana Partners joined the California State Teachers' Retirement System in sending an open letter to Apple calling for changes to its technology. "We believe there is a clear need for Apple to offer parents more choices and tools to help them ensure that young consumers are using your products in an optimal manner," the letter said, citing research findings and media reports.

Long-term well-being is not about instant gratification
Technologist John C. Havens is the executive director of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems.

He answered a few questions for Think:Act on the importance of machine ethics. The statements he makes here do not necessarily reflect the opinions of the IEEE.

Mr. Havens, is it necessary in your opinion that we start building ethical principles into hardware and software?

Being more "connected" to devices or screens doesn't inherently increase short- or long-term happiness. That's why we need applied ethics, or values-based design, to help people creating artificial intelligence (AI) and machines to delve deeper into both end users' values and new issues arising from these technologies. AI directly affects human agency, identity and emotion in ways many other technologies don't. That doesn't mean AI is "bad" or "evil," but that engineers, programmers and practitioners need new levels of due diligence to ensure they can avoid negative, unintended consequences.

Do algorithms need to adhere to human values? Can they align with both our well-being and need for gratification?

The baseline is human values, but put into the context of your regional and cultural values. We also need an analysis of what can increase people's longterm well-being or flourishing. And that's not about instant gratification.

You head the Global Initiative on Ethics of Autonomous and Intelligent Systems of the IEEE, the world's largest association of engineers. What has the project accomplished so far?

The second version of "Ethically Aligned Design" (EADv2) is currently available on the IEEE website. We address these topics in 13 different sections created by over 250 global experts, and developers can already use EADv2 today. EAD has inspired the creation of 14 approved standards working groups which are currently open for anyone to join, for free. Just like the IEEE created the Wi-Fi standard, these standards will help guide companies and developers to implement values-driven design.

Protecting democracy requires to give up some commercial revenue

What exactly does research have to say about the impact of digital devices and services on our well-being? To begin with, devices bring strife and tension to kids, teens and families. Almost half of all children polled by Common Sense Media say they feel addicted to their phones, and three in four families say that devices have caused discord at home. Smartphones and tablets have a significant, negative impact on sleep duration and quality among school-age children and adolescents, according to a 2015 summary review of 67 academic studies on the topic. While the verdict is still out as to whether being on Facebook or Snapchat can really be compared to cocaine or battling a gambling disorder, studies have documented brain activity that looks suspiciously similar to other addictions when it comes to how subjects are able – or rather, unable – to control their impulses and inhibitions. Psychologist Jean Twenge at San Diego State University has drawn an even darker conclusion, making a connection between new media screen time and adolescent depression and suicide rates, especially among females. US teens spend an average of seven hours a day in front of some kind of screen, which comes at the expense of interacting with other humans.

7 hours is the average amount of time that American teenagers spend every day in front of some kind of screen.

Critics argue that the current wave of remorse needs to be transformed into action on three fronts. For starters, consumers should take some easy steps to blunt the worst effects of digital addiction. It will be more difficult to pull the second lever, which can be called "coding with a conscience." Companies have a moral obligation and also a business incentive, the argument goes, to self-regulate and think from the outset how to make their products safer and less addictive. "There are some people who are still in denial [about] how their wealth is created and how their products tear apart society and democracy, but we get more and more thank you notes every day," says CHT co-founder Raskin, whose late father Jef developed the Macintosh computer at Apple in the 1970s and who therefore has a specific frame of reference for what good design should focus on. For him, human-scale design is a big opportunity that should be taught in schools and colleges so it can permeate the developer community. He sums up the question as: "How do you make great products that don't cater to our impulses but appeal to the higher stack of human qualities?"

"Silicon Valley's business model is surveillance capitalism: Give products away for free and turn us into the products."
Portrait of Andrew Keen

Andrew Keen

Author
Book: "How to Fix the Future: Staying Human in the Digital Age"

The third point is regulation. "The fact is tech companies have so far gotten a free pass and are completely unregulated. It's more heavy lifting than the other steps, but we're suggesting to add a seatbelt or an airbag to Facebook," says Common Sense's Zintl, drawing a parallel to mandated updates to automobile safety that have long become accepted standards. The systemic changes critics propose, however, are much deeper and wider than adding the equivalent of an airbag. If the root of the problem lies in the fact that the business model of Google or Facebook is built around tracking and manipulating user behavior and turning it into advertising dollars, regulation would have to force them to change how they operate – and how they make money. "When the British Empire gave up slavery, it gave up 2% of its GDP for 60 years, but it was the morally right thing to do," argues Raskin. "If companies want to do something that protects democracy, they need to give up some revenue. Or why not hold them responsible for the content they promote, that would hugely change the incentives."

Andrew Keen, a razor-tongued tech observer and author of four books on the tech industry's failed promises – the latest one being "How To Fix The Future: Staying Human in the Digital Age" – says contrition is not enough. "Silicon Valley's business model is surveillance capitalism: Give products away for free and turn us into the products."

A man obsessed with technology and the internet holds a bottle of pills with the Facebook logo

People must claim their rights within the digital revolution

Citizens need to fight back and impose human values on technology with common sense tools like civic engagement, education and, yes, regulation, argues Keen. "If it's true that we live in a system of surveillance capitalism, changes cannot come from the five big platforms because we have a winner-takes-all economy that stifles innovation and competition," Keen warns.

Taken together, the long overdue public soul-searching among Silicon Valley's best and brightest may well bring change. First, a blend of heightened awareness and wariness among users of what they're getting into by tapping that "I agree" button. Second, governmental and self-imposed regulation of the dominant players nudged along by frameworks such as the EU's General Data Protection Regulation that took effect in May. And third, the dark horse of emerging competition – and eventual economic success – by entirely new companies that are driven by a different set of values, among them design that doesn't merely look at the interaction between a machine and an individual, but society as a whole.

Further reading
Our Think:Act magazine
blue background
Think:Act Edition

The purpose principle

{[downloads[language].preview]}

What does purpose mean to you and your business? Our Think:Act magazine on purpose addresses changing values in the business world.

Published July 2018. Available in
Subscribe to our Think:Act Magazine newsletter

Portrait of Think:Act Magazine

Think:Act Magazine

Munich Office, Central Europe