July 29, 2019

“We must stop tech platforms from distracting us by design”

James Williams, the former Google employee and author of Stand Out of Our Light, says we need to win back control of our attention from big tech

Have you ever stayed up late at night mindlessly scrolling through your Facebook feed? Or aimlessly watching one YouTube video after another? Or binge-watching Netflix? Or sending messages to friends on WhatsApp? Or using Twitter? Or Snapchat? Or Reddit? Or Instagram?

If you’re reading this article, I’m willing to bet that your answer to (at least) one of these questions is “yes”. And that you’ve probably done it more than once.

Here’s another question: How big a problem do you think this kind of behaviour poses to you, and – more broadly – human society?


James Williams, for one, thinks it poses a big problem – perhaps even the biggest.

As he writes in his new book, Stand Out of Our Light: Freedom and Resistance in the Attention Economy, “liberating human attention” from the iron grasp of modern tech platforms “may be the defining moral and political struggle of our time”.

A former Google employee, James quit the company for academia several years ago after realising that the goals of the vast majority of tech platforms didn’t align with, and sometimes were even directly antagonistic to, his own.

“I realised that “success” for nearly all tech platforms meant things like maximising the amount of time you spend with their product, keeping you scrolling as much as possible, or showing you as many ads as they can,” James explains. “These are not my goals – and, I’m willing to bet, they’re not yours either. I mean, nobody wakes up in the morning and asks, ‘How much time can I possibly spend using social media today?’”

Shortly after leaving Google, James and his fellow former Google employee and friend Tristan Harris founded the Time Well Spent campaign, an initiative that served as a stepping stone to more recent work that Tristan and others have been doing under the aegis of The Center for Humane Technology (CHT), a nonprofit organisation whose aim is to “reverse the human downgrading” caused by much of modern technology “by inspiring a new race to the top, and realigning technology with humanity”.

“Today’s tech platforms are caught in a race to the bottom of the brain stem to extract human attention,” proclaims the CHT website. “It’s a race we’re all losing.”

Many of the consequences of this race will be familiar: shortening attention spans, political polarisation, outrage, social isolation, and – perhaps most harmfully – addiction.

“We need to move towards a more humane technology,” the CHT website continues. “One that embraces rather than exploits human behaviour.”

The CHT’s impact since its founding has been substantial: Tristan has made several appearances on popular American TV shows including 60 minutes, Real Time with Bill Maher, and Tucker Carlson Tonight. Furthermore, the CHT’s work has been discussed by several major print media outlets including The Wall Street Journal, The Atlantic, and The New York Times, while the organisation has also professionally advised numerous political leaders, academic institutions, and tech executives.

James, on the other hand, has opted for a more academic route over the last few years – he recently completed a PhD in technology ethics at Oxford, upon which much of his recent book is based – although he has also made several appearances on major media, including BBC’s Radio 4, CNBC, and Al Jazeera’s Arabic channel.

Sifted caught up with James a few of days ago in Oxford to chat some more about his new book, the importance of human attention, and the present and future impact of modern technology on human society.

The title of your new book mentions the “attention economy”. Can you say a little bit more about what that is?

In the 1970s, Herbert Simon pointed out that when information becomes abundant, attention becomes a scarce resource. Thanks to the stupendous recent advances in modern technology, which in turn has led to a wealth of information being accessible to almost anyone around the world at the mere click of a button, attention now is a scarce resource. And because of that, it is the object of fierce competition among most of the technologies we use every day. This total environment of competition for, and attempts to manipulate, our attention is what’s known as the “attention economy”.

Why do you say in your book that “liberating human attention” from these forces “may be the defining moral and political task of our time”?

Attention is a first-order problem. If we can’t give the right kind of attention to the many enormous challenges facing our world, then we can’t meaningfully address them. More broadly, attention is the most precious resource that we have: it’s a finite one, and essential for human flourishing. There’s also a long tradition across human cultures of the idea that what we give attention to is, in a very real sense, what we are. If that’s true, then what’s being manipulated here is the very substance of our lives.

 Who is ultimately to blame for this mass manipulation of our attention?

 I think that’s the wrong question. The problem is structural: it is one of misaligned incentives between those who design and ultimately control our technologies and these technologies’ users – us. Tech platforms make their money by selling ads: the more time you’re glued to your screen, the more ads they can sell. But such misaligned incentives can also occur in states where the government controls access to the information and media landscape, for instance in China. In other words, governments, as well as private companies, may have their incentives misaligned with us, the people.

How do we fix the problem? Should I just get rid of my phone, tablet, or computer?

 Ultimately, my hope – and Tristan’s too – is that you don’t need to do that. As Aristotle said, “It is disgraceful to be unable to use our good things”. We believe in changing the incentive structure surrounding these technologies, as opposed to just completely getting rid of them. Ultimately, what we need to do is hold tech platforms to a higher standard. They need to be equipped with better design principles, better metrics, better business models – in short, better incentives, more aligned with our own.

More information about the Center for Humane Technology can be found on its website. James’ book – which Cambridge University Press has made available by open access – can be found here. For more, listen to Sifted cofounder John Thornhill interview James for the Financial Times