Compendium 38 — Stolen Focus: Why You Can't Pay Attention
Technological distractions cause twice the drop in IQ than smoking marijuana and produce a level of cognitive impairment comparable to that of a drunk driver.
📖 Brief Overview
The increasing inability to focus is not due to a lack of self-discipline but the result of a deliberate plan to manipulate human behavior and self-esteem.
2019 saw Tristan Harris, a technology ethicist, testify before the United States Senate, explaining how we can each aspire to have self-control, but on the other side of that screen sits thousands of engineers working against us.
By anticipating how to keep us hooked, the platforms could crawl deeper down our brain stems, exploiting our social validation triggers. Rather than gaining our attention, they discovered that getting us addicted to seeking attention from others was more lucrative.
As a result, 50% of teens now prefer a broken bone to a broken phone, while 11 to 14-year-old girls are experiencing a 170% increase in depressive symptoms. On average, adults fare no better, with studies showing they remain on task for just three minutes and never work for more than 30 minutes straight, with each interruption costing them 23 minutes and 15 seconds.
Further studies reveal that technological distractions cause twice the drop in IQ than smoking marijuana and produce a level of cognitive impairment comparable to that of a drunk driver.
The trillion-dollar social empire has been shown to stoke anxiety and outrage in its users while exploiting their neurochemical reward pathways, resulting in the alteration of 4.5 billion brains.
We must remember the adage, 'If you don't pay for the product, you are the product,' while considering if the effects of these platforms are worth the benefits of embracing unregulated technology designed for addiction.
💡 Six Significant Ideas
1. The Attention Epidemic
The rising inability to concentrate is not a problem of discipline. Each of us is responsible for our behavior, but we must contend with the force of a trillion-dollar empire.
Hari states this premise in the introduction to Stolen Focus:
"This is a systemic problem. The truth is that you are living in a system that is pouring acid on your attention every day, and then you are being told to blame yourself and to fiddle with your own habits while the world's attention burns."
An excerpt from Tristan Harris' testimony to the United States Senate best illustrates the issue:
"You can try having self-control, but there are a thousand engineers on the other side of the screen working against you."
Tristan Harris features as one of the experts in Stolen Focus; here's a brief introduction. Harris is an American technology ethicist, called "the closest thing Silicon Valley has to a conscience". He wasn't always this way, though. While working at Google, Harris produced a 141-slide deck titled "A Call to Minimize Distraction & Respect Users' Attention", seen by tens of thousands of Google employees. After his presentation, Google created the position of Chief Technology Ethicist for him. Harris left Google in 2015 to co-found a nonprofit organization, the Center for Humane Technology. Harris now spends his time advocating for more regulation of social media companies through his nonprofit. More recently, Harris was featured in the Netflix documentary The Social Dilemma, seen by more than 38-million people in its first month.
The following is an excerpt from Harris' testimony to the United States Senate, titled 'Persuasive Technology and Optimizing for Engagement'. The testimony focused on the role of unregulated platforms in developing persuasive technology and the societal implications if left unchecked:
Harris: Because there is only so much attention, companies have to race to get more and more of it. I call it the race to the bottom of the brain stem. It starts with techniques like pull to refresh. Pulling to refresh your feed acts like a slot machine, having the same kind of addictive qualities that keep gamblers hooked in Vegas. Another example is removing stopping cues. If I take the bottom out of a glass and keep refilling the wine, you won't know when to stop drinking. This happens with infinitely scrolling feeds; we've removed the stopping cues that keep people scrolling.
The race to get attention has to become more and more aggressive. We have to predict how to keep you hooked. So we crawl deeper down the brain stem into your social validation, which triggered the introduction of likes and followers. Instead of getting your attention, it was much cheaper to get you addicted to getting attention from other people. This created a mass narcism movement and the many of the cultural effects we're seeing today, primarily among young people. And after two decades in decline, the mental health of ten to fourteen-year-old girls has shot up 170% in the last 8-years.
The problem with social media platforms lies within their infrastructure. To illustrate this, Harris uses the following analogy:
Let's say that private companies built nuclear power plants across The United States, which began melting down one by one. These companies then told you that it's your responsibility to have hazmat suits and build your own radiation kits. That's essentially what we're experiencing now with social media. The responsibility is being put on the consumers when, if it's the infrastructure, it should be placed on the people building that infrastructure.
Harris' influence on Hari is evident throughout Stolen Focus. One argument against social media is to blame the users' willpower rather than the companies' practices. Hari flips this in Stolen Focus, as Harris has done for years in his campaigning.
What is Attention?
Before proceeding, let's clarify what attention is. Former Google strategist James Williams describes the three layers of attention to Hari:
"The first layer of your attention, he said, is your spotlight. This is when you focus on 'immediate actions,' like, 'I'm going to walk into the kitchen and make a coffee.' You want to find your glasses. You want to see what's in the fridge. You want to finish reading this chapter of my book. It's called the spotlight because—as I explained earlier—it involves narrowing down your focus. If your spotlight gets distracted or disrupted, you are prevented from carrying out near-term actions like these.
"The second layer of your attention is your starlight. This is, he says, the focus you can apply to your 'longer-term goals—projects over time.' You want to write a book. You want to set up a business. You want to be a good parent. It's called the starlight because when you feel lost, you look up to the stars, and you remember the direction you are traveling in. If you become distracted from your starlight, he said, you 'lose sight of the longer-term goals.' You start to forget where you are headed.
"The third layer of your attention is your daylight. This is the form of focus that makes it possible for you to know what your longer-term goals are in the first place. How do you know you want to write a book? How do you know you want to set up a business? How do you know what it means to be a good parent? Without being able to reflect and think clearly, you won't be able to figure these things out. He gave it this name because it's only when a scene is flooded with daylight that you can see the things around you most clearly. If you get so distracted that you lose your sense of the daylight, James says, 'In many ways you may not even be able to figure out who you are, what you wanted to do, or where you want to go.'
"[Williams] believes that losing your daylight is 'the deepest form of distraction,' and you may even begin 'decohering.' This is when you stop making sense to yourself, because you don't have the mental space to create a story about who you are. You become obsessed with petty goals, or dependent on simplistic signals from the outside world like retweets. You lose yourself in a cascade of distractions. You can only find your starlight and your daylight if you have sustained periods of reflection, mind-wandering, and deep thought. James has come to believe that our attention crisis is depriving us of all three of these forms of focus. We are losing our light."
This theme addressed concerns about the power of unregulated social media companies, which is at the core of Stolen Focus. From here, we will examine the manipulative effects of the platforms.
2. How Technology Manipulates Us
The theme discusses the hidden mechanisms of social media and how they manipulate our attention.
We start with Tristan Harris explaining to Hari how social media works:
"When Facebook (and all the others) decide what you see in your news feed, there are many thousands of things they could show you. So they have written a piece of code to automatically decide what you will see. There are all sorts of algorithms they could use—ways they could decide what you should see, and the order in which you should see them. They could have an algorithm designed to show you things that make you feel happy. They could have an algorithm designed to show you things that make you feel sad. They could have an algorithm to show you things that your friends are talking about most. The list of potential algorithms is long. The algorithm they actually use varies all the time, but it has one key driving principle that is consistent. It shows you things that will keep you looking at your screen. That's it. Remember: the more time you look, the more money they make. So the algorithm is always weighted toward figuring out what will keep you looking, and pumping more and more of that onto your screen to keep you from putting down your phone. It is designed to distract. But, Tristan was learning, that leads—quite unexpectedly, and without anyone intending it—to some other changes, which have turned out to be incredibly consequential.
"Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That's not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there's a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. Scientists have been proving this effect in different contexts for a long time—if they showed you a photo of a crowd, and some of the people in it were happy, and some angry, you would instinctively pick out the angry faces first. Even ten-week-old babies respond differently to angry faces. This has been known about in psychology for years and is based on a broad body of evidence. It's called 'negativity bias.'
"There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are—according to the best site monitoring YouTube trends—words such as 'hates,' 'obliterates,' 'slams,' 'destroys.' A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are' attack,' 'bad,' and 'blame.' A study by the Pew Research Center found that if you fill your Facebook posts with 'indignant disagreement,' you'll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will—unintentionally but inevitably—prioritize outraging and angering you. If it's more enraging, it's more engaging.
"If enough people are spending enough of their time being angered, that starts to change the culture. As Tristan told me, it 'turns hate into a habit.'
"At the moment false claims spread on social media far faster than the truth, because of the algorithms that spread outraging material faster and farther. A study by the Massachusetts Institute of Technology found that fake news travels six times faster on Twitter than real news, and during the 2016 U.S. presidential election, flat-out falsehoods on Facebook outperformed all the top stories at nineteen mainstream news sites put together. As a result, we are being pushed all the time to pay attention to nonsense—things that just aren't so."
Hari then provides a real-world example of how damaging this effect can be:
"YouTube makes more money the longer you watch. That's why they designed it so that when you stop watching one video, it automatically recommends and plays another one for you. How are those videos selected? YouTube also has an algorithm—and it too has figured out that you'll keep watching longer if you see things that are outrageous, shocking, and extreme. Guillaume had seen how it works, with all the data YouTube keeps secret—and he saw what it means in practice. If you watched a factual video about the Holocaust, it would recommend several more videos, each one getting more extreme, and within a chain of five or so videos, it would usually end up automatically playing a video denying the Holocaust happened. If you watched a normal video about 9/11, it would often recommend a '9/11 truther' video in a similar way. This isn't because the algorithm (or anyone at YouTube) is a Holocaust denier or 9/11 truther. It was simply selecting whatever would most shock and compel people to watch longer. Tristan started to look into this, and concluded: 'No matter where you start, you end up more crazy.'
"It turned out, as Guillaume leaked to Tristan, that YouTube had recommended videos by Alex Jones and his website Infowars 15 billion times. Jones is a vicious conspiracy theorist who has claimed that the 2012 Sandy Hook massacre was faked, and that the grieving parents are liars whose children never even existed. As a result, some of those parents have been inundated with death threats and have had to flee their homes. This is just one of many insane claims he has made. Tristan has said: 'Let's compare that—what is the aggregate traffic of the New York Times, the Washington Post, the Guardian? All that together is not close to fifteen billion views.'
"The average young person is soaking up filth like this day after day. Do those feelings of anger go away when they put down their phone? The evidence suggests that for lots of people, they don't. A major study asked white nationalists how they became radicalized, and a majority named the internet—with YouTube as the site that most influenced them. A separate study of far-right people on Twitter found that YouTube was by far the website they turned to the most. 'Just watching YouTube radicalizes people,' Tristan explained. Companies like YouTube want us to think 'we have a few bad apples,' he explained to the journalist Decca Aitkenhead, but they don't want us to ask: 'Do we have a system that is systematically, as you turn the crank every day, pumping out more radicalization? We're growing bad apples. We're a bad-apple factory. We're a bad-apple farm.'"
The previous theme cited Harris' testimony, in which he documents some of the negative implications you just read. The following six points Harris discusses in his testimony (provided here for accuracy against Hari's interpretation of them):
- "Extremism exploits our brains: With over a billion hours on YouTube watched daily, 70% of those billion hours are from the recommendation system. The most recommended keywords in recommended videos were: get schooled, shreds, debunks, dismantles, debates, rips confronts, destroys, hates, demolishes, obliterates."
- "Outrage exploits our brains: For each moral-emotional word added to a tweet it raised its retweet rate by 17%."
- "Insecurity exploits our brains: In 2018, if you were a teen girl starting on a dieting video, YouTube's algorithm recommended anorexia videos next because those were better at keeping attention."
- "Conspiracies exploit our brains: And if you are watching a NASA moon landing, YouTube would recommend Flat Earth conspiracies millions of times. YouTube recommended Alex Jones (InfoWars) conspiracies 15 billion times."
- "Sexuality exploits our brains: Adults watching sexual content were recommended videos that increasingly feature young women, then girls to then children playing in bathing suits."
- "Confirmation bias exploits our brains: Fake news spreads six times faster than real news, because it's unconstrained while real news is constrained by the limits of what is true."
The best line of defense is exposure; people need to realize they're being manipulated.
This theme addressed a handful of societal and individual impacts of social media. Next, we will examine the direct effect on attention.
Keep reading with a 7-day free trial
Subscribe to The Scipionic Circle to keep reading this post and get 7 days of free access to the full post archives.