The increasing inability to focus is not due to a lack of self-discipline but the result of a deliberate plan to manipulate human behavior and self-esteem.
2019 saw Tristan Harris, a technology ethicist, testify before the United States Senate, explaining how we can each aspire to have self-control, but on the other side of that screen sits thousands of engineers working against us.
By anticipating how to keep us hooked, the platforms could crawl deeper down our brain stems, exploiting our social validation triggers. Rather than gaining our attention, they discovered that getting us addicted to seeking attention from others was more lucrative.
As a result, 50% of teens now prefer a broken bone to a broken phone, while 11 to 14-year-old girls are experiencing a 170% increase in depressive symptoms. On average, adults fare no better, with studies showing they remain on task for just three minutes and never work for more than 30 minutes straight, with each interruption costing them 23 minutes and 15 seconds.
Further studies reveal that technological distractions cause twice the drop in IQ than smoking marijuana and produce a level of cognitive impairment comparable to that of a drunk driver.
The trillion-dollar social empire has been shown to stoke anxiety and outrage in its users while exploiting their neurochemical reward pathways, resulting in the alteration of 4.5 billion brains.
We must remember the adage, 'If you don't pay for the product, you are the product,' while considering if the effects of these platforms are worth the benefits of embracing unregulated technology designed for addiction.
Note: Passages from Stolen Focus are in italics and quotation marks. All other text is my own.
Learn more about Stolen Focus on Amazon.
The rising inability to concentrate is not a problem of discipline. Each of us is responsible for our behavior, but we must contend with the force of a trillion-dollar empire.
Hari states this premise in the introduction to Stolen Focus:
"This is a systemic problem. The truth is that you are living in a system that is pouring acid on your attention every day, and then you are being told to blame yourself and to fiddle with your own habits while the world's attention burns."
An excerpt from Tristan Harris' testimony to the United States Senate best illustrates the issue:
"You can try having self-control, but there are a thousand engineers on the other side of the screen working against you."
Tristan Harris features as one of the experts in Stolen Focus; here's a brief introduction. Harris is an American technology ethicist, called "the closest thing Silicon Valley has to a conscience". He wasn't always this way, though. While working at Google, Harris produced a 141-slide deck titled "A Call to Minimize Distraction & Respect Users' Attention", seen by tens of thousands of Google employees. After his presentation, Google created the position of Chief Technology Ethicist for him. Harris left Google in 2015 to co-found a nonprofit organization, the Center for Humane Technology. Harris now spends his time advocating for more regulation of social media companies through his nonprofit. More recently, Harris was featured in the Netflix documentary The Social Dilemma, seen by more than 38-million people in its first month.
The following is an excerpt from Harris' testimony to the United States Senate, titled 'Persuasive Technology and Optimizing for Engagement'. The testimony focused on the role of unregulated platforms in developing persuasive technology and the societal implications if left unchecked:
Harris: Because there is only so much attention, companies have to race to get more and more of it. I call it the race to the bottom of the brain stem. It starts with techniques like pull to refresh. Pulling to refresh your feed acts like a slot machine, having the same kind of addictive qualities that keep gamblers hooked in Vegas. Another example is removing stopping cues. If I take the bottom out of a glass and keep refilling the wine, you won't know when to stop drinking. This happens with infinitely scrolling feeds; we've removed the stopping cues that keep people scrolling.
The race to get attention has to become more and more aggressive. We have to predict how to keep you hooked. So we crawl deeper down the brain stem into your social validation, which triggered the introduction of likes and followers. Instead of getting your attention, it was much cheaper to get you addicted to getting attention from other people. This created a mass narcism movement and the many of the cultural effects we're seeing today, primarily among young people. And after two decades in decline, the mental health of ten to fourteen-year-old girls has shot up 170% in the last 8-years. 
The problem with social media platforms lies within their infrastructure. To illustrate this, Harris uses the following analogy:
Let's say that private companies built nuclear power plants across The United States, which began melting down one by one. These companies then told you that it's your responsibility to have hazmat suits and build your own radiation kits. That's essentially what we're experiencing now with social media. The responsibility is being put on the consumers when, if it's the infrastructure, it should be placed on the people building that infrastructure. 
Harris' influence on Hari is evident throughout Stolen Focus. One argument against social media is to blame the users' willpower rather than the companies' practices. Hari flips this in Stolen Focus, as Harris has done for years in his campaigning.
Before proceeding, let's clarify what attention is. Former Google strategist James Williams describes the three layers of attention to Hari:
"The first layer of your attention, he said, is your spotlight. This is when you focus on 'immediate actions,' like, 'I'm going to walk into the kitchen and make a coffee.' You want to find your glasses. You want to see what's in the fridge. You want to finish reading this chapter of my book. It's called the spotlight because—as I explained earlier—it involves narrowing down your focus. If your spotlight gets distracted or disrupted, you are prevented from carrying out near-term actions like these.
"The second layer of your attention is your starlight. This is, he says, the focus you can apply to your 'longer-term goals—projects over time.' You want to write a book. You want to set up a business. You want to be a good parent. It's called the starlight because when you feel lost, you look up to the stars, and you remember the direction you are traveling in. If you become distracted from your starlight, he said, you 'lose sight of the longer-term goals.' You start to forget where you are headed.
"The third layer of your attention is your daylight. This is the form of focus that makes it possible for you to know what your longer-term goals are in the first place. How do you know you want to write a book? How do you know you want to set up a business? How do you know what it means to be a good parent? Without being able to reflect and think clearly, you won't be able to figure these things out. He gave it this name because it's only when a scene is flooded with daylight that you can see the things around you most clearly. If you get so distracted that you lose your sense of the daylight, James says, 'In many ways you may not even be able to figure out who you are, what you wanted to do, or where you want to go.'
"[Williams] believes that losing your daylight is 'the deepest form of distraction,' and you may even begin 'decohering.' This is when you stop making sense to yourself, because you don't have the mental space to create a story about who you are. You become obsessed with petty goals, or dependent on simplistic signals from the outside world like retweets. You lose yourself in a cascade of distractions. You can only find your starlight and your daylight if you have sustained periods of reflection, mind-wandering, and deep thought. James has come to believe that our attention crisis is depriving us of all three of these forms of focus. We are losing our light."
This theme addressed concerns about the power of unregulated social media companies, which is at the core of Stolen Focus. From here, we will examine the manipulative effects of the platforms.
The theme discusses the hidden mechanisms of social media and how they manipulate our attention.
We start with Tristan Harris explaining to Hari how social media works:
"When Facebook (and all the others) decide what you see in your news feed, there are many thousands of things they could show you. So they have written a piece of code to automatically decide what you will see. There are all sorts of algorithms they could use—ways they could decide what you should see, and the order in which you should see them. They could have an algorithm designed to show you things that make you feel happy. They could have an algorithm designed to show you things that make you feel sad. They could have an algorithm to show you things that your friends are talking about most. The list of potential algorithms is long. The algorithm they actually use varies all the time, but it has one key driving principle that is consistent. It shows you things that will keep you looking at your screen. That's it. Remember: the more time you look, the more money they make. So the algorithm is always weighted toward figuring out what will keep you looking, and pumping more and more of that onto your screen to keep you from putting down your phone. It is designed to distract. But, Tristan was learning, that leads—quite unexpectedly, and without anyone intending it—to some other changes, which have turned out to be incredibly consequential.
"Imagine two Facebook feeds. One is full of updates, news, and videos that make you feel calm and happy. The other is full of updates, news, and videos that make you feel angry and outraged. Which one does the algorithm select? The algorithm is neutral about the question of whether it wants you to be calm or angry. That's not its concern. It only cares about one thing: Will you keep scrolling? Unfortunately, there's a quirk of human behavior. On average, we will stare at something negative and outrageous for a lot longer than we will stare at something positive and calm. You will stare at a car crash longer than you will stare at a person handing out flowers by the side of the road, even though the flowers will give you a lot more pleasure than the mangled bodies in a crash. Scientists have been proving this effect in different contexts for a long time—if they showed you a photo of a crowd, and some of the people in it were happy, and some angry, you would instinctively pick out the angry faces first. Even ten-week-old babies respond differently to angry faces. This has been known about in psychology for years and is based on a broad body of evidence. It's called 'negativity bias.'
"There is growing evidence that this natural human quirk has a huge effect online. On YouTube, what are the words that you should put into the title of your video, if you want to get picked up by the algorithm? They are—according to the best site monitoring YouTube trends—words such as 'hates,' 'obliterates,' 'slams,' 'destroys.' A major study at New York University found that for every word of moral outrage you add to a tweet, your retweet rate will go up by 20 percent on average, and the words that will increase your retweet rate most are' attack,' 'bad,' and 'blame.' A study by the Pew Research Center found that if you fill your Facebook posts with 'indignant disagreement,' you'll double your likes and shares. So an algorithm that prioritizes keeping you glued to the screen will—unintentionally but inevitably—prioritize outraging and angering you. If it's more enraging, it's more engaging.
"If enough people are spending enough of their time being angered, that starts to change the culture. As Tristan told me, it 'turns hate into a habit.'
"At the moment false claims spread on social media far faster than the truth, because of the algorithms that spread outraging material faster and farther. A study by the Massachusetts Institute of Technology found that fake news travels six times faster on Twitter than real news, and during the 2016 U.S. presidential election, flat-out falsehoods on Facebook outperformed all the top stories at nineteen mainstream news sites put together. As a result, we are being pushed all the time to pay attention to nonsense—things that just aren't so."
Hari then provides a real-world example of how damaging this effect can be:
"YouTube makes more money the longer you watch. That's why they designed it so that when you stop watching one video, it automatically recommends and plays another one for you. How are those videos selected? YouTube also has an algorithm—and it too has figured out that you'll keep watching longer if you see things that are outrageous, shocking, and extreme. Guillaume had seen how it works, with all the data YouTube keeps secret—and he saw what it means in practice. If you watched a factual video about the Holocaust, it would recommend several more videos, each one getting more extreme, and within a chain of five or so videos, it would usually end up automatically playing a video denying the Holocaust happened. If you watched a normal video about 9/11, it would often recommend a '9/11 truther' video in a similar way. This isn't because the algorithm (or anyone at YouTube) is a Holocaust denier or 9/11 truther. It was simply selecting whatever would most shock and compel people to watch longer. Tristan started to look into this, and concluded: 'No matter where you start, you end up more crazy.'
"It turned out, as Guillaume leaked to Tristan, that YouTube had recommended videos by Alex Jones and his website Infowars 15 billion times. Jones is a vicious conspiracy theorist who has claimed that the 2012 Sandy Hook massacre was faked, and that the grieving parents are liars whose children never even existed. As a result, some of those parents have been inundated with death threats and have had to flee their homes. This is just one of many insane claims he has made. Tristan has said: 'Let's compare that—what is the aggregate traffic of the New York Times, the Washington Post, the Guardian? All that together is not close to fifteen billion views.'
"The average young person is soaking up filth like this day after day. Do those feelings of anger go away when they put down their phone? The evidence suggests that for lots of people, they don't. A major study asked white nationalists how they became radicalized, and a majority named the internet—with YouTube as the site that most influenced them. A separate study of far-right people on Twitter found that YouTube was by far the website they turned to the most. 'Just watching YouTube radicalizes people,' Tristan explained. Companies like YouTube want us to think 'we have a few bad apples,' he explained to the journalist Decca Aitkenhead, but they don't want us to ask: 'Do we have a system that is systematically, as you turn the crank every day, pumping out more radicalization? We're growing bad apples. We're a bad-apple factory. We're a bad-apple farm.'"
The previous theme cited Harris' testimony, in which he documents some of the negative implications you just read. The following six points Harris discusses in his testimony (provided here for accuracy against Hari's interpretation of them):
- "Extremism exploits our brains: With over a billion hours on YouTube watched daily, 70% of those billion hours are from the recommendation system. The most recommended keywords in recommended videos were: get schooled, shreds, debunks, dismantles, debates, rips confronts, destroys, hates, demolishes, obliterates."
- "Outrage exploits our brains: For each moral-emotional word added to a tweet it raised its retweet rate by 17%." 
- "Insecurity exploits our brains: In 2018, if you were a teen girl starting on a dieting video, YouTube's algorithm recommended anorexia videos next because those were better at keeping attention."
- "Conspiracies exploit our brains: And if you are watching a NASA moon landing, YouTube would recommend Flat Earth conspiracies millions of times. YouTube recommended Alex Jones (InfoWars) conspiracies 15 billion times." 
- "Sexuality exploits our brains: Adults watching sexual content were recommended videos that increasingly feature young women, then girls to then children playing in bathing suits." 
- "Confirmation bias exploits our brains: Fake news spreads six times faster than real news, because it's unconstrained while real news is constrained by the limits of what is true." 
The best line of defense is exposure; people need to realize they're being manipulated.
This theme addressed a handful of societal and individual impacts of social media. Next, we will examine the direct effect on attention.
In this theme, we examine the scientific evidence documenting social media's damage to our attention.
Hari begins by noting how social media is just the tip of the iceberg. The informational load of the world has been increasing since the rise of the Internet.
"In 1986, if you added up all the information being blasted at the average human being—TV, radio, reading—it amounted to 40 newspapers' worth of information every day. By 2007, they found it had risen to the equivalent of 174 newspapers per day. (I'd be amazed if it hasn't gone up even more since then.) The increase in the volume of information is what creates the sensation of the world speeding up."
As information accelerates, the question becomes, can we make sense of it? Can we handle and process such vast amounts of data? Studies suggest not. Let's look at speed reading first:
"Several teams of scientists have spent years figuring out: Can you make humans read things really, really fast? They found that you can—but it always comes at a cost. These teams took ordinary people and got them to read much faster than they ordinarily would; with training, and with practice, it sort of works. They can run their eyes over the words quickly and retain something of what they are seeing. But if you then test them on what they read, you'll discover that the faster you make them go, the less they will understand. More speed means less comprehension. Scientists then studied professional speed-readers—and they discovered that even though they are obviously better at it than the rest of us, the same thing happens. This showed there's just a maximum limit for how quickly humans can absorb information, and trying to bust through that barrier simply busts your brain's ability to understand it instead. The scientists investigating this also discovered that if you make people read quickly, they are much less likely to grapple with complex or challenging material. They start to prefer simplistic statements."
Why hasn't there been more resistance if the quantity of information exceeds our ability to comprehend it? This answer begins with a discussion about one of the great attentional lies; multitasking. The concept of multitasking has convinced us that we can handle life on fast forward:
"I went to interview Professor Earl Miller. He has won some of the top awards in neuroscience in the world, and he was working at the cutting edge of brain research when I went to see him in his office at the Massachusetts Institute of Technology (MIT). He told me bluntly that instead of acknowledging our limitations and trying to live within them, we have—en masse—fallen for an enormous delusion. There's one key fact, he said, that every human being needs to understand—and everything else he was going to explain flows from that. 'Your brain can only produce one or two thoughts' in your conscious mind at once. That's it. 'We're very, very single-minded.' We have 'very limited cognitive capacity.' This is because of the 'fundamental structure of the brain,' and it's not going to change. But rather than acknowledge this, Earl told me, we invented a myth. The myth is that we can actually think about three, five, ten things at the same time. To pretend this was the case, we took a term that was never meant to be applied to human beings at all. In the 1960s, computer scientists invented machines with more than one processor, so they really could do two things (or more) simultaneously. They called this machine-power 'multitasking.' Then we took the concept and applied it to ourselves.
"Some scientists used to side with my initial gut instinct—they believed it was possible for people to do several complex tasks at once. So they started to get people into labs, and they told them to do lots of things at the same time, and they monitored how well it went. What the scientists discovered is that, in fact, when people think they're doing several things at once, they're actually—as Earl explained—' juggling'. They're switching back and forth. They don't notice the switching because their brain sort of papers it over, to give a seamless experience of consciousness, but what they're actually doing is switching and reconfiguring their brain moment to moment, task to task—and that comes with a cost.
"There are three ways, he explained, in which this constant switching degrades your ability to focus. The first is called the 'switch cost effect.' There is broad scientific evidence for this. Imagine you are doing your tax return and you receive a text, and you look at it—it's only a glance, taking five seconds—and then you go back to your tax return. In that moment, 'your brain has to reconfigure, when it goes from one task to another,' he said. You have to remember what you were doing before, and you have to remember what you thought about it, 'and that takes a little bit of time.' When this happens, the evidence shows that 'your performance drops. You're slower. All as a result of the switching.' So if you check your texts often while trying to work, you aren't only losing the little bursts of time you spend looking at the texts—you are also losing the time it takes to refocus afterward, which can be much longer. He said: 'If you're spending a lot of your time not really thinking, but wasting it on switching, that's just wasted brain-processing time." This means that if your Screen Time shows you are using your phone four hours a day, you are losing much more time than that in lost focus.'
Professor Sophie Leroy Ph.D. from the University of Minnesota found that switching tasks leads to attention residue. Here's how it works:
As you switch from Task A to Task B, your attention does not drop Task A in favor of Task B; part of it remains anchored to Task A.
Further research by Gerald Weinberg suggests that switching between tasks consumes 20% of our attention. Our attention bandwidth decreases the more we change activities because of the residue remaining in each. Here's how that would look in practice:
Focusing on a single task allows us to commit 100% of our energy to it.
When switching between two tasks, we give each 40% of our attention, and 20% goes to attention residue.
Jumping between three tasks means only 20% of attention is available for each task, having now lost 40% to context switching.
Implicit in this finding is the question, what about multitasking?
As we have already seen, multitasking in humans is a fallacy.
Multitasking is a computer term first used in 1966. Before multiprocessors, a computer could only handle a single action. After the advent of multiprocessors, each processor performs a single task, making them serial processors, much like our minds.
Cognitively demanding tasks activate the prefrontal cortex, the core of the brain that orchestrates our thoughts and actions. Neuroscientists divide the prefrontal cortex into subregions, but there is no consensus about what these regions should be nor their responsibilities. The accepted processes that the prefrontal cortex handles are short-sighted behaviors with a goal in mind. This can include self-control, planning, decision-making, and problem-solving. But because of the complexity of these operations, it's unlikely only a single region handles them. The brain passes information through its network to achieve the desired result.
Studies suggest that multitasking causes the sides of the prefrontal cortex to split. When focusing on a single task, the two sides work in tandem. But by adding a second task, each side will work independently. The ease of juggling tasks depends on how engaged the prefrontal cortex is. Memorized actions like walking and talking place little strain on the prefrontal cortex compared to, say, reading and driving.
When people refer to multitasking, they aren't concentrating on two tasks simultaneously but rather switching between them.
Further research by Gloria Mark from the University of California found that for every disruption, it takes 23 minutes and 15-seconds to fully get back to the task. Hari continues the exploration of Mark's work:
"Professor Gloria Mark, at the Department of Informatics at the University of California, Irvine, who I interviewed, has discovered that the average American worker is distracted roughly once every three minutes. Several other studies have shown a large chunk of Americans are almost constantly being interrupted and switching between tasks. The average office worker now spends 40 percent of their work time wrongly believing they are 'multitasking'—which means they are incurring all these costs for their attention and focus. In fact, uninterrupted time is becoming rare. One study found that most of us working in offices never get a whole hour uninterrupted in a normal day. I had to look again at that figure several times before I really absorbed it: most office workers never get an hour to themselves without being interrupted. This is happening at every level of businesses—the average CEO of a Fortune 500 company, for example, gets just twenty-eight uninterrupted minutes a day."
Undoubtedly, distractions affect our work, as we all experience the lack of focus that follows them. Our temporary IQ is one aspect we cannot easily measure, but one study attempted to do so:
"A small study commissioned by Hewlett-Packard looked at the IQ of some of their workers in two situations. At first they tested their IQ when they were not being distracted or interrupted. Then they tested their IQ when they were receiving emails and phone calls. The study found that 'technological distraction'—just getting emails and calls—caused a drop in the workers' IQ by an average of ten points. To give you a sense of how big that is: in the short term, that's twice the knock to your IQ that you get when you smoke cannabis. So this suggests, in terms of being able to get your work done, you'd be better off getting stoned at your desk than checking your texts and Facebook messages a lot." 
This study is from 2005, before the rise of social media. With today's technology, it is hard to imagine how significant an IQ drop could be.
However, it is worth noting that I cannot find more scientific papers documenting the relationship between distractions and IQ. One small-scale study from 12 years ago is not conclusive. There is a need for larger studies to determine the effects of distractions on IQ. I should note that the decline in IQ is temporary. I found some instances of this study being misinterpreted online, falling under the 'technology makes you dumber' line of reasoning. It should be clear by now that distractions, in general, make you dumber.
The final point Hari touches on is distractions impact on driving:
"The cognitive neuroscientist Dr. David Strayer at the University of Utah conducted detailed research where he got people to use driving simulators and tracked how safe their driving was when they were distracted by technology—something as simple as their phone receiving texts. It turned out their level of impairment was 'very similar' to if they were drunk. It's worth dwelling on that. Persistent distractions have as bad an effect on your attention on the road as consuming so much alcohol that you got drunk. The distraction all around us isn't just annoying, it's deadly: around one in five car accidents is now due to a distracted driver."
In his book Deep Work, Cal Newport addresses a point that Hari fails to mention. When we remove social media from a user's life, their ability to focus is not restored. There is a lasting reduction in concentration, but the exact extent is unclear.
The late Stanford University professor Clifford Nass gave a shocking summary of his research in 2010. In an interview with NPR's Ira Flatow, Nass states that constantly switching attention online has a lasting negative effect on your brain:
"Once your brain has become accustomed to on-demand distraction, it's hard to shake the addiction even when you want to concentrate. To put this more concretely: If every moment of potential boredom in your life—say, having to wait five minutes in line or sit alone in a restaurant until a friend arrives—is relieved with a quick glance at your smartphone, then your brain has likely been rewired to a point where, it's not ready for deep work—even if you regularly schedule time to practice this concentration."
Given all these attentional defects, we must ask why the social media attraction is so strong. Let's now look at some of the psychological building blocks studied by the founders of Instagram.
This theme unravels the psychological principles underlying social media.
Both co-founders of Instagram, Kevin Systrom and Michel Krieger, attended classes at Stanford University's Persuasive Technology Lab. The course taught in the Technology Lab was about "figuring out how to design technology that could change your behavior—without you even knowing you were being changed."
The course had a scientific basis in the work of B. F. Skinner, as Hari explains:
"A Harvard professor named B. F. Skinner had become an intellectual celebrity by discovering something strange. You can take an animal that seems to be freely making up its own mind about what to pay attention to—like a pigeon, or a rat, or a pig—and you can get it to pay attention to whatever you choose for it. You can control its focus, as surely as if it was a robot and you had created it to obey your whims. Here's an example of how Skinner did it that you can try for yourself. Take a pigeon. Put it in a cage. Keep it until it is hungry. Then introduce a bird feeder that releases seed into the cage when you push a button. Pigeons move around a lot—so wait until the pigeon makes a random movement that you have chosen in advance (like, say, jerking its head up high, or sticking out its left wing), and at that precise moment, release some pellets. Then wait for it to make the same random movement again, and give it more pellets. If you do this a few times, the pigeon will quickly learn that if it wants pellets, it should carry out the random gesture you have chosen—and it will start to do it a lot. If you manipulate it correctly, its focus will come to be dominated by the twitch that you chose to reward. It will come to jerk up its head or stick out its left wing obsessively. When Skinner discovered this, he wanted to figure out how far you could take this. How elaborately can you program an animal using these reinforcements? He discovered you can take it really far. You can teach a pigeon to play ping-pong. You can teach a rabbit to pick up coins and put them into piggy banks. You can teach a pig to vacuum. Many animals will focus on very complex—and, to them, meaningless—things, if you reward them right.
"Skinner became convinced that this principle explained human behavior almost in its entirety. You believe that you are free, and that you make choices, and you have a complex human mind that is selecting what to pay attention to—but it's all a myth. You and your sense of focus are simply the sum total of all the reinforcements you have experienced in your life. Human beings, he believed, have no minds—not in the sense that you are a person with free will making your own choices. You can be reprogrammed in any way that a clever designer wants.
"Years later, the designers of Instagram asked: If we reinforce our users for taking selfies—if we give them hearts and likes—will they start to do it obsessively, just like the pigeon will obsessively hold out its left wing to get extra seed? They took Skinner's core techniques, and applied them to a billion people."
Tristan Harris attended the Stanford course with both Kevin Systrom and Mike Krieger; below, he describes his experience:
"The course was taught by a warm, upbeat Mormon behavioral scientist in his forties named Professor B. J. Fogg. At the start of each day, he would take out a stuffed frog and a cuddly monkey and introduce them to the class, and then he would play on his ukulele. Whenever he wanted the group to break or wrap up, he would tap on a toy xylophone. B.J. explained to students that computers had the potential to be far more persuasive than people. They could, he believed, 'be more persistent than human beings, offer greater anonymity," and "go where humans cannot go or may not be welcome.' Soon, he was sure, they would be changing the lives of everyone—persuading us persistently, throughout the day. He had previously worked on a course dedicated to 'the psychology of mind control.' He assigned to Tristan and his other students a small mound of books that explained hundreds of psychological insights and tricks that had been discovered about how to manipulate human beings and to get them to do what you want. It was a treasure trove. Many of them were based on the philosophy of B. F. Skinner, the man who, as I had learned earlier, had found a way to get pigeons and rats and pigs to do whatever he wanted by offering the right 'reinforcements' for their behavior. After falling out of fashion for years, his ideas were back with full force.
"As part of the class, he was paired with a young man named Mike Krieger, and they were tasked with designing an app. Tristan had been thinking for a while about something named 'seasonal affective disorder'—a condition where, if you are stuck in gloomy weather for a long time, you are more likely to become depressed. How, they asked, could technology help with that? They came up with an app called Send the Sunshine. Two friends would choose to be connected through it, and it would track where they both were and the online weather reports for their locations. If the app realized that your friend was starved of sunshine, and you had some, it would prompt you to take a photo of the sun and send it to him. It showed that somebody cared; and it sent some sunshine your way. It was sweet, and simple, and it helped to spur Mike and another person in the class, Kevin Systrom, to think about the power of sharing photographs online. They were already thinking about another of the key lessons of the class, taken from B. F. Skinner: build in immediate reinforcements. If you want to shape the user's behavior, make sure he gets hearts and likes right away. Using these principles, they launched a new app of their own. They named it Instagram.
"In the final class Tristan attended, all the students discussed ways in which these persuasive technologies could be used in the future. One of the other groups had come up with an eye-catching plan. They asked: 'What if in the future you had a profile of every single person on earth?' As a designer, you would track all the information they offer up on social media and build a detailed profile of them. It's not just the simple stuff—their gender, or age, or interests. It would be something deeper. This would be a psychological profile—figuring out how their personality works, and the best ways to persuade them. It would know if the user was an optimist or pessimist, if they were open to new experiences or they were prone to nostalgia—it would figure out dozens of characteristics they have. Think, the class wondered out loud, about how you could target people if you knew this much about them. Think about how you could change them. When a politician or a company wants to persuade you, they could pay a social-media company to perfectly target their message just for you. It was the birth of an idea. Years later, when it was revealed that the campaign for Donald Trump had paid a company named Cambridge Analytica to do exactly that, Tristan would think of that final class in Stanford. 'This was the class that freaked me out,' he told me. 'I remember saying—this is horribly concerning.'"
Thus far, we have discussed the attention epidemic, the mechanics of social media, the effects on our attention, and the psychological principles underlying these platforms.
We will now conclude this five-part exploration by answering every child's favorite question: Why? What is the aim of stealing our attention?
This theme addresses how social media profits from your attention. We will look at a handful of their data-collection and privacy-invading practices.
Aza Raskin features throughout this final theme. You may never have heard of Raskin, but you use his product daily. Infinite scrolling. Technology innovation runs in Raskin's blood. Aza's father, Jef Raskin, invented the original Apple Macintosh and named it after his favorite apple, the McIntosh.
Aza Raskin and Tristan Harris explain to Hari how social media profits from our attention:
"Facebook makes more money for every extra second you are staring through a screen at their site, and they lose money every time you put the screen down. They make this money in two ways. Until I started to spend time in Silicon Valley, I had only naively thought about the first and the most obvious. Clearly—as I wrote in the last chapter—the more time you look at their sites, the more advertisements you see. Advertisers pay Facebook to get to you and your eyeballs. But there's a second, more subtle reason why Facebook wants you to keep scrolling and desperately doesn't want you to log off.
"Every time you send a message or status update on Facebook, or Snapchat, or Twitter, and every time you search for something on Google, everything you say is being scanned and sorted and stored. These companies are building up a profile of you, to sell to advertisers who want to target you. For example, starting in 2014, if you used Gmail, Google's automated systems would scan through all your private correspondence to generate an 'advertising profile' exactly for you. If (say) you email your mother telling her you need to buy diapers, Gmail knows you have a baby, and it knows to target ads for baby products straight to you. If you use the word "arthritis," it'll try to sell you arthritis treatments. The process that had been predicted in Tristan's final class back in Stanford was beginning.
Advertisers use this information to build profiles of the users. These profiles allow them to market products that match a person's interests, preferences, and needs. Raskin describes this process by using the analogy of a voodoo doll:
"Aza explained it to me by saying that I should imagine that 'inside of Facebook's servers, inside of Google's servers, there is a little voodoo doll, [and it is] a model of you. It starts by not looking much like you. It's sort of a generic model of a human. But then they're collecting your click trails [i.e., everything you click on], and your toenail clippings, and your hair droppings [i.e., everything you search for, every little detail of your life online]. They're reassembling all that metadata you don't really think is meaningful, so that doll looks more and more like you. [Then] when you show up on [for example] YouTube, they're waking up that doll, and they're testing out hundreds of thousands of videos against this doll, seeing what makes its arm twitch and move, so they know it's effective, and then they serve that to you.' It seemed like such a ghoulish image that I paused. He went on: 'By the way—they have a doll like that for one in four human beings on earth.'
"At the moment these voodoo dolls are sometimes crude and sometimes startlingly specific. We've all had one kind of experience of searching online for something. I recently tried to buy an exercise bike, and still, a month later, I am endlessly being served advertisements for exercise bikes by Google and Facebook, until I want to scream, 'I bought one already!' But the systems are getting more sophisticated every year. Aza told me: "It's getting to be so good that whenever I give a presentation, I'll ask the audience how many think Facebook is listening to their conversations, because there's some ad that's been served that's just too accurate. It's about a specific thing they never mentioned before [but they happen to have talked about offline] to a friend the day before. Now, it's generally one-half to two-thirds of the audience that raises their hands. The truth is creepier. It's not that they are listening and then they can do targeted ad serving. It's that their model of you is so accurate that it's making predictions about you that you think are magic.
"This is the business model that built and sustains the sites on which we spend so much of our lives. The technical term for this system—coined by the brilliant Harvard professor Shoshana Zuboff—is 'surveillance capitalism.' Her work has made it possible for us to understand a lot of what is happening now. Of course, there have been increasingly sophisticated forms of advertising and marketing for over a hundred years—but this is a quantum leap forward. A billboard didn't know what you googled at three in the morning last Thursday. A magazine ad didn't have a detailed profile of everything you've ever said to your friends on Facebook and email. Trying to give me a sense of this system, Aza said to me: 'Imagine if I could predict all your actions in chess before you made them. It would be trivial for me to dominate you. That's what is happening on a human scale now.'"
Now that we understand how social media works from the inside, let's explore the other side of the argument. How have these social empires responded to the accusations made against them? What would need to change for social media to coexist with humans without incurring the harmful costs?
This theme addresses the response (or lack thereof) to the growing number of concerns with social media, specifically Facebook. Finally, we examine what would need to change to mitigate the adverse effects of social media.
Facebook formed an internal team in 2020 to examine the effects of its platform on users. The findings were conclusive. Facebook is much worse than it appears. The group, Common Ground, took their results to Mark Zuckerberg, co-founder and CEO of Facebook. Zuckerberg asked that they "not bring him something like that again."
Here's the story from Stolen Focus:
"One day, in the spring of 2020, it was revealed what Facebook actually thinks about these questions, in private, when they think we will never be able to hear them. A large number of internal Facebook documents and communications were leaked to the Wall Street Journal. It turns out that behind closed doors, the company had responded to the claims that their algorithms had damaged our collective attention and helped the rise of Trump and Brexit by convening a team of some of their best scientists and tasking them with figuring out if this was really true, and if it was, to figure out what they could do about it.
"The unit was called Common Ground. After studying all the hidden data—the stuff that Facebook doesn't release to the public—the company's scientists reached a definite conclusion. They wrote: 'Our algorithms exploit the human brain's attraction to divisiveness,' and 'if left unchecked' the site would continue to pump its users with 'more and more divisive content in an effort to gain user attention and increase time on the platform.'
"A separate internal Facebook team, whose work also leaked to the Journal, had independently reached the same conclusions. They found that 64 percent of all the people joining extremist groups were finding their way to them because Facebook's algorithms were directly recommending them. This meant that across the world, people were seeing in their Facebook feeds racist, fascist, and even Nazi groups next to the words: 'Groups You Should Join.' They warned that in Germany, one-third of all the political groups on the site were extremist. Facebook's own team was blunt, concluding: 'Our recommendation systems grow the problem.'
"After carefully analyzing all the options, Facebook's scientists concluded there was one solution: they said Facebook would have to abandon its current business model. Because their growth was so tied up with toxic outcomes, the company should abandon attempts at growth. The only way out was for the company to adopt a strategy that was ‘anti-growth’—deliberately shrink, and choose to be a less wealthy company that wasn't wrecking the world. Once Facebook was shown—in plain language, by their own people—what they were doing, how did the company's executives respond? According to the Journal's in-depth reporting, they mocked the research, calling it an ‘Eat Your Veggies’ approach. They introduced some minor tweaks, but dismissed most of the recommendations. The Common Ground team was disbanded and has ceased to exist. The Journal reported dryly: ‘Zuckerberg also signaled he was losing interest in the effort to recalibrate the platform in the name of the social good…asking that they not bring him something like that again.’"
I couldn't find the study Hari references in Stolen Focus, but it's on The Wall Street Journal, behind a paywall.  The Independent has a similar article in full. 
In response to Harris' and Aza's public condemnation of social media, other engineers behind the platforms began joining their cause:
"When Tristan and Aza started to speak out, they were ridiculed as wildly over-the-top Cassandras. But then, one by one, all over Silicon Valley, people who had built the world we now live in were beginning to declare in public that they had similar feelings. For example, Sean Parker, one of the earliest investors in Facebook, told a public audience that the creators of the site had asked themselves from the start: 'How do we consume as much of your time and conscious attention as possible?' The techniques they used were 'exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology…. The inventors, creators—it's me, it's Mark [Zuckerberg], it's Kevin Systrom on Instagram, it's all of these people—understood this consciously. And we did it anyway.' He added: 'God only knows what it's doing to our children's brains.' Chamath Palihapitiya, who had been Facebook's vice president of growth, explained in a speech that the effects are so negative that his own kids 'aren't allowed to use that shit.' Tony Fadell, who co-invented the iPhone, said: 'I wake up in cold sweats every so often thinking, what did we bring to the world?' He worried that he had helped create 'a nuclear bomb' that can 'blow up people's brains and reprogram them.' Many Silicon Valley insiders predicted that it would only get worse. One of its most famous investors, Paul Graham, wrote: 'Unless the forms of technological progress that produced these things are subject to different laws than technological progress in general, the world will get more addictive in the next forty years than it did in the last forty.'"
We are unlikely to see many changes due to the reliance social media companies have on our attention. A significant overhaul cannot take place without considerable financial loss. When Apple's iOS 14 privacy features went live, Facebook lost $10-billion. The updates would not have been significant if Facebook hadn't been overstepping the boundaries of privacy. More telling is the money Facebook poured into fighting these updates, accusing Apple of killing the 'free internet.'
The Facebook study above found that to mitigate the effects of the platform, they would have to abandon the current business model. They must stop selling users' data to advertisers to end surveillance capitalism. To optimize for human well-being, they need to stop optimizing for time on site and spreading outrage-inducing content. Facebook would have to switch to a subscription model in which users pay a monthly fee to access the platform. This may sound audacious, but Twitter added a subscription service that provides users with exclusive features. With this subscription model, Twitter looks to be moving away from the illicit business model of the past. No doubt Facebook would see mass abandonment if it became a paid service. Although, is that not telling? People wouldn't pay to use Facebook, meaning whatever utility they draw from it isn't worth a few dollars a month.
Hari, Azi, and Harris suggest various features Facebook could install overnight to mitigate some of the adverse effects. Among them were batch notifications, removing the infinite scroll, and a proposal from Hari to sell the networks to the government. Hari's last suggestion touches upon the core issue. The financial infrastructure needs to change. Facebook pursues profit at all costs; if we remove this pursuit by funding the company with the users' subscription, the goals of Facebook would then align with the goals of its users.
One point that Azi, Harris, and Hari all agree on is that surveillance capitalism is the just beginning:
"There are technological innovations coming that will make the current forms of surveillance capitalism look as crude as Space Invaders looks to a kid raised on Fortnite. Facebook, in 2015, filed a patent for technology that will be able to detect your emotions from the cameras on your laptop and phone. If we don't regulate, Aza warns, 'our supercomputers are going to test their way to finding all our vulnerabilities, without anyone ever stopping to ask—is that right? It'll feel to us a little bit like we're still making our own decisions,' but it will be 'a direct attack against agency and free will.'
The book ends with a list of six changes Hari made to his life to regain focus:
“One: I used pre-commitment to stop switching tasks so much. Pre-commitment is when you realize that if you want to change your behavior, you have to take steps now that will lock in that desire and make it harder for you to crack later. One key step for me was buying a kSafe, which—as I mentioned briefly before—is a large plastic safe with a removable lid. You put your phone in it, put the lid back on, and turn the dial at the top for however long you want—from fifteen minutes to two weeks—and then it locks your phone away for as long as you selected. Before I went on this journey, my use of it was patchy. Now I use it every day without exception, and that buys me long stretches of focus. I also use on my laptop a program called Freedom, which cuts it off from the internet for as long as I select. (As I write this sentence, it’s counting down from three hours.)”
“Two: I have changed the way I respond to my own sense of distraction. I used to reproach myself, and say: You’re lazy. You’re not good enough. What’s wrong with you? I tried to shame myself into focusing harder. Now, based on what Mihaly Csikszentmihalyi taught me, instead I have a very different conversation with myself. I ask: What could you do now to get into a flow state, and access your mind’s own ability to focus deeply? I remember what Mihaly taught me are the main components of flow, and I say to myself: What would be something meaningful to me that I could do now? What is at the edge of my abilities? How can I do something that matches these criteria now? Seeking out flow, I learned, is far more effective than self-punishing shame.”
“Three: based on what I learned about the way social media is designed to hack our attention spans, I now take six months of the year totally off it. (This time is divided into chunks, usually of a few months.) To make sure I stick to it, I always announce publicly when I am going off—I’ll tweet that I am leaving the site for a certain amount of time, so that I will feel like a fool if I suddenly crack and go back a week later. I also get my friend Lizzie to change my passwords.”
“Four: I acted on what I learned about the importance of mind-wandering. I realized that letting your mind wander is not a crumbling of attention, but in fact a crucial form of attention in its own right. It is when you let your mind drift away from your immediate surroundings that it starts to think over the past, and starts to game out the future, and makes connections between different things you have learned. Now I make it a point to go for a walk for an hour every day without my phone or anything else that could distract me. I let my thoughts float and find unexpected connections. I found that, precisely because I give my attention space to roam, my thinking is sharper, and I have better ideas.”
“Five: I used to see sleep as a luxury, or—worse—as an enemy. Now I am strict with myself about getting eight hours every night. I have a little ritual where I make myself unwind: I don’t look at screens for two hours before I go to bed, and I light a scented candle and try to set aside the stresses of the day. I bought a FitBit device to measure my sleep, and if I get less than eight hours, I make myself go back to bed. This has made a really big difference.”
“Six: I’m not a parent, but I am very involved in the lives of my godchildren and my young relatives. I used to spend a lot of my time with them deliberately doing things—busy, educational activities I would plan out in advance. Now I spend most of my time with them just playing freely, or letting them play on their own without being managed or oversupervised or imprisoned. I learned that the more free play they get, the more sound a foundation they will have for their focus and attention. I try to give them as much of that as I can.”
We can not do much as individuals to alter the infrastructure of a trillion-dollar empire. What we can change is how we approach their platforms. Most of the studies discussed in this book and summary share a single similarity: how we allow smart devices to interrupt our lives. Although notifications are the primary culprit, Gloria Mark suggests that we interrupt ourselves as often as others. As a result of constant distractions, our brains rewire themselves to seek stimuli that release bursts of dopamine, the neurotransmitter that makes us feel good. Reversing this conditioning begins with disabling notifications.
Taking a hiatus from social media or deleting accounts is unthinkable for many, and Hari himself avoided stating this outright in Stolen Focus. The purpose of this summary is to provide you with a thorough explanation of the problem so you can make your own decisions. As a result, I suggest turning off notifications as a more realistic solution. Eliminating notifications would be applying the 80/20 rule to social media.
The 80/20 rule or Pareto principle states that 20% of inputs account for 80% of outputs.
20% of a business's products generate 80% of its revenue. 20% of football players score 80% of all goals. 20% of a country's citizens hold 80% of the nation's wealth. (In The United States, the top 10% control 76% of the wealth).
I am using the Pareto principle to suggest that you can avoid 80% of the attentional problems with only 20% of the effort (three taps on almost all smart devices).
Putting aside the polarizing effects of social media, what are the results of continual distractions on the brain?
The following is a two-minute summarization of the studies from Stolen Focus and elsewhere documenting social media's effect on our attention:
- One study suggests that technological distractions cause an IQ drop of 10 points, twice as much as smoking cannabis. 
- Distracted drivers have a level of attentional impairment comparable to drunken drivers. 
- Professor Sophie Leroy, Ph.D. from the University of Minnesota, found that switching tasks leads to attention residue. Here's how it works:
As you switch from Task A to Task B, your attention does not drop Task A in favor of Task B; part of it remains anchored to Task A. 
- Further research by Gerald Weinberg suggests that switching between tasks consumes 20% of our attention. Our attention bandwidth decreases the more we change tasks because of the residue remaining in each. Here's how that would look in practice:
Focusing on a single task allows us to commit 100% of our energy towards it.
When switching between two tasks, we give each 40% of our attention, and 20% goes to attention residue.
Jumping between three tasks means only 20% of attention is available for each task, having now lost 40% to context switching. 
- Gloria Mark from the University of California found that for every disruption, it takes 23 minutes and 15-seconds to fully get back to the task. Mark also concluded that the average American work experiences an interruption every 3-minutes. 
- True multitasking (a computer term) in humans is impossible. When people refer to multitasking, they aren't concentrating on two tasks simultaneously but rather switching between them. See more.
- If left unchecked, multitasking changes the structure of our brain. MRI scans show that multitaskers have less brain density in the regions responsible for empathy and emotional control. Distractions are shriveling your brain and eroding your attention. 
- American college students switch tasks every sixty-five seconds. 
- US adult office workers stay on task for an average of three minutes. 
- 40% of knowledge workers in the UK never work more than 30 minutes straight. These same workers are productive for only 2 hours and 48 minutes of their 8-hour workday. 
- Knowledge workers who believe they work 50-hour work weeks spend only 14-hours on task.
- In 2019, Asana released its Anatomy of Work Index, providing an in-depth analysis of 10,223 global knowledge workers. Emails, notifications, and unexpected meetings consume 60% of a worker's day. The remaining 40% splits between 13% planning and only 27% of productive work. 
- The late Stanford University professor Clifford Nass (1958 - 2013) gave a shocking summary of his research in 2010. In an interview with NPR's Ira Flatow, Nass states that constantly switching attention online has a lasting negative effect on your brain:
"Once your brain has become accustomed to on-demand distraction, it's hard to shake the addiction even when you want to concentrate. To put this more concretely: If every moment of potential boredom in your life—say, having to wait five minutes in line or sit alone in a restaurant until a friend arrives—is relieved with a quick glance at your smartphone, then your brain has likely been rewired to a point where, it's not ready for deep work—even if you regularly schedule time to practice this concentration." 
: Deep Work by Cal Newport, 2016, p. 158-159
Here are some suggestions for learning more about social media’s impact.
Cal Newport — Deep Work
Cal Newport — Digital Minimalism
Nicholas Carr — The Shallows: What the Internet Is Doing to Our Brains
Jaron Lanier — Ten Arguments For Deleting Your Social Media Accounts Right Now
Although I haven't read them, I've heard good things about these titles:
Nir Eyal — Indistractable: How to Control Your Attention and Choose Your Life
Nicholas Carr — The Glass Cage: How Our Computers Are Changing Us
Jean M. Twenge — iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy--and Completely Unprepared for Adulthood--and What That Means for the Rest of Us
Neil Postman — Amusing Ourselves to Death: Public Discourse in the Age of Show Business
Learn more about Stolen Focus on Amazon.