Author: Matthew Syed
Full Title: Rebel Ideas: The Power of Diverse Thinking
This book is a staggering great read, summarized in two quotes:
1. Group wisdom emerges whenever information is dispersed among different minds.
2. Diversity, in a real sense, is the hidden engine of humanity.
One argument against diversity proposes that people should be recruited based on skill, not merely because they are different. The argument goes something like this 'If I'm running a company, I want to hire the best people for the job, regardless of their race, ethnicity, religion, or sexual preferences. Aiming to appear diverse to seem woke hinders the performance of the company'.
Matthew address this argument in the beginning on the book.
This is a book about diversity, about the power of bringing people together who think differently from one another. At one level, this might seem like a curious objective. Surely, we should aim to think correctly or accurately, not differently. One should only wish to think differently from other people when they are in the wrong. When other people are right, thinking differently will only lead you into error. This seems like common sense. Another seemingly commonsensical statement was that made by Justice Scalia. He argued that recruiting people because they are different, in one way or another, is to jeopardize performance. You should hire people because they are smart, or knowledgeable or fast. Why would you hire people who are less knowledgeable, fast or talented, just because they are different?
The reason, by the way, is because people that are different think differently.
When I say 'different', I, like the author, am speaking about cognitive and demographic diversity. People that share varying views, beliefs and life experiences.
Demographic diversity (differences in race, gender, age, class, sexual orientation, religion and so on) can, in certain circumstances, increase group wisdom. Teams that are diverse in personal experiences tend to have a richer, more nuanced understanding of their fellow human beings. They have a wider array of perspectives – fewer blind spots. They bridge between frames of reference.
If we only surround ourselves with people who think like us, our life becomes an echo chamber; we only hear the viewpoint we already believe. Our life becomes a living lesson in the power of confirmation bias. This fact is one social media companies are optimizing their algorithms around.
Algorithm manipulation was touched on in the book but is too important not further explore.
Social media companies, by design, keep us on their services for a long as possible. They only optimize for getting people somewhere and keeping them there.
In Stuart J. Russell's book, Human Compatible, he explains how they accomplish this (paraphrased for simplicity).
There are two ways an algorithm can achieve its goal.
1. Become better at predicting what the user wants.
2. Manipulate the user's preferences to make them more predictable.
As it turns out, these algorithms have been focusing primarily on the second option.
The algorithms selecting content to feed you will display information that either matches your current beliefs or, over time, drip feeds you enough news that pushes you one way or another. This may provide one answer as to why people are adopting more extremist views. Being out on the edges of a political spectrum makes you easier to predict, whereas if you're in the middle, you can swing either way - it's not as clear what your believes are and how you will act.
Social media is one arena, but what about Google? Google is neutral, right? It just answers my questions, doesn't it? Unfortunately, that's not the case. Everything you type into Google and click on logged. Google then uses your search and click history to show the most relevant results for you. Google is not a neutral adversary. Your experience on the platform is based on your previous interactions with it.
This all provides for a rather one-dimensional experience online. It would be like living with your family your whole life, never venturing outside of your hometown. Ask yourself next time you see a triggering article on your feed, why am I being shown this? What conclusions does this information lead me to believe? And perhaps, most importantly, are they true?
An echo chamber refers to beliefs being amplified and confirmed inside a closed system, away from any rebuttal. Theories are reinforced inside them without any exposure to opposing viewpoints. They are dangerously easy to fall into, as we have explored. The result of residing inside an echo chamber is confirmation bias, in where a person interprets information in a way that supports their current beliefs.
Here are a couple of my favourite quotes to stop yourself from falling victim to echo chamber thinking.
Does a community’s belief system actively undermine the trustworthiness of any outsiders who don’t subscribe to its central dogmas? Then it’s probably an echo chamber. - C. Thi Nguyen
I never allow myself to hold an opinion on anything that I don't know the other side's argument better than they do. - Charlie Munger
Here is an example from the book of an echo chamber in reality.
The mathematician Emma Pierson analysed how the troubles of Ferguson, Missouri were covered on social media in 2014, after a white police officer called Darren Wilson shot and killed a black man, Michael Brown. She found two, distinct clusters. ‘Blue tweets’ expressed horror at Brown’s death and criticised the oppressive police response, while the ‘red tweets’ argued that the policeman was being scapegoated and the protesters were looters. As Pierson puts it:
The red group says they would feel safer meeting Darren Wilson than Michael Brown, and says that Brown was armed when he was shot; the blue group sarcastically contrasts Darren Wilson with the unarmed Michael Brown. The red group talks about mob justice and race baiting; the blue group talks about breaking the system. The red group blames Obama for exacerbating tensions and forcing the Missouri governor into declaring a state of emergency; the blue group says the state of emergency must not be used to violate human rights.
Perhaps most tellingly of all, these two groups had virtually no interaction with each other. They were only seeing tweets from people who agreed with them, a demonstration of how the segmentary dynamics of the Internet can filter information. ‘When it comes to Ferguson, two groups with very different political and racial backgrounds ignore each other,’ Pierson writes. ‘This seems likely to cause problems, and in fact it does. For one thing, the two groups think drastically different things.’
Social media manipulation and echo chambers each lead to perspective blindness.
Perspective blindness' refers to the fact that we are oblivious to our own blind spots. We perceive and interpret the world through frames of reference but we do not see the frames of reference themselves. This, in turn, means that we tend to underestimate the extent to which we can learn from people with different points of view.
Our modes of thought are so habitual that we scarcely notice how they filter our perception of reality. The danger arises when we overlook the fact that in most areas of life there are other people, with different ways of looking at things, who might deepen our own understanding, just as we might deepen theirs.
We prefer people that share our views and values, at least on serious topics.
Indeed, evidence from brain scanners indicates that when others reflect our own thoughts back to us, it stimulates the pleasure centres of our brains.
Think how comforting it is to be surrounded by people who think in the same way, who mirror our perspectives, who confirm our prejudices. It makes us feel smarter. It validates our world view. Homophily is somewhat like a hidden gravitational force, dragging human groups towards one corner of the problem space.
But this cuts to the essence of the problem: when smart people from a singular background are placed into a decision-making group, they are liable to become collectively blind. As King and Crewe put it: 'Everyone projects onto others his or her lifestyles, preferences and attitudes. Some do it all the time; most of us do it some of the time.
I quoted this in the first key idea but it needs repeating for context.
Another seemingly commonsensical statement was that made by Justice Scalia. He argued that recruiting people because they are different, in one way or another, is to jeopardize performance. You should hire people because they are smart, or knowledgeable or fast. Why would you hire people who are less knowledgeable, fast or talented, just because they are different?
Now comes time to address the answer in detail from a study led by Jack Soll, a psychologist at Duke University.
[Jack Soll] and his colleagues analysed 28,000 forecasts by professional economists. Their first finding was not at all surprising. Some economists are better performers than others. Indeed, the top forecaster was around 5 per cent more accurate than an average forecaster.
But then Soll added a twist. Instead of looking at individual predictions, he took the average prediction of the top six economists. To stretch language a little, these forecasters were being placed into a team. The average of their predictions is what you might call their collective judgement. Soll then checked if this prediction was more accurate than the top-ranked economist.
Now, in a simple task, the answer must be 'no'. In a running race, the average time of six runners has to be slower than the time of the fastest runner. This is what Justice Scalia (quoted above) had in mind when he argued for a trade-off between diversity and excellence. But this analysis flips when we move from simple to complex problems. Indeed, when Soll compared the collective judgement of six economists with the judgement of the top economist, it was not less accurate, it was more accurate. And not just a little more accurate but 15 per cent more accurate. This is a staggering difference – so large, in fact, that it shocked the researchers.
This is due to individuals having different frames of references, which when combined "creates a more comprehensive picture", as Matthew goes on to explain.
It turns out that economic forecasters have frames of reference, too. These are sometimes called models. A model is a way of making sense of the world: a perspective, a point of view, often expressed as a set of equations. No economic model is complete, however. Each model contains blind spots. The economy is complex (unlike, say, the orbit of Jupiter, which can be precisely predicted). The rate of industrial production, for example, hinges on the decisions of thousands of businesspeople, operating tens of thousands of factories and firms, and influenced by millions of variables. No model can account for all this complexity. No economist is omniscient.
In short: Group wisdom emerges whenever information is dispersed among different minds.
Suppose you identified the fastest runner in the world. Let us call him Usain Bolt. Suppose, too, you could clone this runner. If you were putting together a relay team of, say, six runners, your team of Usain Bolts would smash the opposition (assuming they pass the baton effectively). Every single one would be faster than anybody in any other team.
When it comes to simple tasks, diversity is a distraction. You just want to hire people who are smart, fast, knowledgeable, whatever. Things are not just different, but the polar opposite, when it comes to complex problems, however.
If the individuals in the group don’t know much then combining their judgements won’t achieve much. If you ask a group of laypeople to estimate how much ocean levels will rise over the next decade you won’t get very far. To achieve group wisdom, you need wise individuals. But you also need diverse individuals, otherwise they will share the same blind spots.
Let us return to economic forecasting. Suppose you could identify and clone the most accurate forecaster in the world. If you were putting together a team of six forecasters, would it make sense to put six of these clones together? On the surface, the team sounds unbeatable. Each member is more accurate than any forecaster in any other team. Isn't this the perfect team?
We can now see that the answer is an emphatic 'no'! They all think in the same way. They use the same model, and make the same mistakes. Their frames of reference overlap. Indeed, the Soll experiment implies that a diverse group of six forecasters, while individually less impressive, would be 15 per cent more accurate.
Ask yourself this question: suppose that you put together a team of ten people to come up with ideas to solve, say, the obesity crisis. Suppose, too, that each of these ten people comes up with ten useful ideas. How many useful ideas do you have in total?
In fact, this is a trick question. You can't infer the number of ideas in a group from the number of ideas of its members. If these people are clone-like and come up with the same ten ideas, you have only ten ideas overall. But if the ten people are diverse, and come up with different ideas, you could have one hundred useful ideas. That is not 50 per cent more ideas, or 100 per cent more ideas, but almost 1,000 per cent more ideas. This is another huge effect solely attributable to diversity.
But this again reveals why diversity matters. With homogenous groups people tend to get stuck in the same place. Diverse teams, on the other hand, come up with fresh insights, helping them to become unstuck. Rebel ideas are effectively firing the collective imagination. As the leading psychologist Charlan Nemeth puts it: 'Minority viewpoints are important, not because they tend to prevail but because they stimulate divergent attention and thought. As a result, even when they are wrong they contribute to the detection of novel solutions that, on balance, are qualitatively better.'
Clone fallacy: collective intelligence emerges not only from individuals' knowledge but also from the differences between them.
When Justice Scalia argued that there was a trade-off between performance and diversity, he was making a seductive conceptual error. It is the same error that leads most people to express surprise when told that the average of six forecasters is more accurate than the top forecaster, and that deludes people into thinking that a group of wise individuals must constitute a wise group. Scalia was, in effect, looking at the problem from the individualistic perspective, not the holistic perspective. He didn't take account of the fact that collective intelligence emerges not just from the knowledge of individuals, but also from the differences between them. Let us call this the 'clone fallacy'.
The tragedy is that this fallacy is pervasive. Indeed, perhaps the most striking conversation I had while researching this book was with a renowned economic forecaster. I asked if he preferred to work with people who think in the same way, or who think differently. He replied: 'If I truly think my model is the best one out there, then I should work with people who think like me.' This logic is highly compelling. It is also spectacularly wrong.
Due to 'knowledge clustering', hiring the top students from a single university results in clone-like thinking.
Suppose that some universities have a strong reputation for, say, software development. These universities are likely to attract the smartest software students. These students, in turn, will graduate with the most impressive credentials. Now, suppose you are running a top software company. Wouldn't you want these students? Wouldn't you want to pack your organisation with the brightest and the best?
The enlightened answer is 'no'. These graduates will have studied under the same professors and absorbed similar insights, ideas, heuristics and models, and perhaps world views, too. This is sometimes called 'knowledge clustering'. By selecting graduates in a meritocratic way, organisations can find themselves gravitating towards clone-like teams. This is not to dismiss meritocracy. It is merely to point out that collective intelligence requires both ability and diversity.
Successful teams are diverse, but not arbitrarily diverse.
Have you ever had an argument online or in-person and the opposing party begins attacking you, rather than your argument?
It turns out there's a phrase for this 'ad hominem'.
Philosophers have a particular term for serial assaults on personal integrity. The 'ad hominem' is defined by one reference source as 'a fallacious argumentative strategy whereby genuine discussion of the topic at hand is avoided by instead attacking the character, motive, or other attribute of the person making the argument, or persons associated with the argument, rather than attacking the substance of the argument itself.
The dubious power of the ad hominem has been revealed through extensive psychological research. One recent paper published in the Public Library of Science surveyed 39 college students and 199 adults. They found that when you attack someone's character it undermines faith in their conclusions as powerfully as when you identify actual evidence questioning the basis of those claims. Playing the person rather than the ball works.
Ad hominem provides an insight into the polarization we see online daily, as why form a coherent argument when you can hurl insults at others?
Brainwriting is similar to brainstorming; only the ideas are not spoken aloud. Instead, they are written on cards and stuck on a wall for group members to vote on.
Brainwriting is effective because it takes unconscious bias' off the table. The sources of the ideas are hidden, making them credited on merit alone.
Brainwriting, like brainstorming, this is a way of generating creative ideas, but instead of stating the ideas out loud, team members are asked to write them down on cards, which are then posted on a wall for the rest of the group to vote on. 'This means that everyone gets a chance to contribute,' Leigh Thompson of the Kellogg School of Management, told me. 'It means that you gain access to the output of every brain, rather than just one or two. '
Thompson suggests that brainwriting should have just one rule: nobody is allowed to identify themselves with their written contributions. The marketing director should not offer a 'tell' by referring on the card to a client associated with him. 'This is crucial,' Thompson says. 'By anonymising the contributions, you separate the idea from the status of the person who came up with it. This creates a meritocracy of ideas. People vote on the quality of the proposal, rather than the seniority of the person who suggested it, or to curry favour. It changes the dynamic.'
When brainwriting is put head to head with brainstorming, it generates twice the volume of ideas, and also produces higher quality ideas when rated by independent assessors. The reason is simple. Brainwriting liberates diversity from the constraints of dominance dynamics.
[Unconscious bias] refers to the way that people are denied opportunities not because of a lack of talent or potential but because of arbitrary factors such as race or gender.
Perhaps the most intuitive example of unconscious bias emerged in the 1970s. At that point, orchestras in the United States (and elsewhere) were dominated by men. The reason is simple: those who conducted the auditions thought that men were, typically, better musicians. This was a meritocracy, they insisted. Men were said to be more accomplished pianists, violinists, etc.
But Claudia Goldin of Harvard and Cecilia Rose of Princeton had an idea: why not conduct auditions behind screens? This meant that the selection panels could hear the music, and assess its quality, but could not see the gender of the musicians playing it. When these screens were introduced, women's chances of making it through the first round increased by 50 per cent, and in the final rounds by 300 per cent. Female players in major orchestras have since increased from 5 per cent to nearly 40 per cent.
What is fascinating is that the recruiters didn't realise they were discriminating against women until the screen was introduced. Only then could they see that they had been assessing candidates not merely on skill, but through the filter of stereotypes about what a good musician ought to look like. Eliminating bias was not just good for female musicians, but also for orchestras. They were recruiting talent regardless of what it looked like.
Unconscious bias tends not to manifest itself when the differences between candidates are obvious. After all, why would an employer deliberately choose an inferior performer? This would harm the organisation itself. It is only when candidates are similar in ability, when the recruiter has what psychologists call 'discretionary space', that unconscious bias takes on greater significance.
And this hints at what has become known as structural bias: the way that the legacy of historical injustice, unconscious discrimination and skewed incentives can harden into concrete barriers for certain sections of the population.
'Homophily’: people tend to hire people who look and think like themselves.
It is validating to be surrounded by people who share one’s perspectives, assumptions and beliefs. As the old saying goes, birds of a feather flock together. In their meticulous study of the CIA, Milo Jones and Philippe Silberzahn write: ‘The first consistent attribute of the CIA’s identity and culture from 1947 to 2001 is homogeneity of its personnel in terms of race, sex, ethnicity and class background (relative both to the rest of America and to the world as a whole).’
Here is the finding of an inspector general’s study on recruitment:
In 1964, the Office of National Estimates a part of the CIA had no black, Jewish, or women professionals, and only a few Catholics . . . In 1967, it was revealed that there were fewer than 20 African Americans among the approximately 12,000 non-clerical CIA employees. According to a former CIA case officer and recruiter, the agency was not hiring African Americans, Latinos, or other minorities in the 1960s, a habit that continued through the 1980s . . . Until 1975, the IC the United States Intelligence Community openly barred the employment of homosexuals.
This idea that there is a trade-off between excellence and diversity has a long tradition. In the United States, it formed the basis of a seminal argument by Justice Antonin Scalia for the Supreme Court. You can either choose diversity, he contended, or you can choose to be ‘super-duper’. If a diverse workforce, student population, or whatever, emerges organically through the pursuit of excellence, that is one thing. But to privilege diversity above excellence is different. And it is likely to undermine the very objectives that inspired it.
**This is a book about diversity, about the power of bringing people together who think differently from one another. At one level, this might seem like a curious objective. Surely, we should aim to think correctly or accurately, not differently. One should only wish to think differently from other people when they are in the wrong. When other people are right, thinking differently will only lead you into error. This seems like common sense. Another seemingly commonsensical statement was that made by Justice Scalia. He argued that recruiting people because they are different, in one way or another, is to jeopardise performance. You should hire people because they are smart, or knowledgeable or fast. Why would you hire people who are less knowledgeable, fast or talented, just because they are different?**
But by focusing on individuals, there has been a tendency to overlook what we might call the ‘holistic perspective’. A good way to understand the difference is to consider a colony of ants. A naive entomologist might seek to understand the colony by examining the ants within the colony. Individual ants, after all, deploy a vast range of behaviours, such as collecting leaves, marching, etc. They are busy and fascinating creatures. And yet you could spend a year, indeed a lifetime, examining individuals and learn virtually nothing of the colony. Why? Because the interesting thing about ants is not the parts but the whole. Instead of zooming in on individual ants, the only way to understand the colony is to zoom out. One step removed, you can comprehend the colony as a coherent organism, capable of solving complex problems such as building sophisticated homes and finding sources of food. An ant colony is an emergent system. The whole is greater than the sum of its parts.
Pretty much all the most challenging work today is undertaken in groups for a simple reason: problems are too complex for any one person to tackle alone. The number of papers written by individual authors has declined year by year in almost all areas of academia. In science and engineering, 90 per cent of papers are written by teams. In medical research, collaborations outnumber individual papers by three to one.
Cognitive diversity was not so important a few hundred years ago, because the problems we faced tended to be linear, or simple, or separable, or all three. A physicist who can accurately predict the position of the moon doesn’t need a different opinion to help her do her job. She is already bang on the money. Any other opinion is false. This goes back to our common-sense intuition. Thinking differently is a distraction. With complex problems, however, this logic flips. Groups that contain diverse views have a huge, often decisive, advantage.
The critical point is that solutions to complex problems typically rely on multiple layers of insight and therefore require multiple points of view. The great American academic Philip Tetlock puts it this way: ‘The more diverse the perspectives, the wider the range of potentially viable solutions a collection of problem solvers can find.’ The trick is to find people with different perspectives that usefully impinge on the problem at hand.
‘Perspective blindness’ refers to the fact that we are oblivious to our own blind spots. We perceive and interpret the world through frames of reference but we do not see the frames of reference themselves. This, in turn, means that we tend to underestimate the extent to which we can learn from people with different points of view.
Perspective blindness was the subject of David Foster Wallace’s address to Kenyon College in 2005, rated by Time magazine as one of the greatest commencement speeches ever recorded. The speech starts in a fish tank. ‘There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says, “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes, “What the hell is water?” ’
Wallace’s point is our modes of thought are so habitual that we scarcely notice how they filter our perception of reality. The danger arises when we overlook the fact that in most areas of life there are other people, with different ways of looking at things, who might deepen our own understanding, just as we might deepen theirs. John Cleese, the British comedian, put it this way: ‘Everybody has theories. The dangerous people are those who are not aware of their own theories. That is, the theories on which they operate are largely unconscious.’
Demographic diversity (differences in race, gender, age, class, sexual orientation, religion and so on) can, in certain circumstances, increase group wisdom. Teams that are diverse in personal experiences tend to have a richer, more nuanced understanding of their fellow human beings. They have a wider array of perspectives – fewer blind spots. They bridge between frames of reference.
A study by Professor Chad Sparber, an American economist, found that an increase in racial diversity of one standard deviation increased productivity by more than 25 per cent in legal services, health services and finance. A McKinsey analysis of companies in Germany and the United Kingdom found that return on equity was 66 per cent higher for firms with executive teams in the top quartile for gender and ethnic diversity than for those in the bottom quartile. For the United States, the return on equity was 100 per cent higher.
When you are surrounded by similar people, you are not just likely to share each other’s blind spots, but to reinforce them. This is sometimes called ‘mirroring’. Encircled by people who reflect your picture of reality, and whose picture you reflect back to them, it is easy to become ever more confident of judgements that are incomplete, or downright wrong. Certainty becomes inversely correlated with accuracy.
In a study led by Katherine Phillips, Professor at Colombia Business School, for example, teams were given the task of solving a murder mystery. They were given plenty of complex material, composing alibis, witness statements, lists of suspects and the like. In half the cases, the groups tasked with solving the problem were composed of four friends. The other half were composed of three friends and a stranger – an outsider, someone from beyond their social milieu, with a different perspective. Given what we have learned so far, it should come as no surprise that the teams with an outsider performed better. Much better. They got the right answer 75 per cent of the time, compared with 54 per cent for a homogenous group, and 44 per cent for individuals working alone.
Diverse teams, on the other hand, come up with fresh insights, helping them to become unstuck. Rebel ideas are effectively firing the collective imagination. As the leading psychologist Charlan Nemeth puts it: ‘Minority viewpoints are important, not because they tend to prevail but because they stimulate divergent attention and thought. As a result, even when they are wrong they contribute to the detection of novel solutions that, on balance, are qualitatively better.’
Diversity contributes to collective intelligence, then, but only when it is relevant. The key is to find people with perspectives that are both germane and synergistic.
Perhaps the most important point is the generalised significance of diversity. Diversity isn’t some optional add-on. It isn’t the icing on the cake. Rather, it is the basic ingredient of collective intelligence. You can see the power of diversity from a broader perspective, too. Diversity explains why price systems work so effectively, and why open-source innovation platforms and wikis have become pervasive. These all share the same underlying signature: they aggregate the disparate information contained in different minds.
Innovation is not just about creativity, it is also about connections.
The science of recombination presents us with a compelling vision. Innovation is about breaking down walls. Some walls are good, of course. Most of us value privacy. Most companies need to protect intellectual property. Most institutions need specialists who, in turn, need space to do their job. But we often get the balance wrong, leaning towards insularity, not because we don’t value the insights of people who think differently from ourselves, but because we underestimate their significance. This is another aspect of homophily. We are comfortable in our own silos, our own categories, our own conceptual milieus.
Other studies led by Seth Flaxman of Oxford and the Pew Institute offer a different lens on the digital world. These find that when you look at overall Internet use, digital users have higher average exposure to the views of their own side, but nevertheless get to see the views of opponents, too. Perhaps that is not surprising. Even in the clan systems that emerged after the agricultural revolution, the various in-groups were not completely shut off from each other.
But what is fascinating – and broadly acknowledged by almost all scholars – is what happens when exposure does take place. Now, you might have thought that by hearing the views of opponents, and seeing the evidence from the other side, opinions would become less extreme. Views would become more nuanced. In fact, the opposite happens. People become more polarised. In Pierson’s study, for example, the limited interaction between red and blue tweeters was explosive. She writes:
When the red and blue group did talk, it often wasn’t pretty. Consider the things said by members of the red group to one of the most influential members of the blue group – DeRay Mckesson, a school administrator who has played a central role in organizing protests. They described him as a ‘commie boy’ who spread hate . . . saw ‘value in racist drivel’, was armed with ‘guns and Molotov cocktails’, and should get his ‘meds adjusted’.
A study led by Christopher Bail of Duke University found a similar pattern. He recruited 800 Twitter users to follow a bot that re-tweeted the views of high-profile people from across the political spectrum. What happened? Far from becoming more balanced, the Twitter users became more polarised. This was particularly true for Republicans, who became more conservative. It was as if exposure to different views confirmed their prior convictions.
But when one is seeking to become informed on complex subjects such as politics, echo chambers are inherently distorting. By getting their news from Facebook, and other platforms, where friends share cultural and political leanings, people are more exposed to people who agree with them, and evidence that supports their views. They are less exposed to opposing perspectives. The dynamics of fine-sorting can be magnified by a subtler phenomenon: the so-called filter bubble. This is where various algorithms, such as those inside Google, invisibly personalise our searches, giving us more of what we already believe, and further limiting our access to diverse viewpoints.*
To understand what is going on, and to fully glimpse the internal logic of echo chambers, we need to draw a subtle distinction between echo chambers and information bubbles. As the philosopher C. Thi Nguyen notes, information bubbles are the most extreme form of isolation, where people on the inside see only their side of the argument and nothing else. These kinds of social groups have rarely existed in modern history except in cults and other ‘walled institutions’. Echo chambers, Nguyen argues, are different. They may cut some people off from alternative views through informational filtering (research by digital scholars Elizabeth Dubois and Grant Blank found that 8 per cent of people in the UK have such biased media exposure that they experience a distorted version of reality) but their distinctive feature is that they have not one filter, but two.
What is the second filter? We will call them epistemic walls.
Now we can begin to glimpse the subtly different properties of information bubbles and echo chambers. With the former, informational borders are hermetically sealed. People on the inside only hear people who are co-inhabitants of the bubble. This creates distortions, but it also confers fragility. The moment a member of the in-group is confronted with outsider opinions, they are likely to question their beliefs. The way to burst an information bubble, then, is through exposure. This is why cults take such lengths to deny insiders access to different voices.
Echo chambers, with their additional filter, have fundamentally different properties. People on the inside hear more opinions from the in-group, but these views tend to become stronger when exposed to opposing opinions. Why? Because the more opponents attack Limbaugh, the more they point to the errors in his opinions, the more it confirms the conspiracy against him. Opponents are not offering new insights, but fake news. Each piece of evidence against Limbaugh is a new brick in the wall separating the in-group from outsiders. As Nguyen puts it:
What’s happening is a kind of intellectual judo, in which the power and enthusiasm of contrary voices are turned against those contrary voices through a carefully rigged internal structure of belief. Limbaugh’s followers read – but do not accept – mainstream and liberal news sources. They are isolated, not by selective exposure, but by changes in who they accept as authorities, experts and trusted sources. They hear, but dismiss, outside voices.
It is this epistemic vulnerability that echo chambers exploit. By systematically undermining trust in alternative views, by defaming those who offer different insights and perspectives, they introduce a filter that distorts the belief-formation process itself. Alternative views are dismissed not after consideration, but upon contact. Facts are rejected even as they are offered. Perspectives and evidence are repelled somewhat like a magnet repelling iron filings. Nguyen writes: ‘Echo chambers operate as a kind of social parasite on our vulnerability . . . An information bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trust people from the other side.’
Humans are smart because we have evolved to connect with other brains. This is why a human child by the age of nine or ten can beat any other primate of any age on pretty much any of the cognitive tasks in the experiment. The body of knowledge absorbed from adults equips the brain with outsized power.
Our species is the most formidable on the planet not because we are individually formidable, but because we are collectively diverse. By bringing different insights together, by connecting within and across generations, by recombining rebel ideas, we have created innovations of a quite breathtaking kind. It is our sociality that drove our smartness, not the other way around.
Diversity is not merely the ingredient that drives the collective intelligence of human groups, it is also the hidden ingredient that has driven the unique evolutionary pathway of our species. It is, to quote Henrich, the secret of our success.
People who give are able to construct more diverse networks. They have a wider variety of dormant ties. They have access to a greater number of rebel ideas. By giving in the past, givers enjoy greater scope to reach out for ideas when it matters. As one executive put it: ‘before contacting them, I thought that they would not have too much to provide beyond what I had already thought, but I was proved wrong. I was very surprised by the fresh ideas.’
As Grant writes: ‘according to conventional wisdom, highly successful people have three things in common: motivation, ability and opportunity . . . but there is a fourth ingredient: success depends heavily on how we approach our interactions with other people. Do we try to claim as much value for ourselves as we can, or do we contribute value . . .? It turns out that these choices have staggering consequences for success.’
Cultures that encourage new ideas, foster dissent and have strong networks through which rebel ideas can flow, innovate faster than those held back by cultures of intellectual conformity. As Henrich has put it:
Once we understand the importance of collective brains, we begin to see why modern societies vary in their innovativeness. It’s not the smartness of the individuals . . . It’s the willingness and ability of large numbers of individuals at the knowledge frontier to freely interact, exchange views, disagree, learn from each other, build collaborations, trust strangers, and be wrong. Innovation does not take a genius or a village; it takes a big network of freely interacting minds.