In Reality is Up for Grabs, I wrote about how people are opting into their own versions of reality, and the result is increasing fragmentation.
Many people long for a world pre-Twitter, where we all had a shared sense of reality, but that world isn’t coming back. We’ll discuss why in this piece.
How we got here
When the printing press was invented in 1440, we opened up a new era in information distribution. Thoughts and ideas could spread faster than word of mouth in the first form broadcast media — a one to many dynamic.
Over time, the invention of newspapers, TV, and radio helped solidify this dynamic, acting as highly centralized media aggregators — technologies that held society together. Everyone read the same newspapers, watched the same TV channels, and listened to basically the same radio stations, and this single voice became thought of as a source of truth.
Big newspapers or media properties didn’t exist back then. It was all decentralized and all local — no Walter Cronkite, no Don Lemon, no Rachel Maddow, nothing like that. Instead we heard many voices, mostly pseudonymous, since people were worried about getting arrested or killed for saying the wrong thing. Ben Franklin himself had 15 different pseudonyms, each with different points of view, everiscating each other to sell newspapers. Sounds like a higher stakes version of Twitter today.
We’re not used to this because we grew up in an era of peak centralization.
The book Infamous Scribblers chronicles the history of the news business and makes the case that the fragmentation we see in media today has been the norm throughout history, and the consolidation of the last 50 years was the aberration.
Everything used to be fractured and fragmented by definition. Then came the telegraph, then the telephone, and mass manufacturing, public education and more. We’re now returning to that early way of living before Peak Centralization. Structurally, we have more in common with the 1800s than we did with 1950s.
Explosion of Information
The internet, and more specifically social media, cause this explosion of available information by giving voices to many people who weren't otherwise heard. In turn, this increased the number of voices in the public sphere, which also increased the number of truths and versions of reality someone could believe.
As a result, we see more distrust in society because we can’t seem to reconcile which version of reality is more “true” than another.
To put this explosion of information in perspective: In 2002, the world produced double the amount of information that had been produced throughout all history before that. In 2003, it doubled again. And instead of increased knowledge leading to greater understanding, we saw social and political chaos. Why?
As Martin mentioned on our podcast: “Suddenly this flood of information comes and you realize you knew…nothing!”
Media institutions had a monopoly on truth — they had legitimacy because they had authority, and they had authority because they had a monopoly on information.
This had been changing over time. The advent of talk radio, cable news, CNN, and the fairness doctrine led to more voices being on the media.
But it was social media that blew the whole thing open.
Even a couple decades ago, you didn't hear from that many people. It was a pretty big deal for someone to be on TV, and there weren’t that many voices contending for your attention. Even when Facebook started, it was only people on your campus. But in 2010, Facebook expanded well beyond colleges, and now suddenly all these people could speak who could never speak to each before. Now we knew what everyone was thinking all the time, and it was *problematic*.
It’s the multiple voices that causes distrust. If there's only one voice and it's broadcast, then no one can argue with the voice. But if there’s a cacophony of voices and they’re all arguing with each other telling they’re side of the story, it’s hard to know who to believe.
Before social media, a wide swath of the population was just utterly ignored by traditional print/audio/visual media — they never heard what each other had to think. This was the original filter bubble.
Social media flattened the relationship between the elites and everyone else — it gave the public a voice. When JFK would give a speech, he was uncontested. Now, with everyone transitioning from a passive reader to a veritable journalist, every statement is challenged. A journalist, instead of being the sole arbiter of truth, is now just another person on social media. The public, not a group of elites, determined who was an *actual* journalist.
Indeed, as Noah Smith put it: "We all wanted to give a voice to the voiceless and the marginalized. And they did! But don't be surprised when the first thing the voiceless and marginalized say is 'Fuck you.'"
Twitter is the real world
Martin Gurri has cataloged the different revolutions in information, detailing each one's social impacts. From writing, to the alphabet, to the printing press, to mass media, to social media — just as people were originally concerned that the printing press would enable larger scale propaganda, we still see similar concerns today.
Some would call this the "post-truth" era, but that would be misleading — there's more truth than ever. So much so that we don't really know how to process it, since many truths contradict each other. Instead, it feels more like a "post-monopoly-on-truth" era.
Think about it this way: today, burning books is considered heretical. Why? One reason is because we ultimately figured out how to process and utilize all the information they contained — so there’s hope we’ll do the same today with social media, and we’ll look back at those who want to get rid of social media in a similar way.
Put simply, what the explosion of truth did was expose the superficiality of existing media institutions, create a "public" that could challenge the elites directly and create platforms for individuals to effectively become legible institutions and develop one to many relationships.
Twitter is an interesting case study, because it’s basically eaten all of media.
People sometimes say "Twitter isn't the real world", which is to say that activity on Twitter is just a small % of the population, and doesn’t manifest or influence how the broader population thinks. This might be technically true, but substantially false. Whoever you think influences the "real world" — journalists, media companies, educators — they're all strongly influenced by Twitter. Everything begins on Twitter and then branches out.
To understand this, a friend gave me the framework of the "OODA Loop": Observe, Orient, Decide, Act.
Observe - what's happening around me?
Orient - where am I relative to what's happening around me?
Decide - how should I respond given where I am and what's happening around me?
Act - respond.
Generally speaking, whoever runs the OODA loop the fastest wins — the idea being the faster you move through the loop, the harder it is for your opponent to understand reality, thus hurting their decision making ability. If you're running your loop faster than they are, you can sometimes act before they've been able to complete the loop — you'll be acting before they can orient, or deciding before they even finish observing.
So what does this have to do with media institutions?
Well, one of my theories is that the internet offers a hyper-accelerated OODA loop, and Twitter is the fastest loop across the entire internet. As a result, Twitter has the ability to disrupt all other forms of reality perception. Which means what you see take off on Twitter will also take off (or influence) what you see elsewhere.
TV & newspapers are too slow. By the time they've aired, their stories are already old news. Twitter is always more up to date. Legacy institutions can’t keep up with the speed of the internet.
This means the most important news will always spread faster on Twitter, but the opposite is also true — the most toxic behavior on Twitter can (and probably will) be mimicked elsewhere.
This is exactly why the stakes on Twitter are so high — what gets resolved on Twitter will influence what you see in the media, the education system, and elsewhere in society.
A Golden Age?
So that begs the question: was pre-internet journalism better than what we see today?
One argument is that pre-internet, journalists had a more reliable source of revenue, enabling them to do more investigative work, and the business model shift to online advertising meant they'd now have to produce popular pieces more frequently.
To that I'd say the same social media platforms that changed the business of journalism also changed our perceptions of it. Was past journalism actually better? Or was it the same quality, but we called it great out of ignorance?
I think about how coverage of JFK, FDR, and others would have held up under social media. Walter Duranty, for example, won the Pulitzer in the 1930s for his reporting on the USSR, but we later learned he protected Stalin by denying his war crimes. In the 90s, there were calls to take away his prize, but he kept it.
How many other false stories pre-social media were never overturned?
So maybe we shouldn’t lament the end of peak centralization, particularly in media. The idea of there being one telephone company, two superpowers, three television stations, and four internet companies - it’s too homogenous. The fragmentation will lead to new cultures, ways of thinking, new ideas, and ultimately more cultural innovation.
And for those who still miss peak centralization, the truth is that the internet ate media, and there’s no going back.
Writing of the week: ICYMI, Noah Smith interviewing Patrick Collison.
Watch of the week: Balaji on Tim Ferriss. As they say, this mf don’t miss.
Listen of the week: Geoff Shullenberger and Justin Murphy
Until next week,