Jakub Porzycki/NurPhoto via AP
The Open Mind explores the world of ideas across politics, media, science, technology, and the arts. The American Prospect is republishing this edited excerpt.
Heffner: What are you most concerned about as it relates to the election, in terms of combating disinformation right now?
Aral: This is probably the most consequential election of our generation, if not the last hundred years in the United States. And, I was speaking to Maria Ressa, Time’s person of the year in 2018 and, she said actually, it's probably the most consequential election for the world in a long time.
We have foreign governments interfering as we speak, including primarily Russia, but also for the first time China and Iran. We haven't done much since the 2016 election to deal with that. We also have a tremendous rise and affective political polarization in the United States. We see it in terms of animosity between the parties. We also see it on the streets of the United States these days.
The book [The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--and How We Must Adapt] really covers the hype machine or social media's role, both in terms of the spread of false news, which we've done a lot of the research on, as well as foreign interference, as well as its contribution to the polarization of society, all three of which are tremendously important in this election.
There needs to be regulation.
Heffner: I recall in the 2016 cycle, when there were disinformation websites that were being indexed and you would do a Google query, and, in that first page, you might find something that's dis- or misinformation. Google has improved since ‘16, but how much has it improved?
Aral: It's not sufficient. It's a really good first step. It's much better than sharing without thinking, which is what a lot of people do today. Now the list of fake news websites is long, and there are many of them that are essentially written to look like real news websites, especially the URLs and so on.
But just being a little bit reflective and doing some Google searching is an important first step. I also think that the parts about your PSAs that indicate that it is emotionally charged, that it typically involves all caps sometimes typos and so on is also important. It inspires anger and inspires surprise. It's shocking. It's salacious, and if it is that, it's hyping up your emotions, [so] that’s a reason to kind of check yourself and to think about whether this is true or false.
Heffner: Of course, the social media landscape (from which we will separate Google as a search engine) is totally different. We are expecting today the top 10 shared things on Facebook, if not Twitter, as well, and possibly YouTube, which we should note is owned by Google. So YouTube and Google are part of this equation.
But when it comes to Facebook and Twitter, what is being reported by those who cover disinformation on these platforms every day is that of the top 10 shared posts, a majority are not just from hyper partisan news. Now they are actually debunked myths. A majority of the top shared items on Facebook, if not Twitter, as well, are big fabrications.
Aral: The point of the book is to go into the science behind all of this and to kind of reveal how it works, but also what's true and what's false about it. What we realize when we dig into the science is that the business models of the platforms are designed to create short-term engagement.
So salacious, shocking news travels farther, faster, deeper, and more broadly than the truth, primarily because the algorithms have objective functions, which are designed to maximize short term engagement and human brain’s susceptibility to that which is salacious and shocking [to] combine to create kind of the spread of falsity online.
It's important to note that the debunking is never as fast or as broad or as deep as the falsity itself—even if it's debunked, the debunking never catches up to the falsity.
Heffner: What is the thesis for how we can adapt? I mean, that is part of the subtitle of your book, adapting to a culture online that is going to favor falsehood is not what we need to do. Isn't it really these companies that have been challenged to adapt, and they've been challenged before 2016 and in the four years since, and they've failed.
Aral: Yes. So the book goes into what's under the hood of the hype machine, the social media industrial complex and how it works. The last chapter, the longest chapter in the book, describes in detail what we have to do to achieve the promise of social media and avoid the peril.
It really hinges on the four levers, which are money, code, norms, and laws: Money being the business models of the platforms, which set up the incentives for how they behave and thus how the algorithms and therefore the users of social media behave [and] the code, which is the design of the algorithms themselves.
I go into exactly how they're designed, why they're designed that way and what the outcomes are, how they should be designed in order to solve some of the problems that we see with social media. Norms—we can't abdicate our own responsibility as users to be responsible users of technology. How do we establish those norms and how do we make them pervade our use of social media?
Finally, we know that there are a lot of market failures that exist with social media, so there need to be laws. There needs to be regulation. I go through all of the major regulatory questions, including antitrust, federal privacy legislation, election integrity, free speech versus hate speech, as well as misinformation and what we can do about it in the book.
Heffner: That fourth item that you mentioned, legislation or regulation, is that a necessity to prompt these companies to adapt in the three other realms, you know, it's a chicken and egg. But in all honesty, these past years have proved these companies’ failures to be decisive. So you end with the regulation, but don't we need to begin with a regulatory framework in the United States?
Aral: In fact, the regulation is the first part of the “how we adapt” chapter. It begins with all of the regulatory questions that exist. The first question that we need to ask is how do we establish competition in the social media-industrial complex? The reason why that's important is that if social media monopolies, if they are monopolies in a true legalistic sense, do not have competition, they have no incentive to reduce the negative, the pollution of our information ecosystem or the negative externalities that they create for society.
You might be surprised to read how I think we get to competition in this marketplace. It's about structural reform of the economy. It has more to do with data portability, social network portability, interoperability that's more akin to how we regulate the cell phone market and achieve competition there, than it is necessarily to trust-busting.