"We are in great haste to construct a magnetic telegraph from Maine to Texas; but Maine and Texas, it may be, have nothing important to communicate.… We are eager to tunnel under the Atlantic and bring the Old World some weeks nearer to the New; but perchance the first news that will leak through into the broad, flapping American ear will be that the Princess Adelaide has the whooping cough." -Henry David Thoreau, Walden (1854)
If there's one complaint about politics we've heard more than any other in the past few years, it is the pernicious development known as "polarization." Not only has politics been taken over by partisans, but Americans have begun choosing where to live based on where they'll find a community of the like-minded. Our news media are the cause and result of this sorting, encouraging us to narrow our view and delivering us the news they're sure we want to hear.
It's a familiar tale, told by both popular commentators and academics. In a major study of the way information sources affect people's views, a group of researchers found that voters gravitated to media they knew wouldn't challenge their opinion, "but-and this is important-the more strongly partisan the person, the more likely he is to insulate himself from contrary points of view," wrote the study's authors. Voters didn't just prefer news outlets with an ideology similar to theirs; they found them more credible. "The partisans ascribed 'impartiality' and 'veracity' to the media which presented views similar to their own. … A transfer was effected from partisan value to truth value."
A statement of the obvious, you might say. But this study, by Paul Lazarsfeld and his colleagues at Columbia University, was conducted nearly three-quarters of a century ago. The scholars interviewed voters in advance of the 1940 presidential contest between Franklin Roosevelt and Wendell Willkie, then described the results in their 1944 book The People's Choice: How the Voter Makes Up His Mind in a Presidential Campaign. Long before Fox decided to tweak liberals by branding itself "fair and balanced," Americans were convinced that objectivity lay wherever they found agreement with what they believed.
We think of our current era as a time of constant upheaval. The new is displacing the old; the Internet is swallowing "legacy media."
Every year or two, a next-generation Web platform comes along, and sages of the information age tell us This Changes Everything. But most of the media challenges we confront today aren't so new, and they didn't begin with the World Wide Web; the difference is the speed with which each subsequent development moves us along the path we were already on.
At their respective inceptions, both radio and television were touted as wonders of societal advancement, in no small part because they could bring us closer together. Television's early proponents believed it could become not only a provider of universal education-where working stiffs would watch lectures on philosophy and take in Hamlet after dinner-but a force for cultural coalescence. In some ways it was-a schoolteacher in Boise and a stevedore in Boston would not only watch the same entertainment; they'd see the same news and learn the same (few) things about what had happened that day. By the 1950s, television news had become a kind of common national text.
But in the middle of the 20th century, critics argued that as we became the "mass" in "mass media," we, made ever more passive by television's compelling pictures, would lose our ability to think for ourselves. They devised theories with names like "narcotizing dysfunction" and "mean world syndrome," positing that media were turning us into easily manipulated, semiconscious blobs, mainlining soporific entertainment into brains losing their ability to distinguish between fantasy and reality. We were watching life, not living it.
"The making of the illusions which flood our experience has become the business of America," wrote Daniel Boorstin in The Image: Or What Happened to the American Dream (1961). The problem was in both form and content-trivial yet enticing, simultaneously enthralling and mind-numbing. We were, as the title of Neil Postman's 1985 book had it, amusing ourselves to death.
As the 20th century approached its end, the mass media of the previous period took on a favorable glow. When media choices multiplied, the citizenry began to balkanize, and "fragmentation" became the media scholar's new lament. What started it? Cable television. Not cable news, which was a later development, and not ideological cable news, which didn't begin until Fox News went on the air in 1996.
Princeton political scientist Markus Prior has shown that the spread of cable-television penetration in an area heightens political polarization. The effect was visible as far back as the 1970s. After cable arrived in a region, you'd see a decrease in split-ticket voting. This happened not because of news but because other offerings, like sports and entertainment, lured people who had only a marginal interest away from the news. Once viewers had choices, they tuned out and started voting at lower rates, leaving a news audience and an electorate more dominated by partisans.
This was not just an American phenomenon. Between the 1970s and 1990s, one country after another went from a small number of state-sponsored television channels (or in many places just one) to a multitude of choices. As media scholar Elihu Katz wrote in 1996, "From the point of view of participatory democracy, television is dead, almost everywhere. It no longer serves as the central civic space; one can no longer be certain that one is viewing together with everybody else or even anybody else, and the here-and-now of current affairs is being minimized and ghettoized and overwhelmed by entertainment."
The decline of current-affairs media has only gotten worse. A few decades ago, half the households in America with televisions tuned to the network nightly news. Last year, the combined rating for these shows averaged less than one in six households. An endless string of Lipitor and Viagra ads attests to the advancing age of those shows' viewers. In fact, every one of the legacy media-newspapers, television news, and radio-has an audience more weighted to the old than the young.
The medium whose death is warned about most often is the newspaper. Again, the story we've heard isn't quite complete. That story has it that these journalistic dinosaurs were laid low when a guy named Craig Newmark created a bare-bones website for people to find a roommate or sell that treadmill they never got around to using, and in short order robbed newspapers of the classified-ad revenue that had made up a significant portion of their income. As more people migrated to the Web, ad revenues plummeted and the medium entered its death throes. Those things did happen. But the decline of newspapers goes back further. Newspaper circulation has been on a downward arc since at least the 1940s.
The cost-cutting that led to the evisceration of local news coverage also predates the Internet. In the 1980s, large, publicly traded corporations like Gannett bought dozens of small and midsize papers, often slashing local coverage and time-consuming investigations in favor of cheaper wire stories. Not long ago, the idea that a city the size of New Orleans wouldn't have a daily newspaper would have been unthinkable. But last year, the city's sole remaining daily, The Times--Picayune, cut back to three days a week.
When Amazon.com CEO Jeff Bezos bought The Washington Post last year, he got it for a song-only $250 million (compare that to the $315 million AOL paid for The Huffington Post three years earlier). Bezos's lack of a background in journalism, not to mention much of an apparent political agenda, may make him a prototypical media owner for our age. One of the most critical features of "Web 2.0" giants like Facebook, Twitter, and YouTube is that the owners are essentially agnostic about content. Most of it is created by users; the owners manage the platform and count their money. While you may not think of those companies as "the media," they most certainly are. According to Google, which owns YouTube, 100 hours of video are uploaded to the site every minute, and more than 6 billion hours are watched every month.
Bezos won't be firing all the Post's reporters and replacing them with Amazon commenters; if anything, he'll put more money into content created by professionals, though perhaps a more diverse group than ink-stained, shoe-leather reporters. (This obviously has limits; when star Post blogger Ezra Klein proposed an expanded news and policy blog reportedly costing eight figures, the paper declined and Klein moved his venture to Vox Media, a company that runs a series of successful, high-traffic sites covering topics like sports and technology.) Nor is Bezos the only tech billionaire trying his hand at journalism. EBay founder Pierre Omidyar recently hired Glenn Greenwald, famous for his role in documenting the Edward Snowden revelations in The Guardian, to lead a new project called First Look Media, which will "publish a family of digital magazines." The first of these, The Intercept, launched in February with a report on how the NSA uses electronic surveillance to locate targets for assassination.
Though First Look Media is a for-profit company, with its initial focus on lengthy investigative pieces over breaking news, it resembles not so much a traditional newspaper or magazine, but something more like ProPublica, the nonprofit investigative journalism enterprise. Last year, ProPublica released a massive report on accidental deaths from overdoses of acetaminophen, the active ingredient in Tylenol; the investigation took two years and cost more than $750,000. Neil Barsky, a journalist turned hedge-fund manager, recently recruited former New York Times executive editor Bill Keller to lead the Marshall Project, a kind of single-issue ProPublica focused on the criminal justice system. We could use many more undertakings like it, but there are only so many billionaires with an interest in journalism willing to fund projects that nourish our democracy but might not make money.
The relative scarcity of civic-minded but costly investigations is more worrisome when the major news outlets are spending so much of their time trying to react to the rise of social media and their ever-more individualized audiences.
Desperate to keep up with what the kids are into, Brian Williams tells his superannuated viewers about this week's hottest viral video, and Wolf Blitzer reads tweets on the air, just like Edward R. Murrow would have done. Television news operations want to plug in to the new social web lest they get left behind, even as they still use gross demographic distinctions like people between 18 and 49 (known in the industry as "the demo"), giving them only the vaguest picture of their audience as a whole and nothing in particular about you personally. Political campaigns, on the other hand, have grown so sophisticated at gathering and parsing multiple data sources that they practically know what every voter will eat for breakfast on Election Day.
As tools old and new have enabled us each to become our own target audience, we've gotten a clearer view of ourselves and one another. Media used to tame our politics and calm our intemperate fevers; the rhetoric we heard from our betters manning the airwaves was measured and formal. There were exceptions, like Father Coughlin, whose anti-Semitic diatribes reached millions of radio listeners in the 1930s. But the prevailing norms required a more polite politics so that as many of us as possible could partake of the same news meal.
In recent years, media moguls raking in millions (healthy profits can still be had in a declining industry) realized there was gold in channeling, and heightening, base emotions. When you aren't worried about attracting everyone, you can offend large swaths of the potential audience without fear. Offending can even become your business model, so long as you're aiming at the people your target audience hates. Rage can be a ratings winner, and no one nurtures it with more glee, or more profitably, than Fox News and Roger Ailes, who has run the network since its inception in 1996. The enemies are clearly defined: liberal politicians, pointy-headed professors, and above all the allegedly liberal media. Fox never stops telling its viewers, these are the people you should be angry with. As David Folkenflik writes in Murdoch's World: The Last of the Old Media Empires, "Ailes knew that Fox's defining feature would require a highly cultivated resentment toward other news organizations." Sustaining a steady stream of ire is no easy task, which is why talents like Bill O'Reilly and Sean Hannity, who can bristle with anger for hours on end, deserve their lofty perches in the media pantheon.
There are limits to what that anger can accomplish, however. Though Ailes's genius always lay in creating compelling, profitable television while serving the political interests of the Republican Party, the changing demographics of the Obama era threatened the second part of that equation. When Democrats characterize the GOP as a bunch of angry white guys who don't respect women and minorities, their point is made clear by the fact that the network at the center of the conservative media universe features a bunch of angry white guys and a bevy of beautiful blondes. By 2012, more and more people were asking whether Fox was helping or hurting the Republican cause. As Gabriel Sherman notes in The Loudest Voice in the Room, his recent biography of Ailes, a broad-based political appeal for Republicans isn't served by "the vivid political comedy Fox often programmed. … In pursuit of ratings, Fox had sharpened national divisions-and the division had favored the Democrats." According to Nielsen data, the median Fox viewer is 65-plus, and only 1 percent of the network's audience is black. It wouldn't be surprising if, after being told for months that the liberal media's polls were wrong and Mitt Romney was headed for a smashing victory, those viewers were as shocked by the election's actual results as some of Fox's on-air personalities were.
That isn't to say that Fox and the rest of the conservative media don't still pull in healthy audiences and make plenty of money. In their new book, The Outrage Industry, Tufts University professors Jeffrey M. Berry and Sarah Sobieraj estimate the combined audience for "outrage media" on radio, television, and the Internet at 47 million per day. Combining media analysis with fan interviews, they found that angry presentations serve an important psychological purpose. As one interview subject said admiringly about O'Reilly, "There's some appeal in watching somebody who's obnoxious in a way generally you can't be." When interacting with actual humans, the fans of these programs are constrained by the fear that relations will grow awkward, but they find their true selves in watching a guy on TV luxuriate in his contempt.
So while political campaigns craft their granular voter-by-voter appeals, they also enact a kabuki of outrage pitched for the national news media. Some "gaffe" gets uttered and then taken out of context, the deep offense is proclaimed, pundits mull over what profound character defect has been revealed, and a few days later the cycle begins again. Outrage keeps the campaign momentum moving and certain audiences coming back. You too can be Bill O'Reilly, albeit with a smaller audience; all it takes is a visit to a website and a click of the "add comment" button to tell those jerks what you think of them. Hit that button, and you're not lost in a mass audience anymore. You're an individual again.
Whether you're spewing out your anger or bestowing a smiley-faced blessing on an article or video that brightened your day, the media industry wants and needs to know. Every editor tracks how many likes and tweets each piece of journalism produces, hoping all those atomized individuals will signal their approval or their displeasure and pass it along. As the price for our re-individualization, we've laid ourselves bare. The National Security Administration knows whom you've called, and maybe what websites you've visited. Google knows what you've searched for and tailors the ads you see to products it knows you're interested in. Facebook holds on to every photo you've posted and thought you've shared; the company can now track where your cursor hovers when you lazily peruse that ex-girlfriend's page. You can express your consternation about the latest revelation of domestic spying, right after you show the world a picture of your children. We've built our own personal panopticons from the inside out, clicking "I accept" again and again, and we didn't need a tyrannical government's help to do it.
That isn't to minimize all the wondrous ways the Internet has enriched the lives of hundreds of millions.
Today, we have reached what is undoubtedly the golden age of information, and the Internet is full of writers and analysts who rose to some measure of prominence not because they put in time on the metro desk but because they displayed talent and creativity.
A thousand Web magazines have been launched with a tiny fraction of what it would cost to put the same content into print. Whether your interest is a big topic like technology or a small one like hamster breeding, you can more easily find thoughtful writing about it today than you ever could before. Multiple newspapers have gone out of business, but The New York Times and The Guardian produce some of the most extraordinarily well-crafted visual data presentations to be found anywhere. The websites of groups like the Sunlight Foundation and the Center for Responsive Politics have made information about politicians, funders, and government easier to obtain and understand. A project like Retro Report, which revisits stories from recent history like the "crack baby" epidemic of the 1980s (sometimes with surprising results) would never have come about before the Internet. Between its print and online versions, even this modest liberal magazine publishes far more journalism and analysis than it did 10 or 15 years ago. An accounting of the Internet's wonders could go on almost forever.
An open system without barriers to entry also produces a nearly endless supply of ugliness, as people are liberated to pour their ids into their keyboards. Ask any woman who has blogged about a controversial topic, and she'll tell you about the torrent of hatred and rape threats that come her way. Behind every vicious tweet is one human being gaining pleasure from spewing bile at another. These are not psychopaths posting between murders; they're regular folk who would say they're good people. Some media outlets have chosen to turn off the spigot; this past September, Popular Science announced that it would no longer allow comments on its website. "Because comments sections tend to be a grotesque reflection of the media culture surrounding them," the online content director wrote, "the cynical work of undermining bedrock scientific doctrine is now being done beneath our own stories, within a website devoted to championing science."
It might be easier to say, as some doomsayers have at the turn of every new media era, that our current age is a disaster and everything we value has been lost or, at least, is about to be lost. But that's no truer today than it was in 1960 or 1980 or 2000 (to say nothing of the time a couple of millennia ago when Socrates fretted that the dangerous habit of writing would destroy people's ability to trust their own memories). Or, like the techno-boosters who find the full flowering of human potential in every banal tweet, one could proclaim the reporters and newspapers that have gone to pasture nothing more than this century's buggy-whip producers, the inevitable and vaguely pathetic casualties of progress's march. The truth is, as ever, more complex.
The move from mass to niche, from media meant to appeal to everyone to media tailored for your particular idiosyncrasies, has brought with it the glorious and the ghastly. Eventually, another communication revolution will make Web 2.0 look like old news. That revolution may reorient society's power relationships in ways that expand human freedom and dignity, or it may bring frightening new societal consequences. It may even do both. We were a single mass, then a fragmented collection of groups. The best bet is that we are now on our way toward a new radical individuality, and the day when we each inhabit an information snowflake like no one else's. Even if we won't ever inhabit a unique information snowflake, we will likely watch-and be watched-with a specificity that even today we can scarcely imagine.