Illustration by Jandos Rothstein
This article appears in the December 2023 issue of The American Prospect magazine. Subscribe here.
A decade ago, sitting in a world history class, I hollered at my friends, “GTA over GPA!” I was 15 years old, still a hardcore gamer back then. My teacher shot a bewildered look toward me as I carefully explained how today was the release date of the long-awaited Grand Theft Auto V, the vast and anarchic open-world crime game, which would showcase the fullest capabilities of Sony’s PlayStation 3 and Microsoft’s Xbox 360. It only made sense to prioritize my gaming ahead of my schoolwork. My high school teacher wasn’t convinced, but she didn’t write me a detention slip either. Whew.
In my friend group, some of us had PlayStations, others had Xboxes. We’d trash-talk each other about who had the best hardware, as if we actually could measure the resolution and frame rate at that age. (What we actually cared about was the quality of exclusive games only available on one or the other platform.)
School had started two weeks before and I’d just wrapped up my first summer job where I bused tables at a dingy diner near Lake Michigan. I had a simple goal that summer: save enough money to preorder a PS4, buy a TV, and then buy GTA V and The Last of Us, a highly anticipated zombie apocalypse game that has recently been adapted into a TV show.
At this point, I’d been playing video games for more than ten years, and I owned every one of Sony’s consoles—a certified fanboy. But 2013 was different. I wasn’t old enough in 2006 to appreciate the leap from the PlayStation 2 and Xbox to PlayStation 3 and Xbox 360. This time, though, I was attentive, ready, and had a front-row seat.
I began thinking about 2013 and my years as a hardcore gamer while reading news of major structural changes in the game industry. Unity, the popular multi-platform game engine, announced new changes for developers who use the software’s tools when making video games. Departing from the traditional model of charging video game developers a one-time fee, 404 Media reported that Unity would demand an installation fee beginning next year, along with several other price hikes. This fee meant that each time a customer installed a game developed on a Unity engine, the developer would have to pay a fee—a potentially enormous new expense. This move drew immediate blowback, and several developers swore off the platform. Unity said that it was rolling back the reinstall fee, but the other price increases stayed intact. By October 9, The New York Times reported that Unity’s CEO John Riccitiello was departing after nine years there.
Shortly after the Unity fiasco, antitrust authorities failed to stop the largest tech acquisition of all time. After more than 20 months of legal sparring, Microsoft completed its $69 billion purchase of Activision Blizzard, one of the largest third-party video game publishers, owner of some of the most valuable intellectual properties like World of Warcraft and Call of Duty, and parent company of several video game studios. For a reference point, the industry brings in an estimated $180 billion in revenue annually, which is more than movies, books, and music combined. As the industry consolidates into vertically integrated conglomerates running on a subscription model, the artistic quality of games may suffer, and the viability of the industry may even be under threat.
IN THE HEYDAY OF THE CONSOLE WARS, from the 1990s through the mid-2010s, the video game business model was straightforward. Companies like Nintendo and Sony produced hardware that would release every five or six years. Developers would make games for those consoles (or for PCs), publishers would sell them as individual copies, and the revenue would be shared between all three parties.
Throughout most of the 1990s, it was Nintendo versus Sega, which battled to bring the traditionally youth-oriented arcade experience into the home with their Mario and Sonic series, respectively. But things changed when Sony entered the American market in 1995, with a more mature selection of games on its PlayStation.
Sega, bleeding market share, attempted to fight both its competitors at once with the Dreamcast in 1999. Veteran gamers often point to the console’s impressive specs and array of titles available at launch. But it couldn’t define itself against the PlayStation 2, which released the following year, or match Nintendo’s premiere video game console for families. Sega discontinued the Dreamcast in 2001, and then gave up the console business for good, becoming just a third-party developer.
Microsoft restored competition with its first Xbox in 2001, which was aimed mainly at Sony. Nintendo, meanwhile, secured dominance over the family-oriented segment of the market, and left off trying to compete on performance. This has worked quite well up to the present day; when households own more than one console, as a rule one of them is a Nintendo.
For gamers, this had some advantages. Console manufacturers had an incentive to pay top dollar for great exclusive games—sometimes by producing them themselves, as Nintendo typically does. Meanwhile, developers had an incentive to make widely appealing games to maximize sales, and those who opted out of exclusivity could typically port their game to several consoles. No matter how one played, there was generally a good selection of fun titles to choose from.
But it wasn’t all great. Few are as attuned to how rapidly the video game industry was changing during this period as someone like Adam Sessler, the legendary video game journalist.
For him, the console wars pushed each company to outdo the other to bring “sophisticated” PC gaming experiences to a console audience. But at the same time, he believes that the console wars made gamers obsess over hardware rather than artistic quality. “All of these numbers under the hood are somehow more important than what’s on the screen,” Sessler said.
Sessler took me back to 2000, when Sony got its PlayStation 2 on the market a year before the original Xbox. Ultimately, the PlayStation 2 sold more than 150 million units, while the Xbox sold only about 24 million. But Microsoft gambled by buying an Apple game developer that had been working on an action/adventure game set deep in space. Halo: Combat Evolved became a big success, and the sequel, Halo 2, sold eight million copies, making it the best-selling game on the first-generation Xbox. To this day, the franchise elevated what gamers could expect.
Then Microsoft turned the tables on Sony, announcing it would release the Xbox 360 a year before Sony’s PlayStation 3.
In response, Sony released a trailer for its next-generation shooter Killzone 2, its answer to Halo. GameSpot, a leading game publication, said at the time, “Killzone 2 is definitely one of the most impressive visual demos ever to appear at E3.” That implied Sony’s hardware was excellent, and the demo indicative of what gamers should expect. But alas, the version of Killzone 2 was pre-rendered—meaning it didn’t actually run on the future PlayStation 3’s hardware.
All this touched off a firestorm of criticism. But “the whole controversy really was immaterial to how Killzone 2 was a good game … It was a distraction from everything but the video games themselves,” Sessler said. Similar fights kept breaking out, as companies bent the truth about graphics or technical specifications to build marketing hype.
Killzone never became the Halo-killer Sony had hoped for. Sony later tried pairing the PlayStation 4 launch with the new Killzone: Shadow Fall. The game sold only 2.1 million copies.
Ultimately, the war between Sony and Microsoft partly turned gamers into “unpaid cheerleaders for these large corporate entities that at the end of the day don’t care about them that much,” Sessler told me. This dynamic infected how developers themselves made games. “They were forced to do things with their games that justify the marketing of the console, even though that might not actually make for a better game itself,” he said.
When Call of Duty 4: Modern Warfare, a flashy military shooter with cutting-edge graphics, was a surprise smash hit, selling seven million copies in its first year, the landscape changed. Developers chased blockbuster-style games by devising the fanciest graphics, which took more time and cost more. This raised expectations among gamers, leading to even more expensive development, which prolonged development cycles even further, leading to even higher expectations. All this squeezed out the innovative mid-budget title, the equivalent of the middle market for films. “Strange chances were taken by developers prior to Call of Duty,” Sessler told me. “It’s very similar to what’s happening in movies as well, the curse of the superhero movie.”
BEHIND THE SCENES, THE VIDEO GAME STUDIOS developing megahits for hungry gamers pushed their employees to the brink, as Jason Schreier, the longtime video game journalist, documented across his two books, Blood, Sweat, and Pixels and Press Reset.
“Crunch,” what the industry calls the months of 14-hour workdays before a game ships, is often rewarded with a short period of celebration, followed by mass layoffs. This vicious boom-and-bust cycle naturally doesn’t apply to executives making hiring decisions. One review of Schreier’s Press Reset highlighted an Electronic Arts employee’s experience where he and his colleagues worked 80-hour weeks while executives normally left at 5 p.m.
One of those EA executives was John Riccitiello, the former CEO of Unity mentioned earlier. While at EA, Riccitiello once suggested to shareholders that they could charge gamers each time they reloaded their guns in the first-person shooter Battlefield. Riccitiello eventually resigned under pressure from the company, reportedly for lagging financial performance.
Still, while the pay-per-reload suggestion was absurd in its time, that model would become the protean blueprint for the industry’s future. As consoles became connected to the internet, publishers and developers started designing games to be played indefinitely, like the cartoony shooter Fortnite, and loading them up with small fees (called “microtransactions”) for in-game items and services. Rather than a one-off transaction for a single experience, companies tried to hook people with impulse control problems who would end up paying astronomical sums. In industry lingo, this tiny population of gamers would be called “whales” because of their outsized impact on the company’s bottom line.
Now, with the rise of the Microsoft-Activision-Blizzard colossus, the console wars are adapting once more. In the immediate term, Xbox gamers will benefit from the merger because all of Activision’s game library will be included in Game Pass, Microsoft’s $10-per-month subscription service, which offers consumers a library of more than 450 games to play at any time, either by downloading them or via streaming. PC Game Pass subscribers will have access to more than 400 video games.
Game Pass subscribers trade access and convenience for autonomy and ownership.
In money terms, this is an undeniable bargain. Compare $10 a month for access to almost 500 games versus the standard model of paying $70 for a single game. Gamers seem to like it too: Browse Reddit posts on /r/Xbox about Game Pass, and the service receives positive reviews almost across the board.
But there are trade-offs too. First, subscribers don’t own those games; if they stop paying, access is cut off. They trade access and convenience for autonomy and ownership. Second, access to one’s game library must come through the Microsoft system, with all manner of controls and surveillance. One can play games offline, for instance, but only for 30 days.
It might not be a good deal for Microsoft either. The flip side of Game Pass is the company forgoing a ton of revenue from the individual-sale model. On Reddit, for example, gamers commonly ask each other if they should just wait for a game to eventually land on the service rather than buying it. Financial details about Game Pass have been slow to arrive. Phil Spencer, the CEO of Microsoft’s gaming division, insists Game Pass is profitable and accounts for 15 percent of the gaming division’s revenue. However, document leaks from the Federal Trade Commission’s challenge to the Microsoft-Activision merger earlier this year complicate this narrative.
We do know that Game Pass has somewhere north of 25 million subscribers, producing $235 million in monthly revenue. But we also know that this kind of subscription model has been a flop in many other entertainment sectors. Spotify has consistently lost money. Netflix is profitable, but all the other movie studio streaming platforms are bleeding cash. It turns out that producing exclusive content and operating a massive online database are expensive—and that holds double for games, whose files are commonly immense—while for developers, making content exclusive means losing sales on other platforms.
Sony, naturally, is trying to compete with its own subscription service. Last year, the company shuttered its separate video game streaming service, PlayStation Now, and rolled the streaming capabilities into its preeminent subscription service for access to multiplayer games, PlayStation Plus. The result is Sony now offering gamers tiers of services with varying perks. The most expensive includes the video game streaming option.
THE CHALLENGE IN THE COMING YEARS will be how developers adapt to this changing environment. Sessler noted that blockbuster titles take longer to make than ever before. For example, Grand Theft Auto VI, which is only finally being announced ten years after the release of GTA V, has a reported budget of more than $1 billion. Simultaneously, the tools to make a video game are more readily available too. On a positive note, Sessler said, “Unity’s eye-popping behavior is a good cautionary tale,” referring to the blowback the company received after rolling back parts of its pricing scheme.
Given his experience overseeing the industry, I asked Sessler where he thought we were heading. “I don’t think the console war archetype is useful any longer for the consumer. I think it’s useful in terms of a framing device to understand the health of the industry.”
Maybe the end of the console wars shifts the focus to building better games rather than better hardware. But the true test won’t be fully apparent until Microsoft’s ten-year agreement to keep Call of Duty on Sony’s devices, which was part of the merger agreement, ends. In the meantime, both Sessler and the industry analyst Nick McKay told me that it’s likely Sony, as a much smaller company, tries picking up studios to regain ground the PlayStation could lose in a doomsday scenario where Microsoft unilaterally blocks Sony from Activision titles. Until then, as McKay said, “the console wars still definitely matter.”