We all love video games. They provide us with a chance to escape into fantastical worlds, challenge ourselves and connect with friends. But lately, there’s a feeling that the industry we love is on a downward spiral. Games feel more “samey”, studios are churning out sequels faster (well, more often if not quicker) than ever and nickel-and-diming tactics with season passes and microtransactions have become the norm. What’s to blame? As Phil Spencer, head of Xbox, recently pointed out, it’s capitalism.
Publicly traded companies are beholden to their shareholders. They need to show constant growth, quarter after quarter. This relentless pursuit of profit puts immense pressure on game studios. They’re forced to churn out sequels to proven franchises, even when there’s no creative well left to tap. Innovation takes a backseat to safety, and the games we wind up with are predictable and uninspired.
Chances are many of your favorite games from your childhood have seen a sharp drop in quality, probably after an acquisition by a larger gaming studio. The pressure to meet sales figures stifled the creativity that made the franchise special in the first place, and the need to “do the same, but bigger with fancier graphics” leads to a lack of innovation in gameplay.
Lost Luster: Franchises Tarnished by Profit-Driven Decisions
We’ve all been there. A new game in a beloved series is released and you eagerly fire it up ready to recapture the magic of its earlier entries, only to be met with a hollow experience prioritizing profit over player enjoyment. Though there are many games that fit this unfortunate sequence, here are a few well-known examples:
1. SimCity (2013): The once-pioneering city-building sim became notorious for its “always online” requirement, leading to frustrating server issues and hindering core gameplay. Monetization through limited city sizes and expensive DLC felt like a betrayal of the series’ open-ended spirit. The SimCity franchise had been the quintessential city-builder, but the 2013 entry brought that legacy to a close with a soulless cash-grab attempt by EA.
2. Star Wars Battlefront 2 (2017): This game was met with significant backlash due to its aggressive microtransaction model. While the base game offered a solid multiplayer experience, progression felt heavily tied to purchasable loot boxes containing random Star Cards that enhanced character abilities. This created a situation where players who spent more money had a significant advantage, turning a potentially balanced multiplayer experience into a “pay-to-win” scenario.
3. Assassin’s Creed: Unity and Syndicate: Ubisoft’s historical open-world series took a quality dip with Unity and Syndicate. These entries were rushed out to meet annual release schedules, resulting in a buggy mess at launch. The introduction of microtransactions for time-saving boosts and cosmetic items further alienated players who felt the core experience was compromised.
4. Need for Speed (post-2015): The once-innovative racing franchise has lost its way. Games like Need for Speed Payback leaned heavily on loot boxes for car upgrades, making progression feel more like a slot machine than a rewarding racing experience. The always-online requirement in some entries further hampered the enjoyment of solo players too. While this had been a hot car racing franchise for many years, the sharp turn toward profit incentives led this video game series to crash and burn.
5. Fallout 76: One beloved modern video game is Fallout 4, so a new Fallout game had been in very, very high demand. What fans got, though, was just a big mess. Fallout 76 is a strange always online game that tries to make an MMO out of a franchise that is decidedly single player. Why would it do this? Well, it’s probably harder to sell cosmetics when you don’t have other players to impress. When it’s just you sitting on a couch at home, there is less incentive to “show off” by buying whatever new seasonal items are released, but an online play element ups those stakes. Fallout 76 offers a variety of cosmetic purchases through its “Atomic Shop”, and the game largely abandons the rich, story-driven worlds of previous entries in favor of running around a boring wasteland with your friends (or enemies) wearing the latest pants you paid real money for.
6. Left 4 Dead 3 (that never was): This case is a cautionary tale of what could have been. Rumors suggest Valve scrapped the beloved co-op zombie shooter sequel due to difficulty implementing a sustainable monetization model. While the lack of microtransactions is arguably a positive, it highlights the pressure these features put on developers to compromise core gameplay for profit. It’s not enough that a game is good. If there’s no way to make it highly profitable, it could get put on ice.
While these are just a few examples, there are countless more disappointments throughout modern gaming. Fortunately, there are still developers prioritizing player experience. Independent developers have found it easier to reach audiences than ever before thanks to online communities and video game marketplaces. Still, it’s important gamers speak up with our wallets and our voices. The more we invest in quality games, the more likely they are to be produced in the future.
The EA Effect
Electronic Arts, or EA, isn’t the only large gaming studio out there, but it’s one of the most notorious for highlighting this problem of capitalism destroying good video games. Other successful studios have followed suit too, buying up whatever properties they can and brainstorming the best corporate methods for milking profit out of their newly acquired assets.
But they didn’t start out this way. Going back to the early days of video gaming, a lot of game studios were small startups run by a handful of passionate game designers who just wanted to make fun games. Profit was an afterthought. For most, the only monetary goals would have been to simply support themselves and their families.
One great example of this was Maxis. The game studio was started by two guys, Jeff Braun and Will Wright, in order to develop and publish SimCity. With SimCity becoming a massive hit, Maxis was able to move forward with a lot more games, many of which went on to become classics. Maxis also put out a lot of games that didn’t really go anywhere, but these still allowed the team to try out new ideas and innovate.
EA acquired Maxis in 1997, and for a while Maxis was still able to operate somewhat independently, resulting in some more great games, including The Sims. However, after acquisition, EA’s focus on profitability clashed with the more experimental DNA of Maxis. Sequels became the norm, with creativity taking a backseat to proven formulas. SimCity sequels struggled to recapture the magic, and Spore, despite its grand vision, felt shallow. Spore had been a hugely anticipated game promising a lot of highly complex gameplay, but what was delivered seemed to be a cartoonish and extremely simplified take on Will Wright’s original concept.
Will Wright would go on to leave Maxis, and EA would shut Maxis down after the abysmal SimCity reboot. However, The Sims franchise continues to go strong, thanks probably due to its incredibly profitable DLC. EA has seemed to ignore anything else Maxis had once brought to players, and seems content focusing on the money-making juggernaut of The Sims rather than risk trying something new.
The Maxis story serves as a reminder of the tension between artistic freedom and commercial success. While The Sims continues under EA, the studio’s former diverse portfolio of cherished games remains a testament to its lost creative spark.
The Double-Edged Sword of AAA: Bigger Budgets, Bigger Risks
While we might look back fondly on the days when a phenomenal game could be built by a passionate team in a garage, today’s gaming landscape is dominated by colossal AAA titles boasting budgets that would make Hollywood blush. These games deliver breathtaking visuals and sprawling worlds, their ballooning costs create a precarious situation for both developers and players.
Let’s take a look at some recent examples: Red Dead Redemption 2, a visually stunning open-world cowboy epic, reportedly cost over $700 million to develop. Similarly, Cyberpunk 2077, a futuristic RPG brimming with ambition, boasted a colossal development budget. While both games garnered critical acclaim, their sheer scale meant immense pressure to recoup investment.
So, how do developers make back these astronomical sums? Here’s where the issue gets thorny. Often, the answer lies in monetization strategies that can feel exploitative. Microtransactions, loot boxes and season passes become a necessity, potentially sacrificing player experience for profit. Imagine sinking $60 on a game, only to find the best gear locked behind an additional paywall.
And if you can’t milk more money out of the players, then you’ll have to turn to things like in-game advertising or look to cut corners in development. Cyberpunk is a great example, though there are many, of a game rushed out to release before being finished. Eventually a studio has to pull the trigger and launch a game just so it can start bringing in revenue, even if more development time is needed. In a way this has already become a norm, meaning studios are more inclined to take our money for an unfinished product and then use that money to finish it. The downside to this strategy is that if not enough people buy the unfinished product, the studio might decide to just pocket whatever it can and pull the plug on further development costs, leaving the playerbase with an unfinished, buggy mess of a game they paid $60 or more for.
This reliance on manipulative monetization tactics harms the industry in two ways. Firstly, it erodes player trust. Gamers feel increasingly like wallets to be tapped, not valued customers. Secondly, it stifles innovation. Studios become risk-averse, focusing on proven franchises and established formulas to guarantee a return on investment.
Here’s the thing: games don’t have to be that way. Indie developers, free from the constraints of the market, are often the ones creating the most innovative and interesting games. They take risks, experiment with new ideas, and focus on making games they’re passionate about. The best games are often labors of love, not products designed to maximize profit.
So what can we do? Gamers need to speak with their wallets. Don’t preorder games, don’t support exploitative monetization practices, and make your voice heard. Let developers know you value creativity and innovation over endless sequels and loot boxes. And if you’re looking for a breath of fresh air, seek out games from smaller studios. You might just discover your next favorite game.
The future of video games is at stake. Capitalism isn’t going away any time soon, but we can use the tools of capitalism to push back against the market forces that are sapping the life out of our video games. Before you spend your hard-earned money on a video game or some DLC, ask yourself if it’s going to contribute to bringing joy, creativity and fun into the industry or are you just rewarding a slot machine for continually ripping you off?