Gaming has come a long way in the last few years. What was once a niche media platform strictly enjoyed by the sort of geeks you’d only find in a bad ’80s high school movie has now become a form of blockbuster entertainment (and, in many cases, an art form) revered the world over by millions upon millions of players. And as the industry has progressed and advanced, so too has the technology and the culture behind it.
Despite this, there’s plenty of room for improvement. Gaming today is fraught with issues, and money-grabbing schemes by publishers are just the tip of the digital iceberg. These are the failures, the tricks, and the schemes that the industry — and all those who are a part of it — are going to really, really regret in just a few years.
Failing to diversify
Say it loud, say it clear: gaming has a diversity problem. The industry is far behind its cousins, such as film, television, and literature (though they’re hardly perfect either) when it comes to representation in its stories. Ethnic minorities, the LGBT community, the disabled, and women — who we’ll come to in a moment — are almost nonexistent as entities in far too many releases, even now in 2017.
The consequences of this have been quite clear. Studios pandering strictly to a white, male client base — the same one gaming catered to in the early days — has led to a toxic culture in which change, in the form of diversification, is fought against tooth and nail by many gaming fans as part of a larger culture war. Here, we end up with things like Gamergate, in which journalists, minorities, and women are targeted and harassed by other gamers.
It’s not a stretch to imagine that this environment came out of a failure of the industry to diversify earlier, and in taking until the internet age (with all its shiny new ways to harass and discriminate against people) to do so, the issue has been worsened spectacularly. Change is possible, and it is likely to come, but if developers and writers hadn’t taken so long to bring about that change, things might not have gotten to this point. It should have happened sooner.
And then there’s the sexism. Whereas a failure to diversify has been a passive failure on the part of the industry, in some cases akin to embarrassing negligence, this is an active failure on the part of creators who have no tact or sensitivity in portraying women in their games. Sexism in video games, of course, is no new thing. From skimpy outfits (RPGs, we’re looking at you) to the exclusion of female playable characters to a weird insistence on following the male gaze, it’s hard to argue that video games have been a bastion for feminist thought or progress.
But as times change (and they will – the gradual and relative improvement in how film and television have portrayed women shows what will soon enough happen with games), there can be little doubt that we’ll all look back on the old days, in the ’80s, ’90s, ’00s and even the ’10s, as a jurassic era in which out-of-touch makers couldn’t stop producing games in which women were treated as jokes. What we have now and what we used to have is likely to be seen as nothing more than a shameful embarrassment.
Cringy E3 conferences
E3 conferences are a very strange phenomenon. Often, the excitement of being shown sneak peeks of new releases or watching announcements for future games is quickly supplanted by the secondhand embarrassment of watching some middle-aged dude in a suit prancing around the stage acting like a bona fide moron. In the past, we’ve seen Ubisoft trying (failing) to make a meme, Sony going down in gaming history with their awful PS3 demonstration, and Aaron Priceman committing a war crime by doing whatever the hell he was doing at Ubisoft’s conference in 2011.
The issue here is that the marketing and high-level production for many publishers is still handled by people who were born a long, long time ago. Many of these people didn’t grow up with video games, and too many of them still don’t really care about them — they’re marketers and business people, first and foremost. As the years go on, however, the reigns of these companies will be passed to the generations who grew up with games and, crucially, the internet. They’ll be savvy, they’ll be clever, and they won’t desperately try to act cool on stage. We can only hope the E3 conferences in the future will be better, but we can know for sure the ones of today will live on only in infamy.
Day 1 DLC
Day 1 downloadable content: the words alone are enough to send shivers into even the most hardcore gamer. The concept is simple — you purchase a new release, race home, and put it in, only to find that the developers have created extra content for you to enjoy that they left out of the game itself. And it’s yours for only $3.99.
It’s a deeply irritating practice, and one that stems purely from a publisher’s desire to make more money from a game. DLC began as a way for makers to keep making after their game had been released, and for new levels, modes, and maps to be added once you already had the game in your hands. With Day 1 DLC, however, it’s become (for some studios) little more than daylight robbery. The good news, however, is that Day 1 DLC is actually bad for business. A franchise or publisher that gains a reputation for offering Day 1 DLC risks damaging its brand and future sales, to the point where, in the long term, they’re going to start losing more money than they make. Once the industry realizes this, you can expect it to distance itself from the practice quicker than you can say "ripoff." The few publishers who stick with it will (hopefully) see their reputation damaged almost irreparably — and it’s them who’ll regret it most of all.
"Bullshot" is a word for video game marketing campaigns that include fake, pre-rendered screenshots that fail to accurately portray in-game footage and graphics. Characters might be posed, resolutions increased, and the whole look of the game generally polished up to look better than it is. Recent offenders include The Witcher 3, No Man’s Sky (a particularly egregious example), Far Cry 4, and the Call of Duty franchise.
There’s already been a fair bit of backlash against the practice since it began in the mid-2000s, and it’s entirely possible that, as in-game graphics improve, it will become less necessary to fake screenshots. When that happens, bullshots will probably become the great, sourly-remembered marketing shame of the gaming industry.
Motion control is the gaming equivalent of 3-D movies. Like 3-D, it came onto the scene with a bang, was embraced by the vast majority of mainstream consoles (such as the PS3’s Sixaxis controller and the Xbox Kinect), and even led to one or two successes, like the Nintendo Wii.
Motion control is also comparable to 3-D in the sense that people are at last figuring out that it’s really nothing more than a gimmick — a product of its time that inspired too few good things and too many cringe-worthy embarrassments. It made you look stupid, it was frequently broken or unusable and, outside of Nintendo, the only games we got out of it were monstrosities like Kinect Star Wars. The times they are a-changin’ and the fad is coming to an unhappy end. By a decade’s time, the only studio that will be glad they ever went with motion control will be Nintendo, and the catastrophic failure of the Wii U means even they will regret sticking instead of folding on this particular piece of technology.
Failing to protect customers’ personal details
Personal security is as vital as ever in the tech age, where the wrong hacker at the right time can lift the details of thousands of people and cause a major security breach in the system of even the largest companies. Sony is the main culprit for this. According to The Guardian, in 2011, Sony fell victim to a cataclysmic security breach that led to the data of 77 million users being accessed, including personal information such as passwords, birthdates, names, and addresses. In the wake of the attack, Sony insisted they would tighten up and take steps to protect their customers.
In 2017, however, as Business Insider reported, customer information was successfully hacked by a security company attempting to prove to Sony that they could get in. The saying is supposed to go "once bitten, twice shy," not "once bitten, shrugs, and carries on." Hopefully — but not certainly — future companies will be inspired to implement proper security measures to prevent stuff like this. If that’s the case, then the major data losses and brand damage that resulted from these attacks will be seen by these companies as lessons to learn from. Hopefully.
Neglecting the PC
As consoles have slipped ahead in the race for mainstream appreciation, the PC has been left behind a little, both by casual customers and by creators. As a result, major releases are too often excluded from the PC. Other times, the port is a buggy, unplayable mess — if it plays at all. Rockstar is a particularly famous example of a studio that neglected the PC, most specifically with Red Dead Redemption, which never saw a PC release despite a major backlash from fans.
The problem is, the PC isn’t exactly a niche platform. Rockstar itself learned this when its release of Grand Theft Auto V for PC turned out to be a gargantuan hit, bringing in two million copies in the first month of release. Once the console industry realizes how much money it can make with a little more effort in PC ports, they’re going to regret not having done so earlier.
MMOs are not a safe bet for a studio to release. With one exception (if you’ve read this far in the article, you know the one), every single MMO that has ever been released has either been absolutely woeful or, despite being decent, has fizzled out and failed shortly after release. This is the case for a variety of reasons — partly because of World of Warcraft‘s monopoly on the market, partly because developers try too hard to emulate WoW, and partly because they’re just too damn difficult to get right in the first place.
Fewer are being released these days than, say, in the late ’00s, and the genre is already starting to decay. Considering the crazy budgets publishers have pumped into MMO releases over the last decade or so, however, it’d be a wonder if any company (except Blizzard) still thinks making an MMO was a good idea. God knows what they’ll think of the genre in a decade, when even WoW could be gone.
The Grand Budapest Hotel cost $25 million to make. Eternal Sunshine of the Spotless Mind cost $20 million. Baby Driver cost $35 million. Do you know how much it cost to produce Destiny, the so-so Activision game you probably haven’t heard from since it released? Try a reported $500 million, including marketing. Too many games these days cost too much to make, and make too little of a return to justify their existence. Releases like Destiny or The Old Republic (reportedly $200 million) funnel in huge budgets and often fail spectacularly on launch.
It’s entirely possible, however, that the growing popularity of low-budget, indie games will lead to a scaling-back of budgets on the part of most (but not all) studios. Smaller will become better, and games like Destiny and The Old Republic will be remembered as the gaming equivalents of that giant hotel in North Korea that nobody ever wanted. Or, you know, it might get much worse. But here’s hoping.