You turn on your system and boot up your game of choice in anticipation of a great gaming session that will be a great way to pass the time over the next hour or so. Yet the game never makes it to the main menu and in fact, depending on the platform, the game hasn’t started up in the first place. What you are treated to instead of the main menu of your selected title is an update prompt that means you will have to wait from anywhere between a few minutes to more than an hour for the game to finally be patched so you can actually play. Is this something you’ve gone through?
It seems like the further “advanced” we are with technology, the more complicated and flawed it is. Today, patches and updates to games are very commonplace to the point that it is expected that the moment you purchase a game, there will be a patch that will have to be downloaded immediately in order to play it. Then over time there will be patch after patch after patch that supposedly will fix all of the game’s issues but sometimes those same patches will create new and bigger issues. All the while, this takes away precious time you could be playing the very game you paid for. Let’s then not forget to mention the insane amount of hard drive space that is being taken up by these updates. Call of Duty: Modern Warfare anyone?
What has happened to bring us to the state where the 1.0 version of games just can’t be playable to the point where further patches could be optional? The earlier eras of console gaming gave the consumer complete game releases out of the gate that they were able to go home with, fire up and play immediately. It was a full game with no need of updates or patches. And many of those games were solid, playable and memorable. The PC was where you would see patches because of its online component that existed on the platform before consoles made the jump to online. Sure the games of yesteryear contained some level of bugs and glitches. But how is it that today’s games that receive constant updates can still have issues that actually surpass past games that didn’t have patching capabilities? Is it that the constant tinkering with the games these days rather than focusing on releasing the best 1.0 version of the game possible is causing more trouble than its worth? This is a possibility along with the fact that the developers know that patches allow them to release a flawed game, get the customer’s money and then at their leisure choose what they want to fix or not fix.
Updates today seem to be a mixed bag when it comes to gamers. I can’t tell you how many times I read the phrase, “The update broke the game.” The most recent updates that Rockstar Games made to Grand Theft Auto V and Red Dead Redemption 2 have been met with a plethora of complains and dissatisfaction by the owners of these two games. Doing a YouTube search on “Red Dead Redemption 2 graphical downgrade” will bring up many search results where the videos focus on how the graphical quality of the game seemed to consistently deteriorate with each passing update. With online multiplayer games, I’ve read so many times about how some gamers literally stopped playing certain games because the updates ruined and took away the very components that made them initially buy and enjoy the games in the first place. Remember, aren’t all of these updates supposed to fix and make the games better? With the amount of space on your hard drive that is being sacrificed for them, they definitely should.
Let’s be fair. First, I am appreciative of all the hard work that goes into developing these games. It takes a ridiculous amount of effort for these games we play to come to fruition. Let’s never forget that. And there are many cases where patches do help the games. For instance there may be crashes that need to be fixed that weren’t caught before release. Not every single flaw will be caught before you see a game on the shelves in stores and I understand that. I do feel there is a purpose to patches and they very well can bring much needed improvements. We should be appreciative when there are parts of the game that receive a patch when needed and the fact that developers take the time to care in that regard. Kudos to them for that. Yet when is it enough? How can the issues be minimized ahead of time so that companies aren’t trying to figure out how to constantly fix their games after release? The majority of issues, especially major ones, should be worked out before the game even becomes available for sale. No gamer should ever say that a game that they have bought and paid for is “unplayable.”
What I call for with companies to do is to spend more time with the quality control aspect of their games. I realize they are businesses with deadlines, utilizing resources incur costs and that they are looking to make as much money as they can. We as gamers must always understand that gaming companies are first and foremost about their bottom line. And let’s be honest, when you are a business, one of the very purposes of being a business is to make money. So I can never fault a business’ pursuit for the highest revenue possible. But when a company receives negative press and a whirlwind of bad publicity due to your game not playing as it should, then the backlash it receives is well deserved. To push out products to the market that aren’t ready just for the sake of meeting financial projections for a certain quarter sends the message to consumers that they aren’t valued. And the catch for businesses that are about that bottom line is that they need consumers to reach that bottom line. If developers do the work that needs to be done on the pre-release properly, the rest will follow. And gamers, we need to hold them accountable since it is our money that is being spent on their products.
Pingback: Call of Duty is a Gigabyte Guzzler – The Video Gamer's Advocate