Most recent blog

Live Services fall, long live the industry

Tuesday, 23 May 2023

The gaming tech race-ism

 

Technology is a wonderful thing, isn't it? Ever improving, ever developing, ever scaring the absolute daylights out of everyone who just wants the world to move a little slower so they can get used to all of the AI written scripts and AI produced images before the AI designed videos and AI mind-reading rat experiments start up. (That last one is a trip, I suggest looking it up.) Sometimes it can feel like tech is caught in a massive arms race against itself to forever improve under fear of being left in some sort of terrifying 'dust' that will render the subject utterly useless in the face of far more capable, and forever arriving alternatives. It's a perception that leaks out of tech development rooms and into the minds of the public, because why use old slow technology when you can use new faster stuff? It seems like a very linear and unambiguous mindset, but everything changes when we look at art, doesn't it?

Because art, in this case video games, uses quite a lot of technology in it's construction, and yet the definition of a 'good game' doesn't begin and stop with whichever title uses the most terraflops to render the most impressive raytraced hair follicle bouncing off a raindrop- because there's more to game development and art than who wields the bigger stick. But bizarrely this is a thorny issue among game enjoyers as the argument of 'better graphics means better game' has persisted for decades and flared up at certain points time and time again. In the time of the disc drives for the original Playstation, the increased processing power meant far more impressive renders could be constructed than what the cartridge based Nintendo console of the time could manage, leading some to believe that Nintendo were soon to be left in the dust. (FYI: The Switch uses Cartridges to this day. Albeit, vastly improved derivatives.) Saints Row 2 versus GTA 4 was simplified in the minds of some by the fact that GTA 4 took several steps towards looking realistic whilst Saints Row remained stylistically cartoonish and exaggerated; and yet the Saints Franchise would go on to last several years and many sequels later before killing itself last year. You see where I'm going with this.

These days the graphic debate is not quite as loud as it was back in the day, because we've got a new representative of graphical authority; frame rate. The rate at which frames are projected onto our screens thus improving the smoothness of the video quality- and a badge of superiority lauded by the PC master race. Better hardware components allow for more frames resulting in smoother gameplay making for, in an apparently objective fashion according to these people, a better game. And to be utterly fair, there's a cohesive logic there. A game that runs faster will generally be a better experience for the player to enjoy and as 60 FPS tech becomes more prevalent the lack of any gameplay option towards reaching that smoothness can be a telltale sign of bad optimisation leading up to launch. But is that the be-all-end-all?

No, of course it isn't; there's more to gaming than that as we've covered. And yet to this day you'll find people chuckling into their sleeves whilst posting that one bearded wojack meme that at this point is becoming more pathetically oversaturated than the rick roll; slamming those loving 'The Legend of Zelda: Tears of the Kingdom' which runs in the paltry 30fps capable of the Switch, instead of a real game that can hit 60 with 2K! Like... Fortnite? Wait, that's really the comparison we've going for here? Fortnite? The Internet's whipping boy? First off, Fortnite is a competitive multiplayer shooter, so 60fps is kind of like a requirement for fair competition, and secondly- Fortnite is entirely stylised in an, increasingly bland, bubblegum-cartoon animation style- what can that really benefit from being 2K? Tears of the Kingdom looks as gorgeous as Breath of the Wild did- how would upscaling evolve that experience? Would it? 

Now I love a good graphical masterpiece of a product, don't get me wrong; which is partially why I've come to really love this new generation of consoles. Now being able to choose between a 'commonly accepted' decent framerate and the beauty of 4k visuals is a feature built into most every current generation game, and when it comes down to it I'm going to choose to buff the graphics every single time. (Assuming, of course, that the game still hits at least 30 even in 'quality mode') Because at the end of the day I want to be impressed and drawn in by the visuals- but then, I am someone who likes to use my current gen console pretty much exclusively for those titles that lean the direction of realistic visual depictions that actually benefit from higher resolutions, whereas my trusty PC can handle pretty much anything else.

Take, for example, Resident Evil 4 Remake and it's utterly resplendent visual mastery that renders everything from the muddy pools of water forming in the boot treads in the ground to the squirming mass of living carnal tentacles frothing out of the stump of recently removed limbs. It's a spectacular action-packed roller coaster of gory blood- but it's also a single player only survival game. Whilst not exactly slow-paced by any stretch of the imagination, Resident Evil 4 isn't ever throwing it's players into situations of split-second reaction times against enemies who will slay you the very second you fail to do the exact same to them, it's not that sort of game. In such an instance, how important really is 60 frames per second when 30 is totally workable?

On the flipside you have a game like Redfall, which has a little more nuance to it. Again, Redfall is an non competitive PVE shooter game that will never quite throw it's users into totally unwinnable 'split second or death' moments of gun duelling- but it is a first person shooter. Games like that, where you're thrust into the head of the player character, have a different set of operating rules than most others. When your actions are meant to line-up one-to-one with the avatar, even the slightest latency delay can throw you off just that little bit, it's one of the responsibilities of creating a First Person Shooter in the modern age. It's a difference that is really hard to 'just get used to' after experiencing what the alternative could feasibly be in another other 60 FPS hitting game. Which is probably why the choice to launch without it was another of Redfall's many serious blunders.

There's a theory that the hypothetical 'final generation of games consoles' will be turning consoles into specialised gaming computers that can be upgraded with parts like any desktop can be- and in that age the tech race will explode into the stratosphere, but I wonder what might get lost in the actual development process along the way. Some of the greatest games of any generation are created at the tailend of the generational life span, when tech is reaching it's breaking point and artists have to be clever with how they make their dreams come true. Adversity breeds perseverence which breeds quality genre-defining games, and that's the kind of will power that can't be replicated in a machine by a trained AI running on a thousand processors.

No comments:

Post a Comment