Oh boy, this is going to be a long one.
First, some background to explain where I am coming from:
When I was a kid I used to play a variety of different types of titles, from bother large studios and more independent developers.
I remember thinking that the likes of Wolfenstein 3D were going to be the future. Imagine, first person titles giving you the perspective of your character in your game and actually allowing you to live into them. Wolfenstein 3D wasn’t quite ready yet, but then a series of things happened in rapid succession that changed gaming (at least for me) forever.
In 1993 Doom came out. A few years later in 1996 we had Duke Nukem 3D Quake came shortly after and then Quake II in 1997.
These were fun action packed first person titles, and while I loved them, they still felt a little shallow. That was about to change too.
Half-Life came out in 1998 and changed gaming forever. Deus Ex came out in 2000 and built on what Half Life had achieved, but went even deeper and became more immersive first person story telling, while also adding some RPG elements.
I didn’t discover System Shock (1994) until WAY after it launched, but when I did, it had me mesmerized. it was a very influential title.
While I had missed it in 1999 when the first beta mods were released, in 2000 I discovered something unlike anything we had actually seen before in Counter-Strike.
Semi-realistic, first person, team based multiplayer with “real” guns.
I was absolutely hooked.
Over the next five or so years I must have put 10,000 or more hours into Counter-Strike and Counter-Strike: Source. I wound up running two of the biggest public Counter-Strike servers on the U.S East Coast during and immediately after college (I graduated in 2003, miraculously in 4 years with high latin honors in a difficult engineering discipline, despite spending what amounted to almost full time job kinds of hours in game, and working part time and having a busy social/party calendar.
I still don’t know how that happened. The amount of time I spent in game I should have flunked out like so many others who discovered Counter-Strike at that time, and became obsessed. Some of my fellow students hated me after I got an A in Differential Equations after most failed. The professor actually carded me during my midterm as I had only showed up once (first class of the semester), never done any of the homework and he had no idea who I was. I wound up only going to Differential Equations class 3 times. First class, midterm and final, and somehow pulled off an A.
Anyway, I digress.
The moral of this story was that I was irresponsible with my time in college and was obsessed with Counter-Strike and should have flunked out as a result, but somehow didn’t.
Post college in October 2003 I discovered the UT2003 mod, Red Orchestra, and once again I was hooked. Counter-Strike felt almost silly by comparison. It was shallow and unrealistic compared to the gritty eastern front environment the Red Orchestra UT2003 mod, and later the game Red Orchestra: Combined Arms and Red Orchestra: Ostfront 41-45
I was completely sucked in to the gritty, harsh and most realistic game I had ever experienced to date.
I alternated back and forth between good AAA first person single player titles.
Half Life 2, Crysis, Bioshock, etc.
Then a new generation of titles started coming out. They were built on the concept of the storytelling and RPG element first person titles like Deus Ex, but took it even further adding large dynamic open worlds.
We are talking the likes of the Far Cry Series, Fallout series (at least 3, NV and 4) etc.
Titles I have been into over the last 15-20 years or so have been:
- Red Orchestra
- Red Orchestra 2
- Half Life 2
- Far Cry Series (at least 2, 3, 4, Primal, 5, and to a lesser extent 6. The first one was IMHO a boring run and gun action game, and New Dawn was a big letdown, but I still finished them both))
- Fallout Series (at least 3, NV and 4)
- Deus Ex Series (Original, Human Revolution, Mankind Divided)
- Metro Series (2033, Last Light, Exodus)
- S.T.A.L.K.E.R Series (SoC, CoP, these are some of my all time favorite single player titles, maybe excluding CS)
- BioShock Series (1,2,Infinite)
- Dishonored Series
- The Outer Worlds
- Prey (2017)
- Singularity (a bit short, but very well made)
- Dying Light Series (1, The Following expansion, 2. The first one was the best)
- Wolfenstein reboots (New Order, Old Blood, New Colossus. A little “run and gun” for my tastes, but they passed the time)
- Crysis Remastered (again, not really my style, but it passed the time)
- Borderlands Series (again, not really my style, but it passed the time)
- The Outer Worlds
- Chernobylite (This one was a quite surprising gem)
- Cyberpunk 2077 (It got a lot of hate due to the rough launch, but I only played it once patched up to v1.31 and I found it absolutely mesmerizing and enveloping)
- Starfield
There are probably some I am forgetting, but between this, Counter-Strike and Red Orchestra it pretty much sums it up…
Except for one. Back in 1991 I really got into Sid Meier’s Civilization. As I moved on, went to college, discovered Counter-Strike, etc. I mostly forgot about it until rediscovering it at the tail end of the Civ4 era, right after the final Civ 4 expansion “Beyond the Sword” was launched. I have bought and put many hours into every Civilization game and all of its expansions since. Well at least until 7. I couldn’t bring myself to play that one. Maybe I’ll pick it up on sale at some point.
The one big game (once I am done with my biig upgrade project) I have on my “To Play List” is one I have been waiting for for 15 years since I finished S.T.A.L.K.E.R: Call of Pripyat, and that is S.T.A.L.K.E.R 2. The game had a bit of a rough launch (much like Cyberpunk 2077) though, and it is still obnscenely CPU intensive, so I needed an upgrade before tackling it, which resulted in an unwise overkill upgrade that is taking me forever to finish. (see my build log)
S.T.A.L.K.E.R has been my all time favorite single player series (only just ahead of Deus Ex) and I wanted to wait until I could make my first play-through of the new sequel as great as possible.
Anyway, so that is the really long version of my gaming background.
TLDR Version:
I started by playing games on the Atari and Commodore 64 at friends houses, got an 8bit NES in 1986, loved it until I discovered the PC in 1990 after which I never touched another console. At first in the early 90’s I played all sorts of varied titles on the PC. (Anything I could get my hands on really) but then First Person action titles came along and changed everything, followed by first person story based titles, first person story based titles with RPG elements, and finally first person open world story based titles. And a lot of team based multiplayer games, the more realistic the better (Though I could never get into Arma or Squad)
This has been my world in games. Nothing else has really resonated with me, at least since the 90’s. Not 3rd person titles, not Japanese or other eastern games. Not real time strategy. Not small indie platformers.
I would play something like EVE Online or Elite Dangerous, or maybe even Star Citizen but judging by how much I liked the original Wing Commander: Privateer in the 90’s I fear I’d get too absorbed by it.
My take on the current state of games:
I think both extremes are exaggerating. While there are certainly some issues with modern games, but it is nowhere near the worst it has been. But I also don’t think it is the best it has been.
First off: I have almost completely stopped playing multiplayer games. This is almost difficult to believe considering how absorbed I was by Counter-Strike and Red Orchestra at one point.
Part of it is that I have gotten older, and can’t make time for multiplayer gaming anymore. The gaming I do today is sporadic, and mostly after everyone else in the house is in bed, and then I don’t want to be talking loudly on a microphone. Instead my gaming sessions become a way to de-stress late at night before bed.
But I also have some real issues with how modern multiplayer gaming has turned out:
It used to be that if you were going to play a multiplayer game you had to find a community server.There simply weren’t official servers or match-finders or ways to automatically join a game with people on your friends list. While this was comparatively inconvenient, it also had some real benefits.
Firstly, moderation was better. A good community server had a server operator in game most of the time, and if you were an asshat you’d get yourself kicked or banned, and if you liked a server you didn’t want that. You also got to know the regulars and build relationships with them in ways that meant that people would not be total jerks in the same way as is common today.
Sure there are still community servers for some games, but it is not the same. People are lazy and just auto-join a game, resulting in most community servers sitting empty. There was a real benefit to people being forced to use community servers.
Now instead there is a vast wasteland of official servers that are either poorly moderated by the developer OR use some form of algorithmic or AI moderation that results in false positives missing other abuses, and in general results in a worse experience.
- Skins / Lootboxes / Trading:
This really ruined the experience for me. You can’t join a casual Counter-Strike server anymore without some moron harassing others about trades, or seeing their knife, etc. etc.
You don’t always feel like playing a serious match, but on the flipside, non-matches went to shit in the “skins” era.
That, and many of the skins are completely fourth wall breaking. No one in their right mind would charge into battle wearing that!
This resulted in me pretty much permanently no longer playing Counter-Strike after the skins and lootboxes were introduced in an update to CS:GO in 2013.
Oh god do I hate streamers. Fishing for likes and subscribes and doing everything but playing game objectives.
The existence of streaming has - IMHO - further ruined multiplayer gaming to the point where I don’t even try anymore.
I haven’t joined a multiplayer game in probably over 5 years.
Which means these days it is all single player FPS or Civilization for me.
But all is not well in single player world either.
Cyberpunk and S.T.A.L.K.E.R 2 stand out here, but launching games in a broken and in many cases unplayable state has become the norm rather than the exception these days. I mostly blame greedy investors/publishers wanting to rush out titles so they get a return on investment sooner.
I miss the good old days when game developers were run by geeks like of John Carmack who cared about doing it right. I’ll never forget his answer to the question of when Doom 3 would launch. “When it’s done”. I would love to see that philosophy back in games.
- Many games are forced to be always online:
Yes. Even single player games. It’s obnoxious. Stop it. Unless a game is multiplayer, it should be completely playable without any kind of internet connection, or ever even attempting to connect to anything on the network. if a game has both multiplayer and campaign modes, the campaign mode should be entirely playable without ever attempting to connect to the internet.
- Every damn publisher and/or developer wants to force their launcher on you:
This is obnoxious. I already have my preferred store and launcher. Don’t force software on me I don’t need or want. It’s mty computer. I should be the one who is in control.
- Skins and lootboxes are infecting single player games too
You’d think you would be able to just ignore them in a first person title, but the publisher couldn’t have that. Then no one would spend money on them. So then you have fourth wall breaking nonsense like being forced into third person mode when whenever you enter an encampment in Far Cry 6 so you can admire your skins.
They don’t even offer a way to disable it. It ruins the impressiveness, and by doing so, kills a huge aspect of the game the game.
- Dumbing down game UI and mechanics
I first started noticing this in the bad old days of shitty console ports in the late 2000’s. You know, back when we saw an article predicting the death of the PC every other week.
It was clear to me that consoles were being developed for people who liked less complexity, and this was spilling over into PC titles as well due to ports and co-development.
The worst was probably the release of “Civilization Revolution” which felt like it was developed for 6 year olds.
Then for a while there it started getting better. PC games were getting lovely and complex again. That was until Far Cry 6 came out, and parts of the UI felt like a damn mobile game. It was quite disappointing.
The simplified childish UI’s looked terrible. And they gave away all of the different equipment you could get before you got it, so there was no longer any mystique to it all, when you never knew what you’d find.
That and the improvised weapons system was real childish and dumbed down the game a lot over its predecessors.
I’m not entirely sure if this is a huge trend yet, but I fear it is. I hope not though.
- Bribed exclusives trying to force you to use other stores or launchers that you don’t want:
Epic Games Store, I am looking at you. This removes consumer choice, forces people to have tons of bloated launchers on their machine and is just plain obnoxious. It ought to be illegal.
- Development is too expensive:
Most AAA development teams are huge, bloated and cumbersome, and waste money in ways that aren’t needed, like celebrity voices and likenesses and marketing budgets.
Most of the time celebrity voices and likenesses just distract from, and break the fourth wall of a good immersive game, and add cost for no reason.
(I have only once seen this work, and that was with Johnny Silverhand in Cyberpunk 2077. I can’t explain why that was, but it just did. It shouldn’t have.)
As a result an average AAA game costs between $60M and $200M to develop these days, which is absolute insanity. Titles like Red Dead Redemption 2 reportedly cost half a billion dollars to develop.
The result is that despite the player base being enormous today compared to back in the day, so there should be more gamers to spread the development costs over, games just keep getting more expensive.
Launch prices of $70 to $120 are commonplace now. I still have yet to play a game I thought was worth more than $29.99
The industry should learn from Chernobylite, a fantastic game developed on a shoestring budget. (initial $200k kickstarter, but there were probably additional investments after that, still nowhere near the typical AAA game.) it was a better game than most $60M to $200M titles I have played.
- The combination of the advent of Ray-tracing and ballooning GPU pricing has made good, high quality, high resolution gaming out of reach for most:
Lets first be real about raytracing. It had nothing to do with increasing graphical fidelity of games. Late pre-raytracing raster titles looked almost as good as the best raytracing titles.
The problem? They were very labor intensive to develop. Artists had to spend lots of time with shadow maps, fake lighting effects and reflections, etc. to make them work. The end result looked good, but at a fraction of the necessary GPU rendering power.
Nvidia pushed raytracing because they knew it would be attractive to game developers looking to save on artist headcount. After all it is much easier to design a 3d model, texture it and point light sources and cameras at it than it is to do all of the raster graphics trickery to make it look good.
Once developers saw what Nvidia had to offer to save them artist costs, they were almost guaranteed to embrace it, and once they did, Nvidia’s competition (only AMD at the time) would have no good competing product.
It was all about Nvidia defending their market share (a market share they would later abandon in favor of Crypto and AI) and about developers saving money by hiring fewer artists, and pushing the cost of high end raytracing GPU’s and huge power usage onto consumers instead. And games didn’t even get any cheaper, despite these huge artist cost savings. Devs just pocketed the difference.
And this all happened at the same time as we saw a rise in alternate demand for GPU’s. First it was crypto-mining, and then it was AI training. This meant that gamers were in direct competition with huge moneymaking interests to get their hands on GPU’s.
And yes, you can disable ray-tracing, but if you do you get a subpar experience, much worse than how raster graphics looked before ray-tracing was around.
So, this has resulted in scaling (fake pixels) and frame generation (fake frames) being pushed as an alternative. Most can’t afford the GPU’s it takes to actually render a game, so instead lets use these shitty compromises.
Surprisingly, upscaling has gotten to the point where some mild upscaling actually looks decent. Do too much of it and it still looks terrible, but mild upscaling is actually quite good now. I’m thinking the likes of DLSS in Quality mode. Quality mode is great! Balanced and performance, not quite so.
Frame generation - however - is just obje3ctively terrible because of what it does to input lag. It only works well when you have good enough framerate to keep the input lag low (in other words, when you don’t need it)
If you have inadequate framerate without it, and decide to turn it on in order to improve your framerate, you still get shitty input lag ruining the experience.
- Historical inaccuracies, over-representation of minorities and less physically attractive characters:
With great risk of being accused of being either sexist, racist, body shaming, or any other form of unpopular thing to be called these days, I have to bring this up.
To me, I enjoy my games the most when they are as “real” as they can be. I mean, at least within reason. I wouldn’t want to play a game where I get up at 6:30 am, to get to a boring cubicle job for 45 years straight, retire and then die of cancer (these are escapes and fantasies after all), but there needs to be just enough to make the scenario believable so it can be immersive and suck you in.
To a certain extent they need to depict the world as it is, or as it believably would be, not as we want it to be.
In some games this is fine and even works really well to make a game feel more alive and real. Cyberpunk was like this. It is set in a future time, when many of these things have become fully normal. And because of that, the overabundance of homosexual and trans characters kind of works. The way out there Cyberpunk fashion sensibilities also don’t make the 300lb woman NPC with pink hair stand out that much. It just kind of works. The LGBT element in the game is maybe a little bit over-abundant as a percentage of the characters, but it works. I don’t have a problem with it.
But when you are dropping black Nazi German soldiers into a WWII game, that is just too much. It isn’t believable at all, and it ruins immersion.
Don’t get me wrong. I am all for inclusion. If it works with the story and setting, I want games where every single player can play a character they identify with. But it absolutely has to work with the story and setting, otherwise it is just wrong.
As a white guy I was totally OK with playing Far Cry 4 as Ajay Ghale, a presumably south Asian young man, because that is where the game was set. (the choice of villain was a little weird though)
It would have been really weird and less of a good game, if - for instance - in a game set in Japan my character was dropped in like Tom Cruise in the Last Samurai. It just wouldn’t have worked, and would have broken immersion.
I would like to see this kind of adherence to historical or setting accuracy, rather than bending over backwards to be inclusive.
This was one of the reasons I rolled my eyes at, and didn’t wind up buying Civilization VII, despite being every previous release after Civ4 at launch. The make-believe “leaders” that were never real world leaders in the name of inclusiveness was a bit much. (And the Civilizations just suddenly evolving into a different civilization in a different era just didn’t feel right to me)
This next one applies less to me, as I generally want my games to be more realistic, so when characters and NPC’s in a game look more like normal people I like it. But game devs shouldn’t forget that gaming is an escape and a fantasy for many. Interacting with characters that are attractive and interested in the protagonist sells for a reason.
I tend to like a balance. But there are some titles out there that go too far - for instance - giving every single female character in the game the body of a stereotypical “nurse Helga”. Big and overweight with a nose ring and a face that looks like it took a direct impact from a truck.
I’m not some far right conspiracy theorist, and I generally tend to roll my eyes when people start complaining about “wokeness” in games (or really anywhere else in society) but even I can see that they often have at least a small point, and this is a trend that is impacting games, often in a negative way (but sometimes also in a positive way!).
I’d like to see more of the positive inclusiveness, and less of the negative. That’s all.
So, In Summary:
I have many gripes about modern games, as I have expressed above. But gaming is also in a huge renaissance, which is great, as it results in many titles and a lot of variety being available on the market.
Unfortunately “trends” are still a thing that exist, and while there are many games on the market, most of them exhibit the same trends, and if you don’t like those trends, you are not going to like the state of gaming today.
Still, I’d say that games today are generally better than the bad old days of the late 2000’s to early 2010’s when the death of PC gaming (if not the death of the PC all together) was being predicted in article after article, and all the games we got were piss poor console ports that were buggy, had hard coded keyboard mappings to console controllers, and still had prompts (like Press “X” to doubt) left in the PC port, lacked the graphical fidelity to push PC hardware, and were often locked to 30fps.
Some of my favorite titles of all time were PC exclusives (or at the very least, developed first for PC, with ports to other platforms being an afterthought, and thus not influencing the development of the game). I can’t help but wonder if we had more PC Exclusives today, if I would like many of these titles better.
I still think the 2019-2021 (depending on if you did the Early Access or final version) game “Chernobylite” is a great example of what can still be done on a shoestring budget when you have a good team. I thoroughly enjoyed that game. (though I do have to admit that the post-apoclyptic eastern European vibe is right up my alley and makes me somewhat biased)
