Laptop And Computer Repair
Tips and Tricks
Ever since I purchased my first PC over a decade ago, one of the things that has consistently irritated me more than anything else as a PC gamer is the continual need to upgrade my hardware to be able to play many of the latest games. When I bought my first at the time I could run pretty much everything being released but within probably a couple of years, apart from the odd strategy or puzzle game, my PC was relatively useless for gaming and - as it was a laptop - I couldn't do that much to upgrade it.
From the desktop PC side of things it's almost as frustrating though. While the capacity to upgrade is there, it would have meant that over a period of time I would have had to have upgraded the memory, graphics card, hard drive and eventually the motherboard (once the PC needed a more powerful processor than the motherboard supported), effectively needing a new PC. While I appreciate that this is something that will happen as we demand bigger and better games, what is worrying is the rate at which we are expected to keep upgrading.
For me, the main concern is that PC developers constantly seem to be writing games that need higher spec machines, with a surge forward in terms of system requirements every six months or so? Is it because the software being written just can't work on older systems, or is it just that many games developers are lacking one key skill that is commonplace amongst console programmers?
What skill do I mean? Well, one thing I have seen over the years with the PC is a growing level of complacency amongst programmers. Very few developers optimise their code to make the best use possible of the hardware available to them. If a programmer is trying to get a game to do something and the system won't quite let them, instead of trying to find a way to do it, they opt to push up the minimum system requirements of a game so they can use more processing power or better graphics cards to do the work for them. What do I mean?
Well if you look at early games on the PS3 and XBox 360 and compare them to the current games released for both consoles, you'll see a massive step up in terms of graphics and the performance as programmers have managed to push the consoles further getting more out of them. At the same time, while PC gaming has also had the same growth in terms of gaming technology, the hardware requirements needed to play the games has increased accordingly. If you looked at a typical gaming PC specifications from 5 years ago (to make direct comparisons with the PS3 and XBox 360) apart from casual games, you won't find many modern games that will run on older PCs.
While this step up in gaming technology often results in some great looking games that are usually worth playing, it leaves a lot of people with older desktop PCs with limited options - upgrade your hardware, don't use your PC for games, or get a dedicated games console instead. It has to be said that this choice is part of the reason for the continued success of consoles in the market as gamers generally don't have to worry about upgrading them every few months just to be able to play the latest titles!
Granted, there are games that need the extra processing power and more powerful graphics cards on the market today to offer the levels of realism that gamers are demanding, but are some of the specifications really needed or could programmers work more efficiently or even look to the past in the way they approach game development
One of the main things that has bugged me for a long time about PC gaming is the sloppy use of hard disk space. Every developer seems to think that we all have an unlimited amount of storage space on our hard drives and and that as long as we have room on our drive for THEIR game, then nothing else really matters. I find it astonishing to think that games need to use as much space as they do and that most games seem unable to make use of DVD-ROM drives to access game data during play. Surely developers could store just the necessary amount of data on the hard drive, and then load everything else from disc as it is needed during the game? With the speed of DVD drives today, this is certainly a practical option rather than loading a hard drive with several Gb of data for every game you want to play.
If publishers and developers really do believe that their titles will run better in that way, then why not offer gamers the option to have a full or partial installation instead? We get that option with applications so why not with games as well. Certainly for those of us who use our PCs as a home entertainment hub or as a media creation system where storage space is critical then the amount of space allocated for games really is something you do think about.
Graphics are also another sore point for gamers. Every time we get a new version of Windows released, that places a greater strain on system resources meaning many users immediately need to upgrade their systems to get the most out of it, but then when games are also asking for minimum requirements just to get games to run (let alone to get them performing at their best) it does make me think where this will ever end. A lot of gamers are running 1Gb graphics cards right now, but what's to say that in a year or so we'll see games needing that as a minimum and recommending even greater than that and I don't see an end to this until programmers learn to program games better.
Fortunately, sound in games is one of those rarities that hasn't really changed too much over the years. Granted, there are options to have games in stereo, surround, and 5.1 but I can't think of any games that won't work, at least in the sound department, on most PCs. Saying that, there's still the issue of how sound is handled, and too many games store the in-game music as files on the hard drive instead of spooling CD audio or other audio data from the DVD as the game plays.
Another issue is that of processor speed. It seems that every few months or so, new games are released that either need much more powerful processors to be able to run. This is where most programmers seem to at their laziest! Instead of optimising their games to run faster and more efficiently (and as such, capable of running on a wider range of hardware) they just push up the minimum specs required leaving the gameplayers with the problem of having to upgrade if they want to play.
And the same applies to memory. If you have anything less than 4Gb of RAM these days, you can forget running most games. Yes, some will run with less if you are lucky but in all honesty you're probably going to be faced with only playing casual games or older titles. If you are looking for something with a little more action, or with an impressive 3D engine, then you'll have to look elsewhere.
Finally, we come on to the operating system itself, and I firmly believe that it will only be a matter of time before we start seeing games and other software that will ONLY run on Windows 7 or 8. Even though the PCs themselves are certainly up to it in terms of performance, a lot of older games simply won't work under Windows 7, and it's becoming more commonplace that the reverse is also the case so no matter how much additional hardware you add to your PC, you'll also be looking at installing a new version of Windows as well so be careful about your old software collection which may no longer work.
It's a sad state of affairs really, and one that never used to be the case. Back in the 16-bit era Amiga and Atari ST owners never had to concern themselves over this. You bought a game and it just worked and although some eventually did need expanded systems, it was only when the systems had naturally progressed for all users. It was the same in the 8-bit days. Developers were used to the hardware on offer and made sure that they worked to fit games in to fixed hardware specifications, enhancing them for more powerful machines, not writing them to fit what hardware is coming out.
This is easily the greatest strength of consoles - with a fixed hardware platform, developers are forced to push the hardware and their programming skills to their limits. Back when the original Playstation was released, no-one ever expected it to see racing games of the standard of Gran Turismo, and the PS3 and XBox 360 have also found this type of increase in technical achievement possible as developers strive to get the consoles to do what they want it to. Developing games with fixed hardware limitations forces programmers to be innovative, to write faster and more efficient code, and to really get every drop of performance out of the host machine. You never hear of console games being released that will not work without more memory, better graphics hardware or anything else, so why can't PC developers share the same views?
From a size point of view there simply isn't any justification for games - at full retail or casual games distributed digitally - to be as large or demanding as they are. Games that could fit into under 100Mb with proper programming are taking the best part of a Gb as downloads and it beggars belief why we have accepted this state of affairs for so long.
What surprises me most about this attitude is that it is commercial suicide not supporting people with less powerful PCs. The more machines that a game will run on, the more people who will be able to buy it. Now, you are not going to stop a lot of people from copying them still, but with a larger potential audience, the extra sales should more than cover the extra work involved in making titles more user friendly.
Unfortunately, we're a captive audience and if we want to keep playing all these new PC titles, sooner or later we all end up upgrading our systems to handle it but then our brand new state-of-the-art hardware will just become the minimum requirement for games a few months down the line!
One of our best custom model desktops.
The heart of great computing
Fast working drives for data storage.