I’ve recently stumbled upon a couple of new companys like Onlive or Gaikai (demo) whose primary business model is to stream video games hosted in huge server farms (the “clouds”) over broadband networks to everyones low-powered home computer. And this business model makes me think, not only if I remind myself that today’s video platforms like YouTube already take a huge piece of the global bandwidth usage (somebody once calculated this for youtube last year before they started the high quality video steaming and estimated they stream about 126 Petabytes a month).
No, it also makes me think about ecological issues. Let us compare the possible energy consumption between a “traditional” gamer and a (possible) future online gamer who is using one of these services. I won’t and can’t give you detailed numbers here, but you can probably get an idea where I am heading if you once read Saul Griffith’s game plan – its all about getting the full picture of things.
Let’s start out with the traditional gamer, who has a stationary PC or Laptop with a built-in 3D graphics card, processor and sound system. If he plays video games all his components are very busy: The CPU is calculating the AI and game logic, the graphics card is processing the pixel and vertex shaders rendering billions of pixels, vertexes and polygons every second into a digital image stream, which is then sent to the user’s monitor at the highest possible frame rate. A sound system outputs the game’s voices, music and sound effects with the help of the computer’s built-in sound card. As I said I can’t give you a final number here, every setup is a little different to the other, but you can probably get an idea how much power is used even for an average gamer setup – several hundreds of watt.
How does the online gamer compare to that? Well, the first look is good. The only things this gamer’s computer has to process here are video and sound, and the video actually only has to be decoded from a regular encoded digital format. Most PCs even with a lower GHz rate will be able to accomplish this task. The sound will be, by today’s standards, probably only simple stereo, so no need for a custom sound processor or big sound setup either. I’d guess the usual consumption for this setup would be less than one hundred watts. Sounds great? Maybe, but maybe not.
The thing is that the video signal itself has to be generated first – on a high-end machine or “cloud” of computers. This means that the needed graphics and CPU power consumption is moved from the “client” – the gamer’s PC – to a “server” component – it did not simply vanish. There is not a single computer involved which consumes energy to let the user play, but maybe a huge ball park. And the parts of the ball park which process the game’s contents need extra power. I don’t know how much, but I bet it won’t be little.
Ok, server farms might be better suited for these kind of tasks, you might say, because virtualizing these computing-intensive tasks would mean you could use serveral server instances in parallel and therefor also use their power consumption more ecological… But wait, this is not a simple web server idling most of the time which gets virtualized here, we’re speaking of game virtualization. Remember how the single users PC was under full load while computing the game’s contents? And, how much can the program code of a game which is used to run on a single PC really be virtualized and parallalized? Does every of these online gaming clients needs dedicated hardware in the end…?
Now, lets assume the services managed to work around these problems somehow smartly – the online gamer’s power consumption footprint of course raised already because we learned that his video signal needed to be created somewhere else first which might have costed a lot of power. But we’re still not there – the signal is still in the “cloud” – and its huge! Uncompressed video in true color even with a – by todays standards – lower resolution of 1024 by 768 pixels takes for a smooth experience 75 Megabytes per second! Hell, If I get a 1 MB/s download rate today I’m already happy…
So, of course the video signal needs to be compressed. While the later decompression is not as costly, the compression, especially for real-time video, is and it takes lots of processing power and a very good codec like H.264. Special, dedicated hardware might do this task faster than an average Joe’s PC hardware components, but this hardware still needs extra power which we need to consider.
Are we done with the online gamer? Well, almost, the video signal is created, compressed and ready for transport, but it hasn’t yet been transported. We need the good old internet for this and send the constant huge stream of packets over dozens of hops and myriads of cables to him. Every extra hardware which is needed for this extra network load again needs hundreds, if not thousands of watts. Of course not exclusive, but the partial bandwidth and power consumption of these components is surely different if you browse a website, listen to online radio or stream a full-screen video.
As I said multiple times, I can’t give you any detailed numbers, but I really, really have the bad feeling that the whole idea of game virtualization is just a big, fat waste of resources – especially energy.