Using those ‘wasted’ cycles


I’m no stranger to grid applications, and in particular I’m on the whole all for making better use of my computer equipment, if I can, but I think there’s a slight misconception about how these applications work and operate. In general, most people – especially inside offices – leave their computers on all day and all night, which means they are using up electricity. If each worker works for 8 hours a day on their computer, that’s 16 hours in which the computer is not being utilized to it’s full extent; ergo ‘wasted cycles’. The computer is sat there, largely idle, perhaps occasionally checking up your inbox or refreshing your web pages.Some individuals will use as an excuse to run distributed computing projects, like SETI@Home or Distributed.net to ‘use’ those ‘wasted’ cycles. SETI@Home even use this as one of the reasons to install their software. Note the quotes, their important. The problem is that these cycles aren’t entirely wasted, your machine is doing something. But what is even more important is that if you run a distributed computing project you’ll be doing more than making use of those ‘wasted’ cycles, you’ll be generating more of your own. You see, these programs rely on you maxing out your CPU power in order to get the most of the time your machine is not being utilized. Unfortunately, ask a modern CPU to do more work and it’s power consumption increases. Take a lowly 500MHz G4 – hardly a modern CPU, and the power usage increases from 5.3W to 11.9W. that’s more than double for just the CPU alone. Increase the CPU power and you increase the temperature, and that increases the likelihood that the fans will switch on and so you increase the overall power consumption still further. We’re no longer using wasted cycles, we’re generating more, and using more power in the process. At a time when certain parts of the USA, China and many other countries have energy problems, is using ‘wasted cycles’ really the best of of what little power capacity that remains?