Yes, I realize that it uses more electricity, but I'm willing to use my CPU for the BOINC client to help further the BOINC causes with a little bit of electricity expenditure. If I get my electricity from wind power, then it really doesn't matter too much in the end, not saying that it doesn't completelly matter because the goal is not just to use green power and waste it, but to use all electricity efficiently thereby reducing the demand for power.
Here is a link to information regarding BOINC and electricity consumption:
The difference in power consumption of current processors from when they are idle to full load is varying from chipset to chipset and manufacturer to manufacturer. The wattage increase on some of them is over 100 watts, while some is much less.
Idle CPU versus full load CPU:
To explain my actions that day, I have BOINC installed and I let it run when I'm doing tasks that have less intense CPU consumption such as email, Internet, and word processing. Yes it uses more electricity, but by switching all my lights to CFL technology, the electricity I use to perform BOINC calculations is minimal.
What I was saying is that I left my computer on unintentionally when I left my aparatment for many hours, but at least that electricity consumption was not wasted. Had it been idle, it would have wasted over 100 watts at all times amounting to about half a KWH for the 5 hours I left it, but because I had BOINC running in the background, it may have consumed more wattage of about 150 to 200 watts amounting to a total of 1 KWH over the 5 hours, at least it was
So to look at my mistake, instead of losing 0.5 KWH to nothing but idle computer time, I used an additional 0.5 KWH to total my consumption at 1 KWH for science. Not a huge trade-off, but definitely not a total loss either.
As a computer science major, I am aware of the electricity consumption at the architecture level as well as how software can affect the consumption rates, however minimal they might be, enough to amount to more than minimal rates over time and per millions of computers.
Idle processing uses 99 percent to 100 percent of your processing power, but doesn't use much memory or anything else that requires much electricity. It basically is a process that uses assembly code to keep the clock cycles running on the CPU and maintaining the status quo of the computer. While it doesn't completely reduce the power consumption, it does however lower it from a process that might be more memory intensive and not call specific assembly (machine code) instructions to the processor in an architecture like CISC (Complex Instruction Set Computing). In a CISC system, there might be an instruction that idle's the processor at a lower consumption rate.
Here is a good analysis of some other PC's running at idle and under loads:
Another idle versus load electricity consumption:
Here is a BOINC article on idle time for those who don't know what we are discussing:
Here is a link about idle CPU time and electricity consumption across the US:
A good link with comparisons of software, operating systems, monitors, and CPU's and their electricity consumption:
Enery Star's energy efficient office:
Awesome analysis of a typical house and it's electricity consumption:
After reading that, it would be useful to do the following:
1. Unplug your wireless router when not in use.
2. Unplug your network switches when not in use.
3. Unplug your TV when not in use.
4. Turn off your monitors when you leave a room.
5. Turn off your monitors frequently when you aren't looking at them. (Only if you have LCD because they can be turned on and off many times without a problem)
6. Turn off the computer speaker system when not in use because the amplifiers draw electricity whether it is being used or not.
7. Turn off the printer when not using.
8. Unplug your fully charged devices such as battery chargers and cell phone chargers once they are done charging.
If you don't do those steps, you are consuming on average about:
1 watt (per cell phone charger)
3 watts (per battery charger)
3 watts to 5 watts (per TV)
4 watts (per printer)
11 watts (per wireless router)
3 watts (per ethernet switch)
3 watts (per computer speaker system)
2 watts to 4 watts (per LCD computer monitor)
30 watts to 34 watts (assuming only one device of each kind)
That is the "phantom" electricity effect. It would be better to just turn that stuff off.
Each year, if you leave those plugged in and on standby mode you are consuming 262.8 KWH to 297.84 KWH.
That's roughly $20 and lot of carbon dioxide output wasted.
Well, that's not very large for just one person, but to see the effect of the population and why it's important to conserve every bit, that's roughly about $20,000,000,000 each year for the entire country of the US.
The effect of killing a watt here and there over the entire US electrical grid snowballs into serious numbers when everyone does it:
Here is an awesome software to reduce power consumption for offices that are using multiple machines just to do email and word processing. It's a good idea because it's basically the concept of Mainframe servers and terminals. I like this idea for low CPU power consumers such as office only users and internet only users.
Awesome low power consumption software:
Google and Intel among others team up to make computers much more efficient: