HalloweenWeed
Lets take a computer that actually uses 525 watts.
Pulling 525W at idle would be very high. I have a kill-a-watt and my UPS tells has an LED display showing how much is being pulled from the wall. I have an OC i7 920, 2x 285s, 3 hard drives, 7 fans, DVD-RW, etc. and when my computer is sitting idle with the monitor off I am pulling around 220W from the wall. Doing stuff like web browsing I pull around 330W. When I am doing intense gaming it varies widely, but can get a bit over 800W.
The way I calculate cost is by using the approximation $1 / watt / year. This works out to 11.4 cents/KWh, so specific cost can be off by a bit; but this approximation is easy to do and gets you in the right ballpark.
I leave my computer on all the time, and use it for gaming maybe 20 hours a week. 20 hours a week is about 12% of the time, so my cost is like 700W * $1/W/Yr * 0.12 + 220W * $1/W/Y * 0.88 = ~$280/yr. If I turned my computer off when I was not using it, say maybe 75% of the time, then I could save 220W * $1/W/yr * 3/4 = $165/yr which is a non-trivial amount of money.
HVAC cost is an interesting angle. In the winter when I am running the heat then this power is really a wash, because if my computer did not heat up the house then my furnace would have to fill in this extra heat. In the summer when I run the AC though, I get billed double (a little more than double actually): for the computer power and for the AC to remove that heat. So it is like a 0x cost multiplier in the winter, 2.3x multiplier in the summer, and 1x in the spring and fall. What I need is a way to vent the computer heat outside during the summer to get that multiplier back down to 1x :)