With your electricity costs, I see your point . In USD (todays exchange rate), 60W cost me roughly $95/year.The decision most certainly becomes less a matter of electricity bill and more of initial cost of components.
EDIT: By the way, my 60W bulb burning 24/7 costs me $36.76US annually -- and that's actually running at 60W. At 10%, that goes down to $3.68.
Besides, the AMD FX-8320 costs 137,90 in Europe while the A10-6700 only costs 119, so powersaving is even cheaper .
Thanks for the info on the board!
I don't "need" a graphics core on a server - it's a matter of calculation. The A10-6700 has 65W TDP including graphics chipset (which I actually can use for rendering also with OpenCL). If I'd take a CPU without display chipset included, the display chipset would be on the MB, so it would be "extra" power consumption in the whole calculation.
Anyways, I'm not nailed down to AMD... While researching in the web, I found some Intel chips which are supposed to have only 35W TDP? Core i3...? What's the thing with them?