Good article that talks about a underestimated factor of server hosted desktop virtualization business cases: electrical costs for cooling and power consumption.
or putting together a business case costs and revenues are an important part of it. If you want to calculate the direct resource costs associate with hosting a server in your data center, you want to know the direct power consumption by the server in electrical costs and the costs associated with cooling the environment where the server is situated. To do so you will need a few parameters from the device(s) used. You will need the Watts, BTU/h and the electricity costs per kWh.
Where British thermal unit (BTU) is used as a unit for air-cooling power of an air conditioning system and refers to the amount of thermal energy removed from an area. A BTU is approximately a third of a watt-hour. 1000 BTU/h is approximately 293W. Kilowatt hour (kWh) is most commonly known as a billing unit for energy delivered to consumers by electric utilities.
Let’s take for example a HP DL-380 Generation 6 with two Quad core CPUs, 24 GB memory, eight network ports, two 72GB 15K SAS hard disks with two 460 Watt power supplies. This server uses about 307 Watt and generates 1047 BTU.
These values are taken from the HP Power Advisor tool. Often on most technical spec documents the Watt and BTU values are mentioned. Sometimes instead of the Watts per hour, Amps are used on the technical specs. You can use the following calculation to get the Watts: A x V = Watts, where A = Amps and V = Voltage so here in the Netherlands it will be for a 2.4A device on the 230V network 2.4A * 230V = 552 Watts.
How to calculate the costs tied to running this server for 1 year 24×7. The costs can be split into two parts, one is the energy consumption needed for running the server and second the power needed to cool the heat generated by the server.