Power consumption of server farms
Description:
A server farm or server cluster, also called a data center, is a collection of computer servers usually maintained by an enterprise to accomplish server needs far beyond the capability of one machine. Server farms often have backup servers, which can take over the function of primary servers in the event of a primary server failure. One of the main concerns is business continuity; companies rely on their information systems to run their operations.
Due to the inprovements in fiber optic technology, the increased access to high quality networks and the fact that server farms can be run with few personnel (approximately 100 technicians can serve thousands of servers), server farms can be placed virtually everywhere.
Servers often do not operate at full capacity, some operate as low as 4% of full potential. Virtualization allows servers to be devided into multiple virtual machien, thus increasing operating efficiency.
Server farms can be massive, being housed in buildings the size of football fields and consuming up to 100 megaWatt of energy, which is equal to roughly 1400 barrels of oil a day. However, there is a great debate about the energy consumption of servers. Research by Jon Koomey in 2001 showed that servers use approximately 3% of the total US electricity use, which is much lower than the 13% widely quoted in the media.
In recent years legislators in the western world have been showing growing concern regarding the energy comsumptions of server farms.
Carbon emmisions from servers farms worldwide are expected to increase from 80 metric megatons of CO2 in 2007 till 340 metric megatons of CO2 in 2020. In comparison, The Netherlands emitted 146 metric megatons of CO2 in 2007.
Enablers:
- Availability of internet globally
- Availability of energy
- Increase in (digital) literacy
- Demand for data 24/7
- Increase in web-based services
Inhibitors:
- Low-energy digital technology
- (Local) legislation
- Increased server usage efficiency through virtualization