Power consumption of server farms
Description:
A server farm or server cluster, also called a data center, is a collection of computer servers usually maintained by an enterprise to accomplish server needs far beyond the capability of one machine. Server farms often have backup servers, which can take over the function of primary servers in the event of a primary server failure. One of the main concerns is business continuity; companies rely on their information systems to run their operations.
Due to the inprovements in fiber optic technology, the increased access to high quality networks and the fact that server farms can be run with few personnel (approximately 100 technicians can serve thousands of servers), server farms can be placed virtually everywhere.
Servers often do not operate at full capacity, some operate as low as 4% of full potential. Virtualization allows servers to be devided into multiple virtual machien, thus increasing operating efficiency.
Server farms can be massive, being housed in buildings the size of football fields and consuming up to 100 megaWatt of energy, which is equal to roughly 1400 barrels of oil a day. However, there is a great debate about the energy consumption of servers. Research by Jon Koomey in 2001 showed that servers use approximately 3% of the total US electricity use, which is much lower than the 13% widely quoted in the media.
In recent years legislators in the western world have been showing growing concern regarding the energy comsumptions of server farms.
Carbon emmisions from servers farms worldwide are expected to increase from 80 metric megatons of CO2 in 2007 till 340 metric megatons of CO2 in 2020. In comparison, The Netherlands emitted 146 metric megatons of CO2 in 2007.
Enablers:
- Availability of internet globally
- Availability of energy
- Increase in (digital) literacy
- Demand for data 24/7
- Increase in web-based services
Inhibitors:
- Low-energy digital technology
- (Local) legislation
- Increased server usage efficiency through virtualization
Paradigms:
The increase in internet usage will lead to an increased need for server farms, which will lead to an increase in energy consumption. Low-energy technology will dampen the growth rate of the energy-demand, but not decrease demand.
Timeline:
- In the early ages of the computing industry (1960’s and 70’s), computer rooms started to be used.
- During the boom of the microcomputer industry, in the 1980’s, computers started to be deployed everywhere, no longer confined to special computer rooms.
- In the early 1990’s microcomputers (now called "servers") started to find their places in the old computer rooms. The availability of inexpensive networking equipment, coupled with new standards for network cabling, made it possible to use a hierarchical design that put the servers in a specific room inside the company. The terms “data centre” and “server farm” started to gain popular recognition around this time.
- A few years on, the increased demand for fast Internet connectivity and the increased cost for smaller companies to install equipment to provide this, very large facilities, called Internet data centers (IDCs) started appearing. Thease are what is now commonly known as data centres or server farms.
- In the new millenium, server farms have started to move from the traditional locations (such as Silicon Valley) to cheaper regions. Albeit most are still situated in Western countries.
- Exposive growth in Internet usage has fuelled and explosive growth of server farms.
Web resources:
- http://www.govtech.com/pcio/102970
- http://www.lbl.gov/Science-Articles/Archive/data-center-energy-myth.html
- http://en.wikipedia.org/wiki/Server_farm
- http://www.redorbit.com/news/technology/1374175/server_farms_becoming_a_cash_crop_in_the_midwest/
- http://www.computerweekly.com/Articles/2008/12/05/233748/how-to-cut-data-centre-carbon-emissions.htm