The power usage effectiveness ratio (PUE) is a measure of a data center’s energy efficiency. PUE is calculated by dividing the ratio of the total energy used to the energy consumed specifically for information technology activities. The theoretical ideal PUE is 1, which symbolizes 100% of electricity consumption goes toward useful computation. Power transformers, uninterruptible power supplies, lighting, and especially cooling uses power and raises a data center’s PUE.
Theoretical ideal PUE is 1
Along with the total server count, the power demand for each server has also changed. A 2016 Lawrence Berkeley National Laboratory study shows PUE for facilities at various scales, including a server sitting in a room, a server in a closet, a hyper-scale extremely large data center.
The smaller the server, the higher its ratio and the lower its efficiency. For the smallest server spaces, the PUE is above 2, meaning that more than half of its energy use is for things other than computing. For hyper-scale, the PUE is 1.2 — meaning that most of the energy is going to computation.
2019’s average PUE is 1.67
According to the graphic shows a server or data center’s power consumption by use, the smallest applications used more power for cooling than for computation. But at hyper-scale data centers, more than 80% of power consumption went to IT (servers, networking, and storage), and only 13% went to cooling.
With the expansion of cloud computation essentially in hyper-scale data centers, today’s PUE is considered to be changed. A recent Uptime Institute survey of 1,600 data center owners and operators showed that 2019’s average PUE is 1.67. PUE means that 60% of data center electricity consumption is going to IT, and the rest to cooling, lighting, and so on.
On the other hand, Google says that its data centers have a PUE of 1.1, with some centers going as low as 1.06. To lower the cooling demand for a data center is to cool only to the temperature at which the machines are comfortable, not to where humans are most comfortable. For Google, that’s a temperature of 80 degrees Fahrenheit. The technology company unleashed its DeepMind machine learning platform on the problem of data center energy efficiency three years ago. Moreover, last year, it effectively turned over control to its artificial intelligence.
It is estimated that more of that sort of approach will be adopted by Amazon Web Services, Microsoft, IBM, and other major cloud computing firms. While data center electricity demand is growing, many of these major consumers of electricity are also contracting for wind and solar power to meet their demand. Another result of this growing demand, with many data centers clustering in locations such as Northern Virginia, data center loads are becoming a meaningful share of utility peak demand in a given service territory.
See more Hardware News