Ten years ago, power usage at data centers was growing at an
unsustainable rate, soaring 24% from 2005 to 2010. But a shift to
virtualization, cloud computing and improved data center management is
reducing energy demand.
According to a new study, data center energy use is expected to increase
just 4% from 2014 to 2020, despite growing demand for computing
resources.
Total data center electricity usage in the U.S., which includes powering
servers, storage, networking and the infrastructure to support it, was
at 70 billion kWh (kilowatt hours) in 2014, representing 1.8% of total
U.S. electricity consumption.
Based on current trends, data centers are expected to consume
approximately 73 billion kWh in 2020, becoming nearly flat over the next
four years. "Growth in data center energy consumption has slowed
drastically since the previous decade," according to a study
by the U.S. Department of Energy's Lawrence Berkeley National
Laboratory. "However, demand for computations and the amount of
productivity performed by data centers continues to rise at substantial
rates."
Improved efficiency is most evident in the growth rate of physical servers.
From 2000 to 2005, server shipments increased 15% each year, resulting
in a near doubling of servers in data centers. From 2005 to 2010, the
annual shipment increases fell to 5%, but some of this decline was due
to the recession. Nonetheless, this server growth rate is now at 3%, a
pace that is expected to continue through 2020.
The reduced server growth rate is a result of the increase in server
efficiency, better utilization thanks to virtualization, and a shift to
cloud computing. This includes concentration of workloads in so-called
"hyperscale" data centers, defined as 400,000 square feet in size and
above.
It-News
0 comments:
Post a Comment