Hyperscale Data Centers3 minutes reading
Having spent a large part of my career in the semiconductor business, I clearly remember the 2005 era when computing power, or the ever increasing push for more performance translated into clock speed, or Ghz took a “right-hand turn” and the industry started looking more at performance per watt. Clearly, this was driven by new architectures that were less power hungry, the emergence of the mobile ecosystem and the limitations imposed by battery life but even then, 11 years ago, there was a concern that global computing power would be an immense power-hog and could accelerate global warming.
Doomsday scenario then was predictable: Data Centers in the US consumed about 61 billion kilowatt-hours of energy in 2006, about 1.5% of total US consumption. The Government itself was a major culprit, using about 10% of all that electricity. The EPA (Environmental Protection Agency) projected that by 2011, even accounting for the efficiencies that technology could bring, the US Data Center industry would consume more than 100 billion KWh.
Fast forward to today, and the four horsemen not only have not arrived, but probably stopped somewhere for a cup of coffee. A new report by the Department of Energy concludes that the Data Center industry in the US consumed in 2014 about 70 billion Kwh, up only 4% from its last measurement in 2010.
So what exactly happened? We have today a boom in white floor construction (see Ashburn, VA), there is more computing power available nowadays than the most optimistic report ever dreamed possible in 2007. How is it possible that energy consumption only grew a meager 14.7% in the past 8 years? According to the DOE, there is one major cause: Hyperscale. That´s right: Amazon, Google, Facebook and Microsoft and its suppliers such as Intel, Dell and VMware have done more for the environment than it meets the eye.
Hyperscale Data Centers are more efficient on every aspect from an energy consumption stand point (and from a computing stand point as well, but this we all grew accustomed to already). First, servers at Hyperscale facilities have an average active usage that is higher than those sitting in server rooms across corporate America (by more than 4x). Servers sitting idle consume energy without being transformed in productivity, and the energy utilization curve is much better the higher you go on capacity. This is a major gain. Shared cloud infrastructure, that uses installed capacity more efficiently, also contributes with a smaller utility bill. Another interesting point is that servers in 2007 and in 2014 have a similar power envelope, but in terms of computing power, they have increased many fold. This is Moore´s Law at its best. In storage, the adoption of SSDs (Solid State Drives, storage based on non-volatile memory) vs HDD (Hard Disk Drives) also reduced significantly the energy draw from Datacenters, but in the DOE study, there is a lot of headroom. By 2014, only 8% of total storage was using SSDs, which is projected to grow to 22% by 2017. On every spec of the DC ecosystem, there was a significant improvement of performance per watt, including network equipment, storage or computing.
But that does not account for all the savings. Hyperscale DC designers and the DC industry as a whole are constantly trying to better its PUE (Power Usage Efficiency). According to the same study, an average PUE of a closet or server room facility is between 2 and 2.5x. This means, in laymen’s terms, that for every watt of server power used, one needs an additional watt or watt and a half to cool the space. Hyperscale DCs have, on average, a PUE of 1.2. Google just announced that it used its GO playing AI machine DeepMind to lower its DC utility bill by 15%.
We, at ODATA are highly motivated to service the industry, be it with providing the best design possible for those building Hyperscale capacity, or by offering corporations that today perform all their computing in-house similar facilities to those of large cloud providers, so that they can benefit from the same gains in efficiency and power saving capacity. Our new building in Santana de Parnaiba has a PUE design of 1.47, very low for the tropical conditions of Brazil. To reach this number, we will be using newer technology, including more efficient air cooling equipment that uses outside air when the temperatures are low, thus saving energy and reducing operational costs for our clients, among other solutions.
Concluding, what we saw during the past 9 years was technology doing what it does best: innovation at breakneck pace, solving real world problems. An interesting fact: if over the past thirty years or so the automobile industry had increased the combustion engine efficiency as dramatically as computer efficiency has in some respects, you would now be able to drive your car coast to coast (approximately 3000 miles) on about four milliliters of gasoline.
to help you learn more about the world of colocation.
Data Center Infrastructure
Data Center infrastructure modernization: Crucial to support increased data traffic
There have been endless conversations about the increasing need to modernize Data Center infrastructure so that the technology park can support business growth. The agendas range from expanding to the perimeter to diversifying server allocation models, such as Colocation, for the constitution of a hybrid architecture, to the latent demand […]4 minutes reading
Data Center Infrastructure
Data Center redundancy: more security and stability for your company
Data Center redundancy is one of the main requirements for any organization’s corporate network structure to operate with tranquillity, security and efficiency. This is because the unavailability of services can directly impact the financial health of the company, as well as the productivity of the team and the protection of […]6 minutes reading
Data Center Infrastructure
Cloud Computing: everything you need to know
Undoubtedly, technology is one of the leading business drivers in the digital age. So it is unsurprising that professionals in all industries want to understand it – especially Cloud Computing, whose use has been popularized since the beginning of the Covid-19 pandemic. In this scenario, what would be the relationship […]6 minutes reading