Within the next seven years it is predicted the data centre industry will overtake the airline industry in terms of total carbon footprint#. Now more than ever users of data centres need to ask themselves, “is our data centre green enough?”

Aside from the most energy efficient data centres (those with a PUE (power usage effectiveness) of less than 1.2), the answer is ‘no’ – there is always room for improvement and nowhere has this been more apparent than in the realm of data centre cooling. Current estimates suggest that about 1% of the energy consumed by the entire world and 5% of Europe’s annual energy bill is spent just on cooling computers.

PUE is calculated by dividing the total amount of power a data centre consumes by that which is used by the IT equipment. A PUE score of around 2 for a data centre would be considered ‘average’, while a PUE of 1.5 is deemed ‘efficient’ and anything below this is difficult to achieve. Just a few facilities are now operating at 1.1 – 1.4 by using very efficient cooling methods.

In legacy data centres (usually those older than 5 years), this can often account for nearly 40% of power consumed. As such, until the hardware and computer processing unit (CPU) manufacturers catch up with the growing demands for increasingly efficient servers and storage area networks (SANs), cooling is the best thing for data centres to focus on if they want to improve their green credentials.

A Brief History of Data Centre Air Conditioning

Back in the late 90s when electricity was (relatively) cheap, racks and equipment were installed in an ad-hoc basis and computer room air conditioning units (CRACs) were run at maximum power 24/7. Since then there has been a steady progression of ‘best practice’ design models. In broad terms there have been three main areas for improvement:

  • Power delivery (i.e. uninterruptible power supply (UPS) efficiency power factor correction etc)
  • IT resource efficiency (i.e. virtualisation)
  • Heating, ventilation and air conditioning (HVAC).

Out of these three, HVAC has had the greatest potential for reform and is driving down the most energy efficient data centre’s PUE to impressive levels of 1.1 and 1.2.

Blowing Hot and Cold

One of the earliest design changes aimed at improving air flow efficiency was the introduction of hot and cold aisles. Rather than trying to bring the overall ‘ambient’ air temperature of the whole data centre down to 19°C, racks and equipment are set up in such a way that cold air coming out of the sub floor plenum is funnelled in the general direction of the front of the servers and SANs (the ‘cold’ aisle), while the hot exhaust air is focused into a specific area for the CRACs to handle (the ‘hot’ aisle).

In more recent years there has been a trend towards further segregation of hot and cold air through the introduction of cold aisle corridors. These corridors physically separate the hot and cold air and can make a significant difference to the overall efficiency of the cooling system.

Evaporative Cooling

Today, however, there is a better option, both financially and in terms of carbon footprint. It’s called adiabatic cooling, sometimes referred to as Computer Room Evaporative Coolers (CREC). It works on the principle that when water evaporates it draws energy and heat away with it.

So, just as people sweat when hot to cool down, this technology draws warm ambient air through a wetted filter, which in turn causes some of the water to evaporate and cools the ambient air. Having proven successful in the US where Facebook and Google have launched similar systems, CREC units are now available in various formats in the UK.

What Else?

Other eco-friendly technologies that data centres can implement include:

  • Installation of a power factor correction (or PFC) unit on the mains power supply
  • High intensity LED tube lighting
  • Blanking plates and floor grommets in racks
  • PIR-activated lighting in halls and walkways
  • Grey water recycling
  • Usage of power-efficient servers and virtualisation.
  • And good old-fashioned recycling of cardboard packaging, plastic and paper.

So, is your data centre green enough?