WHILST THE LATEST generation of concurrent free-cooling chillers are capable of delivering significant energy savings, Airedale’s internal analysis of data centre cooling techniques indicates air handling units (AHUs) with adiabatic cooling to be among the most efficient methods. Where they are destined for environments calling for very precise control of air quality, temperature, humidity and pressure, AHUs have however in the past presented data centre owners and operators with a challenge. A further issue is their physical size required to generate sufficient air volume for cooling which can present logistical complexity especially in inner city locations where external space is limited.
In response to increased demand for an ultra-high efficiency solution which addresses the challenges of AHU cooling in data centres, British cooling specialist Airedale International has developed the AireFlow™ indirect adiabatic AHU.
In low ambient temperatures the AireFlow™ can deliver 100% free-cooling under ASHRAE conditions (London UK; 26°C supply/35°C return) by using the outside ambient air and modulating the exhaust fans. Using wetted media, instead of an evaporative spray, the cooling system suppresses the temperature of the ambient air directed onto the heat exchanger. The lower temperature drives heat exchange allowing the exhaust fans to run more slowly, reducing exhaust airflow by up to 32% for the same level of free-cooling. Under higher ambient temperature conditions, the adiabatic system is activated to maintain free-cooling.
In locations and/or conditions where free-cooling cannot be achieved, optional inverter direct expansion (DX) or chilled water (CW) cooling provides top-up concurrent cooling in addition to redundancy. DX may be required under higher ambient temperatures where the suppressed ambient temperature is unable to deliver full adiabatic free-cooling or where a low supply air set point such as 22°C is required. The inverter DX system can also remove the need for separate water storage, saving footprint. High efficiency EC (electronically commutated) fans which are up to 70% more efficient at part-load than AC fans provide full N+1 redundancy. The fans, which are offset from one another to maximise efficiency, deliver airflow across the adiabatic system and heat exchangers. Because the ambient air path is modulating, power input can be significantly reduced due to the large portion of the year spent at low fan speeds. A key design feature is the epoxy coated aluminium air-to-air heat exchanger which significantly increases thermal conductivity over plastic or composites.
One of the main barriers to the use of AHUs in data centres is the risk of pollutants contaminating the data centre. The AireFlow’s air-to-air heat exchanger allows heat transfer to occur without any mixing of indoor and outdoor air, preventing ingress of contaminants and eliminating the need for 100% mechanical or direct expansion (DX) back-up cooling which increase ownership costs. G3 / G4 / F7 air filtration and optional NO2, SOx and H2S contaminant filtration preserve critical data centre environments helping prevent server corrosion. An optional integral fresh air inlet (patent pending) maintains data centre air pressure and air quality, removing the need for a separate AHU.
Comparative savings
AHUs provide one of the most efficient cooling methods and offer significant savings in annualised running costs against other high performance cooling technologies. These range from up to 70% against typical DX precision air conditioning (PAC) systems and up to 22% compared with typical free-cooling chillers. Converted into five-year Total Costs of Ownership (TCO), these figures could mean savings of between 7% and 43% compared with typical free-cooling chillers and DX PAC systems respectively. (See footnote 1 below).
Data centre owners could expect to achieve a pPUE2 (partial power usage effectiveness) of 1.035 from the 100kW AireFlow™ unit.
As part of the cost/benefit analysis, other factors that need to be taken into account include water consumption. The application of a wetted media design, as with the AireFlow™, ensures that less water is used without compromising saturation levels; typically this can be as low as 250kg/hr compared with 400kg/hr with a spray system. Through the application of controls strategy, data centre operators are able to control the activation temperature to prioritise either energy or water consumption.
Evaporative spray systems also require large water storage vessels, adding to footprint (typically 9.6m³ vs 6m³). The need for water treatment also comes with high capital and operational costs as salt is required for water softening. A further benefit with the use of a wetted media system is that the heat exchanger remains dry and does not need regular descaling.
Because water is contained within the media and drip tray this also reduces potential corrosion; with a spray system, water coats the internal surfaces of the adiabatic section increasing the opportunity for corrosion.
The UV sterilisation system of the wetted media also reduces maintenance effort and downtime in contrast with spray systems where manual decommissioning is required for regular maintenance, cleaning, descaling and nozzle replacement. If water treatment or softener is required this has to be periodically charged with salts. In contrast the AireFlow™ automatically drains and self-cleans.
A video of the AireFlow™ is available here:
www.airedale.com/aireflow
Reference
1. For simplicity these comparisons are made against Airedale’s own systems which typically rank amongst the highest performers in the marketplace. Based on a fixed load of 500kW, London, UK ambient conditions.
2. Developed by The Green Grid, partial PUE (pPUE™) allows a data centre manager to focus on the energy efficiency of a particular portion of the data centre or mixed-use facility. This may be needed because some measurements of the total data centre energy are unobtainable (because of a leasing arrangement, for example). Partial PUE is the total energy inside a boundary divided by the IT equipment energy inside the boundary. Source White Paper #49 PUE™: A Comprehensive Examination of the Metric, The Green Grid 2012.