Free cooling chillers save £50K per month

The challenge for today’s data centres is how to reduce operating costs and satisfy environmental concerns, so it is not surprising that maximising power reliability and minimising energy costs and emissions are core drivers. But when it comes to the financial services sector avoiding downtime is paramount as the Financial Conduct Authority (FCA) takes a tough stance when it comes to IT failures. Building on its successes in delivering cooling solutions for the financial services sector, Aermec has recently completed a chiller replacement using robust and proven technologies, to deliver improved cooling reliabilities, energy savings and reduced emissions for the global data centre of one of the leading multi-national financial powerhouses in the UK.

Background

Data centres generate significant amounts of heat. Cooling is a critical part of the infrastructure and when it comes to replacing equipment in a live mission critical site, there are a number of regulatory requirements and logistical challenges to overcome. Customer expectations also had to be managed.
When the data centre for one of the of the country’s top banking and financial companies decided to replace all its chillers, their key requirements included resilience, energy efficiencies, reduced emissions and lower PUEs (Power Usage Effectiveness). In line with its own environmental and sustainability policies, the company sought minimal environmental impact. Free cooling chillers offered a smarter approach to cooling as making use of the outside air reduces the use of mechanical cooling throughout the year and this strategy is deployed by many data centres.
Behind the security barriers and virtually windowless building of a 30,000sq metre site on an unobtrusive industrial estate in west London, a phenomenal one billion financial transactions are carried out every second. Re-designing the cooling and replacing the chillers required careful planning.

“This is our global hub. Cooling is critical to our infrastructure. Any disruption with the cooling could impact the UK’s financial services but the ramifications could potentially be felt globally,” says the company’s facilities manager.

Following extensive evaluation and analysis, and working in close collaboration with RED Engineering Design, the MEP design consultancy, M&E contractors Gratte Brothers and Aermec distributor AEUK Ltd, free cooling chillers designed and manufactured by Aermec were selected.
As a global business, the financial corporation’s ethos is to ensure that it minimises its environmental impact and reduces carbon emissions across all its business platforms. By investing in the latest technologies, the company expected to satisfy its stringent criteria and boost its green credentials. But it wanted to be a good neighbour. Noise breakout from the site was minimised by designing extra low noise chillers which satisfied the requirement of the customer and the local council.
Having worked on other data centre projects for the financial giant, Red Engineering was familiar with the mission critical nature of the site.

    The major considerations centred on:

  • Build quality;
  • Ability to test;
  • Use of appropriate technology to take advantages of advances in energy consumption, reduced carbon emissions;
  • In-built resilience;
  • Fast re-start.

Testing

Aermec’s advanced manufacturing facilities and test labs near Verona, Italy, showcased build capabilities, and the quality of products. The on-site test facilities, are the largest and most extensive in Europe and confirmed Aermec’s capabilities to perform rigorous testing.
The test labs have Eurovent, AHRI, MCS and cUL certifications and they can simulate operating conditions ranging from -20⁰ to +55⁰C. In addition to a purpose-built data hall simulator, facilities include testing for heat pumps, chillers, air handling units, indirect evaporative coolers and dry coolers. The labs extend to 2MW cooling capacity per single unit and specific labs within the complex cater for extreme temperature testing, ventilation and heat exchange measurements, noise level verification and vibration testing.

Aermec’s testing capabilities were a big plus, they offered one of the most comprehensive testing facilities in Europe and enabled the chiller performance to be robustly tested to satisfy the site requirements,” comments Alex Nock, Associate at RED Engineering.

Six highly efficient 1.366MW Aermec NSM free cooling chillers with screw compressors, using R134a refrigerant were specified to provide a robust mechanical cooling solution. The free cooling route offered a greener and more eco-responsible solution that would help boost the data centre’s green credentials.
The criticality of the site and its exacting requirements required some modifications to the build specifications. These included electronic expansion valves replacing mechanical expansion valves and the microchannel coils replaced with copper/aluminium coils. The specifications of Aermec’s chillers include twin-headed pumps but these were changed to single run and stand-by pumps for resilience and independent refrigeration circuits were also added to the design specifications for each chiller.

“As a manufacturer, we have the flexibility to make modifications and provide a more customised solution for our customers, who can carry out witness tests to ensure their required performance levels are met,” says Paul Lawrence, Director at Aermec UK.

Fast re-start

Power outages are every data centre’s nightmare. In the event of a power loss, bringing a chiller back online rapidly is essential for operations in mission critical data centre environments.

“Each minute of downtime could cost millions of pounds and the data can cook in 20 minutes,” explains the data centre’s facilities manager.

“Chillers can take as long as 15 or 20 minutes to re-start. Aermec’s chillers have been designed to achieve re-start in two-and-a-half minutes and can achieve a 5⁰C temperature differential in less than five minutes,” adds Paul Lawrence.

Noise rating

The data centre is located adjacent to a residential area. Mindful of the location, the chillers were designed to a very low noise rating at full load.

“Because of the location, acoustics were just as important as maximising energy efficiencies and satisfying sustainability criteria. Aermec’s free cooling chillers addressed the customer’s environmental concerns, reliability, performance and resilience, but they also offered the best acoustic performance,” comments Alex Nock, Associate at Red Engineering.

Each chiller took 12 weeks to design, build and factory test. Each unit measures 13 metres and weighs 11 tonnes. They were delivered and installed in a well-planned phased approach, using temporary chillers for support to ensure no disruption to the live site. Despite the size, each chiller was craned into position as a whole unit.
Three chillers are required to run the data centre, but running four is more economical and maximises the benefits of free cooling. The CHW temperature is 13⁰C/19⁰C in normal mode and 17⁰C/23⁰C in high temperature mode to maximise free cooling.

Savings

The chillers have saved £50,000 per month – 1,124,028.90 kWh (data collected from September 2016-January 2017). During the three months of November 2016 – January 2017, 100% free cooling was achieved, with no compressors running. The chillers have contributed to a reduction in the data centre’s PUE, which is down from 1.7 to 1.4.
It is anticipated that the data centre’s full year running figures will meet the data centre’s expectations for energy saving and carbon reduction.

“Companies claim their kit can do more than it can. In our experience, the chillers more than live up to Aermec’s claims and provide the added benefit of being able to do 10% more,” says the facilities manager.

Collaborative working played a key role in achieving the successful outcome.


Due to extraordinary maintenance, the Aermec servers will not be reachable on 2 and 3 January 2023.

Therefore, the sites will not be viewable and the selection programs will not be able to update.