How to curb data centers’ hunger for energy and resources
The mantra ‘keep the lights on, keep it cold’ is often quoted by data center engineers when defining their roles, but is this the right approach to energy efficiency and the environmental impact of what are, essentially, utility-hungry sheds that consume the equivalent power of small towns every day?
In January 2016 it was estimated that the global data center industry was using about 416.2 terawatt hours per year. To put things into perspective, the entire UK with a population of over 65m people and over 5.7m businesses consumes about 300 terawatt hours during an entire year.
There have been many opinions on how best to curb the use of such huge amounts of energy, however with billions of people watching cat videos, streaming films, gaming and ‘liking’ photos on social media, an ‘end of pipe’ solution to data center usage would be foolhardy at best, and probably cause global uproar.
The accessibility of data and ease of use is a great example of the Jevons paradox – “technological progress increases the efficiency with which a resource is used, but the rate of consumption of that resource rises because of increasing demand” - in other words, the easier you make it to consume the product the greater the consumption will be!
Is green energy the answer?
The most appropriate place to look at energy efficiency is not at the domestic end-user, but rather at the data center, colo or hyperscale, where energy is brought in and consumed. A potential solution to lessen the environmental impact of these energy hungry giants could be renewable energy. But is this practical?
Some companies will pay a higher ‘green rate’ for renewable energy where in reality the actual energy consumed has come from a traditional fired (coal, oil, gas, nuclear) power generation plant. Is this promotion of renewable energy or just a tax under another name?
The ability to force organisations to use solely renewable energy supplies hinges on the ability to provide enough energy for constant demand as well as servicing the peaks and troughs of supply/generation.
However, until the renewables industry plays catch-up with their technology, there are plenty of alternative strategies data center operators can pursue to reduce their energy consumption – and improve their bottom line in the process.
So, what can data center operators do?
Margins in colocation are now lower than they used to be and clients want to pay via a PUE mechanism. Therefore, more efficient operations are a must.
In Romonet’s experience, significant efficiencies can be gained from CAPEX neutral changes to the data center or from minimal CAPEX expenditure and best practice.
Through our modeling activities we invariably see fixed speed fans and set points that are too low. Air flow management is often poor with a lack of blanking panels and containment. Long rows of racks, while seeming efficient, are often not the case; if an operator only has CRAHs (Computer Room Air Handlers) at one end of the row, the engineers will have to overpressurise the floor to make sure cool air reaches the furthest point.
The ability to increase free cooling hours is vital to reduce cost and improve efficiency.
All these little adjustments can increase a facility’s energy consumption but how can you understand what the effect will be when changing the set points within a cooling system or changing the UPS to a more efficient type?
Predictive analytics for increased efficiency
Many data centers are still managed with outdated and time consuming tools like spreadsheets. However, spreadsheets are prone to errors and can become so complex that the chance of tracking a mistake is very slim.
This is where predictive modeling and ‘what if’ analysis come into play to support data center managers in making data-led decisions.
By analysing trends and data from hundreds of different data centers and meteorological information spanning various climate regions Romonet can accurately simulate, calibrate and validate its models to within 98% accuracy.
Also, this data is used to model proposed changes and predict future savings and PUE values that such a change will bring about.
Using machine learning for processing the data produced by BMS and EPMS systems is far more efficient than humans could ever hope to be…
The ability to produce consistent results relies not only on a good program but also one designed by those who actually know and have operated data centers at the sharp end.
Through the use of our predictive analytics engine and database of information, between January 2016 and December 2017 Romonet promoted utility cost savings of £3,068,500, saved 47,928MW hours and 11,555 tonnes of CO2 in 21 facilities.
The best part is that the majority of these identified savings have been at the expenditure of minimal CAPEX.
The answers to making these savings already lie within the data centers. The information is being produced daily. However, often the systems employed are not able to drill into this data or instead make broad, false assumptions that everything is fine.
Very often we find that although a site may have extensive metering, much of it is either broken, flatlining or just uncalibrated. So, the resulting data is quite frankly worthless.
We spend countless hours cleaning data so it is usable, and then find that meters have not recorded load for some time. Or the BMS system is unable to produce trended information or historic information.
Perhaps the BMS or EPMS system has been replaced and all historic data has been lost. Whilst this is frustrating at best or may cause your data scientist to go off on a rant, work can be done to get a facility back into line and start driving efficiency which can be documented to show savings, reduced PUE and better resilience.
In conclusion, if you want to maintain uptime of a facility at a better and more environmentally cost-efficient model, adopt new technologies to make predictive data driven decisions!