How do you prioritize data center cooling?
What are the most crucial factors when it comes to data center cooling?
What should I budget for data center cooling?
What are modern methods for cooling data centers?
These are common questions we hear from our customers in tech. Regarding data center operations, cooling is one of the most important considerations. If servers get too hot, expensive hardware damage can lead to data loss, and that’s just the tip of the iceberg. That's why data center operators always look for ways to keep their data centers cool and improve airflow.
We're here to cover:
The Cost of Data Center Cooling
What do you think of when you " hear “modern data center cooling technology.” - the words “expensive” or “budget" may quickly come to mind. While cooling does take up a significant amount of your expense related to the power consumption of cooling technologies, it is important to look at the value of effective cooling from a long-term cost savings and cost avoidance perspective, including significant improvement in data center PUE (power usage effectiveness). In addition to helping to improve your data center's PUE, data center cooling protects your hardware investments and keeps your operations running.
How Data Centers Benefit from Effective Cooling
Two benefits that hit home for every data center when it comes to cooling. The first is that effective cooling prolongs equipment life. The Uptime Institute shared that “heat is the number one cause of equipment failure.” Next, heat can also lead to data center performance issues and latency. When servers begin to overheat, they “protect” themselves by reducing processing speed to avoid damage. The bottom line is that effective cooling is essential to data center health; here are 6 ways companies improve data center cooling.
5 Ways to Cool Your Data Center
1. Employ Regularly Scheduled Maintenance
Regularly scheduled maintenance plays a significant role in sustaining data center uptime. Having a regular preventative maintenance schedule from server racks to your facility's cooling system ensures everything runs as it should. Without regularly scheduled maintenance, it's too late by the time temperatures rise and could cause a data center disaster.
2. Optimize Server Racks for Cooling Efficiency
In a data center that is heavy in computing or storage, it is critical to have an effective cable management structure to improve cooling at a rack level. With so much equipment in a modern-day rack, the cables can occupy too much space. If you don't dress them appropriately, tie them the right way and effectively manage them, they will start to block airflow. We recommend streamlining cables to allow hot air within the rack to flow easily to the back of the server rack. Today’s server racks come with a variety of built-in cable management options.
Another item you can optimize within a server rack for cooling efficiency is positioning the power distribution units (PDUs). If not set correctly, PDUs can also block airflow within a cabinet. Our engineers strongly recommend implementing a recessed PDU cavity to alleviate this pain point. A recessed PDU cavity moves the PDU outward from the cabinet and away from the area that would typically block. It’s important to discuss these options with your current server rack supplier to ensure they have a model with this design, or if they do not, the capability to manufacture a custom server rack to your needs. So, what do you do with space within a server rack that does not have equipment?
Utilizing blanking panels inside of server racks can be a cost-effective way to improve data center cooling. The blanking panels cover any open spaces without equipment and block the surface of the frame, allowing cool air to enter from the cool aisle or to exhaust hot air into your hot aisle.
Another piece to look at is your server rack doors. Perforated doors help to improve airflow and keep servers cool by allowing hot air to escape, while still providing adequate protection for the equipment inside. The color of a server rack is also an important detail in improving cooling efficiency at a rack level.
Many data centers adopt white hardware as their standard instead of black for heat retention purposes. Consider this, white server racks reflect 80% of light, while black server racks reflect about five percent of light. What does that mean from a cost savings perspective? If you were to change your all-black data center to an all-white one, you would reduce your lighting requirements by up to 30 percent. That is a tangible cost saving on your monthly electric bill from energy efficiency. The best part of this suggestion? Manufacturers typically charge the same price for a white cabinet as a black cabinet with the same specs, so there would be no additional investment.
3. Rethink Your Data Center Architecture
Hot and cold aisle designs alternate rows of hot and cold aisles to improve cooling efficiency. The concept is simple – cold air enters the front of racks in the cold aisle, and hot exhaust air exits the back of the server racks into the hot aisle. Many companies are taking this a step further and implementing hot and cold aisle containment systems to reduce air mixing and operational costs associated with cooling.
The goal for cold aisle containment is to create a smaller area to cool. The cold row is capped at the top of the cabinets and across the aisle, and doors are installed at the ends of the rows to contain cold air.
With hot aisle containment systems, hot air is isolated with vertical panels to reduce energy use and costs. This barrier prevents hot and cold air mixing and directs exhaust airflow into an AC return, which will increase the capacity of computer room air conditioning (CRAC) units.
Deciding between the two systems is specific to your data center needs. Still, our engineers encourage hot aisle containment solutions if your data center has recently increased density/ will increase density.
4. Increase Data Center Temperature
Increasing data center temperature to reduce cooling costs may seem counter-intuitive to maintain uptime but running your data center at a higher temperature than the norm, 68°F – 71°F, has increased data center efficiency. Google shared that it keeps its data centers as high as 80°F to reduce energy usage. Keep in mind that to employ this method, you need to invest in high-efficiency equipment and have a robust monitoring system to prevent overheating.
5. Utilize Liquid Cooling The last item we will review is liquid cooling. Liquid cooling is a budding data center trend, and the concept is much like that of a radiator. Although it is gaining popularity, many data center operators hesitate to bring liquids into their data centers. Many companies also object that liquid-based cooling is more expensive than air-based cooling. Still, Data Center Dynamics "hared “that with total immersion cooling technology, expenses such as chillers, computer room air handling (CRAH) equipment, raised floors or ducting are no longer necessary." In the same article, they also shared that without CRAH equipment and fans inside servers, “liquid cooling technology can reduce energy bills by up to 80%. We are excited to see how liquid cooling trends with the rise of high-density data centers over the next few years.
Data Center Cooling for Modern Day Data Centers
We predict that as data center cooling concerns continue to rise, there will also be a rise in data center cooling best practices and data center cooling technology. We’d love to hear from you on how you maintain your cool while keeping your costs down in your data center in the comments below.
Are you looking for a partner to help you keep your data center cooling costs down? Learn more about DAMAC’s premium data center solutions: https://www.maysteel.com/data-center-solutions/ or contact our team of engineers by clicking here.
Stay up to date with all Maysteel manufacturing news, updates, and trends. If you don't love it, unsubscribe with just a click.