Cooling A Data Center
When server rooms and small to mid-size data centers are first established, cooling is a lesser priority. As more computing equipment is added, rising heat can cause costly shutdowns. Rapid, uncontrolled expansion might exacerbate cooling concerns. It’s typically unneeded for data center administrators to enhance cooling capacity. Most heat-related concerns can be solved using low-cost rack cooling. Best practices improving airflow, efficiency, and cost.
Here are some tips to improve data center cooling:
1. Measure Intake Temperatures
Hot air recirculation raises equipment intake temperatures over room temperature. IT equipment intake temperatures should be used to make cooling decisions, not room temperature.
2. Understand HVAC’s Job
HVAC systems are intended for occupant comfort. Adding your data center to your facility’s HVAC system may sound straightforward, but it’s not. The return air stream is a convenient way to move hot data center air, but building HVAC systems have constraints that limit IT cooling.
3. Remove Excess Heat
Anything that increases data center heat reduces cooling efficiency. Replace obsolete equipment with newer, more energy-efficient models. Replace obsolete light fixtures with LEDs and UPS systems with online energy-saving ones.
4. Discard Unneeded Gear
End-of-life servers commonly stay plugged in, sucking electricity and producing heat. Worse, older servers are inefficient. It’s sometimes easier to leave equipment in place than to disconnect a gadget someone may be using.
5. Reduce Hotspots By Spreading Loads
When put close together, blade servers and other high-wattage loads can cause hot spots. A small to mid-size data center with ten 2 kW racks and two 14 kW racks is more complicated and costlier to cool than one with twelve 4 kW racks, even though each contains 48 kW of equipment.
6. Arrange Racks Hot-Aisle/Cold-Aisle
Separating cold air from hot equipment air improves cooling efficiency. In chilly aisles, face rack fronts together; in hot aisles, face backs. This reduces energy usage by preventing equipment from sucking in hot air from nearby rows.
7. Blanking Panels Manage Passive Airflow
Airflow control has few or no costs. Installing basic accessories like blanking panels can boost cooling efficiency. Panels prohibit cold air bypass and hot air recirculation. The greatest 1U blanking panels snap-in. The 1U size takes less time to install than screw-in variants.
8. Replace Open-Frame Racks With Enclosures
Open-frame racks are great for some applications, but they lack airflow control. Use completely vented front and rear doors to manage front-to-back airflow. If rack-level security isn’t a priority, remove the doors or buy enclosures without them.
9. Solid Side Panels
Ventilated side panels may appear to help cool, but they recycle hot air and cause difficulties. Solid enclosure walls prevent hot air from recirculating. Bayed enclosures should have solid side panels to prevent hot air from circulating.
10. Manage Cables
Unmanaged cabling prevents cold air distribution beneath elevated floors and causes heat buildup in enclosures. Move under floor cabling to ladders or troughs in raised-floor situations. Organize patch cables and power cords within enclosures using horizontal and vertical cable organizers.
11. Use Thermal Ducts For Passive Heat Removal
Passive heat removal reduces rack and data center heat without adding energy expenditures. Simple passive heat removal may chill a space. Add two vents near a climate-controlled area—one high on a wall and one near the floor.
12. Actively Remove Heat
Active heat removal aids passive ventilation fans. Add a fan to the upper high/low room vent to improve airflow. Hot air is lighter than cool air, thus it rises and the fan speeds up the process.
13. Close-Coupled Cooling
Close-coupled cooling is meant to supplement—not replace—other best practices. Combining feasible and suitable alternatives optimizes data center cooling.
14. Consider Outages
If you have backup power for your data center equipment, consider it for your cooling systems. UPS-backed servers and other equipment will continue to work and emit heat during a blackout, although cooling systems may be down.
15. Schedule A Data Center Assessment
The site, equipment, and application determine the best installation solution. A data center evaluation can help you discover and fix heat-related problems to establish optimal provisioning.