Although there is no one single trick that admins can use to achieve an energy-efficient data center, there are several small things that can significantly decrease energy use.
Infrastructure power requirements can drive up operating costs. Data center managers can cut their utility bills if they address power needs for CPUs, storage and cooling systems.
Utility bills are no small data center expense. As part of the push to address IT spending, data center managers and organizations continue to look for ways to drive down data center energy costs and increase overall energy efficiency.
Because there are so many infrastructure components that use electricity, managers have several options for increasing data center energy efficiency. Some of these options include adjusting fan speeds, storage hardware, cloud infrastructure use and even operating temperature. These small changes can collectively reduce data center power consumption, resulting in significant energy savings.
Enterprises worldwide are deeply engaged in their digital transformation journey, as they digitize and automate antiquated processes. To get there, they are increasingly investing in data analytics and business intelligence tools to analyze extensive datasets and make the right business decisions.
Consequently, the data analytics market is surging, and now tops $200 billion in annual spending, according to IDC analysts.
Similarly, a rising trend is also seen in the data analytics job market. The U.S. Bureau of Labor Statistics predicts a strong growth of over 30% in data science positions by 2030. Moreover, according to Gartner, nearly every business (up to 90%) is estimated to value information as a critical asset and data analytics as an essential competitive edge.
Several factors are fueling this exponential growth in the data management arena. Here we look at the top seven trends that determine the data management market in 2022 and beyond, as enterprises strive to meet every data-centric demand for competitive edge.
Extreme heat and cold can keep equipment from operating at peak efficiency. Explore cost-efficient and cost-effective cooling technologies and smart options for your facility.
One of the most vital tasks for any data center is environmental monitoring and management. High temperatures and humidity levels can damage IT equipment, causing them to fail. This could lead to uncomfortable conditions for anyone working inside the data center.
Fortunately, there are many systems and technologies that can help monitor and manage data center cooling to keep temperatures and humidity levels in the optimal range.
Discover the different classifications of liquid cooling -- such as direct-to-chip, liquid immersion or rear-door heat exchangers -- before adopting it in your data center.
As the performance levels and rack power densities of modern computing equipment climb, more companies transition from air to liquid cooling, since liquid offers a more efficient method of transferring heat. Despite some concern about mixing liquid and electronics, liquid cooling technology has evolved to make such concerns more obsolete.
Water at standard conditions is far better at conducting heat per unit volume than air, which means liquid cooling increases both cooling effectiveness and energy efficiency for data centers that employ it. Plus, it's easier to manage than high volumes of air. ASHRAE Technical Comittee 9.9 has even added another liquid cooling classification to standardize the breadth of liquid cooling applications.
The quest to maintain operating temperatures for increasing computing densities has firms transitioning from air cooling to liquid cooling. We assess both methods.
Data centers continue to pack more computing power into smaller spaces to consolidate workloads and accommodate processing-intensive applications, such as AI and advanced analytics. As a result, each rack consumes more energy and generates more heat, putting greater pressure on cooling systems to ensure safe and efficient operations.
In the past, when rack power requirements remained well below 20 kilowatts (kW), data centers could rely on air cooling to maintain safe operating temperatures. But today's high-performing racks can easily exceed 20 kW, 30 kW or more. This is, in large part, because the computing systems within these racks are configured with CPUs and GPUs that have much higher thermal power densities than previous generations. Although some air cooling systems can support racks that require more than 20 kW of power, they are inefficient and complicated to maintain, causing organizations to look into liquid cooling.
See all Archived IT - Operations articles
See all articles from this issue