Quantcast
Channel: Data Center Frontier
Viewing all articles
Browse latest Browse all 2336

Data Center IQ Survey – A Look at Cost, Deployment, and Risk

$
0
0

Today’s business is tightly coupled with the capabilities of its data center. We’re seeing more users, more virtual workloads, and a lot more use-cases for greater levels of density and compute requirements. There’s no slowdown in data growth in sight, and data center requirements will only continue to grow. Through it all, organizations have been working hard to improve data center economics with better underlying data center ecosystem technologies.

Data Center IQ

Download this study today to learn about the key issues data centers professionals face today.

When working with data center technologies – there are numerous considerations around the business, the workloads, and the users. Data center administrators must use the right tools to help them deliver a truly optimal data center experience. Professionals in the area constantly struggle with core management and control challenges like:

  • Which DCIM tools are the most critical?
  • How do you control and optimize your PUE?
  • What should my average rack density be?
  • What happens during an outage and what does it cost?
  • What are some ways that I can speed deployment?

With all of this in mine – Ponemon Institute and Emerson Network Power released the results of the first Emerson Data Center IQ Quiz, part of the Data Center Performance Benchmark Series, which provides an industry-wide perspective on availability, security, productivity, cost and speed of deployment. The purpose of this study is to determine the domain knowledge of data center personnel while also collecting data on application of best practices and current operating conditions within participants’ data centers.

Power Considerations

As density increases, one of the biggest challenges for data center administrators is controlling power requirements. According to The Green Grid, which established the PUE metric, “PUE measures the relationship between the total facility energy consumed and the IT equipment energy consumed,” and is expressed as a ratio of those two numbers. A data center with total energy consumption of 1 MW with 625 kW of that energy used by IT equipment would have a PUE of 1.6 (1000/625). Energy consumed by the data center lighting is included in the total facility energy. As for the study – 45% of respondents stated that their PUE was 1.59 or below. However, only 14% have a PUE of 1.19 – 1.0.

The responses to this question demonstrate just how broad the range of PUEs has become in the current environment, where legacy facilities coexist with new high-efficiency data centers: 10 percent of respondents had PUEs above 2.0, while 14 percent had PUEs below 1.2.

Consider the difference that means for a 1 MW facility: With a PUE over 2.0, less than 500 kW of available power is used by IT equipment, while the same size facility operating with a PUE of 1.2 has 830 kW available to power IT equipment. Eighty-seven percent know their PUE but only 50 percent know what goes into its calculation and 32 percent know what it stands for and.

The Power of DCIM

Data center managers agree; you can’t manage what you can’t see. This is where DCIM comes into play. Global Industry Analytics, Inc. projects the global market for DCIM will reach $1.9 billion by 2020, based on DCIM’s ability to optimize data center infrastructure performance in the face of growing complexity. The participants in this study demonstrated a high degree of understanding of the core capabilities of DCIM and significant use of these capabilities in their own data centers. Many have already deployed the following DCIM capabilities:

  • Resource Management: 73%
  • Predictive Analysis: 71%
  • IT Physical Asset Management: 70%
  • Reporting and Visualization: 70%
  • Workflow Integration Management: 68%
  • Power Monitoring: 67%
  • Environmental Monitoring: 64%

Cost and Reasons of an Outage

The Ponemon Institute has conducted three separate analysis of the cost of data center outages, published in 2010, 2013 and 2016 respectively. At the time this survey was deployed, the 2013 numbers were the most current, showing the average cost per minute of a full outage to be $7,900, up 33 percent from 2010.

In addition to analyzing the cost of full and partial outages in three separate studies dating back to 2010, Ponemon Institute has documented the average duration of data center outages. In the 2010 study, the average duration of a complete outage was 134 minutes (2 hours and 14 minutes).

According to the survey study – “UPS failure” in the broad sense, has been the number one cause of outage during each of the three Ponemon Cost of Data Center Outage Reports, the 2013 National Survey of Data Center Outages conducted by Ponemon Institute sheds more light on the root causes of outages. It found battery failure to be the most frequent cause of outage, followed by UPS capacity exceeded. While both of these fall under the category of “UPS failure,” they do not represent a failure of the UPS module. They represent a failure respectively, both of which are correctable.

Download this study today to learn about the key issues data centers professionals face today. This includes a look at:

  • Maintaining availability
  • Controlling cost
  • Mitigating risk
  • Improving productivity
  • Increasing speed of deployment

As the data center continues to evolve – managers will need to understand the most critical components which power their infrastructure. This means looking at all of the variables around creating greater levels of resiliency, management, uptime, and – very importantly – business integration.


Viewing all articles
Browse latest Browse all 2336

Trending Articles