Growing Trends in the Data Center World, by Guest Author Yannis Gatsiounis of AngelouEconomics

Yannis Gatsiounis
Yannis Gatsiounis

Associate Project Manager, AngelouEconomics

As companies grow, their data centers often follow – turning into enormous, expensive operations harnessing massive amounts of energy to operate. But advances in technology and business modeling are altering the data center landscape.

A data center is a centralized computer repository for a business’s IT operations and equipment. Data centers prevent disruption to a system’s IT infrastructure by providing backup communications connections, power supplies, data storage and security devices.

Cloud technology is the most talked about change to data centers in recent years. Transferring data centers to the cloud allows companies to scrap the on-premise hardware by storing the data on the Internet instead. The process also usually involves outsourcing operations including updates and maintenance to a third party, reducing the burden on in-house, IT staff and overall operational costs. Cloud data centers involve fewer applications – sometimes as few as one – a single hardware environment and software architecture, and less application patching and updating than traditional data centers.  Lower costs and fewer infrastructural demands means clouds are easier to initiate.

Clouds, however, cannot store servers and equipment, and cannot handle workloads as complex as traditional data centers can. And because a physical data center is linked to a local network usually monitored by company staff, it’s easier to secure than those plugged into a cloud network. Cloud security customarily depends on third-party providers.

Traditional data centers are less scalable than cloud systems. The research company Gartner estimates that data centers are obsolete after seven years, while the Uptime Institute found that more than a third of the large companies it surveyed anticipated outgrowing the IT capacity of their data centers.

Another trend, says Peter Gross, vice president of mission critical systems at Bloom Energy, is that software increasingly is defining data centers and their services – “Management of service controlled by intelligent software as opposed to hardware,” said Gross, “making for a far smarter environment.”

A more software-centric data center will, for example, optimize power generation and enable workloads to be shifted intuitively across data centers. With software-defined data centers (SDDC’s), functions relating to the network, server and storage are virtualized. This allows applications to better control their resources, and makes for greater flexibility and responsiveness overall.

Another trend, one facilitated by cloud technology and advances in software, is colocation, whereby data center infrastructure, including space, cooling equipment, bandwidth and security, is rented out to companies at a single location. Cost is the driving force behind the decision of many businesses to co-locate; power, network, connectivity, physical security, engineers and specialized staff, 24/7 monitoring, and a predictable operational model are often provided. Another advantage is the ability to scale up operations; additional capacity can be provided quickly.

Mario Hernandez, CEO of the San Antonio Economic Development Foundation, has seen a drop in the number of data centers being built around San Antonio as companies opt to colocate their data centers.

Margaret Rouse, who writes for and manages TechTarget’s IT encyclopedia,, notes that there are potential downsides of colocation centers. They may involve long-term contracts, which will prevent renegotiation of rates when prices fall. Fine print of contracts may include hidden charges.

TechTarget lists the following among costs that are commonly overlooked: ancillary services, such as janitorial services; backup power systems; environmental services; security guards; contractors hired to assist with technical problems and other services; and costs associated with network bandwidth.

Colocation can be extremely beneficial to Internet-related businesses with midsize IT needs, because it allows the staff to focus on the company’s core business rather than the supporting logistical infrastructure. But there is no one type of business that can benefit from colocation.

There are generally two types of colocation centers and the key is determining which best suits your business needs. Wholesale providers offer large spaces that besides power and cooling infrastructure are essentially empty. The tenant brings in servers, cables and other equipment and makes sure the system is operable. In retail, the provider tends to provide these services and infrastructure, and spaces are usually smaller.

“Standardization,” said Gross, “will play a bigger role moving forward.” Colocation allows providers to more cost effectively address a client’s needs. Standardization is likely to see fewer players controlling the architecture of data centers, making integration and optimization easier. The risk, say some, is a one-size-fits-all approach that makes for a less innovative environment. However, standardization of the main building blocks need not stifle flexibility and customization.  It can in fact make it easier for suppliers to deliver fully integrated, value-added and scalable solutions.

HP is one company moving toward standardization. In 2013, it formed its Converged Systems business unit, which focuses on establishing core platforms of converged infrastructure to optimize workloads and built-in use cases. The approach prevents siloing through use of a localized, foundational application that is built out across the data center.

Data centers are likely to get more energy efficient, due to both environmental pressure and cost-effectiveness. McKinsey and Company found that data centers on average were using only 6-12 percent of the electricity to compute computations on their servers, with much of the remainder being used to keep servers ready for an uptick of activity that could slow down or crash the system.  The New York Times notes that “the inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet the expectation.”

According to the Environmental Protection Agency data centers consumed 1.7-2.2 percent of the total electricity consumed in the United States in 2010 and predicts consumption will grow about nine percent per year through 2020, as the world grows more digitalized and record amounts of data are being created and consumed each year. IBM estimates that users have created 2.7 zettabytes of data – and 90 percent of that has been created in the last two years.

Some companies are using re-engineered software and cooling systems to waste less power. Another method is “right-sizing” the data center platform, which basically involves working with server manufacturers to streamline their designs and to remove items extraneous to a particular operation. A third way is virtualization, which can be used to make server utilization more efficient by managing servers without the physical hardware typically required to do so and making use of virtual servers instead. Microsoft’s data centers are using virtualization to better maximize server resources like central processing units, disk inputs and outputs and memory.

The United States remains the world’s safest and lowest risk place to locate a data center, according to a report from Cushman & Wakefield, hurleypalmerflatt and Source8. Robust internet bandwidth capacity, reliable power costs, and low risk effecting data center operations contributed to the top ranking; the UK ranked second. And, Honk Kong at sixth place, topped Asian countries.

But within the U.S. risk varies significantly. Areas with poor power infrastructure can lead to blackouts, which can cause extended downtime of services. Places susceptible to hurricanes, tornadoes and other natural disasters can further impact risk. Cooler climates can reduce costs because less energy is required to cool the data center. Generally, it is safer to put data centers in less populated and trafficked areas. More people, more commerce equals more risk.

In a 2012 study by location consultants the Boyd Company, Sioux Falls, SD, is the best place to locate a data center from a cost and security standpoint. Runners up were Tulsa, Okla.; Ames, Ia.; Council Bluffs, Ia.; Bloomington, Ind.; Albuquerque; San Antonio; Omaha; and Colorado Springs. All are smaller to mid-size cities with a lower cost of living than major metropolitan centers.

About AngelouEconomics
AngelouEconomics is a leading Corporate Site Location and Economic Development Consultancy. The firm has provided site location services to clients with over $5 billion worth of data center projects and developed over 400 economic development strategic plans for U.S. and international clients. To learn more, visit www.angeloueconomics.comor email:, (512-658-8400 cell).

Leave a comment

You must be logged in to post a comment.