Data Center Definition

Data Center Definition

Data Center Definition
Data Center Definition

Data Center Definition: A data center (or datacenter) is a facility composed of networked computers and storage that businesses or other organizations use to organize, process, store and disseminate large amounts of data. A business typically relies heavily upon the applications, services and data contained within a data center, making it a focal point and critical asset for everyday operations.

Data centers are not a single thing, but rather, a conglomeration of elements. At a minimum, data centers serve as the principal repositories for all manner of IT equipment, including servers, storage subsystems, networking switches, routers and firewalls, as well as the cabling and physical racks used to organize and interconnect the IT equipment. A data center must also contain an adequate infrastructure, such as power distribution and supplemental power subsystems, including electrical switching; uninterruptable power supplies; backup generators and so on; ventilation and data center cooling systems, such as computer room air conditioners; and adequate provisioning for network carrier (telco) connectivity. All of this demands a physical facility with physical security and sufficient physical space to house the entire collection of infrastructure and equipment.

Data center consolidation and colocation

There is no requirement for a single data center, and modern businesses may use two or more data center installations across multiple locations for greater resilience and better application performance, which lowers latency by locating workloads closer to users.

Conversely, a business with multiple data centers may opt to consolidate data centers, reducing the number of locations in order to minimize the costs of IT operations. Consolidation typically occurs during mergers and acquisitions when the majority business doesn’t need the data centers owned by the subordinate business.

Iceland encourages data center industry with low cost green energy.

Alternatively, data center operators can pay a fee to rent server space and other hardware in a colocation facility. Colocation is an appealing option for organizations that want to avoid the large capital expenditures associated with building and maintaining their own data centers. Today, colocation providers are expanding their offerings to include managed services, such as interconnectivity, allowing customers to connect to the public cloud.

Data center tiers

Data centers are not defined by their physical size or style. Small businesses may operate successfully with several servers and storage arrays networked within a convenient closet or small room, while major computing organizations, such as Facebook, Amazon or Google, may fill an enormous warehouse space with data center equipment and infrastructure. In other cases, data centers can be assembled in mobile installations, such as shipping containers, also known as data centers in a box, which can be moved and deployed as required.

However, data centers can be defined by various levels of reliability or resilience, sometimes referred to as data center tiers. In 2005, the American National Standards Institute (ANSI) and the Telecommunications Industry Association (TIA) published standard ANSI/TIA-942, “Telecommunications Infrastructure Standard for Data Centers,” which defined four tiers of data center design and implementation guidelines. Each subsequent tier is intended to provide more resilience, security and reliability than the previous tier. For example, a tier 1 data center is little more than a server room, while a tier 4 data center offers redundant subsystems and high security.

Data center architecture and design

Although almost any suitable space could conceivably serve as a “data center,” the deliberate design and implementation of a data center requires careful consideration. Beyond the basic issues of cost and taxes, sites are selected based on a multitude of criteria, such as geographic location, seismic and meteorological stability, access to roads and airports, availability of energy and telecommunications and even the prevailing political environment.

Once a site is secured, the data center architecture can be designed with attention to the mechanical and electrical infrastructure, as well as the composition and layout of the IT equipment. All of these issues are guided by the availability and efficiency goals of the desired data center tier.

Energy consumption and efficiency

Data center designs also recognize the importance of energy efficiency. A simple data center may need only a few kilowatts of energy, but an enterprise-scale data center installation can demand tens of megawatts or more. Today, the green data center, which is designed for minimum environmental impact through the use of low-emission building materials, catalytic converters and alternative energy technologies, is growing in popularity.

Organizations often measure data center energy efficiency through a metric called power usage effectiveness (PUE), which represents the ratio of total power entering the data center divided by the power used by IT equipment. However, the subsequent rise of virtualization has allowed for much more productive use of IT equipment, resulting in much higher efficiency, lower energy use and energy cost mitigation. Metrics such as PUE are no longer central to energy efficiency goals, but organizations may still gauge PUE and employ comprehensive power and cooling analyses to better understand and manage energy efficiency.

Data center security and safety

Data center designs must also implement sound safety and security practices. For example, safety is often reflected in the layout of doorways and access corridors, which must accommodate the movement of large, unwieldy IT equipment, as well as permit employees to access and repair the infrastructure. Fire suppression is another key safety area, and the extensive use of sensitive, high-energy electrical and electronic equipment precludes common sprinklers. Instead, data centers often use environmentally friendly chemical fire suppression systems, which effectively starve a fire of oxygen while mitigating collateral damage to the equipment. Since the data center is also a core business asset, comprehensive security measures, like badge access and video surveillance, help to detect and prevent malfeasance by employees, contractors and intruders.

Data center infrastructure management and monitoring

Modern data centers make extensive use of monitoring and management software. Software such as data center infrastructure management tools allow remote IT administrators to oversee the facility and equipment, measure performance, detect failures and implement a wide array of corrective actions, without ever physically entering the data center room.

The growth of virtualization has added another important dimension to data center infrastructure management. Virtualization now supports the abstraction of servers, networks and storage, allowing every computing resource to be organized into pools without regard to their physical location. Administrators can then provision workloads, storage instances and even network configuration from those common resource pools. When administrators no longer need those resources, they can return them to the pool for reuse. All of these actions can be implemented through software, giving traction to the term software-defined data center.

Data center vs. cloud

Data centers are increasingly implementing private cloud software, which builds on virtualization to add a level of automation, user self-service and billing/chargeback to data center administration. The goal is to allow individual users to provision workloads and other computing resources on-demand, without IT administrative intervention.

It is also increasingly possible for data centers to interface with public cloud providers. Platforms such as Microsoft Azure emphasize the hybrid use of local data centers with Azure or other public cloud resources. The result is not an elimination of data centers, but rather, the creation of a dynamic environment that allows organizations to run workloads locally or in the cloud or to move those instances to or from the cloud as desired.