
Does data center size matter in virtualization, DCIM decisions?
Big enough for DCIM
Now that data centers are a critical business asset, building management and IT systems management teams must pay close attention to managing the facility with fast response times to problems and changes.
Data center infrastructure management (DCIM) software tools put facilities management -- such as temperature and humidity monitoring -- with server and storage monitoring and other IT tasks on a single pane of glass. Administrators can provision servers in the optimal load-balanced spot to ensure safe room temperature, guard against unauthorized data center access, monitor energy efficiency and so on.
There is no single point of inflection that defines when or where DCIM should be deployed.
One meaningful measure of DCIM is work-versus-reward analysis. DCIM adds physical sensors and a layer of software to the environment, which adds to the workload for IT staff. Each business that evaluates DCIM must determine if the benefits of superior capacity planning, faster response to problems, lower energy use and so on outweigh the additional work needed to deploy DCIM sensors and software. Your organization might not be large enough for the additional effort and expenditure to pay off in tangible savings, or you might lack the talent and time in-house to make use of the tool set.
Also consider future cloud or outsourcing initiatives. While DCIM is appropriate for large cloud or hosting providers, organizations that plan to move workloads to cloud or to outsourcing providers generally reduce their dependence on owned data centers. Look for a provider with a detailed DCIM portal.
Virtualization versus data center size constraints
Generally, virtualization lowers space, power and cooling requirements, delaying new data center builds.
Hosting multiple workloads on one server vastly improves hardware use and simplifies tasks like workload migration and data protection. Since virtualization allows one server to do the work of 10, 15 or more physical systems, the number of systems needed to operate an enterprise, even a small business, drops significantly.
The abstraction layer of virtualization also allows organizations to evaluate outsourcing alternatives like third-party infrastructure as a service (IaaS) or cloud providers. Outsourcing workloads prohibits data center growth and can stretch refresh cycles.
Once workloads are consolidated onto virtual servers, the data center facility may be too big. Use containment or partition off areas to reduce the air volume that must be cooled. For long-term results, consider renovating the unused space, or opt for more efficient on-demand builds like containerized data centers.
Source: searchdatacenter.techtarget.com