You don’t need a Nobel prize for Economics to realize that the world’s economies are facing a slowdown or recession head on. And it doesn’t take a genius, or large leap of logic, to work out that your data center’s budget is likely to face a cut.
Whether you have an inkling a cut is coming or you haven’t been warned of an impending budget cut, establishing a course of action to cut costs now would be a wise move, according to Ken McGee, a vice president and Fellow at Gartner.
As far back as last year Gartner was warning about the need to prepare for a recession. Since then, things have obviously changed for the worse. “Since that time, the factors we based the research on — such as GDP growth projections and expert predictions for the likelihood of a recession — have worsened to a degree that convinces us it is now time for clients to prepare for cutting IT costs,” McGee said in January.
McGee recommends dedicating top staff exclusively to investigating IT cost-cutting measures, and appointing a senior auditor or accountant to the team to provide an official record of the team’s performance. He also recommends reporting progress to senior managers on a weekly basis and identifying a liaison with a legal representative to make it easier to work through legal issues that may crop up in connection with maintenance and other contracts or penalty clauses. This is to ensure cost-cutting measures don’t result in increased legal liabilities for your company.
So, having established that now is the time to take measures to help the data center weather a recession, the question is where should you look to cut costs?
One of the most significant data center costs is electricity — for powering both the computing equipment and the systems used to provide cooling. Virtualization can play a key role in reducing overall electricity consumption, as it reduces the number of physical boxes needed to power and cool.
A single physical server hosting a number of virtual machines can replace two, three or sometimes many more underutilized physical servers. Although a physical server working at 80 percent utilization uses more electricity than one working at 20 percent, it is still far more energy-efficient than running four servers at 20 percent along with the accompanying four disk drives, four inefficient power supplies, and so on.
Virtualization also shrinks costs by reducing the amount hardware that must be replaced. If you operate fewer servers, you then have fewer to replace when they reach the end of their lives. Thanks to advanced virtual machine management software from the likes of Microsoft and VMware, the time spent setting up and configuring them (and thus the associated cost) can be much less than that spent managing comparable physical servers.
And virtualization need not be restricted to servers. What’s true of servers is true of storage systems, too: Storage virtualization can cut costs by reducing over-provisioning and reducing the number of disks and other storage media that must be powered (and cooled), bought and replaced.
This leads to the concept of automation. Data center automation can take a vast amount of investment, but it also promises significant cost savings. In a time of recession it’s prudent to look at initiatives that carry a modest price point and offer a relatively fast payback period. These may include patch management and security alerting (which in turn may enable lower cost remote working practices,) and labor-intensive tasks, such as password resets. Voice authentication systems, for example, can dramatically reduce password reset costs in organizations that have large numbers of employees calling the IT help desk with password problems. Such systems automatically authenticate the user and reset relevant passwords.
Any automation software worth its salt also has the added benefit that when it reduces the number of man-hours spent dealing with a task, managers have the flexibility to choose between reducing data center human resource costs and reassigning employees to other tasks, including implementing further cost cutting systems — thereby creating a virtual circle.
A more straightforward, but contentious, strategy is application consolidation. Clearly the more applications your data center runs, the more complex and expensive it will be to manage them. Thus, consolidating on as few applications as possible makes good financial sense, assuming, of course, the apps are up to the required task. If these are open source applications, which in practice probably means Linux-based ones, then there’s a potential for significant savings, in terms of operating system and applications license fees, and CALs.
Bear in mind that significant support costs will remain, and Microsoft and other large vendors make the case that the total cost of ownership of open source software is no lower than closed source, but at the very least, you may be able to use open-source alternatives as bargaining chips to get a better deal from your existing closed source vendors.
As well as looking at changes that can be made at the micro level, it’s also useful to look at the macro level at the way your whole data center operations are structured. For example, you may have set yourself a target of “the five nines” for system availability, but it’s worth evaluating if this is really necessary. How much would it reduce your costs to ease this target to 99.9 percent? And what impact would it have on the profitability of the business as a whole?
If you can identify only a few applications that require 99.999 percent uptime, it’s important to consider if your data center is the best place from which to provide them. A specialized application service provider may be able to provide this sort of reliability at a lower cost for a fixed, per user fee, with compensation if they fall below this service level. It certainly doesn’t make sense to provide more redundancy than you need: That’s simply pouring money down the drain.
Also consider whether your data center is operating longer hours than necessary. Thanks to the power of remote management tools, you may find it makes more sense financially to leave it unmanned at certain times, while having a number of staff “on call” to sort out problems remotely, should the need arise.
Finally, it’s worth mentioning best practice IT management frameworks like the IT Infrastructure Library (ITIL) and Microsoft Operations Framework (MOF). Aligning operations to these frameworks is a medium- to long-term project, but they are intended to ensure that all IT services, including those associated with the data center, are delivered as efficiently as possible.
If you can achieve that, you are a long way down the path to ensuring your data center can endure any slowdown the economy can throw at it — not just this time, but the next time, and the time after that.
This article was first published on ServerWatch.com.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.