It’s curious how we incessantly need more. We always seem to need more processors, more storage, more memory, more people, more everything … Why?
As the abundance of any given resource increases, so too, it seems, does the rate of inefficient use by the user, the customer and IT. There is a very real risk that the rate of inefficiency will overtake and negate any gains made, or even cause losses in excess of perceived benefits due to problems with utilization and management overhead.
Way back in the stone age of PC computing (early to mid-1990s) we marveled at Russian programmers who hand-coded and optimized in assembler to make the most efficient use of their relatively limited computing power. For them necessity was the mother of invention – they had to make do with the computing they had.
Today, we pick the lowest-cost route and use high-level languages that generate relatively unoptimized code and then mask the performance by compensating with inordinately high-performance hardware for processing and storage. Today, we buy ridiculously elaborate software suites that have massive hardware and maintenance requirements and then actually make productive use of a fraction of the capacity.
Unrecognized constraints
The drivers all too often are the vendors and their relentless upgrade paths. Think of it in this way – if processor power and data storage have followed exponential growth curves, then why hasn’t our productivity followed a similar curve? We’ve definitely improved, don’t get me wrong, but at nowhere near the same level. Very real and often unrecognized constraints exist between us and our goals.
The point of business isn’t to run the latest and greatest technology – the point is to make money, or, to be more precise – to maximize the return on investment in a sustainable manner.
Whatever we do with computing should be subordinated to achieving our goals – not goals in and of themselves. Are we really moving toward our goals when we add more storage, more processors and version 1001 of some software package? We typically upgrade due to risks associated with falling outside of the upgrade path, the support path or having hardware that is no longer available or is at a point in its lifecycle that it is likely to fail.
Does risk avoidance by itself move us toward our goal, or are we optimizing in one area at the expense of the overall system? This is fun to ponder (with lots of caffeine) as we realize that an over-optimization of risk mitigation can itself create a risk and must always be tempered by the needs of the business.
Left unchecked, storage and processing requirements easily could expand to consume an unacceptably large portion of an organization’s earnings. It is easy to fill the capacity of a new resource and far harder to manage and optimize the utilization of scarce resources.
In other words, the path of least resistance is to buy additional capacity rather than negotiate the appropriate use of currently near-capacity systems.
Focus on the business
If isn’t a question of “can we expand” but one of “should we expand – can we sustain it?” It would certainly make sense to have policies and procedures in place to manage the utilization of capacity before it gets out of control.
As a thought, when you get ready to install the inevitable next multi-petabyte SAN, take time to negotiate service levels that form expectations around business requirements and what IT can deliver. Moreover, make the business customer assist in the cost justification – if it’s truly a business need, he’s in a far better position to explain to senior management why the service is needed and this perspective must take the total costs into consideration reflecting hardware, software and the IT personnel required to care for the system.
This brings our topic full circle. If the people who comprise organizations have a tendency to over-consume and make inefficient use of resources, then IT needs to work with the business units to understand their requirements, including legal and regulatory requirements, and then work as stewards of the organization to shepherd decisions that maximize productivity while managing risks.
On one hand, it is very easy to add storage, to add processors, get the latest version, and to add staff, but should we? Where is the point of inflection wherein the model is no longer sustainable and we witness productivity or growth decline? At what point does the house of cards collapse?
Inefficiency impacts not just the user’s ability to gain value, but IT’s ability to sustain effective and efficient management of the ever-increasing volumes of resources that are complexly integrated.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.