This is why Microsoft Windows is threatened by thousands of forms of malicious software and, in contrast, specialized supercomputers are threatened by relatively few. This also is why as the adoption of the Internet grew and the number of nodes increased, the Internet as a whole became a threat vector. Then, as multiple popular systems began to coalesce, they too became threat vectors.
From a simple cost/benefit view as a hacker, why work hard on an attack that can only compromise a small handful of obscure machines when you can devise one attack that compromises thousands, or even tens of thousands, of systems globally, which then can be used to enable additional exploits or to stop and collect data?
The fact that the Internet, the largest network on the planet, has an extremely large number of active threat vectors is best evidenced by constant news of security problems and an awareness that there are nodes which only add negative value to the Net. When these predatory hosts are used to compromise hosts that add value, then not only is that total value lost to the sponsoring organizations, as well as society, during the course of the initial breach (assuming it is detected), but tremendous costs — both accounting and economic — are associated with restoring the systems, purchasing, implementing and maintaining countermeasures, etc. These costs play havoc with potential value models because they create equations with multiple unknown variables that cannot be readily solved.
For many organizations, connecting to the Internet and having email and Web capabilities are simply viewed as the cost of doing business and can be readily tallied each month by looking at bills from vendors. The value proposition, in comparison, is nebulous at best to many homeowners and businesses that are not engaged in commerce on the Internet. As a result, many spend as little as possible for the connection and put in as few controls as possible because they can’t measure the value of the Internet to them. However, they can track the costs.
In other words, they know they are spending money, but really don’t know if the benefits merit the costs.
Looking at past history and trying to establish rudimentary risks, the ”it hasn’t happened to me before” mentality can create an environment wherein individuals and businesses, even large ones, spend very little on controls, such as Internet security, firewalls, antivirus, antispam, etc. The final nail in the coffin is a fixation on self-interest and an unwillingness to spend personal/organizational funds to protect the Internet, which is a digital commons.
Safeguarding our Resource
Perhaps the core issue surrounding the Internet is the fact that it is a global public commons much like the environment, albeit a virtual one. As such, the Internet is a resource that needs safeguarding to prevent its misuse and ultimate destruction.
In fact, one can apply the Tragedy of the Commons to the Internet in a number of ways.
First, since people are not held accountable for responsible use, an ”anything goes” mentality exists and is perpetuated by a lack of coordinated action by lawmakers worldwide. Second, there are diminishing returns, much like Garrett Hardin pointed out in his classic article on pollution.
With the Internet, for each additional node added that doesn’t have adequate security and behave in a responsible manner, we observe diminishing returns, or even negative returns. And we lose a portion of total value. How many tens of thousands of zombie hosts are on the Internet right now due to clueless small businesses and home owners who have no idea what is going on, yet are unknowingly allowing coordinated attacks to happen on high-value targets all over the world? How many virii are running wild causing havoc? How much time is wasted and opportunity costs incurred due to spam?
These example risks, and many more, threaten the real value of the Internet to society.
Because the Internet is a commons and is being exploited, it needs regulation to safeguard both it and society. Adam Smith’s “Free Hand of the Market” appears to be a relatively effective control to coerce corrective action after the fact but it fails abysmally until consumer pressure, real or perceived, exists.What Internet disaster will be needed next for the free hand of the market to wake up? Take privacy, for example. It is certainly on everyone’s minds and that involves issues beyond just the Internet. It extends to backup tapes, documentation, multimedia, email, instant messaging, etc. It took the losses of hundreds of thousands of records over the course of a few months before anybody woke up and took notice despite the fact that the danger was well known and privacy “loss” issues have been going on for years before the debacles of 2005.
Governments worldwide must act now in a coordinated manner to protect their national economies and security to put in a sensible set of baseline security requirements and enforcement to ensure compliance. Far too much is at stake to keep allowing the free hand of the market to whip up fervor and cause another set of inconsistent, vague, fear-driven regulations that are ultimately useless to be enacted.
The Internet has the potential to continue adding value to the global economy, but its security and that of its nodes must be protected, which requires coordinated global regulations and enforcement. If not, then the system will become so saturated with threats and uncoordinated ad hoc countermeasures that a great deal of the network’s value will be lost and future value suppressed.
To citizens this means the potential to lose another Library of Alexandria. To corporations it means accounting costs, opportunity costs and lost revenue from dealing with security breaches, unreliable service levels and constantly escalating security and compliance requirements. For nations, there will be constant pressure to protect the economy and national security that can not be dealt with effectively due to the global nature of the Internet.
In closing, the Internet is amazing and its potential value may well be beyond reckoning, but to even pursue it effectively we must proactively implement globally consistent regulations that include enforcement provisions to safeguard our future.
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2020
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Anticipating The Coming Wave Of AI Enhanced PCs
FEATURE | By Rob Enderle,
September 05, 2020
The Critical Nature Of IBM’s NLP (Natural Language Processing) Effort
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
August 14, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.