Where should Internet and consumer technology companies draw the line when it comes to censoring or banning content?
The issue was thrust into the foreground this week when Wikileaks moved to Amazon’s Elastic Web Compute (EC2) service to protect itself from a series of distributed denial of service (DDoS) attacks. After widespread criticism, including from U.S. Senate Homeland Security Committee Chairman Joe Lieberman, Amazon announced that it would refuse to host Wikileaks.
The Wikileaks organization exists to make secret, confidential or sensitive information public. It’s in the news this week after beginning the process of exposing more than a quarter of a million US State Department cables.
Opinion on Wikileaks is all over the map; some are calling for the arrest of its founder and others say the organization is good for democracy.
The nature of the content exposed by Wikileaks is not at all universally deplored. It’s a point of controversy, which Amazon has now taken a side on. Is that what a public hosting company should do?
On what basis would Amazon refused to host Wikileaks? Regardless of where you stand on Wikileaks itself, is there some objective standard by which a theoretically neutral hosting company like Amazon can make decisions about what to host and what not to host?
As far as I can tell, hosting Wikileaks isn’t illegal. The government didn’t force Amazon to turn them away; it only requested it.
Should companies ban or censor content whenever someone in the government requests it? Should they have committees to determine for themselves what threatens national security or public safety — or even decency?
Although the idea of a private company censoring content appears controversial, it really isn’t. There is an overwhelming consensus in the public sphere that all content companies should censor. For example, you almost never hear support for the idea that companies should allow child pornography. Even 4chan doesn’t support that.
Disagreement exists over where the line is, not over whether or not there should be a line.
So if we expect companies to censor content, on what basis should they do it? Safety? National security? Morality? Protect children but not adults?
Consider other high-profile questions.
Terrorism experts say that web sites now serve as the top source for recruits and “education” of violent terrorist extremists. The majority of these sites are hosted in the United States by American companies.
One recent example in the news: A Canadian extremist who left Canada and is hiding somewhere abroad from Canadian and international authorities broadcasts calls for violence and even genocide on a Web site called 1st-amendment.netbased in the United States.
Should this be shut down? It’s easy to say “yes,” but on what basis? Public safety? Hate speech? If so, there’s widespread disagreement over what constitutes both.
Mothers Against Drunk Driving, for example, might consider alcoholic beverage ads a far greater threat to public safety than terrorism. Statistically, they’ve got a point. And by what yardstick can a company determine “hate speech”?
Apple is often criticized for banning iOS apps that serve up content. Apple censors based on a very wide variety of unpublished criteria, which may include sexuality, hate speech, support for competing platforms and even “taste.” (Apple even recently banned a magazine app for Android enthusiasts.)
It’s not at all clear how Apple determines what’s OK and what isn’t OK. But how shouldthey determine it? Should they allow just about anything, even if that means what they would consider a degraded “experience” for users or advantage for competitors – even if a majority of users want them to do this?
Facebook allows all kinds of photos that many would consider degrading to women, or even soft-core pornography, but has an iron-clan ban on pictures of mothers breastfeeding their babies.
These facts raise a question about whether Facebook’s censorship policy is a direct reflection of Facebook’s skewed employee demographic, which heavily favors the young and male. For example, if Facebook’s CEO was a 45-year-old mother instead of a 26-year-old frat-boy, the company might choose to ban degrading photos of women and allow pictures of infant breastfeeding.
Should companies establish censorship policies that reflect their own internal values, rather than the values of society at large? On whose values should these policies be determined?
In the blockbuster Xbox game, Call of Duty: Black Ops, players can add patches and images to various objects in the game. Microsoft specifically bans the use of the swastika, the official symbol of the Nazi Party in Germany. The symbol has become associated with the holocaust, and by extension, anti-Jewish, white supremacist hate speech.
How should Microsoft censor symbols? By committee? Policy? As Microsoft Director of Xbox LIVE Policy and Enforcement Stephen Toulouse wrote on his blog, the anti-swastika policy is “not political correctness, it’s fundamental respect.”
That’s easy to say, but how is “respect” determined in less clear-cut cases? Should companies like Microsoft ban everything anyone complains about?
It’s clear that banning swastikas is good policy. What’s less clear is: Where is the line, and how is that line determined? Should Confederate flags be allowed? Soviet hammer-and-sickle flags? Pirate flags?
It’s also clear that censorship is something most people believe private companies should do. But right now there is no process or standards by which they can do it. As a result, each company is left on its own to make important censorship decisions with wildly varying degrees of arbitrariness.
Maybe we need censorship policies to be issued by something equivalent to standards bodies for technology. Maybe there should be industry-wide working groups that get together and haggle over what kind of content should be allowed, and what should not be.
That way, at least, we’d have some objective metric against which to judge the performance of companies on what they ban and what they allow.
Or maybe they should just allow everything.
Let’s hear your thoughts in the comments area!
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.