Blunders. We all make them. With the exception of technology journalists – who are wise, mistake-free individuals – all people and companies commit serious snafus. They simply mess up, big time.
Fortunes are lost, wrong paths are taken. Minor concerns are treated as major problems, while looming disasters are disregarded. Visionary CEOs turn out to be blithering idiots (the most common source of blunders.)
The field of technology is particularly susceptible because change is so constant. When the world turns upside down every 18 months, true foresight is required to look smart. No one can see ahead allthe time.
The following list, then, is only a partial account. But of the many missteps, mess-ups, miscalculations and outright step-in-the-doodoo blunders, these are some of the choicest.
1) The Apple OS Decision
The Blunder:
Apple refused to refuse to sell its OS separate from its hardware, forever consigning it to tiny market share.
What Happened:
In the late 1970s the personal computer business was wide open, a veritable desktop Wild West. Nobody knew who would emerge as the dominant player.
For a period, Tandy’s fearsome hot rod of a PC, the TRS-80, looked like a winner. It had a deluxe cassette back-up system and its own word processing software, Scripsit, which enabled you to set your own text margins.
The Commodore PET, with its stylish black-and-green monitor and rugged metal case, included a built-in tape back-up system, the Datassette. Its mini-sized “chiclet keyboard” was hard to use, but at least it came with a keyboard, unlike some systems.
An early front-runner, Apple, launched the Apple II and soon thereafter, the improved Apple II+. It was easy to use – not just for techies – and boasted attractive color graphics and a hot spreadsheet program, VisiCalc. Apple used an open architecture; its many slots allowed you to attach third party gear like memory extenders or graphic cards.
Its architecture was so open that by 1980 various manufacturers were selling Apple clones – a move Apple hired lawyers to squash. The only hardware that can run the Apple OS will be made by Apple, thank you very much.
The tech world’s 5,000-pound gorilla, IBM, realized a lucrative market when it saw one, and entered the PC market with its full weight. Seeing the competition from Apple, IBM opted for a similarly open architecture. In fact, it was even more open – the company actually published its BIOS specifications. The IBM PC, released in 1981 with an awe-inspiring 640KB of memory (if fully loaded), was a huge success.
Over the next several years, IBM’s decision to opt for open architecture defined the PC industry. But to IBM’s chagrin, the open hardware specs allowed companies like Compaq and Dell to sell clones, boxes that got cheaper and cheaper. A generation of PC buyers realized something: buy an inexpensive machine (pre-installed with Windows) and you were ready to roll. Who needed IBM?
For a brief moment as the clone market was zooming upward, Apple had a chance to license its OS and continue to be a top player. (In 1984, Apple’s annual sales of $1.5 billion dwarfed tiny Microsoft’s $98 million.) But, having sealed its fate with its anti-clone stance, Apple was left behind. Later, it realized the mistake and briefly allowed clones; but the time had passed.
The company’s original anti-clone decision was an expensive one. In April 2007, by one count, Apple had a whopping five percent share of the personal computing business.
Moral of the Story:
If you have a choice between the software and the hardware business, it’s usually more lucrative to choose the software business.
2) That “Linux is a Cancer” Remark
The Blunder:
In a horrible case of foot-in-mouth that revealed deep fear behind a thin mask of disdain, Microsoft CEO Steve Ballmer compared Linux to cancer. Ouch!
What Happened:
In 2001, Microsoft was feeling embattled. The year before, in the case United States v. Microsoft, a federal court handed down a judgment declaring the company to be an “abusive monopolist.” There was talk of breaking the corporation into smaller, component parts.
Perhaps just as bad, the Linux OS was gaining server market share quicker than expected. An IDC report in early 2000 study found that Linux had already grabbed the No. 2 operating system, with a 25 percent share. (Windows NT was on top with 38 percent.) Yet IDC had earlier forecast that Linux wouldn’t earn the No. 2 berth until 2002 or 2003. Linux was challenging Redmond faster than experts had forecast.
In 2000, IBM very publicly announced it would spend an impressive $1 billion on Linux in 2001. (In early 2002 the company crowed it had nearly recouped its investment, a claim that analysts questioned.)
Apparently all these developments got under the collar of CEO Steve Ballmer. In the spring of 2001, he told the Chicago Sun-Timesthat “Linux is a cancer that attaches itself in an intellectual property sense to everything it touches.”
Digging himself further into a hole, he claimed, “The way the license is written, if you use any open-source software, you have to make the rest of your software open source.” However, this statement is directly contradicted by the GNU General Public License, which presumably Ballmer was familiar with.
Then came the kicker. Making his comments seem truly absurd was an agreement Microsoft signed with Novell in 2006. The headline: “Microsoft and Novell Announce Broad Collaboration on Windows and Linux Interoperability and Support.”
Which begs the question: if it’s a disease, why are you working to make your software interoperable with it?
Moral of the Story:
In the constantly changing tech world, today’s competitor is tomorrow’s partner. In general, it’s not a good idea to equate competing technologies with deadly diseases.
3) AOL Group Madness
The Blunder:
In an act of societal mass delusion, more than 30 million people sign up for America Online. Historians will always wonder why.
What happened:
America Online took an ingenious approach: in an age before the general public understood the Internet, the ISP moved hyper-aggressively to build a customer base. Morphing from a popular BBS to a major Net onramp in 1989, AOL was there when couch potatoes realized they could stare at the Internet instead of the television.
The strange thing was, for years AOL was an ISP that didn’t allow access to the Internet. Until 1995, AOL users paid substantial dial-up fees (higher than those charged by many other ISPs) to stay corralled in AOL’s closed universe. Instead of surfing the Net, users had to be content browsing through AOL’s homogenized, sanitized offerings.
Yet the public flocked to AOL (full disclosure: I was one of the lemmings). Nothing could keep them away. Not all those busy signals when the system was overloaded, not the dubious customer service, not AOL’s clunky, slow software.
Fully explaining this mass hypnosis is hard to do. The most likely reason: that warmly robotic voice chiming “You’ve Got Mail!,” which made you feel as if someone, somewhere, wanted to communicate with you.
The lemmings, uh, I mean users, kept signing up. At one point, AOL claimed a remarkable 34 million subscribers. It strode the Net like a colossus, attracting the attention of Time Warner in a merger that was itself one of the great business blunders.
Over time, the hallucination wore off. Users realized that they didn’t need AOL to hold their hand as they explored the Web. The subscriber based started falling, but AOL tried desperately to hang on. In 2005, the company paid a $1.25 million fine to the state of New York after getting copious complaints about the difficulty of canceling.
Then came the humiliating incident with a user named Vincent Ferrari, who recorded his painful attemptto cancel AOL, which became an Internet hit.
Now, with a reportshowing its user base shriveled to 13.2 million, AOL is a free service, attempting to make a buck from ad revenue. Oh, how the mighty do fall.
Moral of the Story:
All that glitters isn’t gold. And an ISP that has to be pressured into allowing its customers onto the Internet probably won’t thrive long term.
4) Google Spends a Silly Amount of Money
The Blunder:
Google buys the Napster of 2006 and gets embarrassed by an Australian teenager.
What Happened:
I know what you’re thinking: Google buying YouTube for a $1.65 billion was an absolute steal. When Google bought the 20-month-old Web site, YouTube boasted 35-40 million users (in the U.S. alone). So the site draws more eyeballs than American Idol, without needing to dole out paychecks to Paula, Randy and Simon.
And YouTube’s price tag pales in comparison to some prior media acquisitions. Yahoo paid more than $5 billion for Broadcast.com, @Home picked up Excite for $7.2 billion, and Terra swallowed Lycos for a mere $12.5 billion (gulp).
In this context, GooTube looks brilliant. Especially when you consider that those sites weren’t all revenue powerhouses, in contrast to YouTube, which has a major source of revenue–
Oh wait, you say YouTube doesn’t have a clear revenue source? Well, that’s still okay, because acquiring YouTube kept the video site out of the hands of rivals Yahoo and Microsoft. And with all the great content that YouTube has, monetizing the traffic will be easy–
Oh, what’s that? You say a huge chunk of the content is…copyrighted? Well certainly Google knew that. Even a fast search of YouTube reveals postings of Beatles music, the Harry Potter movies, scenes from Seinfeld, and video from the Super Bowl– which appears to pose a legal liability bigger than the North American continent. So clearly Google had a strategy in mind, probably some revenue sharing deals, maybe a–
Oh, Google has been sued? Don’t worry about it. I mean, c’mon, the $1 billion lawsuit filed by Viacom against Google for copyright violations was described by Google CEO Eric Schmidt as a negotiating tactic. These days, a $1 billion lawsuit is just a way to start a conversation.
And pay no attention to the fact that, were Viacom to win the case (or even settle out of court), a long list of other concerns would step forward to demand their share of the bounty. But at any rate, Google claims it will add a filter to YouTube to eliminate copyrighted material. (But shouldn’t that have been in place months ago?)
Some observers claim that only a small percentage of YouTube’s traffic derives from copyrighted material; other experts dispute that. Whatever the exact case, spending big on a sprawling legal liability, and scrambling to put filtering technology after the fact – while fending off legal action – certainly seems to be a huge fumble.
Especially when you realize that an Australian teen sent Google a fake cease and desist order, and the search giant was so nervous it quickly complied with the bogus request. Yikes!
Moral of the Story:
Even big, smart, progressive companies do incredibly dumb things sometimes.
5) SCO Sues instead of Sells
The Blunder:
In 2003, the SCO Group filed a lawsuit against IBM – for the usual $1 billion – alleging IBM had misappropriated trade secrets by incorporating SCO’s intellectual property into the Linux OS.
What Happened:
At first glance, SCO’s strategy seemed to make sense, at least as a business tactic. The small Utah-based company, which claims to own copyright to a version of Unix, would generate revenue by adopting a feisty, contentious stance with regard to its intellectual property.
Step one: sue a deep-pocketed corporation for a headline-inducing amount of money. The possible benefits were numerous. Deep-pocketed corporations sometimes settle because it’s cheaper just to make it go away. Deep-pocketed corporations sometimes buy your company. And who knows? In a court of law, as in football, any team can win on any given day. Talented lawyers can accomplish amazing things.
But for SCO, the strategy has been the opposite of a winner. Its tactics quickly began to work against it. In May 2003 the company sent letters to Fortune 1000 companies warning them of possible legal action if they use Linux. The problem: SCO is an enterprise IT vendor, and Fortune 1000 companies are the biggest consumers of enterprise IT. In effect SCO was harassing its potential customers.
The checkered tale of SCO vs. IBM has twisted and turned every which way since it began, spawning the closely related SCO vs. Novell, among other sagas. Over four years it’s had more plot twists than Gone With the Wind.
But the result for SCO, instead of a windfall from litigation, has been a black hole they’ve poured money into. Or, as the company noted in a press release about 2006 Q4 revenue, “Because of the unique and unpredictable nature of the Company’s litigation, the occurrence and timing of litigation-related expenses is difficult to predict, and will be difficult to predict in the future.”
Far worse, in a January 2007 conference call, SCO CEO Darl McBride was forced to address rumors that the company is going bankrupt. He claimed that it isn’t, but as he was quoted in Internetnews, “Let’s face it, it’s not a real pretty picture.”
Moral of the Story:
Always sit down and take several deep breaths before hiring lawyers. Especially when you sue someone with pockets as deep as the Grand Canyon.
6) Microsoft and Security. Oh Goodness.
The blunder:
Over many years and many releases, Microsoft software has proven vulnerable to a plague of viruses, worms, Trojans, malware and other security snafus, the extent of which would boggle any reasonable mind.
What happened:
You know what happened – or rather, what happens. Microsoft puts out a new release and, like clockwork, fresh security problems are announced. Go to eSecurityPlanet and you’ll see a constantly refreshed list of Windows viruses – it’s about five to ten per day.
Pity the poor Windows user whose system is not enclosed in a fortress of the latest, greatest, extra-strength security software. Remember the “I Love You” worm? A single Trojan crafted by a lowly computer student brought down email systems from the CIA to the British Parliament.
In fairness, some security experts note that anyOS with a user base as big as Windows’ would necessarily have problems. Its large market share makes it a fat target for legions of script kiddies worldwide (and worse, the fraudsters who make money selling knowledge of vulnerabilities).
But regardless of whether it’s Microsoft’s fault, is there not some way to find virtually all of a program’s holes beforeit’s released to the public?
Here’s an idea. Prior to release, Microsoft could hire fifty of the world’s top hackers, give them $10,000 a week and all the pizza and Red Bull they can consume. Set them up in a big warehouse in Redmond and turn them loose on the beta version. For every hack they find, give them a $50,000 bonus. At the end of three months, the hacker who’s found the most vulnerabilities gets a $1 million grand prize and is allowed to throw a cream pie at Bill Gates. (Okay, maybe you leave out the cream pie bit – it might not fly with senior management.)
Sure, the scheme would cost Microsoft a few million, but when the software was done being punished, it’d be reasonably close to bulletproof. If, for example, Vista had been put through this trial by fire, this hacklikely wouldn’t have been discovered post release. Why is it that an obscure Russian hacker can find something that all the talent in Redmond can’t find?
Moral of the Story:
It’s not enough to ask company programmers to test your software. It’s not even enough to release a beta version to well-regarded professionals. If you want really tough software, you have to allow some real-world poking and prodding prior to release.
7) The Glacial Pace of the Linux Desktop: Sloooooooow
The Blunder:
Even years after the OS earned acceptance in the tech world, a mainstream Linux desktop refuses to gain hold.
What Happened:
There are many things in life we know we have to wait for. Vacation. That fat bonus. A low-priced hybrid car. We know these things are coming – or, we thinkthey’re coming – but we just need to be, well, patient. However, in the case of the Linux desktop, the patience required is epic.
Oh, I know the Linux desktop already exists. You can choose from the likes of Feisty Fawn or GNOME. You can download Xandros or any one of these nifty Linux desktopapps. If you have the technical know-how and real gumption, you can certainly use Linux as your desktop OS.
But what about a Linux desktop in the larger world? Leave the rarefied sphere of card-carrying techies, and the Linux desktop is one very rare bird. A reportin April 2007 found that Linux accounted for 0.8% of users. That’s probably too conservative; the real number is probably closer to 3%.
Still, think about it: Linus Torvalds birthed the Linux kernel in 1991. Now, sixteen years later, we’re still waiting for a mass produced, commercially accepted desktop. It didn’t take Steve Jobs 16 years, nor Bill Gates.
At moments, the wait for Linux to break out of the tech ghetto and enter the mainstream gets discouraging. A recent surveyby OpenSUSE.org found – surprise – that 98% of Linux desktop users are male. The true test of an OS: if only one gender is using it, it’s still waiting to be discovered.
And here’s a fact that would bruise anyone’s pro-Linux sensibilities. This reporter from DesktopLinux.com, covering the 2006 LinuxWorld show in San Francisco, found that only half the laptop users (based on casual observation) were running Linux. Incredible! At an actual LinuxWorld show – in San Francisco, no less – only half the users were driving Linux. (Does that mean that half were actually running…Windows? At a LinuxWorld show?)
So the question looms larger: after all these years, why is there no mainstream desktop Linux?
There are probably a lot of complicated reasons, but the simple one is this: the big PC makers are highly risk-averse; unless they see a big built-in audience, they won’t pre-install an OS at the retail level. And for 95% of users, if an OS doesn’t come preinstalled, it’ll never be installed at all.
It’s circular logic: A major PC maker will only get behind a Linux desktop when the public clamors for it. But the public isn’t clamoring for it because no big PC maker is promoting it.
Then comes Dell. Responding to thousands and thousands of users who used Dell’s ideaStormsuggestion box, the company made a momentous announcement: it would launch a desktop running the Ubuntu flavor of Linux. The release is slated for late May.
Just day later, however, came the news: Dell joined the Microsoft-Novell pact, with its very proprietary intellectual property restrictions. In the eyes of many Linux adherents, this was heresy. The snarky messages poured into Dell like flaming spears cast by an army of enraged barbarians. Or, as Techworlddelicately put it, the deal “appears to have drained much of the goodwill Dell had fostered among Linux enthusiasts.” Hopes for a popularly accepted Linux desktop seemed dim.
But wait – the mood has changed. Check out this message board full of Linux adherents, eagerly discussing the joys of the forthcoming Dell release. They’re ready to give it their full support – unequivocally. “I’ll order a Dell for my mom when they’re available,” enthused one fellow, who clearly has that old time religion.
So the Linux desktop is really on its way – really. Any day now, it’s going to poke its head shyly into the bright lights of the mainstream PC market. No fooling this time. Just a little bit more patience…
Moral of the story:
A true believer will wait as long as it takes, even if that’s a long, long time.
8) The Entire Music Player Business Prior to the iPod
Blunder:
An entire industry let itself be swept under the carpet by a single product – one that was high priced, introduced late, and tends to break easily.
What Happened:
As Apple’s advertising machinery is happy to tell us, as of April 2007 the iPod had sold over 100 million units worldwide. Yes, this overpriced little piece of plastic with a hard drive is carried proudly by hipsters, tweens and middle-aged wannabes all across the globe.
Now that the iPod has assumed world domination, it’s hard to remember that it was late to the party. This music player wasn’t introduced until 2001. (In fact, it debuted the month after 9/11 – definitely nota fortuitous time for a product roll out.)
For years in the late ‘90s, the other music players had the market all to themselves. The Diamond Rio, for example, enjoyed such a high profile that it was the target of a 1998 lawsuit over illegal downloading, brought by the RIAA (which failed).
And don’t forget Creative Labs’ Nomad, the RaveMP2300, the I-Jam-100, and the awkwardly-named Eiger Labs MPMan F10. Most impressive was Compaq’s Personal Jukebox 100, which held more than 1,000 songs. All of these players piggybacked on the rabid success of Napster, with its free-for-all approach to the musical smorgasbord.
These device makers vied for an astoundingly lucrative market: young people with significant disposable income, eager to spend it on a plugged-in lifestyle. Capture this audience and uncountable fortune was to be had.
Yet for some reason, these companies couldn’t figure it out. Many of their gadgets were unwieldy, had poorly laid-out user interfaces, and low storage. Strictly teenage – or actually, not even hip enough to be teenage.
At first the iPod seemed like just another also-ran – and an expensive one, at an eyebrow-raising $399 at launch.
But then, in a stroke of brilliance, Apple created units that were interoperable between Windows and Mac. Going further, Apple negotiated with labels to create a legal online music store that was easier to use than the first movers’ sites – quickly snagging market share.
And then there’s the coolness factor. Apple turned what had been an ugly duck into a designer fashion accessory (that also plays music). Buyers seemed not to care about the comparatively hefty price tag as they frantically snapped up the must-have devices. In the ensuing melee, everyone forget that there were other companies that made music players. Diamond who?
Moral of the story:
It’s never too late to get in the market, as long as your product is vastly superior to the competition’s.
9) All The Missing Data (Where Does it all Go To?)
The Blunder:
The concept of data security, in which sensitive data is held fully protected, has gone out of style.
What Happened:
An epidemic rages across the land. Businesses, it seems, are incapable of securing their information. Staffers bring home laptops, where they’re promptly stolen. Senior executives leave notebooks in the back seats of cabs. Huge corporations are hacked on a weekly (daily?) basis.
For some reason, companies simply cannot hang on to their data. Like a fifth-grader with a hole in his pocket, they get their lunch money, and…poof! It’s gone.
The phenomenon calls to mind Sun ex-CEO Scott McNealy’s commentto reporters, regarding privacy issues: “You have zero privacy anyway – get over it.” The quote could be tweaked: “There is no data security anyway – get over it.”
There are stories like this one, in which large companies sell lists of names to telemarketing criminals, who then bilk the elderly. Then there are breeches like retailer TJX’s, in which 45 million credit and debit card numbers were stolen from its IT system over 18 months.
But the list of “data on the move” is endless. For instance:
• Last year, a Cal State, Los Angeles employee’s USB drive was inside a purse stolen from a trunk. It held personal data on more than 2,500 students and program applicants.
• Hertz Global Holdingssaid that it dropped a prominent financial services company from its underwriting team after several e-mails discussing its $1.5 billion IPO were inadvertently sent to about 175 institutional clients.
• Veterans groups filed a class-action lawsuit against the U.S. Dept. of Veterans Affairs after a laptop was stolen from an employee’s home. It contained Social Security numbers and birth dates for 26.5 million veterans and their spouses. None of the data was encrypted, and the employee had been routinely taking home confidential data for at least three years.
At this late data there’s been so much data lost that it’s no longer clear: is there more data behind secure walls, or out in the wild, bought and sold by scamsters?
Moral of the story:
Information wants to be free. But not that free.
10) The Costs of Y2K
The blunder:
The mother of all cost overruns, with the most inflated invoices in the history of technology, occurred at the end of the 1990s.
What Happened:
There will always be a debate as to whether Y2K was a hoax.
In the opinion of some, the anxiety about what would happen when the date changed over to 2000 was overblown – wildly so. The build-up, the saturated media coverage, the companies so focused on it: it was a colossal misunderstanding.
This camp points to the fact that very little happened on January 1, 2000. Not only did large U.S. companies experience no major meltdowns, but schools and small businesses – many of whom didn’t prepare – were unaffected. Moreover, many countries around the world who did next to nothing, like China and Russian, saw no rash of major problems. So the Y2K madness was just a hoax, some believe.
On the other hand, many experts note that it is precisely because of the preparation that big U.S. companies had no major problems. If not for the Herculean effort spent overhauling IT systems, all hell would have broken out by 12:05 AM on January 1, 2000. These experts point to those calling Y2K a hoax and start fuming: “What do you mean it’s a hoax – we solved the problem, that’s why nothing happened!
Though the argument will rage on, it’s safe to assume that the preparation really did make a big, big difference. Y2K was not a hoax.
Read enough descriptions from honest, knowledgeable programmers about the work they did and you’ll realize the task was real. Many diagnostic tests showed conclusively that systems would have failed if not for remedial action.
However, there is an area in which Y2K really was a folly: the costs.
According to a 1999 estimate by the U.S. Department of Commerce – a government agency, and therefore not prone to inflate predicted costs – Y2K costs would reach $100 billion, and possibly as high as $114 billion. Probably more accurate, the Gartner Group calculated U.S.-based Y2K costs at $150 billion to $225 billion.
Think about it: there are 300 million people in the U.S.; even at the lowest figure of $100 billion, we spent…$333 for every single human across this huge country of ours. The mind staggers.
If that’s not enough, look at worldwide Y2K expenditures. Cap Gemini America estimated $858 billion; the Gartner Group guesstimated $600 billion; and IDC calculated a mere $300 billion. Take the middle figure, $600 billion, divide it by the world population of 6 billion, and you realize – you better be sitting down – we spent $100 for every man, woman and child on the entire planet. Wow. A great silence is heard as the audience sits stunned.
While the work was definitely needed, there were some people made a wholelot of money from Y2K.
Moral of the story:
Cost overruns are commonplace in high end tech projects. But add a dose of fear/anxiety/worry to the project, and the invoice can assume truly oceanic proportions.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.