IBM had earlier announced they were pivoting the most potent Supercomputer in the world, Summit, to fight Covid-19. The result was that they have been able, with Oak Ridge Laboratory and the University of Tennessee, to screen 8,000 compounds to find out which ones were likely to mitigate Covid-19.
They were looking to see which compounds could best bind to the central “spike” protean of the coronavirus, making it unable to infect host cells. In effect, they are looking for what would render the virus ineffective against anyone medicated with one of these compounds. The effort surfaced 77 promising small-drug compounds that potentially make Covid-19 wholly or partially ineffective. These compounds are now undergoing testing.
This week IBM announced they were adding another 16 additional systems, that’s 330 petaflops, 775K CPU cores, 34,000 GPUS, in collaboration with the White House Office of Science and Technology Policy, and the US Department of Energy through a massive technology pool. This pool includes Lawrence Livermore National Lab (LLNL), Argonne National Lab (ANL), Oak Ridge National Laboratory (ORNL), Sandia National Laboratory (SNL), Los Alamos National Laboratory (LANL), the National Science Foundation (NSF), NASA, the Massachusetts Institute of Technology (MIT), Rensselaer Polytechnic Institute (RPI), and multiple other leading technology companies.
This move is the kind of effort you typically only see during massive scale war, and it is certainly indicative of how seriously some of the major technology players are taking this
This is the first time this lever of computational power has been applied to a single problem.
Now a few years back, even if we had this kind of computational power available, it would have been challenging to rapidly apply it to something it wasn’t designed to do.
These systems were generally designed to look at massive problems that revolved around predicting weather patterns globally, the interactions between galaxies, and attempting to discover Dark Matter. You typically can’t take something that was designed uniquely to work on one class of problem and pivot it very quickly to another.
However, over the last decade, IBM has mostly re-engineered how these huge systems function and the need for these systems to be able to handle a wide variety of problems so that their massive cost can be spread between more entities have been advanced.
Even the way we connect systems like this into a solution has undergone a rather revolutionary process. It used to be that when you had various systems working with unstructured data, all of which needed to be collated to find an answer you’d need to re-analyze the date from scratch with whatever computational tool you had.
But now the process is to allow the various systems to do the analysis they were designed for. Then you have the master system analyze the results, which cuts a massive amount of time out of finding an answer and reduces the computational power needed to arrive at a solution.
Now, if you combine the massive additional computational power with this change in the process you get a multiplicative impact where the combination should result in far more reliable results considerably faster. One of the things I expect they’ll be looking at is easier to determine early markers for someone who is sick that can scale.
For instance, we know that one of the first symptoms is a loss of taste. Assuming everyone could be tested when well to set a baseline, you should be able to rapidly test a lot of people for a loss of taste if you had a test focused on this.
Anyone that couldn’t taste what they tasted during the baseline would immediately be flagged as a potential carrier and removed until more detailed testing could be done (that testing time is dropping rapidly as well) and the combination of a rapid, cheap test coupled with a quick, effective quarantine process could get the nations back to work again avoiding much of the massive financial crisis that we currently anticipate.
IBM is one of several companies stepping up sharply to address this crisis. IBM’s approach begins with what they do the best analysis of unstructured data at scale, and their engagement increases the likelihood that we all can get back to something approaching a normal life this year, and before the economic damage becomes the bigger problem.
This use will also begin to establish best practices for the next Pandemic, and there will be the next one, hopefully assuring that this Pandemic will be the last with this level of global damage. It is also a showcase of how technology can be effectively be rapidly repositioned to address tactical problems of massive scale, making these applications critical for dealing with the many anticipated threats we will face as a race going forward.
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2020
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Anticipating The Coming Wave Of AI Enhanced PCs
FEATURE | By Rob Enderle,
September 05, 2020
The Critical Nature Of IBM’s NLP (Natural Language Processing) Effort
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
August 14, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.