Data modeling and simulation create the foundation for tactics and strategies tied to difficult problems ranging from military action to health care. Stone Ridge recently announced the first billion-cell engineering data simulation completed entirely on GPUs. While this breakthrough focused on the petrochemical market and oil fields, the potential it represents for virtually every major industry is nothing short of world-shattering.
The previous world record for the problem that Stone Ridge and its partners solved was 20 hours, and it required a massive supercomputer. They accomplished the same feat with 30 IBM servers in just over an hour and a half. This is a massive jump in performance and an even more massive reduction in plant cost, power and complexity.
I spent some time talking to Stone Ridge last week, and I think these results represent what could be a massive disruption in the status quo for modeling and simulation.
The Stone Ridge effort is one of the first big showcases for IBM’s Minsky high-performance computer (HPC), which has two Power8 CPUs and four Nvidia Tesla P100 GPUs. What makes this server unique is that it is the joint brainchild of IBM and Nvidia. It represented one of the most powerful and targeted efforts to change the way we compute, and it focused on the concept of cognitive computing, which both companies have embraced massively.
At the heart of these systems is NVLink which connects the CPU and GPU. IBM worked with Nvidia to optimize performance across this link, which resulted in far fewer bottlenecks and conflicts. The solution appears to have come from one company rather than two, and it is here that you find the strongest system advantage.
This very unique system has a near clean-slate design focused very tightly on the kinds of problems standing in the way of thinking computers. To deal with problems with an increasingly human-like competency, you first have to convert those problems, which are often defined with massive amounts of visual data, into some kind of data model. it And these new servers are absolutely outstanding in how they can model and analyze data.
Stone Ridge’s Echelon is a high-performance petroleum reservoir simulator. Large oil companies use it to analyze an oil field and plan how to extract the oil most cost effectively. With oil prices falling, the need to contain costs while increasing yields from these oil fields has never been higher. Data input for the test was from 1,000 wells in a Middle East carbonate oil field, and the simulation used 1.01 billion cells.
Before Stone Ridge could get this phenomenal performance, it did have to modify Echelon to favor GPU-based analysis and to utilize the NVLink fully. What I think is particularly interesting is that much of the performance advantage actually came from memory bandwidth, which shouldn’t be surprising given the massive amount of data being analyzed. Apparently Nvidia’s P100 has nine times the memory bandwidth of Intel’s current Xeon. The difference in the solution architecture also appears to contribute to the advantage with the Minsky systems because of a massive reduction in system complexity. Solving this kind of a problem on a similar Intel system requires the problem be divided into 360 domains, while on the Minsky using Power and a GPU it requires only four domains. According to the companies, one Minsky node using Echelon performs in line with 18 Xeon server nodes.
In the end, doing this kind of simulation on a supercomputer is simply unaffordable to most firms managing oil fields — particularly now with revenues and profits falling sharply. However, using the IBM- and Nvidia-based Minsky server approach, it was both viable and affordable, particularly when using Stone Ridge’s Echelon product.
One of the big components of the coming wave of smart cognitive computers is simulation, increasingly at massive scale. It is this scale of simulation that will help future top executives, political leaders, generals, lawyers, doctors and scientists find the answers they need to advance their organizations’ agendas. It will also form the basis for future cognitive computers to provide the focused assistance to allow the most successful of these folks to scale to levels and scope we can’t now imagine.
This announcement showcases that we have advanced massively in our capability to model and analyze massive amounts of data. We can only hope these tools, as in this case, will be used for and not against us.
Photo courtesy of Shutterstock.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.