Posted by on
Tags: , , , , , , , , , , , , , , , , , , ,
Categories: Uncategorized

For decades, compute resources used for weather forecasting have tracked with advances in state-of-the-art supercomputing. Which is to say that the weather segment demands systems with the greatest data ingest and storage capacity combined with the most powerful processing capabilities. As the accuracy of daily weather forecasts and warnings of severe weather depend on high-performance computing combined, increasingly, with artificial intelligence, it is perhaps not surprising that weather segment IT spending has not been affected by the COVID-19 pandemic. Hyperion Research predicts that it will in fact grow by an astonishing 33 percent between 2021–20241, significantly outpacing expected growth for the overall HPC industry.

Spurring this growth is not only the need to predict increasingly volatile, climate change-driven weather events but[1] also the emergence of advanced technologies and technology strategies enabling more effective weather modeling.

As we approach exascale, supercomputing era, characterized by extreme compute power and extreme heterogeneity, it seemed a good time to sit down with a leading weather technologist, Ilene Carpenter, earth sciences segment manager at Hewlett Packard Enterprise, to discuss HPC trends in weather forecasting – particularly the rapid adoption of AI techniques by major weather centers.

“AI has been used in weather forecasting for a long time, but now there’s a resurgence because of advances in machine learning, driven by the availability of massive amounts of data and the power of GPUs,” Carpenter told us. “Weather forecasting centers were among the first supercomputer users. Now, they are combining physical modeling on supercomputers with AI and data-driven approaches to enable better predictions.”

Carpenter explained that HPE’s acquisitions of SGI in 2016 and of Cray two years ago, and the subsequent melding of the three companies’ HPC technologies, has resulted in supercomputing capabilities uniquely capable of supporting combinations of AI with traditional physics-based simulations, creating a paradigm opening a new world of forecasting possibilities.

For example, the National Center for Atmospheric Research (NCAR) next year will deploy a new HPE Cray EX supercomputer, named Derecho. It will be nearly 3.5x faster and 6x more energy efficient than the current HPE SGI 8600 Cheyenne system.

“NCAR needs a supercomputer powerful enough to support both complex physical modeling and machine learning algorithms,” she said, “to improve the understanding of weather and climate, as well as things like seasonal water supply and drought risk. NCAR’s research is also empowering emergency responders with better predictive models of wildfire behavior, so they can save lives and millions of dollars in property damage.”

The system’s capacity to combine HPC and AI, in part via a tool called SmartSim, enables it to support NCAR’s highly demanding needs.

“Each technology, HPC and AI, is dependent on the other because alone, they simply couldn’t get the job done that well,” she said. “We wanted to make the use of this new paradigm easier so Cray, now HPE, developed SmartSim, which allows researchers to add AI to their traditional simulations by adding just a few lines of code.”

Read more here:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.