Google researchers have developed an artificial intelligence that they say can predict weather and climate patterns as accurately as current physical models, but with less computing power.
Existing forecasts are based on mathematical models run by extremely powerful supercomputers that deterministically predict what will happen in the future. Since they were first used in the 1950s, these models have become increasingly detailed and require more and more computer power.
Several projects aim to replace these computationally intensive tasks with much less demanding AI, including a DeepMind tool that forecasts localized rainfall over short periods of time. But like most AI models, the problem is that they are “black boxes” whose inner workings are mysterious and whose methods can’t be explained or replicated. And meteorologists say that if these models are trained on historical data, they will have a hard time predicting unprecedented events now being caused by climate change.
Now Dmitrii Kochkov and his colleagues at Google Research in California have created a model called NeuralGCM that they believe strikes a balance between the two approaches.
Typical climate models divide the Earth’s surface into a grid of cells up to 100 kilometers in size. Due to limitations in computing power, simulating at high resolution is impractical. Phenomena such as clouds, turbulence, and convection within these cells are only approximated by computer codes that are continually adjusted to more closely match observed data. This approach, called parameterization, aims to at least partially capture small-scale phenomena that are not captured by broader physical models.
NeuralGCM is trained to take over this small-scale approximation, resulting in less computational overhead and greater accuracy. In their paper, the researchers say their model can process 70,000 days’ worth of simulations in 24 hours using a single chip called a Tensor Processing Unit (TPU). By comparison, a competing model called X-SHiELD uses a supercomputer with thousands of processing units to process just 19 days’ worth of simulations.
The paper also claims that NeuralGCM performs predictions at a rate comparable to or better than best-in-class models. Google did not respond to a request for an interview. New Scientist.
Tim Palmer of the University of Oxford says the work is an interesting attempt to find a third way between pure physics and opaque AI approximations: “I’m uncomfortable with the idea of completely abandoning the equations of motion and moving to AI systems that even experts say they don’t fully understand,” he says.
This hybrid approach is likely to spur further discussion and research in the modeling community, but only time will tell if it will be adopted by modelers around the world, he says. “It’s a good step in the right direction and the type of research we should be doing. It’s great to see all these alternative methods being explored.”
topic: