Search

# Google Researchers Explore A New Paradigm For Partial Differential Equations With Machine Learning

What’s common with predicting climate change and simulating nuclear fusion reactors? These tasks are modelled on a system of very famous mathematical equations — partial differential equations (PDE).

PDEs are the class of equations which describe everything smooth and continuous in the physical world, and the most common class of simulation problems in science and engineering.

##### Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy

Solving computation hungry PDEs takes a toll even on supercomputers. And, we just can’t tweak in the hardware (shrink transistors) for reducing the time consumed, a theory complemented by Moore’s law.

However, there is still a glimmer of hope. The recent breakthroughs in machine learning combined with the development of hardware that suits these algorithms have inspired a team of researchers at Google to take up this mammoth of a task to engineer a new paradigm for the world of scientific computing.

In the paper titled Learning Data Driven Discretizations for Partial Differential Equations, the researchers at Google explore a potential path for how machine learning can offer continued improvements in high-performance computing, both for solving PDEs.

### Data-Driven Discretizations For PDEs

The biggest challenge in simulations are, even the smallest component in an equation is still smaller than the solutions to the problem. For example, the Reynolds number is one of the key parameters used in modelling the fluid flow.

Even with petascale computational resources, the largest direct numerical simulation of a turbulent fluid flow ever performed has Reynolds number of order 1,000.

Simulations at higher Reynolds number require replacing the physical equations with effective equations that model the unresolved physics. In other words, a differential equation is discretised into components like coefficients and variables, among others. This forms a set of discrete equations. The unresolved physics part in these equations requires to be replaced by a new set of discrete equations. This itself gives us a hint of how computational heavy these simulations are.

The picture above illustrates a hurricane’s both full resolution and simulated resolution in a state-of-the-art weather model. Cumulus clouds (e.g., in the red circle) are responsible for heavy rainfall, but in the weather model, the details are entirely blurred out. Instead, models rely on crude approximations for sub-grid physics, a key source of uncertainty in climate models.

The researchers also simulated Burger’s equation to check how a neural network performs. Burger’s equation is a model for shock waves in fluids, which is usually solved using the finite volume method. A comparison plot can be seen below illustrating how neural networks perform smoothly.

The orange squares represent simulations with each method on low-resolution grids. These points are fed back into the model at each time step, which then predicts how they should change. Blue lines show the exact simulations used for training. The neural network solution is much better, even on a 4x coarser grid, as indicated by the orange squares smoothly tracing the blue line.

### Conclusion

The rules that ML models recover are complex, and we don’t entirely understand them, but they incorporate sophisticated physical principles.

Mathematics doesn’t begin when a theorem is followed by proof. The creative place in mathematics is beyond theorems. The potential of mathematics still hasn’t been realised. For instance, the mathematics behind the machine learning algorithms is at least two centuries old. Newton wouldn’t have imagined that an equation to find slope would also be the key to make a machine to predict.

The time is ripe for applied ML to reinforce the foundations on which it is based on — mathematics. And, open up new untouched avenues. As the famous mathematician Michael Atiyah once said, “there are no rules laid down. You have to do it your own way. A proof by itself doesn’t give you understanding. You can have a long proof and no idea at the end of why it works.”

#### Also see:

I have a master's degree in Robotics and I write about machine learning advancements.

### Telegram group

Discover special offers, top stories, upcoming events, and more.

### Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

### Is GPT-4 Really Better than Radiologists?

“Radiology report summaries created by GPT-4 are comparable, and in some cases, even preferred over

### TSMC: The Wizard Behind AI’s Curtain

TSMC anticipates a substantial CAGR of nearly 50% in the AI sector from 2022 to 2027.

Not really.

### Google Gemini To Arrive Sooner Than Expected

This is after announcing the AI at the Google I/O 2023, the company had postponed

### ByteDance to Launch Platform to Build Custom Chatbots

This comes just a few days after OpenAI had delayed its plan to launch a

### This New AI tool Could Mark the Beginning of the End for TikTok and Instagram Influencers

Alibaba Group announces a model framework that can transform still images into dynamic character videos

### Embracing Identity: The Journey of Sujoy Das

“Why is it that corporate diversity efforts are often limited to specific times of the

### The Biggest Data Breaches of 2023

The most significant breaches that impacted the global landscape in 2023.

### NVIDIA Planning Big Expansions in Japan

Prime Minister Fumio Kishida has extended billions of dollars in financial support to bolster TSMC

### Runway Partners with Getty to Build Video Generation Model for Enterprises

Runway enterprise users can refine RGM with their proprietary datasets, benefiting various industries like Hollywood,