It is estimated that every year more than 200 million people either get displaced or lose their lives due to floods. Flood forecasting is a tricky job since a lot of factors such as rainfall, the topology of the location, soil strength, fluid dynamics come into play. Current forecasting techniques do help in timely evacuations. Usually, the warnings from the meteorological department lead to the deployment of disaster relief personnel in hazardous locations. However, this too isn’t enough.
In India, the state of Kerala alone has experienced two devastating floods in two consecutive years. The aftermath of any flood usually ends up with people either blaming global warming or the lawmakers. Though there is no one-stop solution, the best alternative is to resort to robust precautionary measures. In order to ring the alarm bells early enough, Central water commission of India has joined hands with Google to forecast floods.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
Google has been implementing its machine learning prowess for various social good across the world.

In India especially, Google has been doing tremendous work by leveraging all their AI capabilities at disposal. Last year, Google rolled out its early flood warning services, starting with the Patna region.
How Does Google Do It?

Google’s approach involves incorporating multidisciplinary techniques. They range from gathering data regarding the topography of a location to using equations of fluid dynamics.
A reliable forecasting model can be established using the following data:
Real-time Water Level Measurements
Google partnered with Indian Central Water Commission (CWC), which measures water levels hourly in over a thousand stream gauges across all of India, aggregates this data, and produces forecasts based on upstream measurements. The CWC provides these real-time river measurements and forecasts, which are then used as inputs for our models.
Elevation Map Creation
It is critical that the models have a good map of the terrain and this requires high-resolution digital elevation models (DEMs) that are difficult to acquire. This is where Google Maps comes into the picture. Map creation is achieved as follows:
- The large and varied collection of satellite images used in Google Maps are used for correlating and aligning the images in a large batch.
- Then the corrected camera models are used to create a depth map for each image.
- Optimally fuse the depth maps are fused together at each location to make the elevation map.
- Train convolutional neural networks to identify where the terrain elevations need to be interpolated.
Hydraulic Modeling
The location and velocity of water through time are considered in building hydraulic models. The results tested using a 2D form using Saint-Venant equations. These are a bunch of partial differential equations. Solving these equations requires a lot of computational budget.
So, Google optimised the hydraulic model for Tensor Processing Units (TPUs). The parallelised nature of TPUs offer 85x times faster than a traditional CPU.
The researchers are also exploring advanced machine learning models to run data discretisation techniques to solve partial differential equations, which has the potential to bring down the time required to forecast.
Forecasting A Safer Future
Modelling nature is a feature engineering nightmare. There a number of factors that come into the picture. Hydrologic models, for example, consider precipitation, solar radiation, soil moisture as inputs to produce a forecast. So, challenges like scaling and accuracy still persist.
Nevertheless, we can safely say that Google is doing it the right way by collaborating with government entities like Central water commission and other non-profit organisations to gather data at a granular level. Since a machine learning model is only as good as the data, localised models are necessary.