The AI industry giants have spoken up on climate change as they released a paper to materialise the actions necessary to tackle the effects of climate change. Pioneers like Youshua Bengio (winner of Turing award 2019) and Andrew Ng (co-founder of Google Brain) who have spearheaded most of the developments in the field of machine learning are now collaborating with a handful of researchers to demonstrate the role of ML in helping society adapt to climate change.
To begin with, the authors of this paper have the following recommendations for those who seek a roadmap to applying ML in climate emergency:
- Learn new skills and identify their use.
- Find collaborators, who may be researchers, entrepreneurs, established companies, or policy-makers. Remember that for every domain we have discussed here, there are experts in that area who understand its opportunities and pitfalls, even if they do not necessarily understand ML.
- Listen to what your collaborators say is needed, and gather input more broadly as well to make sure your work will have the desired impact. Groundbreaking technologies have an impact, but so do well-constructed solutions to mundane problems.
- Ensure that your work is deployed where its impact can be realized.
AI As A Tool For Society And Individuals
Problems due to change in climate can have a global attention but the solutions need to be hyper localised. Solutions should be feasible and encouraging. This can only be done when the tools are made available at the local level. Here are a few excerpts from the original paper that speak about some of the domain specific solutions:
Climate Proofing Infrastructure
Heat and wind damage roads, buildings, and power lines. Rising water tables near the coast will lead to faults in pipelines. Urban heat islands will be exacerbated and flooding caused by heavy rain or coastal inundations will become more routine, along with resulting property damage and traffic blockages.
Two strategies for efficiently managing limited maintenance resources are predictive maintenance and anomaly detection; both can be applied to electrical, water, and transportation infrastructure.
A plausible solution is to incorporate flood hazard and traffic information in order to uncover vulnerable stretches of road.
- Localized crop yield predictions by aerial imagery or meteorological data and linking it to historical data.
- Crop disease identification from plant photos.
Epidemics And Disaster Relief
- Malaria diagnosis based on photos of prepared pathology slides taken with a mobile phone.
- Accurate and well-annotated maps can inform evacuation planning, retrofitting campaigns, and delivery of relief. Further, this imagery can assist damage assessment, by comparing scenes immediately pre- and post-disaster
Reducing Carbon Footprint
Natural language processing (NLP) can be used to extract the flights a person takes from their email, or determine specific grocery items purchased from a bill, making it possible to predict the associated emissions.
ML techniques have been used to effectively disaggregate household energy, such as spectral clustering, Hidden Markov Models, and neural networks
Tools from game theory and incentive/mechanism design have been applied to develop climate policy , but there are many opportunities for machine learning in this area, including exploration of incentive design, application ofmulti-agent reinforcement learning (RL) to planning and coordination in climate change policy or mitiga-tion, decision support systems, and data visualization tools.
Domains where machine learning can make things better:
- Prevent electricity loss during transmission.
- Consolidate freight and reduce food waste
- Enable remote sensing and automatic monitoring (e.g. pinpoint deforestation, gather data on buildings, and track personal energy
- Provide fast approximations to time-intensive simulations (e.g. climate models and energy scheduling models)
- Can lead to interpretable or causal models (e.g. for understanding weather patterns, informing policy makers, and planning for disasters).
Challenges And Future Direction
The obvious challenge for any data driven project begins with the data itself. Data availability and privacy pose are one of the many which hinder large scale deployment of machine learning models. The collected data is either scarce or sensitive or sometimes both. Datasets contain information from multiple sources and making sense of it will require collaboration of experts across domains.
Even if the data had been cleaned and made available, it need not always represent the global use case. Because the data availability and tools change from nation to nation. A developed country might have readily available data whereas a third world nation might be too poor to attend to problems that don’t display any immediate danger.
For this, the experts recommend the use of tools from transfer learning and domain adaptation will likely prove essential in low-data settings. For some tasks, it may also be feasible to augment learning with carefully simulated data. And most importantly, active participation of public and private entities to release datasets and to solicit involvement from the ML community.
Read the full paper here.
Enjoyed this story? Join our Telegram group. And be part of an engaging community.
Provide your comments below
What's Your Reaction?
A technology journalist with a master's degree in Robotics. Likes to write about machine learning advancements. email:firstname.lastname@example.org