The AI industry giants have spoken up on climate change as they released a paper to materialise the actions necessary to tackle the effects of climate change. Pioneers like Youshua Bengio (winner of Turing award 2019) and Andrew Ng (co-founder of Google Brain) who have spearheaded most of the developments in the field of machine learning are now collaborating with a handful of researchers to demonstrate the role of ML in helping society adapt to climate change.\n\nTo begin with, the authors of this paper have the following recommendations for those who seek a roadmap to applying ML in climate emergency:\n\n\n Learn new skills and identify their use.\n Find collaborators, who may be researchers, entrepreneurs, established companies, or policy-makers. Remember that for every domain we have discussed here, there are experts in that area who understand its opportunities and pitfalls, even if they do not necessarily understand ML.\n Listen to what your collaborators say is needed, and gather input more broadly as well to make sure your work will have the desired impact. Groundbreaking technologies have an impact, but so do well-constructed solutions to mundane problems.\n Ensure that your work is deployed where its impact can be realized.\n\n\nAI As A Tool For Society And Individuals\n\n\n\nProblems due to change in climate can have a global attention but the solutions need to be hyper localised. Solutions should be feasible and encouraging. This can only be done when the tools are made available at the local level. Here are a few excerpts from the original paper that speak about some of the domain specific solutions:\n\nClimate Proofing Infrastructure\n\nHeat and wind damage roads, buildings, and power lines. Rising water tables near the coast will lead to faults in pipelines. Urban heat islands will be exacerbated and flooding caused by heavy rain or coastal inundations will become more routine, along with resulting property damage and traffic blockages.\n\nTwo strategies for efficiently managing limited maintenance resources are predictive maintenance and anomaly detection; both can be applied to electrical, water, and transportation infrastructure.\n\nA plausible solution is to incorporate flood hazard and traffic information in order to uncover vulnerable stretches of road.\u00a0\n\nFood Security\n\n\n Localized crop yield predictions by aerial imagery or meteorological data and linking it to historical data.\u00a0\n Crop disease identification from plant photos.\n\n\nEpidemics And Disaster Relief\n\n\n Malaria diagnosis based on photos of prepared pathology slides taken with a mobile phone.\n Accurate and well-annotated maps can inform evacuation planning, retrofitting campaigns, and delivery of relief. Further, this imagery can assist damage assessment, by comparing scenes immediately pre- and post-disaster\n\n\nReducing Carbon Footprint\n\nNatural language processing (NLP) can be used to extract the flights a person takes from their email, or determine specific grocery items purchased from a bill, making it possible to predict the associated emissions.\n\nML techniques have been used to effectively disaggregate household energy, such as spectral clustering, Hidden Markov Models, and neural networks\n\nTools from game theory and incentive\/mechanism design have been applied to develop climate policy , but there are many opportunities for machine learning in this area, including exploration of incentive design, application ofmulti-agent reinforcement learning (RL) to planning and coordination in climate change policy or mitiga-tion, decision support systems, and data visualization tools.\n\nDomains where machine learning can make things better:\n\n\n Prevent electricity loss during transmission.\n Consolidate freight and reduce food waste\n Enable remote sensing and automatic monitoring (e.g. pinpoint deforestation, gather data on buildings, and track personal energy\u00a0\n Provide fast approximations to time-intensive simulations (e.g. climate models and energy scheduling models)\n Can lead to interpretable or causal models (e.g. for understanding weather patterns, informing policy makers, and planning for disasters).\n\n\nChallenges And Future Direction\n\nThe obvious challenge for any data driven project begins with the data itself. Data availability and privacy pose are one of the many which hinder large scale deployment of machine learning models. The collected data is either scarce or sensitive or sometimes both. Datasets contain information from multiple sources and making sense of it will require collaboration of experts across domains.\u00a0\n\nEven if the data had been cleaned and made available, it need not always represent the global use case. Because the data availability and tools change from nation to nation. A developed country might have readily available data whereas a third world nation might be too poor to attend to problems that don\u2019t display any immediate danger.\u00a0\n\nFor this, the experts recommend the use of tools from transfer learning and domain adaptation will likely prove essential in low-data settings. For some tasks, it may also be feasible to augment learning with carefully simulated data. And most importantly, active participation of public and private entities to release datasets and to solicit involvement from the ML community.\n\nRead the full paper here.