The outbreak of COVID-19 witnessed an unprecedented shift towards academic research in a relatively short span of time. The academia is geared to tackle the pandemic head-on by deploying all the tools at their disposal. Data analytics and algorithmic solutions, especially, have been the go-to tools for many researchers.
The vast amount of data surrounding COVID-19, including those related to protein structures, fatality rates, and risk prediction, have been put to use. To handle this surge in data, many organizations have deployed Google Cloud.
Let us take a look at how these organizations are implementing Google Cloud services:
Freeing Up $20M Worth Cloud Credits
A massive $20 million worth Google Cloud credits were made available to enable researchers for their fight against COVID-19. To make sure that these credits are not misused, Google has partnered with the Harvard Global Health Institute. The research requests will be reviewed by Harvard GHI and proposals will be considered on a rolling basis.
Applauding this initiative, Dr Ashish K. Jha of Harvard GHI said that they would be considering clinical research, drug delivery and therapeutics research, health services and policy research, and epidemiological research to address the urgency of the pandemic.
Apart from this, Google is also supporting researchers at the University of Virginia Biocomplexity Institute, who are running daily epidemic simulations on Google Cloud.
Launch Of Dataset Program
Google Cloud launched the COVID-19 Public Dataset Program to make data more widely available and accessible for researchers, while enabling free querying of COVID-19 related datasets in BigQuery. The data includes the popular Johns Hopkins University cases data, as well as datasets that may prove relevant in COVID-19 research, such as the American Community Survey and Open Street Maps.
Google also took the help of the Kaggle community by encouraging data scientists to take part in challenges to forecast the spread of COVID-19.
To Find The Impact Of Lockdowns
To estimate how strategies such as travel restrictions and social distancing policies impact the spread of infection, Northeastern University started running large-scale, data-driven model simulations on Google Cloud.
“Developing data-driven models for predicting COVID-19 infection spread and potential impact is monumental as we race to slow the virus.”-Dr. Matteo Chinazzi, NE Univ
So far, Northeastern University researchers have been able to generate over 9 million different models and analyze more than 5,500 TB of resulting data.
By using Google Cloud’s High Performance Computing, the researchers ran thousands of preemptible Virtual Machines (PVMs) to simultaneously run to power their work. Google Cloud preemptible VMs are a great way to run these types of easily distributed, fault-tolerant research applications, enabling researchers to accelerate the computational portion of their research at a fraction of the cost of standard VMs.
On the completion of simulations, they used BigQuery to analyze the results and quickly share these insights with public health agencies around the world.
For Accelerating Low-Cost Drug Discovery
The main impediment in the face of a pandemic is lack of time. It is potentially life-saving even if the diagnosis time is cut by a small fraction. Speeding up training times for the models needs seamless computational resources. By distributing their work across thousands of virtual machines on Google Cloud, researchers can speed up their models and analysis.
Home » How Google Cloud Is Powering COVID-19 Research
This is where Google’s VirtualFlow has come in handy, as it is now being used by researchers at Harvard Medical school to target billions of drug compounds against SARS-CoV-2 proteins in a matter of days. VirtualFlow is an open-source, scalable virtual drug discovery platform running on Google Cloud that utilizes preemptible VMs, to quickly and accurately narrow down promising drug targets.
SARS-CoV-2 main protease
According to a report, drug development in the United States typically costs between $2-3 billion. It takes about ten years, and these services by Google have proven to be critical in addressing the recurring challenges of public health.
“The use of hundreds of thousands of computational cores at Google Cloud allows us to finish this task of screening a billion compounds, in a couple of weeks. To accomplish this on a standard laptop would take 1,500 years.”-Haribabu Arthanari, Harvard Medical School.
While powering many researchers across the globe, it is also crucial that the data that is being used stays secured. The forecasting of the pandemic takes in a lot of data that is sensitive at times. For instance, modelling an optimal strategy for social distancing can require information such as location, and this data must not be mishandled post-crisis.
Google Cloud claims that data on its platform is highly secured and is handled by widely recognized patient privacy and data security practices.
Provide your comments below
If you loved this story, do join our Telegram Community.
Also, you can write for us and be one of the 500+ experts who have contributed stories at AIM. Share your nominations here.
I have a master's degree in Robotics and I write about machine learning advancements. email:email@example.com