Recently, researchers at Waterloo Engineering developed an AI solution to assess the severity of COVID-19 cases. The technology could give doctors a critical tool to manage cases, said Alexander Wong, a systems design engineering professor and co-founder of DarwinAI.
“Assessing the severity of a patient with COVID-19 is a critical step in the clinical workflow for determining the best course of action for treatment and care, be it admitting the patient to ICU, giving a patient oxygen therapy, or putting a patient on a mechanical ventilator,” he added.
Supporting healthcare workers
The research is part of the COVID-Net open-source initiative launched more than a year ago. The study was done by researchers from Waterloo, DarwinAI, a spin-off startup company and radiologists of the Stony Brook School of Medicine and the Montefiore Medical Center in New York.
A deep-learning model analysed the extent and opacity of infection in the lungs of COVID-19 patients using chest x-rays, and the scores were compared to assessments of the same x-rays by expert radiologists.
The key indicators showed the severity of infections, both extent and opacity, using the AI software aligned with the human experts’ scores.
“The promising results in this study show that artificial intelligence has a strong potential to be an effective tool for supporting frontline healthcare workers in their decisions and improving clinical efficiency,” said Wong. He said this is especially important given how much stress the ongoing pandemic has placed on healthcare systems worldwide.
The tech behind
For the research, 396 Chest X-rays (CXRs) were used from 267 patients between 12-88 years old. It was obtained using a range of X-ray imaging equipment types and acquisition protocols such as supine and upright, posterior-anterior and anterior-posterior, said the research paper.
A technique known as ‘transfer learning24’ was used to improve the performance of the deep neural networks. This meant synthesising new training samples by adding randomly generated translations, intensity shifts, rotations, horizontal flips, zooms, cutout, and Gaussian noise to the CXR data in the training set. The idea was to increase data diversity and improve the robustness of the deep neural network.
The deep neural networks were trained using an Adam optimiser with an image size of 480×480, batch size of 32, the learning rate of 4e–4, 30 epochs, and mean squared error as the loss function. All of the model development was conducted using Python, OpenCV, and the Keras deep learning library with a TensorFlow backend, said the report.
During the research, a stratified Monte Carlo cross-validation26 was conducted to understand the efficacy of the COVID-Net S deep neural networks developed for computer-aided severity scoring of SARS-CoV-2 lung disease. Hundred deep neural networks were trained for geographic extent and opacity extent independently (50 for geographic and 50 for opacity), using 100 different random subsets of CXR data from the study (50 for geographic and 50 for opacity).
The networks were then tested independently on 100 different subsets of CXR data. A random subset consisting of 80% of the CXR data was used for each trial to train a deep neural network. The remaining 20% of the data was held out and used for testing.
The researchers felt that ground-glass opacity or lung consolidation signs could be very subtle to the human eye for SARS-CoV-2 patient cases with low lung severity. This makes it difficult for radiologists to identify visually.
The study evaluates the potential for computer-aided decision-making for SARS-CoV-2, demonstrating the ability of deep learning systems to learn and identify such subtle imaging features in CXRs for SARS-CoV-2 detection.