Lately, artificial intelligence and machine learning are being intensively adopted by the healthcare industry. In fact, since the beginning of the pandemic, the medical imaging space has been leveraging technologies for the rapid detection of COVID-19 among patients.
Even before the unprecedented times set in, algorithms had been developed to detect chest-related conditions, including tuberculosis and lung cancer. However, the capabilities of these technologies and algorithms were all limited to general clinical settings — at times where there could be chances of multiple abnormalities. For instance, a system meant to detect pneumothorax may not be expected to highlight nodules suggestive of cancer. Developing an algorithm that identifies X-rays containing any sort of abnormality could significantly facilitate the process of identifying diseases. However, developing such a classifier is extremely challenging owing to the number of possibilities of abnormal findings.
Earlier, Google researchers presented a model to distinguish between normal and abnormal chest X-rays across datasets and settings. Recently, Google has released a set of radiologists’ labels for the test set used in the previous study for the publicly available ChestX-ray14 dataset.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
How does the model work?
Google’s deep learning system is based on EfficientNet-B7 architecture, which is pre-trained on ImageNet. Google researchers trained the model using over two lakh de-identified chest X-rays from India’s Apollo Hospitals. Each x-ray was assigned under labels ‘normal’ and ‘abnormal,’ using natural language processing on the radiology reports.
To understand if the system generalised the new patient population correctly, researchers compared the performance of the two datasets consisting of varied abnormalities — test split from Apollo Hospitals dataset (DS-1) and publicly available ChestX-ray14 (CXR-14). US board-certified radiologists interpreted the labels. The system achieves areas under the receiver operating characteristic curve of 0.87 on DS-1 and 0.94 on CXR-14 (higher is better).
Although the eval-14 contained varied abnormalities, a possible use-case would be utilising the model in novel and unforeseen diseases. To evaluate the generalisability of the system to new patients and for diseases not existing in the training set, the researchers de-identified datasets from three different countries. These data sets included two publicly available tuberculosis datasets and two COVID-19 datasets from Northwestern Medicine. The system reported AUCs of 0.86 for the COVID-19 dataset and 0.91 and 0.93 for the tuberculosis dataset.
Source: Google Blog
According to Google researchers, the performance drop for the COVID-19 dataset is due to the system reporting positive for abnormalities that were negative for COVID-19. Nevertheless, the ‘positive’ reports later detected abnormal chest X-ray readings, thus, highlighting the usefulness of such models.
Source: Google blog
Potential Use Cases
In order to understand the use case of the deep learning system to improve clinical workflows, the researchers simulated its use case prioritisation where abnormal cases were pushed ahead of normal cases. In these simulations, the system reduced the turnaround time for abnormal cases by up to 28 per cent. Post reprioritisation, this setup could be used to divert complex abnormal cases that may require urgent decision-making, allowing batching up of negative chest X-rays for streamlined review.
Source: Google blog
While much work continues to be done to realise the true capabilities of machine learning in the medical image interpretation space, Google researchers have already scratched the surface of what lies ahead in AI in the healthcare space. They have also observed that the system could be used to pre-train models and further improve other machine learning algorithms for chest X-rays, particularly in cases where the data is limited.
To know how India’s AI and Robotics Technology Park, in collaboration with health tech startup NIRAMAI Health Analytics and the Indian Institute of Science, developed AI-driven chest X-ray interpretation solution Xray Setu, click here.