Computer-aided detection is a big part of radiology, so much so that every now and then, we hear about a new algorithm being trained to detect pathologies in medical images like tumours, lesions and fractures — just like practising radiologists. The hype is so great, that it has led to speculation about radiologists losing jobs because of artificial intelligence.
AIM Daily XO
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
In India, AI has already waded into commercial diagnostics with Aravind Eye Care hospitals deploying Google’s Deep Learning application to identify diabetic retinopathy in patients. It is the same deep learning technique which was leveraged by Google’s image search applications to detect dogs and cats in pictures. Atom360, a Bengaluru-based startup that we spoke to, focuses on AI-powered diagnostic tools to counter tobacco-related deaths in India. The six-member startup is working on deep learning applications for tumour segmentation and it targeted at radiologists who can use it to segment tumour faster and more efficiently.
What Role Does AI Play In Radiology?
Download our Mobile App
With AI playing a major role in medicine and deep learning algorithms expediting analysis of medical imaging, radiologists, especially in the West, have gone into a panic mode. Part of the blame should be assigned to leading AI researchers Andrew Ng and Geoff Hinton who believe the profession is dying. According to experts from the medical community, radiology as a field lends itself well to technological development. Before the advent of deep learning, radiology saw rapid technological advances, from X-ray and CT scanners, to MRI scanners.
Today, these are routine examinations in medical centres across the globe. In a similar vein, radiologists would certainly benefit from AI-powered systems that can read and interpret multiple images quickly, because the number of images has increased much faster over the last decade than the number of radiologists. Especially in countries like India where the doctor-patient ratio is reported to be 1:921, deep learning algorithms can help radiologists assess cases faster.
Lily Peng, product manager at Google Brain AI research had once said, “India is one of the many places around the world where a lack of ophthalmologists means many diabetics don’t get the recommended annual screening for diabetic retinopathy.” Automated DR screening methods with high accuracy have a strong potential to assist doctors in evaluating more patients and quickly routing those who need help to a specialist. For Aravind Eyecare Hospitals, Google researchers worked closely with doctors in India and the US to create a development dataset of 128,000 images which were each evaluated by 3-7 ophthalmologists from a panel of 54 ophthalmologists. This dataset was used to train a deep neural network to detect referable diabetic retinopathy.
Here’s Why AI Won’t Replace Radiologists
Deep Learning Can Only Solve A Common Set Of Problems: Medical experts emphasise that deep learning algorithms can only solve a set of problems, that too, after the models have been trained on millions of images to arrive at the right results. In the medical field, there is a long trail of problems which can only be detected by radiologists. Currently, there are a few algorithms which are very specific and each algorithm is targeting only a specific problem.
Challenges Abound In Building Discovery And Analytics Platform: A senior executive from the American tech consulting firm, Booz Allen Hamilton revealed that there are many challenges in developing discovery and analytic platforms — right from acquiring and ingesting data to annotating, storage, figuring out governance and policy — to types of analysis enabled via the platform. One of the biggest challenges has been cited as data annotation and facilitating discovery across the datasets in a platform.
Deep Learning Will Add To Radiologist’s Skill And Effort: One of the biggest hurdles researchers face in the medical imaging field is building highly-accurate algorithms that can have a significant clinical impact. And the adoption of AI and medical image interpretation algorithms would speed up the results and add to the skill and effort of radiologists.
Corporate Players Jump On AI Bandwagon
The first company to make a splash in the medical AI field was IBM with its much-touted Watson for Oncology application which provided evidence-backed cancer care. The solution has already been deployed in India’s Manipal Hospitals to much success. As per the IBM report, in a double-blinded study presented at the San Antonio Breast Cancer Symposium, the doctors at Manipal Hospitals found that Watson technology was consistent with the tumour board recommendations in 90 percent of breast cancer cases.
It’s not just IBM, IT leaders with dedicated healthcare arms like GE, Philips and Siemens have also started taking AI into their medical imaging software systems. Reports indicate that GE is developing a predictive analytics software with AI technology and there are also a slew of startups such as India and US-based Qure.ai, which are specifically developing deep learning algorithms to interpret radiology images. Qure.ai’s algorithms have received critical reviews at scientific conferences as well.