AI has made radiology more efficient, faster and has improved the quality of care: Prashant Warier, qure.ai – ET HealthWorld

Shahid Akhter, editor, ETHealthworld, spoke to Prashant Warier, Co-founder & CEO, qure.ai, to figure out the new wave of AI algorithms that are interpreting radiology scans.

AI in Healthcare (Radiology)
If you look at the data that we are collecting today, there is so much data about us being collected. We are conducting many tests. I mean, there is a vast amount of information on the Internet. Our phones are collecting health information. There are Fitbits and other devices. There is a lot of data that we are gathering about individuals, and it’s data that a human being cannot process. That’s where AI comes in, and AI can process huge amounts of data in a very short time. So, there is a need for AI in healthcare, specifically in radiology, which is the domain we operate in.

If you look at, for example, the US, they have about 35 to 40,000 radiologists for a population of about 300 million people. India has about 1.4 billion people, and we have one-third the number of radiologists that the US has. So what that means is, you don’t have radiologists available in rural parts of the country to read scans. So most of those places are underserved. We don’t receive reports on time. Patients are not getting their radiology reports. They’re not treated on time. They’re waiting for the radiology reports, and that’s where AI can come in and improve the quality of care for many underserved parts of the country.
Building AI algorithms to interpret radiology scans
When it comes to teaching and building AI algorithms to interpret radiology scans, I think the first thing that I’d like to point out is that it’s a very difficult task. The task of identifying an everyday object, like a cup in an image, is something even a three-year-old child can do. But for someone to read a scan, read an X-ray or a CT scan, that requires about 30 years of training. They have to go through high school, obtain a medical degree, and specialize in radiology. That’s when they are trained to be able to read that scan. Now we are teaching algorithms to automatically do that, to interpret an X-ray or CT scan within seconds using algorithms, and that is a very challenging task. We embarked on that difficult task about seven years ago, and not only us, but multiple organizations embarked on that journey to use AI to make radiology more efficient, make radiology reporting faster, and improve the quality of care.Building AI algorithms to interpret radiology scans: Challenges
Regarding the challenges encountered when building something like this, when you’re building AI for regular images, there are millions, probably billions of such images on the Internet. There are many different places where you can download them, and you can train these algorithms. However, getting access to radiology data is very hard. That’s something we worked on significantly in the first couple of years at Qure.AI, to gain access to a large amount of data, so we can train high-quality algorithms. Second is, as I mentioned, detecting something abnormal on a scan is an extremely difficult task. It requires a significant amount of training. Specifically, when you look at a scan, maybe there are four or five pixels that might not be right, such as a small nodule on a scan, and that makes that scan abnormal. So typically, when you’re looking at everyday images, you have a large dog or cat, and you’re able to identify that. But when you translate that to medical imaging, you’re talking about a few pixels being different, and that’s why something is called abnormal. So again, the problem is much more difficult because you’re trying to find the differences in a small part of that particular image. That increases the difficulty of training these algorithms.

AI in radiology: Opportunities
When I think about opportunities for AI in radiology, I think about tuberculosis. Tuberculosis is still a massive problem in the low and middle-income countries of the world. There are about 11 million patients every year in these geographies, and typically these are in some of the poorest parts of the world with the least amount of access. What is missing in all these places is access to radiologists. Specifically, I’ll talk about one of our customers now and how they operated a few years back.

This is a program in the Philippines, and they have mobile vans conducting community screenings in multiple parts of the country. They would start from Manila, and these vans would go out in, the community in different parts of the country to screen people there. They would travel around for a few weeks and come back to Manila. The van itself has an X-ray system within it, so people can come in, take an X-ray, and get screened. Unfortunately, they didn’t have a radiologist within the van. So what used to happen is that patients would take an X-ray, and they would only get a report once it goes back to Manila. It took about two to three weeks, sometimes even four or five weeks, to get an X-ray interpreted. You’re talking about a TB patient who is spreading TB to others while they’re waiting for their diagnosis to come through. That’s where AI can play a role. AI can interpret that scan faster and provide a quicker intervention and treatment for that patient. That’s one opportunity.

A second area of opportunity is in stroke. Typically, stroke patients have to wait for a CT scan or a CT angio for patients who are suspected of having a stroke. You have to wait for hours to get that CT scan read by a radiologist. In that time, the patient is losing brain cells. So, can we get that scan interpreted faster so that treatment for the stroke patient can start sooner? The third opportunity is around lung cancer. Can you diagnose lung cancer early using AI? There are multiple opportunities for AI in radiology.

Building qure.ai
I have been a practitioner of data science and AI for the last 23 years, even when there was no field named data science. I’ve been building algorithms to optimize and predict systems for a long time. I have worked in various fields, from trucking to retail, fashion, and advertising. With Qure.AI in 2016, there was an opportunity to apply AI to radiology imaging. Unlike my work in the past, which involved making people click on more ads or buy more products, this was a much more altruistic use-case. I was quite excited about building Qure back in 2016.

When we started, I spoke about some of the problems that we identified. We asked, “Can we interpret chest X-rays? Can we interpret head CT scans to diagnose tuberculosis, stroke, or lung cancer?” So we began with these use cases. Typically, when you build AI, the first thing you have to do is gather a lot of data because without data, any AI you develop won’t be very accurate. So our first order of business was to compile a large database of images. In fact, you only need anonymized data; you don’t need any patient identifiers to train these algorithms. So, we collected a significant amount of data.

We built these algorithms and collaborated with some of the top academic institutions in the world, from Harvard Medical School to Erasmus University to Mayo Clinic. We published and validated the efficacy of our algorithms. Then, we went about obtaining the necessary regulatory clearances to deploy these solutions in the field.

Even though what we built is only software, it’s an AI software that can automatically interpret a radiology image. The input is a radiology image, and the output is a report. However, it still classifies as a medical device. It’s software classified as a medical device, and you have to obtain regulatory clearance in multiple geographies. You need to secure FDA clearance, a CE marking, or a CDSE clearance. In India, we pursued obtaining those clearances before deploying these solutions in various regions.

Returning to the problems I mentioned, where the opportunity existed for AI in healthcare, if you look at tuberculosis now, we are deployed across 30 to 35 countries in approximately 700 to 800 sites where we are screening for tuberculosis. We provide an automated report to a screening camp within a minute of the X-ray being taken. Compare that to the two to three weeks, sometimes more, I spoke about earlier. This change is happening within a minute, thanks to AI. Patients can come in, receive an X-ray, and get a report from AI on the same day, leading to faster treatment. All of this happened because we built the AI, and WHO endorsed the use of AI, stating that it can be a full replacement for human readers in these constrained settings. This endorsement helped us scale and truly make an impact using our AI products.

  • Published On Oct 9, 2023 at 06:25 AM IST

Join the community of 2M+ industry professionals

Subscribe to our newsletter to get latest insights & analysis.

Download ETHealthworld App

  • Get Realtime updates
  • Save your favourite articles


Scan to download App
health barcode

Source link

Leave a comment