Skip to content

Breakthroughs, Briefly

Hacking Pneumonia: A New Lead on Diagnosing It

When computer scientists and radiologists team up, you get faster, better results.

Radiologist Matthew Lungren, left, meets with graduate students Jeremy Irvin and Pranav Rajpurkar to discuss the results of detections made by their CheXNet algorithm. A tool the researchers developed along with the algorithm produced these images, which are similar to heat maps and show the areas of the X-ray most indicative of pneumonia. (Photo: Linda A. Cicero/Stanford News Service)

By Diana Aguilera

Whose medical diagnosis would you trust more: an experienced doctor’s or an algorithm’s? It turns out that — at least when it comes to pneumonia — artificial intelligence can decode chest X-rays more accurately than radiologists can.

We can use the help. According to the CDC, pneumonia sends 1 million people to U.S. hospitals each year. The lung infection can be tough to spot, and roughly 50,000 people in the United States die from it annually.

The project to create a machine-learning diagnostic tool began with a large data set released by the National Institutes of Health of more than 100,000 frontal-view chest X-rays, labeled with 14 possible diagnoses. The NIH included some preliminary algorithms for deciphering the illnesses and asked for help advancing them.

A group of Stanford computer scientists teamed up with assistant professor of radiology Matthew Lungren to set about the task. They enlisted four Stanford radiologists to analyze 420 of the images for indications of pneumonia, which served as a baseline for diagnostic performance. The computer scientists, meanwhile, designed CheXNet, an algorithm that learned in about a week’s time to identify 10 of the 14 diagnoses in the original data set more accurately than previous algorithms had been able to. After a month, CheXNet was ahead in all 14 categories. Moreover, it consistently outperformed the four radiologists in diagnosing pneumonia. The group published its findings on the open-access, scientific preprint website arXiv in November. Graduate students Pranav Rajpurkar, ’16, and Jeremy Irvin were its co-lead authors.

“Just from a pure imaging standpoint, it’s an outstanding feat that a machine with only a few weeks of work can do as well as a radiologist with 30 years of experience,” Lungren says.

The study has some limitations: It did not involve lateral-view chest X-rays or patients’ medical histories, both of which can improve the accuracy of diagnoses. Still, Lungren says, the algorithm can speed diagnosis, reduce human error and aid patients in places that lack medical experts.

“This is potentially a game changer for health globally,” he says. 


Diana Aguilera is a STANFORD staff writer.

 

Comments (0)


  • Be the first one to add a comment. You must log in to comment.

Rating

Your Rating
Average Rating

Actions

Tags

Be the first one to tag this!