Imaging Informatics: AI Speeds Brain Hemorrhage Detection
By Dave Yeager
Vol. 19 No. 2 P. 6
As computer scientists and radiologists attempt to convert the promise of artificial intelligence (AI) to actual problem solving, the pool of algorithms keeps growing. A new entry from ParallelDots, Inc, a company working in the AI space, looks for signs of brain hemorrhage on CT scans. Its developers claim that its precision is similar to that of a radiologist, and its recall rate is as good or better. The algorithm, called Recurrent Attention DenseNet (RADNet), which is being tested in two hospitals in India, analyzes all head CT scans that are performed at the hospitals. If a brain hemorrhage is detected, it alerts a radiologist.
"We have made a chatbot that sends a message to the radiologist whenever hemorrhage is detected in a CT scan, along with a small video of the scan," says Angam Parashar, CEO and cofounder of ParallelDots. "This message is received by the radiologists within two minutes after the CT scan is completed."
Parashar says initial feedback has been positive, and radiology departments have found it particularly useful during night shifts, when senior radiologists are typically not available. The hospitals' radiologists have also found it helpful for making quicker reports on emergency cases. Although the algorithm has not yet been submitted to the FDA, Parashar expects it to be submitted in the first half of 2018.
What's the Score?
ParallelDots chose to focus on head trauma because of the need to identify brain hemorrhages quickly, Parashar says. RADNet analyzes CT scans much as radiologists do: by analyzing a series of 2D cross-sectional slices with emphasis on potential hemorrhagic regions and using 3D context from neighboring slices to make predictions about the likelihood of hemorrhage. The algorithm is able to show which areas of a slice it thinks are important. More than 300,000 CT slices were tagged and annotated to train it.
The developers compared RADNet's accuracy with that of three senior radiologists on a data set of 77 brain CTs. The algorithm's accuracy at predicting hemorrhage was 81.82%, which was comparable to the radiologists. Its precision score (the number of correct positive predictions divided by all of the samples that were marked as positive) was 81.25%, not far behind the radiologists. Its recall score (the number of correct positive predictions divided by all of the samples that should have been marked positive) was 88.64%, better than two radiologists and equal to the other. Overall, its F1 score (the average of the precision and recall scores) was 84.78%, better than two of the radiologists. ParallelDots' study has been accepted for the Institute of Electrical and Electronics Engineers Symposium on Biomedical Imaging 2018 as a conference paper.
Parashar says he doesn't view RADNet as a replacement for radiologists; rather, he envisions it as a type of decision support. Going forward, he believes the algorithm, once it's incorporated into their workflows, can be a useful tool for radiologists.
"As the technology matures, I see the radiologists relying on our tool for second opinions, carefully considering the recommendation of the machine before making the diagnostic decision," Parashar says. "It will help them provide faster and better patient care to their patients.
"Another very important area where I see the AI technology being used is in rural/semiurban areas where there is an acute shortage of radiologists," Parashar continues. "This could have a huge impact at such places and can potentially save many human lives."
Although he believes RADNet holds great potential as an emergency diagnosis tool, Parashar says it requires more real-world experimentation. Further down the road, he believes there will be many AI-based diagnostic tools that can detect other conditions on brain scans.
— Dave Yeager is the editor of Radiology Today.