By Beth W. Orenstein
Vol. 19 No. 6 P. 18
AI has yet to make its mark on PET imaging, but experts say it will save time, lower dose, and improve results.
PET is used in various clinical applications, including to detect and locate cancer as well as evaluate and stage recurrent malignancies. It is also widely used to detect cardiac disease and brain disorders, among other conditions. Since Time magazine named the PET-CT scanner the invention of the year in 2000, the number of PET scans performed annually has risen continuously, says Michael Graham, PhD, MD, cochair of SNMMI's Clinical Trials Network and director of nuclear medicine and the PET Center at the University of Iowa.
During that time, FDG has become the standard radiotracer for cancer patient management and neuroimaging, but new tracers are in various stages of development. The FDA recently approved a tracer for imaging the prostate—the second most common cancer location in men worldwide—and "an even better one is in clinical trial," Graham says. With new tracers for various applications near market, "we're on the verge of another surge of PET studies," Graham says. However, because reading PET scans is time consuming and pressure is on physicians to spend their time productively, it is difficult for nuclear radiologists to keep pace with these developments. This scenario makes the buzz about AI playing a role in PET scans "most welcome," Graham says.
The use of AI in medical imaging isn't new. Vincent Gaudet, PhD, PEng, a professor in and chair of the department of electrical and computer engineering at the University of Waterloo in Waterloo, Ontario, Canada, remembers AI being a hot topic 25 years ago when he was an undergraduate. "It's had its ups and downs over the years," Gaudet says. "But it's back in the forefront lately." Gaudet attributes AI's resurgence, particularly for applications in medicine, including PET imaging, to improvements in computing hardware.
"I think a lot of the advances can be traced back to advances on the electronic side, the chip development side," Gaudet says. "The computer architectures of today have really evolved. Today, we're able to pack many more transistors onto a chip so you can do more processing close together and you can run more complex algorithms effectively with the same cost."
The first area where AI has made an impact is mammography; AI applications for PET are not as advanced as they are for mammography, Graham says. "We don't have any software yet that can look through the PET images and put a bunch of little arrows on them that, in essence, say, 'these are probable areas of abnormalities that you might consider,' like we can with mammograms. We're nowhere close to that in nuclear medicine." Experts say, however, that current research could lead someday, perhaps sooner than later, to using AI in PET applications.
Many predict that AI could play a role in acquiring better PET images, reducing dose when acquiring those images, and assisting readers in making diagnoses and tracking treatment success from the images. Katherine Zukotynski, BASc, MD, FRCPC, president of SNMMI's PET Center of Excellence and an associate professor in the departments of medicine and radiology at McMaster University in Hamilton, Ontario, Canada, sees AI assisting nuclear medicine physicians in the reading room. Ultimately, she says, "The computer could go through the imaging data and provide useful information to help physicians come to their impressions more rapidly."
Eliot Siegel, MD, FACR, FSIIM, a professor in and vice chair of the department of diagnostic radiology at the University of Maryland School of Medicine, as well as chief of radiology and nuclear medicine for the VA Maryland Healthcare System in Baltimore, agrees that advances in machine learning technology have made it easier to create algorithms directly from data. "It doesn't take as long as it had in the past to create new algorithms for PET image optimization detection of abnormalities or diagnosis," Siegel says. As a result, "an increasing number and percentage of PET-related algorithms are being developed using deep learning, which is based on convolutional neural networks," a highly effective means of recognizing and classifying images.
One area where AI could play an important role is in helping to acquire better quality images, Siegel says. For example, PET studies are often used to track a lymphoma patient's progress; currently, Siegel says, "the physician might select and measure uptake in a small number of index lymph nodes and follow up therapy with a repeat PET scan of those nodes. In a patient with extensive disease, these index nodes may or may not be representative of the total amount of uptake on the PET scan, but we do not have an easy way to determine total body uptake."
What if AI could help nuclear radiologists report on the patient's total tumor burden rather than just the sentinel nodes? "If we could take all the areas of abnormal FDG uptake that the machine finds and add them together, we would be able to much more directly compare what the patient's total body burden is," Siegel says. He notes that total tumor burden would likely give a much better or at least important additional indication over time whether the chosen treatments were successful.
When physicians look at the standard uptake value (SUV) of a particular tumor, it could be an outlier and not representative of the metabolic activity, Siegel adds. "When you look at SUVs, there are a lot of challenges when trying to quantify the amount of disease a patient has or the amount of change a patient has over time," he says. "Being able to identify and quantify all the abnormal areas in the body would be far superior."
Austin Pantel, MD, a nuclear radiologist at the University of Pennsylvania in Philadelphia, also believes that PET images contain far more useful information than is currently mined from them. "The amount of information on the image that we actually use in our reports that go to the physicians is very little," Pantel says. While Pantel's research focuses on the kinetic analysis of dynamic PET imaging, he is certain "many features on the PET images could be exploited and used as biomarkers to guide cancer treatment."
One possibility is using AI to mine tumor heterogeneity in PET—the differences in radiotracer uptake within a tumor. PET is often used for imaging solid tumors, not only lymphoma but also head and neck tumors as well as lung tumors. These solid tumors often have a great deal of heterogeneity, Pantel says.
A tumor's heterogeneity may have important consequences for how it is diagnosed and treated. As shown in a study published in the Journal of Nuclear Medicine in December 2008, spatial heterogeneity in sarcoma FDG uptake can predict patient outcomes. Pantel says combining AI with PET to evaluate tumor heterogeneity would allow more powerful tools to be developed. "We could extract and build models for clinical predictions," he says. He expects that it could happen within the next decade. "That's what I would hope," he says.
AI may also help with registration of PET and CT scans. "Even though patients are examined on the same table, patients frequently move or shift a little in the scanner," Siegel says. As a result, "the PET image is slightly offset from the CT image." Machine learning applications have shown promise for improved registration between PET and CT scans.
Currently, humans register the separate images; however, computers could be trained with a large number of cases to find patterns in images that are brought back into full registration, "and we would be able to automate that task," Siegel says. Automation would save a great deal of time. "That automation can come with a relatively small number of steps, whereas, when you do it by hand, it takes many steps that are very complex," Siegel says. "It turns out that a machine learning algorithm can perform those tasks directly from the images in a much simpler way that makes the development of those algorithms significantly more rapid."
However, it's the possibility of using AI to reduce PET dose that has Siegel the most excited. In December, researchers at Stanford University reported in arXiv, an e-print journal of Cornell University, that they used deep learning to reduce PET scan dose by a 200-fold factor, which is more than a 99% dose reduction. In PET, the quality of the images is dependent on the amount of radiotracer.
"The typical dose scan for a PET/CT, according to Image Wisely [an awareness campaign of the ACR], is approximately 8 to 30 mSv of radiation," Siegel says. "The potential to be able to reduce PET dose in half would be exciting, but the potential to reduce the dose by a factor of 200, as the researchers at Stanford did, is truly astounding."
The researchers scanned nine patients with glioblastoma using a standard dose of FDG. They waited 45 minutes and then acquired images for about 40 minutes. They stored the raw list mode data sets—data files that list the geometric values for each event—for each scan and then synthesized low-dose reconstructed images at a dose reduction factor of 200. They did this by randomly selecting 0.5% of the count events, spread evenly over the acquisition period. They also reconstructed PET images from the acquired data at a dose reduction factor of 1 and were able to train the computer to simulate a larger data set. The researchers were able to show that they could dramatically reduce not only dose but also the time it takes to scan patients, while still producing quality images.
"The implications of that are huge," Siegel says. The capability to image patients with less dose and in less time is important not only to patients but also to the technologists who perform the scans, he says. A study published in the February 2018 issue of Radiology found that technologists who perform nuclear medicine procedures have a higher rate of cataracts now than they did in the past. "That may be because there is greater radiation associated with PET scans than other nuclear medicine studies," Siegel says. "I think the potential to cut those doses to patients and technologists by a factor of as much as 200 is really intriguing."
Graham, however, considers this study to be very poorly done and believes the results are "probably incorrect."
Zukotynski sees AI eventually playing two important roles in PET: saving time and decreasing variability among readers. "It could allow us to manage the volume of scans that we have and read them more efficiently and, possibly, even better," she says.
Graham believes the reason AI has been slower coming to PET than X-ray or other imaging modalities is not specific to PET. The problem, he says, is communicating between engineers, programmers, and imaging physicians. "They speak quite different languages, it turns out," says Graham, who also has degrees in electrical engineering from the Massachusetts Institute of Technology and the University of California, Berkeley. "Gradually, communication is improving, but that's a barrier that is going to continue to be a bit of a problem."
Still, Graham is optimistic that AI and PET are on the verge of many breakthroughs. "I suspect that over the next five years there are going to be a lot more applications, as some of these things emerge." Graham says he wouldn't be surprised if the number of publications on AI in PET goes up exponentially during that time.
Like Graham, Siegel believes it's still a number of years until some of the promising research involving PET and AI ends up in clinics. "But I think we'll really start seeing applications of machine learning in PET within the next five years," he says. "We can really look forward to seeing this with the major scanner and workstation vendors but, importantly, also with innovative startups. Some of the largest vendors are doing research on PET and AI, but they may not be able to move as quickly as some of the smaller vendors who are significantly more agile by virtue of their size."
Gaudet says that, as machine learning techniques continue to evolve, the amount of data needed to train algorithms in all imaging modalities, including PET, will decrease. "With advances in the theories underlying AI, we can expect to see even more applications of AI in imaging in the years to come."
— Beth W. Orenstein of Northampton, Pennsylvania, is a freelance medical writer. She is a frequent contributor to Radiology Today.