A Meeting of the Minds
By Sandra Nunn, MA, RHIA, CHP
Vol. 18 No. 3 P. 24
Artificial intelligence may one day be indispensable to radiologists.
The first report from Stanford's One Hundred Year Study on Artificial Intelligence, also known as AI100, was produced in 2016. This study is a "100-year effort to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live, and play." After years of backwater performance, advances in computer technology have allowed for remarkable advances in perception tasks such as object and speech recognition—tasks for which deep learning algorithms are achieving significant improvements in accuracy.
The Future of Work
Artificial intelligence (AI) has moved to the center stage of American consciousness due to some of its applications, a situation which may further exacerbate the replacement of American workers through automation. However, per the article "Hybrid Workplaces of the Future" from the Winter 2016 edition of XRDS, a publication of the Association for Computing Machinery, in both health care and education, AI is likely to become the partner of the human professional, ie, an enhancer of the caregiver's or educator's execution of their professional duties. As the medical professionals interviewed in this article concur, the AI100 report "predicts a future in which machines and humans work together, side by side, in the workplace."
Oren Etzioni, PhD, CEO of the Seattle-based Allen Institute for Artificial Intelligence, who coauthored the report, stated in the November/December 2016 issue of the MIT Technology Review that "the extreme positive hype [of AI] is wrong, and the fear-mongering is not based on any data."
In his RSNA 2016 presentation, "When Machines Think: Radiology's Next Frontier," Keith Dreyer, DO, PhD, vice chair of radiology at Massachusetts General Hospital (MGH) and an associate professor of radiology at Harvard Medical School, described the human/machine partnership envisioned above as a "centaur," the creature from Greek mythology with the head, arms, and torso of a man and the body and legs of a horse, indicating the close relationship that will need to occur in medicine between man and machine. Recently, Dreyer has helped found the MGH Clinical Data Science Center, citing future learning in radiology as occurring at the "crossroads of radiology and the emerging field of clinical data science." He correctly noted that "there [are] a tremendous amount of applications for AI in radiology" and that "the radiology field itself will be the foundation of precision health care." The following examples of developing and implemented AI applications in radiology support his supposition.
The Longitudinal View
Tanveer Syeda Mahmood, PhD, an IBM fellow and chief scientist for the Medical Sieve Radiology Grand Challenge, has a visionary view of the potential of AI in radiology and other medical applications. Mahmood, in collaboration with IBM, is exploring the potential of AI in multiple modalities including cardiology, oncology, ophthalmology, and radiology. Recognizing that AI is in its nascent mode, IBM is partnering with other health care entities to develop the enormous potential of these new tools.
Mahmood enthusiastically notes the success of the Eyes of Watson, a joint venture of IBM Research and RSNA. Employing use cases to demonstrate the powerful potential of AI tools, the Eyes of Watson attracted 2,000 radiologist viewers and 7,000 questions from the practitioners about the use cases at RSNA 2016. According to Mahmood, there was uniform feedback from radiologists interacting with the AI tools that these would support excellent hands-on medical education. She stresses that these are "assist you" technologies and are in no way intended to replace the radiologist.
Mahmood explains that "cognitive assistants" are trained through working with huge data repositories. Health care systems collaborating with IBM upload their deidentified data into a cloud-based repository. Repositories from different health care systems remain separate, but the AI tool can cross repositories to reach conclusions.
The term "longitudinal patient record" in IBM parlance refers to the cognitive assistant's view through consecutive patient visits, exerting comprehensive analysis. IBM is reaching for "temporal analysis"—tools that the radiologist can use in real time to immediately interact with the AI assistants. Mahmood says that workflow remains a key component; in order for the system to help radiologists it must be "well trained" and include not just images but also text such as reports, associated data from the EHR, and information from expert systems. Problems she has encountered include "volume overload, which can mean missing coincidental diagnoses" and the desire to make decisions "based on images alone."
Agfa Healthcare is another company that is collaborating with IBM in the AI domain. James Jay, vice president and general manager of imaging IT and integrated care solutions at Agfa, says Agfa's strategy focuses on "the role of aggregator to those organizations who are using machine learning and leveraging our workflow engine to bring AI tools that are useful to physicians," rather than bringing specific AI products into the health care sector. Agfa is using a "semantic platform to make things searchable" including unstructured information. The idea is to "process, ingest, and codify" what is useful for referential search.
Not working exclusively with IBM, Agfa's strategy is to "screen the marketplace for innovators out there who may be building something useful, ie, something worthy of going to market," Jay says. One of its hopeful liaisons is with CureMetrix, a company that will soon bring to market a product that can gain a calcium score from imaging data; this product is already integrated with the Agfa imaging product. Other promising work from CureMetrix includes an AI product that will perform an analysis of imaged breast tissue with the goal of knowing whether cancer will develop from looking at the breast images, possibly years before the appearance of an imaged tumor. This disease prevention potential is far in advance of current methodologies.
Jay says Agfa is also working with Mindshare Medical, an analytics company engaged in image analysis that leverages advanced machine learning algorithms. Agfa is looking at the software's ability to partially diagnose medical conditions through a consideration of a patient's demographics, risk factors, and, eventually, genomics combined with population-based information. Integrating with Agfa's enterprise imaging platform, Mindshare has produced a computer analysis of the image data that can replace the current use of size-based analysis (the Fleischner model) for lung nodule diagnosis to determine the correct action. This AI technology will provide the radiologist with the means to suggest to the treating physician whether a biopsy is immediately necessary or can be delayed.
Combined with population information-triggered workflow, guidance on treatment can also be suggested. Jay says, "This technology is not to be confused with computer-assisted diagnostics and not a situation of 'find the nodule' but is intended to provide help arriving at the actual diagnosis and treatment." The technology was initially developed with clinical data that is available from the National Institutes of Health's study on lung cancer and will be expanded leveraging early clinical study partners.
Digital Reasoning, a company with a history in support of governmental intelligence, has turned its resources in the direction of health care. Hal Andrews, JD, president of the health care sector of Digital Reasoning, believes in the power of AI tools to eliminate waste and improve efficiency in health care. Digital Reasoning has worked with sentiment analysis, an AI tool used to analyze a key metric in health care: patient satisfaction feedback.
Moving into the clinical domain, Digital Reasoning is on its fourth release of Synthesys, a tool that is able to machine-evaluate hundreds of thousands of pathology reports while simultaneously associating them with data from a health care entity's data warehouse. With algorithms, the system routes pathology reports that show or suggest cancer to clinicians as well as the cancer registrars in an organization's cancer registry, while simultaneously screening out the reports with normal results. In addition, the tool can auto-populate portions of the required abstracts that are usually filled out by cancer registrars. Andrews says that a 30-hour work process can be reduced to 45 minutes.
According to Andrews, future endeavors include investigation into the "pharma space" to improve the time it takes to monitor drugs for adverse events. With the ingestion of adverse event data, the potential to do this new drug monitoring in multiple countries and multiple languages exists with the use of AI tools. This will speed drugs to market while maintaining safety standards. Paralleling Jay's comments, Andrews states that his company is looking to find "leading providers of AI in each sector and to work with those AI leaders to solve problems for customers."
In "Radiology Should Embrace—Not Fear—the Future," her presentation at RSNA 2016, Hedvig Hricak, MD, PhD, asserted that "fear of new technologies like machine learning are misguided." Hricak, chair of the department of radiology at Memorial Sloan-Kettering Cancer Center in New York City, discussed the important role radiology has played in the treatment of cancer. Hricak thinks radiologists should embrace AI tools as radiology moves into the area of "understanding tumor ecology.''
Hricak discussed radiomics, extracting quantitative data from medical images, as "a key area of development in cancer care that can serve as a predictive biomarker and a tool for the assessment of chemoresistance. Radiomics research on a high-grade type of ovarian cancer has shown that the further the metastases are from a tumor, the greater the heterogeneity. This heterogeneity is predictive of a worse outcome." Rather than seeing a reduced role for radiology professionals, she talked of radiologists having their own clinics and owning admitting privileges.
The Cleveland Clinic is also embracing cognitive computing. In 2011, after Watson's successful debut on Jeopardy!, IBM and Cleveland Clinic joined forces to train the technology to "think" like a doctor. The active mode working in the Cleveland Clinic now includes the use of Agfa's lung nodule analysis tool. The second active mode may be more important. IBM and Agfa are studying the radiology function, specifically, the ways radiologists are practicing their craft today and the workflow changes that will need to be made if radiologists are to interact with AI tools to treat patients.
Jay says future steps anticipated for the use of AI at the Cleveland Clinic include the following:
• development of treatment guidelines to guide physicians in the hospital setting and other settings;
• development of referential image data for research and genetic information;
• eventual development of a link to genomics with the ability to consider genetic information;
• integration with radiomics to go even deeper into analysis of the images;
• further work in the analysis of the pixels in the images themselves; and
• engagement of molecular imaging.
The future work landscape of hybrid intelligence is associated teamwork, according to "Hybrid Workplaces of the Future." The article states that "The central idea of research on hybrid intelligence is that instead of striving to design AI systems that function alone, our focus should be on hybrid systems that benefit from human input." In other words, the real value of AI systems may come from effective partnering with humans.
— Sandra Nunn, MA, RHIA, CHP, is a freelance writer and principal of KAMC Consulting in Albuquerque, New Mexico.
Dargan R. Evolution of machine learning will strengthen radiology. RSNA 2016 website. https://rsna2016.rsna.org/dailybulletin/index.cfm?pg=16mon02. Published November 28, 2016.
Dargan R. Radiology should embrace — not fear — the future. RSNA 2016 website. https://rsna2016.rsna.org/dailybulletin/pdf/Wed_DailyBulletin.pdf. Published November 30, 2016.
Kamar E. Hybrid workplaces of the future. XRDS. 2016;23(2):22-25.