Three vibrantly colored views of a brain MRI showing a large tumor
Image: burhan oral gudu/Getty Images

At a glance:

  • Correctly distinguishing between look-alike tumors found in the brain during surgery can guide critical decisions in real time while patient is still in the operating room.

  • A new AI tool outperformed humans and other models in distinguishing glioblastoma from another type of cancer that appears similar under a microscope.

  • The new AI tool has a built-in uncertainty feature that flags tumors the model has not encountered before and marks them for human review.

Work described in this story was made possible in part by federal funding supported by taxpayers. At Harvard Medical School, the future of efforts like this — done in service to humanity — now hangs in the balance due to the government’s decision to terminate large numbers of federally funded grants and contracts across Harvard University.

A Harvard Medical School–led research team has developed an AI tool that can reliably tell apart two look-alike cancers found in the brain but with different origins, behaviors, and treatments.

The tool, called PICTURE (Pathology Image Characterization Tool with Uncertainty-aware Rapid Evaluations), distinguished with near-perfect accuracy between glioblastoma — the most common and aggressive brain tumor — and primary central nervous system lymphoma (PCNSL), a rarer cancer often mistaken for glioblastoma. While both can appear in the brain, glioblastoma arises from brain cells, whereas PCNSL develops from immune cells. Their similarities under the microscope often lead to misdiagnosis, with serious consequences for treatment.

Get more HMS news

The work, supported in part by the National Institutes of Health, is described Sept. 29 in Nature Communications. The AI model is publicly available for other scientists to use and build upon, the team said.

Correctly identifying look-alike tumors in the brain during surgery is one of the toughest diagnostic challenges in neuro-oncology, the researchers said. An accurate diagnosis while the patient is still in the operating room can help expedite critical treatment choices, such as whether to operate and remove the cancerous tissue — as should be done with glioblastoma — or leave it behind and opt for radiation and chemotherapy instead, the preferred therapy for PCNSL. Inaccurate or delayed diagnosis of cancers in the brain can lead to unnecessary surgery and delays in proper treatment.

What makes the tool especially valuable is its ability to be deployed during surgery, providing critical insights in real time to surgeons and pathologists.

“Our model can minimize errors in diagnosis by distinguishing between tumors with overlapping features and help clinicians determine the best course of treatment based on a tumor’s true identity,” said study senior author Kun-Hsing Yu, associate professor of biomedical informatics in the Blavatnik Institute at HMS and HMS assistant professor of pathology at Brigham and Women’s Hospital.

During brain tumor surgery, surgeons typically remove tumor tissue for rapid evaluation under a microscope. The evaluation is done by freezing the sample in liquid nitrogen, which can distort the cellular features somewhat but provides a quick, real-time assessment. The process takes 15 minutes or so. Based on the results of this first-glance evaluation, surgeons determine whether to remove the tumor or leave it behind and opt for radiation and chemotherapy. Then, over the next few days, pathologists conduct a more detailed and more reliable evaluation of the tumor sample. In about 1 in 20 cases, the initial diagnosis of a tumor changes on second read, Yu said. This is precisely where the new AI system could play a valuable role — removing uncertainty and reducing the risk for error during operation when critical decisions are made.

Authorship, funding, disclosures

Additional authors include Junhan Zhao, Shih-Yen Lin, Raphaël Attias, Liza Mathews, Christian Engel, Guillaume Larghero, Dmytro Vremenko, Ting-Wan Kao, Tsung-Hua Lee, Yu-Hsuan Wang, Cheng Che Tsai, Eliana Marostica, Ying-Chun Lo, David Meredith, Keith L. Ligon, Omar Arnaout, Thomas Roetzer-Pejrimovsky, Shih-Chieh Lin, Natalie NC Shih, Nipon Chaisuriya, David J. Cook, Jung-Hsien Chiang, Chia-Jen Liu, Adelheid Woehrer, Jeffrey A. Golden, and MacLean P. Nasrallah.

The work was supported in part by the National Institute of General Medical Sciences at the National Institutes of Health (grant R35GM142879), a Department of Defense Peer Reviewed Cancer Research Program Career Development Award (HT9425-23-1-0523), a Research Scholar Grant from the American Cancer Society (RSG-24-1253761-01-ESED), a Google Research Scholar Award, an HMS Dean’s Innovation Award, and a Blavatnik Center for Computational Biomedicine Award.

Yu is an inventor of U.S. Patent 10,832,406, which is assigned to Harvard University and is not directly related to this manuscript. Yu was a consultant for Curatio DL (not related to this work).