TY - CHAP
T1 - Machine and deep learning in hyperspectral fluorescence-guided brain tumor surgery
AU - Suero Molina, Eric
AU - Black, David
AU - Xie, Andrew
AU - Gill, Jaidev
AU - Di Ieva, Antonio
AU - Stummer, Walter
PY - 2024
Y1 - 2024
N2 - Malignant glioma resection is often the first line of treatment in neuro-oncology. During glioma surgery, the discrimination of tumor's edges can be challenging at the infiltration zone, even by using surgical adjuncts such as fluorescence guidance (e.g., with 5-aminolevulinic acid). Challenging cases in which there is no visible fluorescence include lower-grade gliomas, tumor cells infiltrating beyond the margin as visualized on pre- and/or intraoperative MRI, and even some high-grade tumors. One field of research aiming to address this problem involves inspecting in detail the light emission spectra from different tissues (e.g., tumor vs. normal brain vs. brain parenchyma infiltrated by tumor cells). Hyperspectral imaging measures the emission spectrum at every image pixel level, thus combining spatial and spectral information. Assuming that different tissue types have different "spectral footprints," eventually related to higher or lower abundances of fluorescent dyes or auto-fluorescing molecules, the tissue can then be segmented according to type, providing surgeons a detailed spatial map of what they see. However, processing from raw hyperspectral data cubes to maps or overlays of tissue labels and potentially further molecular information is complex. This chapter will explore some of the classical methods for the various steps of this process and examine how they can be improved with machine learning approaches. While preliminary work on machine learning in hyperspectral imaging has had relatively limited success in brain tumor surgery, more recent research combines this with fluorescence to obtain promising results. In particular, this chapter describes a pipeline that isolates biopsies in ex vivo hyperspectral fluorescence images for efficient labeling, extracts all the relevant emission spectra, preprocesses them to correct for various optical properties, and determines the abundance of fluorophores in each pixel, which correspond directly with the presence of cancerous tissue. Each step contains a combination of classical and deep learning-based methods. Furthermore, the fluorophore abundances are then used in four machine learning models to classify tumor type, WHO grade, margin tissue type, and isocitrate dehydrogenase (IDH) mutation status in brain tumors. The classifiers achieved average test accuracies of 87%, 96.1%, 86%, and 93%, respectively, thus greatly outperforming prior work both with and without fluorescence. This field is new, but these early results show great promise for the feasibility of data-driven hyperspectral imaging for intraoperative classification of brain tumors during fluorescence-guided surgery.
AB - Malignant glioma resection is often the first line of treatment in neuro-oncology. During glioma surgery, the discrimination of tumor's edges can be challenging at the infiltration zone, even by using surgical adjuncts such as fluorescence guidance (e.g., with 5-aminolevulinic acid). Challenging cases in which there is no visible fluorescence include lower-grade gliomas, tumor cells infiltrating beyond the margin as visualized on pre- and/or intraoperative MRI, and even some high-grade tumors. One field of research aiming to address this problem involves inspecting in detail the light emission spectra from different tissues (e.g., tumor vs. normal brain vs. brain parenchyma infiltrated by tumor cells). Hyperspectral imaging measures the emission spectrum at every image pixel level, thus combining spatial and spectral information. Assuming that different tissue types have different "spectral footprints," eventually related to higher or lower abundances of fluorescent dyes or auto-fluorescing molecules, the tissue can then be segmented according to type, providing surgeons a detailed spatial map of what they see. However, processing from raw hyperspectral data cubes to maps or overlays of tissue labels and potentially further molecular information is complex. This chapter will explore some of the classical methods for the various steps of this process and examine how they can be improved with machine learning approaches. While preliminary work on machine learning in hyperspectral imaging has had relatively limited success in brain tumor surgery, more recent research combines this with fluorescence to obtain promising results. In particular, this chapter describes a pipeline that isolates biopsies in ex vivo hyperspectral fluorescence images for efficient labeling, extracts all the relevant emission spectra, preprocesses them to correct for various optical properties, and determines the abundance of fluorophores in each pixel, which correspond directly with the presence of cancerous tissue. Each step contains a combination of classical and deep learning-based methods. Furthermore, the fluorophore abundances are then used in four machine learning models to classify tumor type, WHO grade, margin tissue type, and isocitrate dehydrogenase (IDH) mutation status in brain tumors. The classifiers achieved average test accuracies of 87%, 96.1%, 86%, and 93%, respectively, thus greatly outperforming prior work both with and without fluorescence. This field is new, but these early results show great promise for the feasibility of data-driven hyperspectral imaging for intraoperative classification of brain tumors during fluorescence-guided surgery.
KW - Artificial intelligence
KW - Machine learning
KW - Hyperspectral Imaging
KW - Fluorescence-guided surgery
UR - http://www.scopus.com/inward/record.url?scp=85208991615&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-64892-2_15
DO - 10.1007/978-3-031-64892-2_15
M3 - Chapter
C2 - 39523270
SN - 9783031648915
T3 - Advances in Experimental Medicine and Biology
SP - 245
EP - 264
BT - Computational neurosurgery
A2 - Di Ieva, Antonio
A2 - Suero Molina, Eric
A2 - Liu, Sidong
A2 - Russo, Carlo
PB - Springer, Springer Nature
CY - Switzerland
ER -