The tumor, node, metastasis (TNM) staging system enables clinicians to describe the spread of head-and-necksquamous-cell-carcinoma (HNSCC) cancer in a specific manner to assist with the assessment of disease status, prognosis, and management. This study aims to predict TNM staging for HNSCC cancer via Hybrid Machine Learning Systems (HMLSs) and radiomics features. In our study, 408 patients were included from the Cancer Imaging Archive (TCIA) database, included in a multi-center setting. PET images were registered to CT, enhanced, and cropped. We created 9 sets including CT-only, PET-only, and 7 PET-CT fusion sets. Next, 215 radiomics features were extracted from HNSCC tumor segmented by the physician via our standardized SERA radiomics package. We employed multiple HMLSs, including 16 feature-extraction (FEA) + 9 feature selection algorithms (FSA) linked with 8 classifiers optimized by grid-search approach, with model training, fine-tuning, and selection (5-fold cross-validation; 319 patients), followed by external-testing of selected model (89 patients). Datasets were normalized by z-score-technique, with accuracy reported to compare models. We first applied datasets with all features to classifiers only; accuracy of 0.69 ± 0.06 was achieved via PET applied to Random Forest classifier (RFC); performance of external testing (~ 0.62) confirmed our finding. Subsequently, we employed FSAs/FEAs prior to the application of classifiers. We achieved accuracy of 0.70 ± 0.03 for Curvelet transform (fusion) + Correlation-based Feature Selection (FSA) + K-Nearest Neighbor (classifier), and 0.70 ± 0.05 for PET + LASSO (FSA) + RFC (classifier). Accuracy of external testing (0.65 & 0.64) also confirmed these findings. Other HMLSs, applied on some fused datasets, also resulted in close performances. We demonstrate that classifiers or HMLSs linked with PET only and PET-CT fusion techniques enabled relatively low improved accuracy in predicting TNM stage. Meanwhile, the combination of PET and RFC enabled good prediction of TNM in HNSCC.
Multi-level multi-modality fusion radiomics is a promising technique with potential to improve the prognostication of cancer. We aim to use advanced fusion techniques on PET and CT images coupled with deep learning (DL) to improve outcome prediction in head and neck squamous cell carcinoma (HNSCC). In our study, 408 HNSCC patients were included from The Cancer Imaging Archive (TCIA) in a multi-center setting. Prognostic outcomes (binary classification) included overall survival (OS), distant metastasis (DM), locoregional recurrence (LR), and progression free survival (PFS). We utilized a DL algorithm with a 17-layer 3D convolutional neural network (CNN) architecture. Prior to training, each image underwent min-max-normalization, image-augmentation by using random rotations (0-20°) to improve the performance and generalizability of our model and followed by 5-fold-cross-validation. We employed 12 datasets, including CT, PET, and 10 image-level fused datasets. The best OS performance was achieved via Discrete-wavelet-transform (DWT) resulting in mean accuracy of 0.93±0.06. The best DM score was achieved via ratio of low-pass pyramid (RLPP), resulting in an accuracy of 0.95±0.02. Optimal LR and PFS scores were achieved using DWT and RLPP for LR, and Laplacian pyramid for PFS, resulting in accuracies of 0.90-0.92. Comparatively, when using a machine learning framework instead of deep learning, we obtained scores of 0.83, 0.90, and 0.87 for the prediction of OS, DM, and LR. Our study demonstrates that our multi-modality fusion techniques performed better than using standalone PET or CT in prognostication of HNSCC patients and that a high level of accuracy can be achieved for prognostication of HNSCC patients when combining multi modality fusion techniques with DL.
Accurate prognostic stratification of Head-and-Neck-Squamous-Cell-Carcinoma (HNSCC) patients can be an important clinical reference when designing therapeutic strategies. We set to predict 4 outcomes: overall survival (OS), distant metastasis (DM), locoregional recurrence (LR), and progression-free survival (SP). We studied Hybrid Machine Learning Systems (HMLS), applied to datasets with radiomics features. In this multicenter study, 408 HNSCC patients were extracted from The Cancer Imaging Archive (TCIA) database. PET images were registered to CT, enhanced, and cropped. 215 radiomics features were extracted from each region of interest via our standardized SERA radiomics package. We employed multiple HMLSs: 12 feature extraction (FEA) or 9 feature selection algorithms (FSA) linked with 9 survival-prediction-algorithms (SPA) optimized by 5-fold cross-validation, applied to PET only, CT only and 4 PET-CT datasets generated by image-level fusion strategies. Datasets were normalized by z-score-technique, and cindices were reported to compare the models. For OS prediction, the highest c-index 0.73 ± 0.10 was obtained for HMLS with Ratio of low-pass pyramid (RP) fusion technique + gaussian process latent variable model (GPLVM) + causal structure learning-based feature modification method (CSFM). For DM prediction, we achieved 0.80±0.06 via Dual-tree complex wavelet transform (DTCWT) fusion + Laplacian Score (LAP) + Logistic regression hazards (LH). For LR prediction, we arrived at a c-index of 0.73 ± 0.13 using PET + Sammon Mapping Algorithm (SM)+ deep neural network to distribute first hitting times (DHS). For SP prediction, the performance of 0.68 ± 0.02 was obtained via PET + SM + Relative risk model-depend on time (CoxTime). When no dimensionality reduction (FEA/FSA) was employed, the above 4 performances decreased to 0.69 ± 0. 10, 0.74 ± 0.13, 0.66 ± 0.15, and 0.68 ± 0.04 for OS, DM, LR and SP prediction. We demonstrated that using fusion techniques followed by appropriate HMLSs, including FEAs/FSAs and SPAs, improved prediction performance.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.