Next Article in Journal
Inflammation, Infiltration, and Evasion—Tumor Promotion in the Aging Breast
Previous Article in Journal
Is Pediatric Melanoma Really That Different from Adult Melanoma? A Multicenter Epidemiological, Clinical and Dermoscopic Study
Previous Article in Special Issue
Prediction of the Molecular Subtype of IDH Mutation Combined with MGMT Promoter Methylation in Gliomas via Radiomics Based on Preoperative MRI
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Application of Machine Learning for Differentiating Bone Malignancy on Imaging: A Systematic Review

1
Department of Diagnostic Imaging, National University Hospital, 5 Lower Kent Ridge Rd, Singapore 119074, Singapore
2
Department of Computer Science, School of Computing, National University of Singapore, 13 Computing Drive, Singapore 117417, Singapore
3
University Spine Centre, Department of Orthopaedic Surgery, National University Health System, 1E, Lower Kent Ridge Road, Singapore 119228, Singapore
4
Department of Radiation Oncology, National University Cancer Institute Singapore, National University Hospital, 5 Lower Kent Ridge Road, Singapore 119074, Singapore
5
Department of Diagnostic Radiology, Yong Loo Lin School of Medicine, National University of Singapore, 10 Medical Drive, Singapore 117597, Singapore
*
Author to whom correspondence should be addressed.
Submission received: 9 February 2023 / Revised: 7 March 2023 / Accepted: 16 March 2023 / Published: 18 March 2023
(This article belongs to the Special Issue Artificial Intelligence and MRI Characterization of Tumors)

Abstract

:

Simple Summary

Distinguishing between benign vs. malignant bone lesions is often difficult on imaging. Many bone lesions are infrequent or rarely seen, and often only specialist radiologists have sufficient expertise to provide an accurate diagnosis. In addition, some benign bone tumours may exhibit potentially aggressive features that mimic malignant bone tumours, making the diagnosis even more difficult. The rapid development of artificial intelligence (AI) techniques has led to remarkable progress in image-recognition tasks, including the classification and characterization of various tumours. This study will review the most recent evidence for AI techniques in differentiating bone lesions on various imaging modalities using a systematic approach. Potential clinical applications of AI techniques for clinical diagnosis and management of bone lesions will also be discussed.

Abstract

An accurate diagnosis of bone tumours on imaging is crucial for appropriate and successful treatment. The advent of Artificial intelligence (AI) and machine learning methods to characterize and assess bone tumours on various imaging modalities may assist in the diagnostic workflow. The purpose of this review article is to summarise the most recent evidence for AI techniques using imaging for differentiating benign from malignant lesions, the characterization of various malignant bone lesions, and their potential clinical application. A systematic search through electronic databases (PubMed, MEDLINE, Web of Science, and clinicaltrials.gov) was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. A total of 34 articles were retrieved from the databases and the key findings were compiled and summarised. A total of 34 articles reported the use of AI techniques to distinguish between benign vs. malignant bone lesions, of which 12 (35.3%) focused on radiographs, 12 (35.3%) on MRI, 5 (14.7%) on CT and 5 (14.7%) on PET/CT. The overall reported accuracy, sensitivity, and specificity of AI in distinguishing between benign vs. malignant bone lesions ranges from 0.44–0.99, 0.63–1.00, and 0.73–0.96, respectively, with AUCs of 0.73–0.96. In conclusion, the use of AI to discriminate bone lesions on imaging has achieved a relatively good performance in various imaging modalities, with high sensitivity, specificity, and accuracy for distinguishing between benign vs. malignant lesions in several cohort studies. However, further research is necessary to test the clinical performance of these algorithms before they can be facilitated and integrated into routine clinical practice.

1. Introduction

Differentiating between benign vs. malignant bone tumours is critical for clinical decision-making and treatment planning [1,2]. Routine imaging investigations for evaluating bone lesions include radiographs (X-ray), computed tomography (CT), positron-emission tomography combined with computed tomography (PET/CT), and magnetic resonance imaging (MRI) [3,4]. Conventional radiography remains a key initial imaging modality and is still an optimal technique for evaluating primary bone tumours [5,6]. It is relatively inexpensive and allows for a clear visual assessment of lesion location, margins, internal lesion matrix, and any associated periosteal reaction [7]. Along with the patient’s age, these radiographic details are often sufficient to provide a reasonable list of differential diagnoses [8]. However, radiographs are often limited by superimpositions, incomplete visualizations of bone cortex destruction, and inadequate assessments of adjacent soft tissue involvement [9,10,11]. Furthermore, diagnosis of bone tumours using imaging can be complicated by other factors such as the presence of pathological fractures, which may result in benign bone lesions having potentially aggressive features that mimic malignant bone tumours [12,13].
Multi-detector CT can be a useful adjunct to radiographs as it allows precise delineation of complex anatomical locations including articulations [3] and irregular bones, such as the sacrum or vertebral bodies [14]. These regions are often difficult to visualize in 2D (2-Dimensional) planes and CT can provide additional 3D (3-Dimensional) reconstructions [15], which are useful for surgical planning. The evaluation of minor bone changes, such as tumour mineralization, bone cortex changes, and periosteal reaction, are also better depicted on CT scans [16]. In addition, CT provides simultaneous evaluation for both bone and soft tissue lesions in cases of suspected malignancy (systemic staging), which reduces the burden of imaging for patients [17,18]. However, CT is deficient in evaluating the soft tissue extent of bone lesions, as well as the degree of medullary involvement [19,20].
MRI is a highly sensitive modality for the delineation of bony abnormalities and provides the ability to assess bone marrow involvement, the extent of soft tissue invasion, and the internal content of the bone lesions [21]. Occasionally, the excellent tissue or lesion characterization provided by MRI can yield sufficient information to clinch an accurate diagnosis, even without histological correlation [20,22]. With the advent of functional MRI sequences, which include perfusion or dynamic contrast-enhanced MR imaging, Diffusion Weighted Imaging (DWI), and MRI Spectroscopy, there is even more potential for the accurate differentiation between benign vs. malignant bone lesions [23,24,25,26,27].
For radiographically indeterminate bone lesions, 18F-Flurodeoxyglucose Positron Emission Tomography combined with Computed Tomography (18F-FDG PET/CT) has been reported to provide an improved differentiation between benign vs. malignant bone lesions in comparison to CT or MRI alone [2,28,29]. FDG avidity of the bone lesion helps to predict aggressiveness, with a malignant tumour showing increased avidity relative to a benign bone lesion of the same histological subtype [30,31]. However, FDG uptake within a bone lesion does not usually help to determine the morphologic features and specific subtype of the bone tumour. Moreover, FDG uptake in bone lesions can lead to false positives for malignancy, with superimposed trauma or fractures, background bony hyperplasia, and underlying metabolic bone disease is also reported to increase the FDG uptake within bone lesions, thus, confounding the assessment [32,33,34].
Even when combining various imaging modalities, radiologists are often not accurate or specific in classifying bone lesions [35]. Considering the potential limitations of current advanced imaging and the wide spectrum of bone tumours encountered in clinical practice, there is a clear utility for emerging technologies to aid clinicians in the detection and characterization of benign from malignant bone tumours.
Emerging Artificial Intelligence (AI) tools continue to demonstrate remarkable progress in medical imaging applications, especially in the field of oncology [36]. These applications include cancer screening and diagnosis [37,38,39,40], diagnosis and classification [41,42,43,44], predicting prognosis and treatment response [45,46,47,48,49], automated segmentation [50,51,52,53,54], and radiology-pathology correlation (radiogenomics) [55,56,57,58]. In particular, within the field of diagnosis and classification, the ability of AI models to classify benign vs. malignant tumours has been shown to achieve high accuracy, sensitivity, and specificity in various organs, such as in the case of breast [59,60,61], prostate [62,63], lung [38,64,65,66], and brain lesions [67,68]. This review article aims to provide an overview of the current evidence on the effectiveness of machine learning in differentiating bone lesions on various imaging modalities.

2. Materials and Methods

2.1. The Literature Search Strategy

A systematic search of the major electronic databases (PubMed, MEDLINE, Web of Science, and clinicaltrials.gov, accessed on 31 December 2022) was conducted in concordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [69,70] using keywords and Medical Subject Headings (MeSH), or both, for the following key terms: (“bone” OR “vertebral” OR “spinal”) AND (“tumour” OR “lesion” OR “malignancy” AND (“radiomics”, OR “machine learning”, OR “artificial intelligence”, OR “deep learning”). Two authors (W.O. and Y.L.T.) performed an independent review of the collected references and selected the appropriate studies for detailed full-text screening.

2.2. The Study Screening and Selection Criteria

No limitations were specified for the reference and literature search. The main inclusion criteria were scientific studies harnessing radiomic techniques, Artificial Intelligence (AI), or deep learning to distinguish bony lesions. Inclusion criteria also involved the following: (a) imaging analysis involving radiographs (X-rays), nuclear medicine (radioisotope) imaging, Computed Tomography (CT, all types including Positron Emission Tomography combined with Computed Tomography (PET/CT)), and Magnetic Resonance Imaging (MRI) scans; (b) studies that addressed the ability to differentiate between benign vs. malignant bone (primary malignant or metastatic) lesions (c) studies involving human subjects only; and (d) publications primarily in the English Language. Articles that were excluded from the further analysis included case reports, editorial correspondence (for example, opinion pieces, letters, and commentaries), and review articles. Duplicate publications, those focusing on non-imaging (for example, genomic) radiomic AI techniques or articles on the applications of AI technology not related to distinguishing bony lesions (for example, segmentation and detection, among others) were also excluded from this analysis.

2.3. Data Extraction and Reporting

All selected research articles were retrieved and compiled into a spreadsheet using Microsoft Excel (Microsoft Corporation, Washington, DC, USA). Information gathered from the individual research articles included:
  • Research article details: Complete authorship, date of journal or publication, Journal name;
  • Main clinical use: Differentiating benign vs. malignant bone lesions, characterization and classification of various bone tumours;
  • Patient population: Patients with known bone lesions, who have undergone various imaging investigations (X-ray, CT, PET/CT or MRI) and have subsequently undergone histopathological confirmation;
  • Research study details: The type of study, patient or imaging modality sample sizes (for example, internal or external data sets), imaging modalities used (CT, MRI, bone scans or PET/CT), treatment or management information and outcome/prognostic measures;
  • Machine Learning techniques used: Radiomics and convolutional neural networks, among others.

3. Results

3.1. Search Results

The primary search through the relevant electronic medical databases (please see Figure 1) identified a total of 54 relevant research articles, which were initially screened using the previously detailed criteria. This initial screening led to the exclusion of eight publications, and the remaining 46 articles then underwent a detailed full-text analysis to determine their inclusion or exclusion. Following the detailed full-text analysis, a further 25 publications were excluded from further analysis as they were either focused on cancer or lesion sites other than bone or related to other Artificial intelligence (AI) applications of bone lesions and were not related to differentiating or characterizing them (for example, detection, segmentation or prognosis). An additional 13 articles were included after the study team manually reviewed the bibliography of the selected research manuscripts. Overall, this screening yielded a total of 34 publications (please see Table 1) for detailed analysis. Key findings from each article were then compiled, categorized and summarized for this review. Most studies lacked the detailed data required to create 2 × 2 contingency tables, hence a formal meta-analysis could not be performed.
Our search found that 12/34 (35.3%) studies were X-ray-based, 5/33 (14.7%) were CT (Computed Tomography) based, 5/33 (14.7%) were related to nuclear studies (for example, Positron Emission Tomography with CT (PET-CT)), and 12/33 (35.3%) were MRI (Magnetic Resonance Imaging) based. All studies were retrospective in nature using radiological images fed into various AI systems, with the majority of patients having either a confirmed pathological diagnosis or an agreed consensus of the clinical diagnosis.
The overall reported accuracy, sensitivity, and specificity of AI in distinguishing between benign vs. malignant bone lesions (Table 2) ranges from 0.44–0.99, 0.63–1.00, and 0.73–0.96, respectively. The studies on X-rays reported an accuracy of 0.44–0.99, a sensitivity of 0.75–1.00, and a specificity of 0.78–0.91 with an AUC of 0.79–0.95. A note is made of a high sensitivity of 1.00 achieved in a study by Consalvo et al. [97] using radiographs to differentiate between acute osteomyelitis vs. malignant Ewing sarcoma. This high sensitivity may be related to data misbalance due to the small sample size and the use of relatively “normal” radiographs as a healthy control group. For CT studies, the reported accuracy, sensitivity, and specificity range from 0.74–0.92, 0.80, and 0.96, respectively, with an AUC of 0.78–0.96. PET/CT studies reported an accuracy of 0.74–0.88, a sensitivity of 0.84–0.90, and a specificity of 0.74–0.85 with an AUC of 0.76–0.95. Last but not least, the included MRI studies reported overall accuracy ranging from 0.74–0.98, sensitivity from 0.78–0.88, and a specificity of 0.61–0.79 with an AUC of 0.73–0.94.
Of note, studies with a two-label classification (benign vs. malignant) achieved relatively higher performance when compared to studies with three or more label classifications. For example, Pan D. et al. [84] showed that the ability of their AI model for binary classification (non-malignant vs. malignant) on radiographs achieved a higher AUC of 0.97 and an accuracy of 0.95 when compared to tertiary classification (benign vs. intermediate vs. malignant), with an AUC of 0.94 and an accuracy of 0.83. Similarly, the study performed by Chianca et al. [95] showed that their machine learning model achieved a higher accuracy of 0.86, vs. 0.68, when classifying vertebral lesions into a dichotomous (benign or malignant) compared to trichotomous classification (benign, primary malignant or metastases), respectively.

3.2. Machine Learning Techniques

AI refers to the computational ability of a machine to execute tasks to a level comparable to those performed by humans. This is achieved by utilizing unique data inputs to generate outputs that have a high-added value [104]. Recent advances in the field of medical imaging along with the availability of large volumes of imaging and report data have fueled worldwide interest in the use of AI techniques for medical imaging [105]. The initial rationale for developing and deploying AI tools was to assist clinicians (mainly radiologists) in the detection and characterization of lesions, which would have the benefit of increasing efficiency and detection accuracy, and reducing diagnostic errors [106]. With recent advances in computing and AI, many other applications, including the characterization of lesions, automated segmentation, and decision support planning (for example, predicting phenotypes and outcomes) have been studied [107].
Machine learning represents a subset of AI where models are trained for the prediction of outcomes using datasets with known ground truth, from which the machine “learns”. The developed AI model can then apply its new knowledge to predict outcomes in previously unseen datasets [108]. Radiomics is a new division of machine learning which involves converting the information stored within medical images into measurable and quantifiable data [109]. The information obtained can then be used to aid radiologists and clinicians in the assessment of benign and malignant lesions by analyzing and utilizing additional details beyond that of visual interpretation (inferable by the human eye) [110,111]. When radiomics is combined with additional clinical and qualitative imaging data or both, it has the potential to guide and optimize clinical decision-making, including improved lesion detection, classification, prognostication, and enhanced treatment response assessments [112]. In general, the typical workflow for developing a radiomics model can be divided into the following steps (Figure 2): image acquisition, data selection (image input), segmentation, image feature extraction within the curated regions of interest (ROIs), exploratory analysis with feature selection, and modelling [113]. The models should then be validated using test sets (preferably using both internal and external data) to evaluate their performance [114].
The two most common machine learning methods (Figure 3) are radiomics-based imaging feature analysis and convolutional neural networks (CNN). Feature-based radiomics techniques involve the extraction of various handcrafted features, which can then be included in a training set for AI-based imaging classification [115]. On the other hand, CNNs utilize deep learning to extract useful imaging features by learning their patterns, classifying data directly from input images [116], and transforming them into useful outputs [117,118]. This results in the ability to detect and process distinct diagnostic patterns and imaging features beyond that of human readers, which are then used for various applications including lesion detection, characterization, and providing prognostic information [119].

4. Discussion

4.1. Machine Learning on Conventional Radiographs

Correctly classifying bone tumours on conventional radiographs is vital for clinical decision-making and guiding subsequent management [120]. However, this is often difficult, especially in places where there is a shortage of subspecialty radiology expertise. Moreover, many bone lesions are uncommon entities, and often only a few specialist radiologists have sufficient experience to diagnose them accurately [10,79]. In clinical practice, most radiologists rely on image pattern recognition to distinguish between benign and malignant lesions on radiographs, which is subject to bias and can sometimes lead to an erroneous interpretation [121,122]. Some of these common radiological features include location, cortical destruction, periostitis, lesion orientation or alignment, and the zone of transition between the lesion and the surrounding bone [123,124,125,126]. However, some benign bone lesions may demonstrate one or more aggressive features which may confound the distinction [127,128].
Machine learning techniques can identify the more important radiographic features for distinguishing between benign vs. malignant bone lesions. Pan D et al. [84] performed a study using random forest (RF) models and identified that the most important imaging features to distinguish between benign vs. malignant bone lesions, in descending order of importance, are: margins, cortical bone involvement, the pattern of bone destruction, and the internal high-density matrix, with an accuracy of up to 94.7% and an area under the curve (AUC) of 0.97. A deep learning algorithm developed by He et al. [79] achieved high performance using a multi-institutional dataset comprised of up to 2899 images with pathologically proven bone tumours (benign 52.5%, vs. intermediate 21.9%, vs. malignant 25.6%). The developed model demonstrated an AUC reaching up to 0.92 and an accuracy of 72.1% for trichotomous classification, as per the World Health Organization (WHO) classification of bone tumours [129]). The model outperformed two junior radiologists (accuracies 63.4%–67.9%, p < 0.05) and was similar in accuracy compared to two subspecialist radiologists (accuracy 69.3%–73.4%, p > 0.1). Similarly, a deep learning model created by Liu et al. [83] achieved diagnostic performance comparable to that of senior radiologists for benign, intermediate, and malignant bone lesions. The developed fusion model combined clinical and imaging features and achieved an AUC of 0.87 vs. 0.82 for the radiologists, which did not reach statistical significance (p = 0.86).
Malignant bone lesions are often difficult to differentiate from other aggressive disease processes, including inflammation and infection. Ewing’s sarcoma, an aggressive malignant tumour occurring in children, is a typical example [130]. Differentiating this entity from acute osteomyelitis is often difficult, even by trained musculoskeletal radiologists, due to its similar clinical and radiological features [131,132,133]. Consalvo et al. [97] developed an artificial intelligence algorithm which was able to leverage radiographic features to distinguish between Ewing sarcoma and acute osteomyelitis, achieving an accuracy of up to 94.4% on their validation set and 90.6% on a held-out test set. Although this study is limited by a small sample size, requiring the use of cross-validation and loss weighting to achieve statistical significance, it demonstrates the potential feasibility of AI techniques for differentiating infective bony lesions from malignant bone lesions on routine radiographs.

4.2. Machine Learning on Computed Tomography (CT) Imaging

Incidentally detected dense (sclerotic) lesions are a common occurrence on CT examinations in clinical practice [134]. The ability to distinguish a benign sclerotic lesion, such as an enostosis (bone island), vs. a malignant lesion, such as an osteoblastic metastasis, is crucial as it affects the treatment strategy and patient prognosis [135]. For this task, radiomics-based random forest models were created by Hong et al. [85] and achieved an AUC of up to 0.96 and an accuracy of up to 86.0% in the test sets, which was comparable to two experienced radiologists (AUCs of 0.95–0.96) and even higher compared to a radiologist in training (AUC 0.88, p = 0.03). Along with the extraction of 1218 radiomics features, the authors showed that a model utilizing attenuation and shape-related features achieved the highest AUC, which had been postulated in several prior studies [136,137,138,139,140]. In another study, Sun et al. [74] developed a CT-derived nomogram using radiomics to characterize benign vs. malignant bone tumours. This utilized radiomics features from the texture analysis, including a ground-glass matrix, peripheral lesion sclerosis, residual bony ridge, and the presence of a soft tissue mass [141,142], and demonstrated an AUC of up to 0.92. In addition, the team also showed that by including clinical features in the nomogram, the model achieved higher net clinical benefits for decision-making when compared to radiomics alone with an NRI (Net reclassification index) of 0.24 (95% CI 0.07–0.41) and an IDI (Integrated discrimination index) of 0.16 (95% CI 0.11 to 0.22), albeit not reaching statistical significance.
Besides differentiating between benign vs. malignant bony lesions, radiomics models can also be used to differentiate between a variety of tumour matrix types with high performance. A deep convolutional neural network (CNN) created by Y. Li et al. [81] was able to further classify benign and malignant bone lesions into cartilaginous or osteogenic tumours using a multi-channel enhancement strategy for image processing to improve accuracy, achieving a top-1 error rate of only 0.25. In a similar study that focused on radiographs rather than CT, Reicher et al. [80] created a recurrent CNN model which was able to learn and classify the bone tumour matrix with a high accuracy of 93% compared to the average radiologist’s accuracy of 70%. This shows the potential use of artificial intelligence to further discriminate various subtypes of bone tumours (for example, chondroid or osteoid tumours) which is important for biopsy and treatment planning.

4.3. Machine Learning on Magnetic Resonance Imaging (MRI)

MRI plays a key role in aiding clinicians in discriminating between benign vs. primary malignant or metastatic bone tumours [143]. The conventional pulse sequences [20,144], diffusion-weighted imaging (DWI) [145,146] with matched apparent diffusion coefficient (ADC) maps [147,148] as well as dynamic contrast sequences (DCE) [149,150] can predict potential malignancy with good reliability. However, some imaging features of benign vs. malignant bone lesions can overlap and this makes formulating a differential diagnosis challenging [151]. Machine learning techniques using radiomic features on MRI have been shown to have high performance for predicting benign vs. malignant bone lesions. Using pre-trained ResNet50 image classifiers, Georgeanu et al. [93] were capable of predicting the malignant potential of bone tumours in 93.7% of cases using T1-weighted sequences and 86.7% using T2-weighted sequences. A model developed by Chianca et al. [95] for classifying vertebral lesions into benign vs. malignant (primary malignant and metastatic lesions) demonstrated 94.0% accuracy in the internal test dataset and 86% accuracy in an external validation dataset. This showed no significant difference relative to an expert musculoskeletal radiologist with more than 5 years-experience who achieved 94.0% in the internal test cohort (p = 0.99). However, there was no consistent MRI protocol or sequences among the different studies, and a wide variety of software was used for image segmentation and feature extraction. In addition, there was inconsistency in the number and type of features explored and the type of deep learning models used to classify the final features. Overall, these factors may have reduced the reproducibility of the results. Future multicenter validation studies could be performed with more standardized protocols to assess the generalizability of the deep learning applications and facilitate their integration into routine clinical practice.
Specific to cartilaginous bone lesions, machine learning has been studied for the differentiation of various grades of chondrosarcoma. Conventional chondrosarcoma is usually divided into three categories based on pathology, where grade 1, also known as atypical cartilaginous tumours (ACTs), usually have an indolent biologic nature, whereas grades 2–3 (high-grade chondrosarcoma) are malignant bone tumours [152] with metastatic potential and a high recurrence rate following surgical resection [153]. Discrepancies in correct tumour grading are widespread even among experienced radiologists and pathologists, secondary to overlapping imaging and histological features, and it is for this reason that more accurate diagnostic aids are required [154,155]. Gitto et al. [91] used MRI-derived radiomics to characterize ACT vs. high-grade chondrosarcoma, based on conventional T1- and T2-weighted images, and achieved 85.7% and 75.0% accuracy (AUCs of 0.85 and 0.78) in the training and test groups, respectively, with no difference in diagnostic performance between the radiologist and machine learning classifier (p = 0.45). Further to that, Gitto et al. [90] utilized a similar radiomics-based MRI method to discriminate between ACT and Grade II chondrosarcoma, achieving 92% accuracy (AUC of 0.94) with no significant difference compared to an expert radiologist (p = 0.13). With the help of AI, the discrimination between various grades of bone tumours could be improved, although further external multicenter validation is necessary to assess the generalisability of the proposed models before they can be applied in a prospective clinical setting.
The use of intravenous contrast media (gadolinium-based) for the evaluation of bone tumours has been shown to add some specificity in tissue characterization [156,157], although the main advantages are for the accurate evaluation of tumour extent for biopsy and treatment planning and to assess for recurrence [158,159,160]. Interestingly, a study by Eweje et al. [72] demonstrated that a deep learning model was able to achieve a performance similar to that of expert radiologists for classifying bone lesions at various skeletal locations without using post-contrast T1-weighted sequences, which were made available to the radiologists. The model had an accuracy of 76% vs. 73% for radiologists (p = 0.7) for classifying benign (which includes intermediate as per WHO classification criteria) vs. malignant bone lesions. This preliminary study shows the potential utility of machine learning for the accurate diagnosis of bone tumours without requiring the administration of gadolinium-based MRI contrast media. This could be advantageous, especially in patients who have contraindications to gadolinium-based contrast (due to renal impairment or allergy) or in children with pain-related anxiety that is secondary to the placement of an intravenous cannula and the uncertain long-term implications of gadolinium deposition in children [161,162]. Further larger studies are required to show if machine learning using a combination of non-contrast and contrast-enhanced imaging has an advantage over non-contrast imaging alone.

4.4. Machine Learning on Positiron Emission Tomography with CT (PET/CT) Imaging

PET/CT imaging is a widely used modality to differentiate malignant vs. benign tumours in a host of organ systems [163,164,165]. The most common radiotracer used in PET/CT imaging is 2-deoxy-2-18F-fluoro-β-D-glucose (18F-FDG), an analogue of glucose, with the concentrations of radiotracer accumulation in PET/CT image proportional to the metabolic activity of tissues concerning the underlying glucose accumulation and metabolism [166]. Fluorine 18–Sodium Fluoride (18F–NaF) is another radiotracer used more specifically for bone imaging in PET/CT, with the uptake proportional to blood flow in the bone and osseous remodeling [167,168,169]. Increased 18F-NaF bone uptake can be seen in an abnormal bone that undergoes higher remodeling, such as in osteoblastic or osteolytic processes and is used to differentiate various pathologies [170]. For the musculoskeletal system, the utility of PET/CT in distinguishing between malignant and benign bone tumours has also been widely studied and proved effective [171,172,173]. In fact, the metabolic information derived from PET/CT has been reported to provide a better characterization of bone lesions compared to conventional CT or MRI alone [29,174,175]. The standardized uptake value (SUV) technique with an optimal cut-off value of maximum SUV (SUVmax) is often the main feature used for the differential diagnosis of osseous tumours on a PET/CT [176]. However, the use of SUVmax alone as a distinguishing factor is often limited, due to the significant overlap of SUVmax values that occur between malignant vs. benign lesions [177,178,179,180]. Benign diseases, such as osteoarthritis, stress or trauma-related vertebral fractures, osseous hyperplasia, and underlying metabolic bone disease [181,182,183] have been reported to have high SUVmax values which may confound the diagnosis of malignant bone lesions.
To improve the current diagnostic efficacy of PET/CT interpretation, Fan et al. [184] utilized texture analysis with SUVmax to construct radiomics models to distinguish between benign vs. malignant bone lesions. Texture analysis extracts information, regarding the relationship between adjacent voxels or pixels, and assesses for inhomogeneity, which can then be used to predict the likelihood of benign vs. malignant bone lesions. [185,186]. By incorporating partial texture features along with SUVmax, the developed classification model (using logistic regression) achieved an accuracy of 87.5% compared to 84.3% for nuclear medicine physicians (p = 0.03) in differentiating spinal osseous metastases from benign osseous lesions with high SUVmax values.
In another study using 18F-NaF PET/CT imaging, Perk T et al. [87] created a machine tool for the automated classification of benign vs. malignant osseous lesions in patients with metastatic prostate cancer who received whole-body 18F-NaF PET/CT scans. The group analyzed up to 1751 bone lesions from a total of 37 subjects. The model included an analysis of 172 imaging and spatial probability features which showed superior classification performance compared to using SUVmax alone, with an AUC of up to 0.95. In addition, the model was also able to replicate the nuclear medicine physicians’ classification of bone lesions which had an AUC range of 0.91–0.93. This machine learning tool may potentially assist physicians in swiftly and accurately detecting and classifying bone lesions in 18F-NaF PET/CT scans.

4.5. Potential Clinical Impact and Applications

There is significant clinical value in the ability of machine learning to differentiate between benign vs. malignant bone lesions. A retrospective study by Stacy et al. [187] found that at least a third of patients with bone lesions referred to orthopedic oncology in a year had images that were diagnosed by radiologists as characteristic of benign tumours or non-malignant entities that did not require follow-up or referrals. Accurate AI models for the characterization of osseous lesions could therefore potentially reduce the rate of unnecessary specialist referrals and follow-up, reducing the associated healthcare costs and patient anxiety regarding a possible cancer diagnosis.
Moreover, more accurate characterizations of bone lesions would be valuable for radiologists to identify high-risk bone lesions that will benefit from biopsy to rule in malignancy with greater certainty. Unnecessary biopsies of benign bone lesions leave patients at risk of post-procedural complications, and hasty biopsy planning can increase the risk of misdiagnosis, creating unwarranted patient stress [188,189]. Biopsies can also be non-diagnostic in up to 30% of bone lesions, which requires repeat biopsies and an associated higher risk of complications [190]. A robust AI model that can identify benign bone lesions with a high specificity could help reduce the rate of unnecessary biopsies. This would be especially helpful for bone tumour subtypes which demonstrate similar imaging and histology features for their benign and malignant counterparts, for example, cartilaginous tumours including enchondroma (benign) vs. chondrosarcoma [135,165]. Recently, a CT-based radiomics model derived by Gitto et al. [78] was able to distinguish between ACT from high-grade chondrosarcoma with accuracy higher than that of pre-operative biopsy (81%, AUC 0.89 vs. 69%, AUC 0.66) albeit not reaching statistical significance (p = 0.29). Cartilaginous tumour characterisation remains a clinical challenge as wide resection is often required for definitive diagnosis and biopsy may inadvertently lead to tumour down-grading in lesions with marked heterogeneity, as only small areas can be sampled [166]. In addition biopsy of high-grade chondrosarcoma may inadvertently result in spillage or biopsy tract contamination [167,168].
On top of differentiating bone lesions into benign vs. malignant, machine learning methods can also help differentiate lesions secondary to post-treatment change from residual or recurrent malignant disease. This is a crucial diagnostic challenge as the treatment for the two entities are vastly different, and accurate diagnosis is also important to prevent unnecessary invasive biopsy and/or chemoradiotherapy. However, radiological evaluation of bone tumours after treatment can be quite difficult [191,192]. Zhong et al. [73] created an MRI-derived radiomics nomogram that was found to be clinically useful in discriminating between cervical spine osteoradionecrosis following radiotherapy and metastases with an AUC of 0.72 in the validation set. Another study by Acar E. et al. [78] utilized machine learning techniques via texture analysis on 68Ga-prostate-specific membrane antigen (PSMA) PET/CT images. These techniques were able to distinguish between sclerotic bone lesions with complete post-treatment response vs. metastatic bone disease with an AUC of 0.76.
The use of multiple imaging modalities to evaluate bone tumours is known to improve the accuracy of diagnosis [3]. Machine learning models built using different modalities (multimodal) have also been shown to improve diagnostic performance [193]. In breast radiology, Antropova et al. [194] came out with a CNN method involving fusion-based classification using dynamic contrast enhanced-MRI, full-field digital mammography, and ultrasound. This method outperformed conventional CNN-based and CADx-based classifiers, with an AUC of 0.90. In prostate cancer imaging, Sedghi A. et al. [195] developed models of fully convolutional networks integrating data from both temporal enhanced ultrasound along with the apparent diffusion coefficient (ADC) from MRI studies. The multimodality integration of information from both MRI and ultrasound achieved an AUC of 0.76 to 0.89 for the detection of prostate cancer foci, which outperformed the average unimodal predictions (AUC of 0.66–0.70). Future studies for bone tumours could adopt similar multimodal methods to further improve diagnostic accuracy, for example, CNN models using matched CT and MRI data to predict malignant potential.

5. Conclusions

Machine learning techniques to discriminate bone lesions have achieved a relatively good performance across various imaging modalities, demonstrating high sensitivity, specificity, and accuracy for distinguishing between benign vs. malignant lesions in several cohort studies. These techniques could improve the management of bone tumours in two key ways. Firstly, benign lesions could be targeted for an imaging follow-up rather than a biopsy, which will reduce unnecessary referrals to specialist clinics and prevents biopsy complications. Secondly, suspected malignant lesions could be targeted for expedited orthopaedic oncology referral, and additional machine learning tools could aid in determining the tumour subtype and location for the highest biopsy yield. Currently, the majority of studies are in the preliminary stage, limited by small sample sizes and the retrospective nature of the analyses. Further research is required, especially through the use of multicenter external datasets to assess the generalisability of the machine learning techniques on a greater range of imaging data. Multimodal machine learning techniques are an especially exciting area of future work for bone tumour characterization as they can fuse the information from different complementary modalities (for example, radiographs, CT, MRI and PET/CT) with clinical data, which is similar to the current radiologist assessment and could yield even greater accuracy.

Author Contributions

Conceptualization, methodology, supervision and writing: W.O., Y.L.T., L.Z., B.C.O., E.C.T., J.H.T., B.A.V., N.K., S.T.Q., A.M. and J.T.P.D.H.; Investigation and project administration: W.O., Y.L.T., A.M., S.T.Q. and J.T.P.D.H.; Resources and software: W.O., Y.L.T., L.Z., B.C.O., E.C.T., J.H.T., N.K., B.A.V., S.T.Q., A.M. and J.T.P.D.H.; Formal analysis and validation: W.O., Y.L.T., L.Z., B.C.O., E.C.T. and J.T.P.D.H.; Discussion pointers: W.O., S.T.Q., A.M., J.H.T., N.K. and J.T.P.D.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by: Direct Funding from MOH/NMRC. This research is supported by the Singapore Ministry of Health National Medical Research Council under the NMRC Clinician-scientist individual research grant, new investigator grant (CS-IRG NIG); Grant Title: Deep learning pathway for the management of spine metastases (CNIG20nov-0011, MOH-000725).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ladd, L.M.; Roth, T.D. Computed Tomography and Magnetic Resonance Imaging of Bone Tumors. Semin. Roentgenol. 2017, 52, 209–226. [Google Scholar] [CrossRef] [Green Version]
  2. Piperkova, E.; Mikhaeil, M.; Mousavi, A.; Libes, R.; Viejo-Rullan, F.; Lin, H.; Rosen, G.; Abdel-Dayem, H. Impact of PET and CT in PET/CT studies for staging and evaluating treatment response in bone and soft tissue sarcomas. Clin. Nucl. Med. 2009, 34, 146–150. [Google Scholar] [CrossRef]
  3. Goyal, N.; Kalra, M.; Soni, A.; Baweja, P.; Ghonghe, N.P. Multi-modality imaging approach to bone tumors—State-of-the art. J. Clin. Orthop. Trauma. 2019, 10, 687–701. [Google Scholar] [CrossRef]
  4. Hapani, H.; Kalola, J.; Hapani, J. Comparative role of CT scan and MR imaging in primary malignant bone tumors. IOSR J. Dent. Med. Sci. 2014, 13, 29–35. [Google Scholar] [CrossRef]
  5. Berquist, T.; Dalinka, M.; Alazraki, N.; Daffner, R.; DeSmet, A.; el-Khoury, G.; Goergen, T.; Keats, T.; Manaster, B.; Newberg, A. Bone tumors. American college of radiology. ACR appropriateness criteria. Radiology 2000, 215, 261–264. [Google Scholar]
  6. Costelloe, C.M.; Rohren, E.M.; Madewell, J.E.; Hamaoka, T.; Theriault, R.L.; Yu, T.-K.; Lewis, V.O.; Ma, J.; Stafford, R.J.; Tari, A.M.; et al. Imaging bone metastases in breast cancer: Techniques and recommendations for diagnosis. Lancet Oncol. 2009, 10, 606–614. [Google Scholar] [CrossRef]
  7. Karanian, M.; Coindre, J.M. Fourth edition of WHO classification tumours of soft tissue. Ann. Pathol. 2015, 35, 71–85. [Google Scholar] [CrossRef]
  8. Vanel, D. General principles of imaging. In Diagnosis of Musculoskeletal Tumors and Tumor-like Conditions: Clinical, Radiological and Histological Correlations—The Rizzoli Case Archive; Picci, P., Manfrini, M., Donati, D.M., Gambarotti, M., Righi, A., Vanel, D., Dei Tos, A.P., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 27–30. [Google Scholar]
  9. Huang, P.-Y.; Wu, P.-K.; Chen, C.-F.; Lee, F.-T.; Wu, H.-T.; Liu, C.-L.; Chen, T.-H.; Chen, W.-M. Osteomyelitis of the femur mimicking bone tumors: A review of 10 cases. World J. Surg. Oncol. 2013, 11, 283. [Google Scholar] [CrossRef] [Green Version]
  10. Gerber, E.; Said-Hartley, Q.; Gamieldien, R.; Hartley, T.; Candy, S. Accuracy of plain radiographs in diagnosing biopsy-proven malignant bone lesions. SA J. Radiol. 2019, 23, 1768. [Google Scholar] [CrossRef]
  11. Priolo, F.; Cerase, A. The current role of radiography in the assessment of skeletal tumors and tumor-like lesions. Eur. J. Radiol. 1998, 27 (Suppl. S1), S77–S85. [Google Scholar] [CrossRef]
  12. Picci, P. Epidemiology of Bone Lesions. In Diagnosis of Musculoskeletal Tumors and Tumor-like Conditions: Clinical, Radiological and Histological Correlations—The Rizzoli Case Archive; Picci, P., Manfrini, M., Donati, D.M., Gambarotti, M., Righi, A., Vanel, D., Dei Tos, A.P., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 3–9. [Google Scholar]
  13. Wu, J.S.; Hochman, M.G. Bone Tumors: A Practical Guide to Imaging; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
  14. Yang, J.; Li, K.; Deng, H.; Feng, J.; Fei, Y.; Jin, Y.; Liao, C.; Li, Q. CT cinematic rendering for pelvic primary tumor photorealistic visualization. Quant. Imaging Med. Surg. 2018, 8, 804–818. [Google Scholar] [CrossRef] [PubMed]
  15. Zampa, V.; Roselli, G.; Beltrami, G. MRI of bone tumors: Advances in diagnosis and treatment assessment. Imaging Med. 2010, 2, 325–340. [Google Scholar] [CrossRef]
  16. Abdel Razek, A.A.; Castillo, M. Imaging appearance of primary bony tumors and pseudo-tumors of the spine. J. Neuroradiol. 2010, 37, 37–50. [Google Scholar] [CrossRef] [PubMed]
  17. Faiella, E.; Santucci, D.; Calabrese, A.; Russo, F.; Vadala, G.; Zobel, B.B.; Soda, P.; Iannello, G.; de Felice, C.; Denaro, V. Artificial Intelligence in Bone Metastases: An MRI and CT Imaging Review. Int. J. Env. Res. Public Health 2022, 19, 1880. [Google Scholar] [CrossRef]
  18. Salamipour, H.; Jimenez, R.M.; Brec, S.L.; Chapman, V.M.; Kalra, M.K.; Jaramillo, D. Multidetector row CT in pediatric musculoskeletal imaging. Pediatr. Radiol. 2005, 35, 555–564. [Google Scholar] [CrossRef]
  19. Zimmer, W.D.; Berquist, T.H.; McLeod, R.A.; Sim, F.H.; Pritchard, D.J.; Shives, T.C.; Wold, L.E.; May, G.R. Bone tumors: Magnetic resonance imaging vs. computed tomography. Radiology 1985, 155, 709–718. [Google Scholar] [CrossRef]
  20. Nascimento, D.; Suchard, G.; Hatem, M.; de Abreu, A. The role of magnetic resonance imaging in the evaluation of bone tumours and tumour-like lesions. Insights Imaging 2014, 5, 419–440. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Vande Berg, B.C.; Malghem, J.; Lecouvet, F.E.; Maldague, B. Classification and detection of bone marrow lesions with magnetic resonance imaging. Skelet. Radiol. 1998, 27, 529–545. [Google Scholar] [CrossRef]
  22. Davies, A.M.; Sundaram, M.; James, S.L. Imaging of bone tumors and tumor-like lesions: Techniques and applications; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  23. Rijswijk, C.S.P.v.; Geirnaerdt, M.J.A.; Hogendoorn, P.C.W.; Taminiau, A.H.M.; Coevorden, F.v.; Zwinderman, A.H.; Pope, T.L.; Bloem, J.L. Soft-Tissue Tumors: Value of Static and Dynamic Gadopentetate Dimeglumine–enhanced MR Imaging in Prediction of Malignancy. Radiology 2004, 233, 493–502. [Google Scholar] [CrossRef]
  24. Tokuda, O.; Hayashi, N.; Taguchi, K.; Matsunaga, N. Dynamic contrast-enhanced perfusion MR imaging of diseased vertebrae: Analysis of three parameters and the distribution of the time-intensity curve patterns. Skelet. Radiol. 2005, 34, 632–638. [Google Scholar] [CrossRef]
  25. Kajihara, M.; Sugawara, Y.; Sakayama, K.; Kikuchi, K.; Mochizuki, T.; Murase, K. Evaluation of tumor blood flow in musculoskeletal lesions: Dynamic contrast-enhanced MR imaging and its possibility when monitoring the response to preoperative chemotherapy—Work in progress. Radiat. Med. 2007, 25, 94–105. [Google Scholar] [CrossRef]
  26. Costa, F.M.; Ferreira, E.C.; Vianna, E.M. Diffusion-Weighted Magnetic Resonance Imaging for the Evaluation of Musculoskeletal Tumors. Magn. Reson. Imaging Clin. N. Am. 2011, 19, 159–180. [Google Scholar] [CrossRef]
  27. van Rijswijk, C.S.; Kunz, P.; Hogendoorn, P.C.; Taminiau, A.H.; Doornbos, J.; Bloem, J.L. Diffusion-weighted MRI in the characterization of soft-tissue tumors. J. Magn. Reason. Imaging 2002, 15, 302–307. [Google Scholar] [CrossRef] [PubMed]
  28. Bischoff, M.; Bischoff, G.; Buck, A.; von Baer, A.; Pauls, S.; Scheffold, F.; Schultheiss, M.; Gebhard, F.; Reske, S.N. Integrated FDG-PET-CT: Its role in the assessment of bone and soft tissue tumors. Arch. Orthop. Trauma Surg. 2010, 130, 819–827. [Google Scholar] [CrossRef] [PubMed]
  29. Charest, M.; Hickeson, M.; Lisbona, R.; Novales-Diaz, J.A.; Derbekyan, V.; Turcotte, R.E. FDG PET/CT imaging in primary osseous and soft tissue sarcomas: A retrospective review of 212 cases. Eur. J. Nucl. Med. Mol. Imaging 2009, 36, 1944–1951. [Google Scholar] [CrossRef] [PubMed]
  30. Schulte, M.; Brecht-Krauss, D.; Heymer, B.; Guhlmann, A.; Hartwig, E.; Sarkar, M.R.; Diederichs, C.G.; Von Baer, A.; Kotzerke, J.; Reske, S.N. Grading of tumors and tumorlike lesions of bone: Evaluation by FDG PET. J. Nucl. Med. 2000, 41, 1695–1701. [Google Scholar] [PubMed]
  31. Dimitrakopoulou-Strauss, A.; Strauss, L.G.; Heichel, T.; Wu, H.; Burger, C.; Bernd, L.; Ewerbeck, V. The role of quantitative (18)F-FDG PET studies for the differentiation of malignant and benign bone lesions. J. Nucl. Med. 2002, 43, 510–518. [Google Scholar] [PubMed]
  32. Choi, H.S.; Yoo Ie, R.; Park, H.L.; Choi, E.K.; Kim, S.H.; Lee, W.H. Role of ¹⁸F-FDG PET/CT in differentiation of a benign lesion and metastasis on the ribs of cancer patients. Clin. Imaging 2014, 38, 109–114. [Google Scholar] [CrossRef]
  33. Wang, L.J.; Wu, H.B.; Zhou, W.L.; Yu, S.R.; Wang, Q.S. Gummatous Syphilis Mimicking Malignant Bone Tumor on FDG PET/CT. Clin. Nucl. Med. 2019, 44, 313–316. [Google Scholar] [CrossRef]
  34. Fan, X.; Zhang, H.; Yin, Y.; Zhang, J.; Yang, M.; Qin, S.; Zhang, X.; Yu, F. Texture Analysis of (18)F-FDG PET/CT for Differential Diagnosis Spinal Metastases. Front. Med. 2020, 7, 605746. [Google Scholar] [CrossRef]
  35. Ma, L.D.; Frassica, F.J.; Scott, W.W., Jr.; Fishman, E.K.; Zerbouni, E.A. Differentiation of benign and malignant musculoskeletal tumors: Potential pitfalls with MR imaging. Radiographics 1995, 15, 349–366. [Google Scholar] [CrossRef] [Green Version]
  36. Tran, K.A.; Kondrashova, O.; Bradley, A.; Williams, E.D.; Pearson, J.V.; Waddell, N. Deep learning in cancer diagnosis, prognosis and treatment selection. Genome Med. 2021, 13, 152. [Google Scholar] [CrossRef]
  37. Liu, B.; Chi, W.; Li, X.; Li, P.; Liang, W.; Liu, H.; Wang, W.; He, J. Evolving the pulmonary nodules diagnosis from classical approaches to deep learning-aided decision support: Three decades’ development course and future prospect. J. Cancer Res. Clin. Oncol. 2020, 146, 153–185. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Li, K.; Liu, K.; Zhong, Y.; Liang, M.; Qin, P.; Li, H.; Zhang, R.; Li, S.; Liu, X. Assessing the predictive accuracy of lung cancer, metastases, and benign lesions using an artificial intelligence-driven computer aided diagnosis system. Quant Imaging Med. Surg. 2021, 11, 3629–3642. [Google Scholar] [CrossRef]
  39. Raya-Povedano, J.L.; Romero-Martín, S.; Elías-Cabot, E.; Gubern-Mérida, A.; Rodríguez-Ruiz, A.; Álvarez-Benito, M. AI-based Strategies to Reduce Workload in Breast Cancer Screening with Mammography and Tomosynthesis: A Retrospective Evaluation. Radiology 2021, 300, 57–65. [Google Scholar] [CrossRef] [PubMed]
  40. Graewingholt, A.; Duffy, S. Retrospective comparison between single reading plus an artificial intelligence algorithm and two-view digital tomosynthesis with double reading in breast screening. J. Med. Screen. 2021, 28, 365–368. [Google Scholar] [CrossRef]
  41. Chakrabarty, S.; Sotiras, A.; Milchenko, M.; LaMontagne, P.; Hileman, M.; Marcus, D. MRI-based Identification and Classification of Major Intracranial Tumor Types by Using a 3D Convolutional Neural Network: A Retrospective Multi-institutional Analysis. Radiol. Artif. Intell. 2021, 3, e200301. [Google Scholar] [CrossRef] [PubMed]
  42. Deepak, S.; Ameer, P.M. Brain tumor classification using deep CNN features via transfer learning. Comput. Biol. Med. 2019, 111, 103345. [Google Scholar] [CrossRef]
  43. Díaz-Pernas, F.J.; Martínez-Zarzuela, M.; Antón-Rodríguez, M.; González-Ortega, D. A Deep Learning Approach for Brain Tumor Classification and Segmentation Using a Multiscale Convolutional Neural Network. Healthcare 2021, 9, 153. [Google Scholar] [CrossRef]
  44. Dmitriev, K.; Kaufman, A.E.; Javed, A.A.; Hruban, R.H.; Fishman, E.K.; Lennon, A.M.; Saltz, J.H. Classification of Pancreatic Cysts in Computed Tomography Images Using a Random Forest and Convolutional Neural Network Ensemble. Med. Image Comput. Comput. Assist. Interv. 2017, 10435, 150–158. [Google Scholar] [CrossRef]
  45. Massafra, R.; Fanizzi, A.; Amoroso, N.; Bove, S.; Comes, M.C.; Pomarico, D.; Didonna, V.; Diotaiuti, S.; Galati, L.; Giotta, F.; et al. Analyzing breast cancer invasive disease event classification through explainable artificial intelligence. Front. Med. 2023, 10, 1116354. [Google Scholar] [CrossRef] [PubMed]
  46. Du, R.; Lee, V.H.; Yuan, H.; Lam, K.O.; Pang, H.H.; Chen, Y.; Lam, E.Y.; Khong, P.L.; Lee, A.W.; Kwong, D.L.; et al. Radiomics Model to Predict Early Progression of Nonmetastatic Nasopharyngeal Carcinoma after Intensity Modulation Radiation Therapy: A Multicenter Study. Radiol. Artif. Intell. 2019, 1, e180075. [Google Scholar] [CrossRef] [PubMed]
  47. Bibault, J.E.; Giraud, P.; Housset, M.; Durdux, C.; Taieb, J.; Berger, A.; Coriat, R.; Chaussade, S.; Dousset, B.; Nordlinger, B.; et al. Deep Learning and Radiomics predict complete response after neo-adjuvant chemoradiation for locally advanced rectal cancer. Sci. Rep. 2018, 8, 12611. [Google Scholar] [CrossRef] [PubMed]
  48. Kao, Y.S.; Hsu, Y. A Meta-Analysis for Using Radiomics to Predict Complete Pathological Response in Esophageal Cancer Patients Receiving Neoadjuvant Chemoradiation. Vivo 2021, 35, 1857–1863. [Google Scholar] [CrossRef] [PubMed]
  49. DiCenzo, D.; Quiaoit, K.; Fatima, K.; Bhardwaj, D.; Sannachi, L.; Gangeh, M.; Sadeghi-Naini, A.; Dasgupta, A.; Kolios, M.C.; Trudeau, M.; et al. Quantitative ultrasound radiomics in predicting response to neoadjuvant chemotherapy in patients with locally advanced breast cancer: Results from multi-institutional study. Cancer Med. 2020, 9, 5798–5806. [Google Scholar] [CrossRef]
  50. Lin, M.; Momin, S.; Lei, Y.; Wang, H.; Curran, W.J.; Liu, T.; Yang, X. Fully automated segmentation of brain tumor from multiparametric MRI using 3D context deep supervised U-Net. Med. Phys. 2021, 48, 4365–4374. [Google Scholar] [CrossRef]
  51. Primakov, S.P.; Ibrahim, A.; van Timmeren, J.E.; Wu, G.; Keek, S.A.; Beuque, M.; Granzier, R.W.Y.; Lavrova, E.; Scrivener, M.; Sanduleanu, S.; et al. Automated detection and segmentation of non-small cell lung cancer computed tomography images. Nat. Commun. 2022, 13, 3423. [Google Scholar] [CrossRef]
  52. Chang, C.Y.; Buckless, C.; Yeh, K.J.; Torriani, M. Automated detection and segmentation of sclerotic spinal lesions on body CTs using a deep convolutional neural network. Skelet. Radiol. 2022, 51, 391–399. [Google Scholar] [CrossRef]
  53. Goehler, A.; Harry Hsu, T.M.; Lacson, R.; Gujrathi, I.; Hashemi, R.; Chlebus, G.; Szolovits, P.; Khorasani, R. Three-Dimensional Neural Network to Automatically Assess Liver Tumor Burden Change on Consecutive Liver MRIs. J. Am. Coll. Radiol. 2020, 17, 1475–1484. [Google Scholar] [CrossRef]
  54. Anderson, B.M.; Rigaud, B.; Lin, Y.-M.; Jones, A.K.; Kang, H.C.; Odisio, B.C.; Brock, K.K. Automated segmentation of colorectal liver metastasis and liver ablation on contrast-enhanced CT images. Front. Oncol. 2022, 12, 886517. [Google Scholar] [CrossRef]
  55. Fathi Kazerooni, A.; Bagley, S.J.; Akbari, H.; Saxena, S.; Bagheri, S.; Guo, J.; Chawla, S.; Nabavizadeh, A.; Mohan, S.; Bakas, S.; et al. Applications of Radiomics and Radiogenomics in High-Grade Gliomas in the Era of Precision Medicine. Cancers 2021, 13, 5921. [Google Scholar] [CrossRef] [PubMed]
  56. Saxena, S.; Jena, B.; Gupta, N.; Das, S.; Sarmah, D.; Bhattacharya, P.; Nath, T.; Paul, S.; Fouda, M.M.; Kalra, M.; et al. Role of Artificial Intelligence in Radiogenomics for Cancers in the Era of Precision Medicine. Cancers 2022, 14, 2860. [Google Scholar] [CrossRef]
  57. Ren, M.; Yang, H.; Lai, Q.; Shi, D.; Liu, G.; Shuang, X.; Su, J.; Xie, L.; Dong, Y.; Jiang, X. MRI-based radiomics analysis for predicting the EGFR mutation based on thoracic spinal metastases in lung adenocarcinoma patients. Med. Phys. 2021, 48, 5142–5151. [Google Scholar] [CrossRef] [PubMed]
  58. Darvish, L.; Bahreyni-Toossi, M.-T.; Roozbeh, N.; Azimian, H. The role of radiogenomics in the diagnosis of breast cancer: A systematic review. Egypt. J. Med. Hum. Genet. 2022, 23, 99. [Google Scholar] [CrossRef]
  59. Zhou, Y.; Feng, B.-J.; Yue, W.-W.; Liu, Y.; Xu, Z.-F.; Xing, W.; Xu, Z.; Yao, J.-C.; Wang, S.-R.; Xu, D. Differentiating non-lactating mastitis and malignant breast tumors by deep-learning based AI automatic classification system: A preliminary study. Front. Oncol. 2022, 12, 997306. [Google Scholar] [CrossRef]
  60. Kayode, A.A.; Akande, N.O.; Adegun, A.A.; Adebiyi, M.O. An automated mammogram classification system using modified support vector machine. Med. Devices 2019, 12, 275–284. [Google Scholar] [CrossRef] [Green Version]
  61. Mohanty, A.K.; Senapati, M.R.; Beberta, S.; Lenka, S.K. Texture-based features for classification of mammograms using decision tree. Neural Comput. Appl. 2013, 23, 1011–1017. [Google Scholar] [CrossRef]
  62. Liberini, V.; Laudicella, R.; Balma, M.; Nicolotti, D.G.; Buschiazzo, A.; Grimaldi, S.; Lorenzon, L.; Bianchi, A.; Peano, S.; Bartolotta, T.V.; et al. Radiomics and artificial intelligence in prostate cancer: New tools for molecular hybrid imaging and theragnostics. Eur. Radiol. Exp. 2022, 6, 27. [Google Scholar] [CrossRef]
  63. Van Booven, D.J.; Kuchakulla, M.; Pai, R.; Frech, F.S.; Ramasahayam, R.; Reddy, P.; Parmar, M.; Ramasamy, R.; Arora, H. A Systematic Review of Artificial Intelligence in Prostate Cancer. Res. Rep. Urol. 2021, 13, 31–39. [Google Scholar] [CrossRef]
  64. Wan, Y.L.; Wu, P.W.; Huang, P.C.; Tsay, P.K.; Pan, K.T.; Trang, N.N.; Chuang, W.Y.; Wu, C.Y.; Lo, S.B. The Use of Artificial Intelligence in the Differentiation of Malignant and Benign Lung Nodules on Computed Tomograms Proven by Surgical Pathology. Cancers 2020, 12, 2211. [Google Scholar] [CrossRef]
  65. Heuvelmans, M.A.; van Ooijen, P.M.A.; Ather, S.; Silva, C.F.; Han, D.; Heussel, C.P.; Hickes, W.; Kauczor, H.-U.; Novotny, P.; Peschl, H.; et al. Lung cancer prediction by Deep Learning to identify benign lung nodules. Lung Cancer 2021, 154, 1–4. [Google Scholar] [CrossRef] [PubMed]
  66. Li, X.; Hu, B.; Li, H.; You, B. Application of artificial intelligence in the diagnosis of multiple primary lung cancer. Thorac. Cancer 2019, 10, 2168–2174. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. Li, X.; Lu, Y.; Xiong, J.; Wang, D.; She, D.; Kuai, X.; Geng, D.; Yin, B. Presurgical differentiation between malignant haemangiopericytoma and angiomatous meningioma by a radiomics approach based on texture analysis. J. Neuroradiol. 2019, 46, 281–287. [Google Scholar] [CrossRef]
  68. Ullah, N.; Khan, J.A.; Khan, M.S.; Khan, W.; Hassan, I.; Obayya, M.; Negm, N.; Salama, A.S. An Effective Approach to Detect and Identify Brain Tumors Using Transfer Learning. Appl. Sci. 2022, 12, 5645. [Google Scholar] [CrossRef]
  69. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef] [PubMed]
  70. Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.; Clarke, M.; Devereaux, P.J.; Kleijnen, J.; Moher, D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: Explanation and elaboration. BMJ 2009, 339, b2700. [Google Scholar] [CrossRef] [Green Version]
  71. Xiong, X.; Wang, J.; Hu, S.; Dai, Y.; Zhang, Y.; Hu, C. Differentiating between Multiple Myeloma and Metastasis Subtypes of Lumbar Vertebra Lesions Using Machine Learning-Based Radiomics. Front. Oncol. 2021, 11, 601699. [Google Scholar] [CrossRef]
  72. Eweje, F.R.; Bao, B.; Wu, J.; Dalal, D.; Liao, W.H.; He, Y.; Luo, Y.; Lu, S.; Zhang, P.; Peng, X.; et al. Deep Learning for Classification of Bone Lesions on Routine MRI. EBioMedicine 2021, 68, 103402. [Google Scholar] [CrossRef]
  73. Zhong, X.; Li, L.; Jiang, H.; Yin, J.; Lu, B.; Han, W.; Li, J.; Zhang, J. Cervical spine osteoradionecrosis or bone metastasis after radiotherapy for nasopharyngeal carcinoma? The MRI-based radiomics for characterization. BMC Med. Imaging 2020, 20, 104. [Google Scholar] [CrossRef]
  74. Sun, W.; Liu, S.; Guo, J.; Liu, S.; Hao, D.; Hou, F.; Wang, H.; Xu, W. A CT-based radiomics nomogram for distinguishing between benign and malignant bone tumours. Cancer Imaging 2021, 21, 20. [Google Scholar] [CrossRef]
  75. Reinus, W.R.; Wilson, A.J.; Kalman, B.; Kwasny, S. Diagnosis of focal bone lesions using neural networks. Investig. Radiol. 1994, 29, 606–611. [Google Scholar] [CrossRef]
  76. Filograna, L.; Lenkowicz, J.; Cellini, F.; Dinapoli, N.; Manfrida, S.; Magarelli, N.; Leone, A.; Colosimo, C.; Valentini, V. Identification of the most significant magnetic resonance imaging (MRI) radiomic features in oncological patients with vertebral bone marrow metastatic disease: A feasibility study. Radiol. Med. 2019, 124, 50–57. [Google Scholar] [CrossRef]
  77. Yin, P.; Mao, N.; Zhao, C.; Wu, J.; Chen, L.; Hong, N. A Triple-Classification Radiomics Model for the Differentiation of Primary Chordoma, Giant Cell Tumor, and Metastatic Tumor of Sacrum Based on T2-Weighted and Contrast-Enhanced T1-Weighted MRI. J. Magn. Reason. Imaging 2019, 49, 752–759. [Google Scholar] [CrossRef] [PubMed]
  78. Acar, E.; Leblebici, A.; Ellidokuz, B.E.; Basbinar, Y.; Kaya, G.C. Machine learning for differentiating metastatic and completely responded sclerotic bone lesion in prostate cancer: A retrospective radiomics study. Br. J. Radiol. 2019, 92, 20190286. [Google Scholar] [CrossRef] [PubMed]
  79. He, Y.; Pan, I.; Bao, B.; Halsey, K.; Chang, M.; Liu, H.; Peng, S.; Sebro, R.A.; Guan, J.; Yi, T.; et al. Deep learning-based classification of primary bone tumors on radiographs: A preliminary study. EBioMedicine 2020, 62, 103121. [Google Scholar] [CrossRef] [PubMed]
  80. Reicher, J.J.; Alto, P.; Do, B.H.; Nguyen, M.; Beaulieu, C.F. Single-input bone tumor diagnosis based on convolutional neural network classification of bone tumor matrix. In Proceedings of the SIIM Annual Meeting 2018, National Harbor, MD, USA, 31 May–2 June 2018. [Google Scholar]
  81. Li, Y.; Zhou, W.; Lv, G.; Luo, G.; Zhu, Y.; Liu, J. Classification of Bone Tumor on CT Images Using Deep Convolutional Neural Network. In Proceedings of the Artificial Neural Networks and Machine Learning—ICANN 2018, Rhodes, Greece, 4–7 October 2018; Lecture Notes in Computer Science. pp. 127–136. [Google Scholar]
  82. Park, C.W.; Oh, S.J.; Kim, K.S.; Jang, M.C.; Kim, I.S.; Lee, Y.K.; Chung, M.J.; Cho, B.H.; Seo, S.W. Artificial intelligence-based classification of bone tumors in the proximal femur on plain radiographs: System development and validation. PLoS ONE 2022, 17, e0264140. [Google Scholar] [CrossRef] [PubMed]
  83. Liu, R.; Pan, D.; Xu, Y.; Zeng, H.; He, Z.; Lin, J.; Zeng, W.; Wu, Z.; Luo, Z.; Qin, G.; et al. A deep learning-machine learning fusion approach for the classification of benign, malignant, and intermediate bone tumors. Eur. Radiol. 2022, 32, 1371–1383. [Google Scholar] [CrossRef]
  84. Pan, D.; Liu, R.; Zheng, B.; Yuan, J.; Zeng, H.; He, Z.; Luo, Z.; Qin, G.; Chen, W. Using Machine Learning to Unravel the Value of Radiographic Features for the Classification of Bone Tumors. Biomed. Res. Int. 2021, 2021, 8811056. [Google Scholar] [CrossRef]
  85. Hong, J.H.; Jung, J.Y.; Jo, A.; Nam, Y.; Pak, S.; Lee, S.Y.; Park, H.; Lee, S.E.; Kim, S. Development and Validation of a Radiomics Model for Differentiating Bone Islands and Osteoblastic Bone Metastases at Abdominal CT. Radiology 2021, 299, 626–632. [Google Scholar] [CrossRef]
  86. Von Schacky, C.E.; Wilhelm, N.J.; Schafer, V.S.; Leonhardt, Y.; Jung, M.; Jungmann, P.M.; Russe, M.F.; Foreman, S.C.; Gassert, F.G.; Gassert, F.T.; et al. Development and evaluation of machine learning models based on X-ray radiomics for the classification and differentiation of malignant and benign bone tumors. Eur. Radiol. 2022, 32, 6247–6257. [Google Scholar] [CrossRef]
  87. Perk, T.; Bradshaw, T.; Chen, S.; Im, H.J.; Cho, S.; Perlman, S.; Liu, G.; Jeraj, R. Automated classification of benign and malignant lesions in (18)F-NaF PET/CT images using machine learning. Phys. Med. Biol. 2018, 63, 225019. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  88. Do, B.H.; Langlotz, C.; Beaulieu, C.F. Bone Tumor Diagnosis Using a Naive Bayesian Model of Demographic and Radiographic Features. J. Digit. Imaging 2017, 30, 640–647. [Google Scholar] [CrossRef] [PubMed]
  89. Kahn, C.E., Jr.; Laur, J.J.; Carrera, G.F. A Bayesian network for diagnosis of primary bone tumors. J. Digit. Imaging 2001, 14, 56–57. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  90. Gitto, S.; Cuocolo, R.; van Langevelde, K.; van de Sande, M.A.J.; Parafioriti, A.; Luzzati, A.; Imbriaco, M.; Sconfienza, L.M.; Bloem, J.L. MRI radiomics-based machine learning classification of atypical cartilaginous tumour and grade II chondrosarcoma of long bones. EBioMedicine 2022, 75, 103757. [Google Scholar] [CrossRef]
  91. Gitto, S.; Cuocolo, R.; Albano, D.; Chianca, V.; Messina, C.; Gambino, A.; Ugga, L.; Cortese, M.C.; Lazzara, A.; Ricci, D.; et al. MRI radiomics-based machine-learning classification of bone chondrosarcoma. Eur. J. Radiol. 2020, 128, 109043. [Google Scholar] [CrossRef] [PubMed]
  92. Yin, P.; Mao, N.; Chen, H.; Sun, C.; Wang, S.; Liu, X.; Hong, N. Machine and Deep Learning Based Radiomics Models for Preoperative Prediction of Benign and Malignant Sacral Tumors. Front. Oncol. 2020, 10, 564725. [Google Scholar] [CrossRef] [PubMed]
  93. Georgeanu, V.A.; Mamuleanu, M.; Ghiea, S.; Selisteanu, D. Malignant Bone Tumors Diagnosis Using Magnetic Resonance Imaging Based on Deep Learning Algorithms. Medicina 2022, 58, 636. [Google Scholar] [CrossRef] [PubMed]
  94. Xu, R.; Kido, S.; Suga, K.; Hirano, Y.; Tachibana, R.; Muramatsu, K.; Chagawa, K.; Tanaka, S. Texture analysis on (18)F-FDG PET/CT images to differentiate malignant and benign bone and soft-tissue lesions. Ann. Nucl. Med. 2014, 28, 926–935. [Google Scholar] [CrossRef]
  95. Chianca, V.; Cuocolo, R.; Gitto, S.; Albano, D.; Merli, I.; Badalyan, J.; Cortese, M.C.; Messina, C.; Luzzati, A.; Parafioriti, A.; et al. Radiomic Machine Learning Classifiers in Spine Bone Tumors: A Multi-Software, Multi-Scanner Study. Eur. J. Radiol. 2021, 137, 109586. [Google Scholar] [CrossRef]
  96. Gitto, S.; Bologna, M.; Corino, V.D.A.; Emili, I.; Albano, D.; Messina, C.; Armiraglio, E.; Parafioriti, A.; Luzzati, A.; Mainardi, L.; et al. Diffusion-weighted MRI radiomics of spine bone tumors: Feature stability and machine learning-based classification performance. Radiol. Med. 2022, 127, 518–525. [Google Scholar] [CrossRef]
  97. Consalvo, S.; Hinterwimmer, F.; Neumann, J.; Steinborn, M.; Salzmann, M.; Seidl, F.; Lenze, U.; Knebel, C.; Rueckert, D.; Burgkart, R.H.H. Two-Phase Deep Learning Algorithm for Detection and Differentiation of Ewing Sarcoma and Acute Osteomyelitis in Paediatric Radiographs. Anticancer Res. 2022, 42, 4371–4380. [Google Scholar] [CrossRef] [PubMed]
  98. Zhao, K.; Zhang, M.; Xie, Z.; Yan, X.; Wu, S.; Liao, P.; Lu, H.; Shen, W.; Fu, C.; Cui, H.; et al. Deep Learning Assisted Diagnosis of Musculoskeletal Tumors Based on Contrast-Enhanced Magnetic Resonance Imaging. J. Magn. Reason. Imaging 2022, 56, 99–107. [Google Scholar] [CrossRef] [PubMed]
  99. Bradshaw, T.; Perk, T.; Chen, S.; Im, H.-J.; Cho, S.; Perlman, S.; Jeraj, R. Deep learning for classification of benign and malignant bone lesions in [F-18]NaF PET/CT images. J. Nucl. Med. 2018, 59, 327. [Google Scholar]
  100. Do, N.-T.; Jung, S.-T.; Yang, H.-J.; Kim, S.-H. Multi-Level Seg-Unet Model with Global and Patch-Based X-ray Images for Knee Bone Tumor Detection. Diagnostics 2021, 11, 691. [Google Scholar] [CrossRef] [PubMed]
  101. Masoudi, S.; Mehralivand, S.; Harmon, S.A.; Lay, N.; Lindenberg, L.; Mena, E.; Pinto, P.A.; Citrin, D.E.; Gulley, J.L.; Wood, B.J.; et al. Deep Learning Based Staging of Bone Lesions from Computed Tomography Scans. IEEE Access 2021, 9, 87531–87542. [Google Scholar] [CrossRef]
  102. Gitto, S.; Cuocolo, R.; Annovazzi, A.; Anelli, V.; Acquasanta, M.; Cincotta, A.; Albano, D.; Chianca, V.; Ferraresi, V.; Messina, C.; et al. CT radiomics-based machine learning classification of atypical cartilaginous tumours and appendicular chondrosarcomas. EBioMedicine 2021, 68, 103407. [Google Scholar] [CrossRef]
  103. Schacky, C.E.v.; Wilhelm, N.J.; Schäfer, V.S.; Leonhardt, Y.; Gassert, F.G.; Foreman, S.C.; Gassert, F.T.; Jung, M.; Jungmann, P.M.; Russe, M.F.; et al. Multitask Deep Learning for Segmentation and Classification of Primary Bone Tumors on Radiographs. Radiology 2021, 301, 398–406. [Google Scholar] [CrossRef]
  104. Rajkomar, A.; Dean, J.; Kohane, I. Machine Learning in Medicine. N. Engl. J. Med. 2019, 380, 1347–1358. [Google Scholar] [CrossRef]
  105. Thrall, J.H.; Li, X.; Li, Q.; Cruz, C.; Do, S.; Dreyer, K.; Brink, J. Artificial Intelligence and Machine Learning in Radiology: Opportunities, Challenges, Pitfalls, and Criteria for Success. J. Am. Coll. Radiol. 2018, 15, 504–508. [Google Scholar] [CrossRef]
  106. Hosny, A.; Parmar, C.; Quackenbush, J.; Schwartz, L.H.; Aerts, H. Artificial intelligence in radiology. Nat. Rev. Cancer 2018, 18, 500–510. [Google Scholar] [CrossRef]
  107. Ong, W.; Zhu, L.; Zhang, W.; Kuah, T.; Lim, D.S.W.; Low, X.Z.; Thian, Y.L.; Teo, E.C.; Tan, J.H.; Kumar, N.; et al. Application of Artificial Intelligence Methods for Imaging of Spinal Metastasis. Cancers 2022, 14, 4025. [Google Scholar] [CrossRef]
  108. Erickson, B.J.; Korfiatis, P.; Akkus, Z.; Kline, T.L. Machine Learning for Medical Imaging. Radiographics 2017, 37, 505–515. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  109. van Timmeren, J.E.; Cester, D.; Tanadini-Lang, S.; Alkadhi, H.; Baessler, B. Radiomics in medical imaging-“how-to” guide and critical reflection. Insights Imaging 2020, 11, 91. [Google Scholar] [CrossRef]
  110. Mannil, M.; von Spiczak, J.; Manka, R.; Alkadhi, H. Texture Analysis and Machine Learning for Detecting Myocardial Infarction in Noncontrast Low-Dose Computed Tomography: Unveiling the Invisible. Investig. Radiol. 2018, 53, 338–343. [Google Scholar] [CrossRef] [Green Version]
  111. Aerts, H.J. The Potential of Radiomic-Based Phenotyping in Precision Medicine: A Review. JAMA Oncol. 2016, 2, 1636–1642. [Google Scholar] [CrossRef] [PubMed]
  112. Lambin, P.; Leijenaar, R.T.H.; Deist, T.M.; Peerlings, J.; de Jong, E.E.C.; van Timmeren, J.; Sanduleanu, S.; Larue, R.; Even, A.J.G.; Jochems, A.; et al. Radiomics: The bridge between medical imaging and personalized medicine. Nat. Rev. Clin. Oncol. 2017, 14, 749–762. [Google Scholar] [CrossRef] [Green Version]
  113. Shur, J.D.; Doran, S.J.; Kumar, S.; ap Dafydd, D.; Downey, K.; O’Connor, J.P.B.; Papanikolaou, N.; Messiou, C.; Koh, D.-M.; Orton, M.R. Radiomics in Oncology: A Practical Guide. RadioGraphics 2021, 41, 1717–1732. [Google Scholar] [CrossRef]
  114. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  115. Liu, Z.; Wang, S.; Dong, D.; Wei, J.; Fang, C.; Zhou, X.; Sun, K.; Li, L.; Li, B.; Wang, M.; et al. The Applications of Radiomics in Precision Diagnosis and Treatment of Oncology: Opportunities and Challenges. Theranostics 2019, 9, 1303–1322. [Google Scholar] [CrossRef]
  116. Ciresan, D.C.; Meier, U.; Masci, J.; Gambardella, L.M.; Schmidhuber, J. Flexible, high performance convolutional neural networks for image classification. In Proceedings of the 22nd International Joint Conference on Artificial Intelligence, Catalonia, Spain, 16–22 July 2011. [Google Scholar]
  117. Zaharchuk, G.; Gong, E.; Wintermark, M.; Rubin, D.; Langlotz, C.P. Deep Learning in Neuroradiology. AJNR Am. J. Neuroradiol. 2018, 39, 1776–1784. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  118. Kaka, H.; Zhang, E.; Khan, N. Artificial intelligence and deep learning in neuroradiology: Exploring the new frontier. Can. Assoc. Radiol. J. 2021, 72, 35–44. [Google Scholar] [CrossRef]
  119. Ziyad, S.R.; Radha, V.; Vayyapuri, T. Overview of Computer Aided Detection and Computer Aided Diagnosis Systems for Lung Nodule Detection in Computed Tomography. Curr. Med. Imaging Rev. 2020, 16, 16–26. [Google Scholar] [CrossRef]
  120. Roodman, G.D. Skeletal imaging and management of bone disease. Hematol. Am. Soc. Hematol. Educ. Program 2008, 2008, 313–319. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  121. Umer, M.; Hasan, O.H.A.; Khan, D.; Uddin, N.; Noordin, S. Systematic approach to musculoskeletal benign tumors. Int. J. Surg. Oncol. 2017, 2, e46. [Google Scholar] [CrossRef] [Green Version]
  122. Remotti, F.; Feldman, F. Nonneoplastic lesions that simulate primary tumors of bone. Arch. Pathol. Lab. Med. 2012, 136, 772–788. [Google Scholar] [CrossRef] [Green Version]
  123. Teo, H.E.; Peh, W.C. Primary bone tumors of adulthood. Cancer Imaging 2004, 4, 74–83. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  124. Miller, T.T. Bone tumors and tumorlike conditions: Analysis with conventional radiography. Radiology 2008, 246, 662–674. [Google Scholar] [CrossRef]
  125. Costelloe, C.M.; Madewell, J.E. Radiography in the initial diagnosis of primary bone tumors. AJR Am. J. Roentgenol. 2013, 200, 3–7. [Google Scholar] [CrossRef]
  126. Gemescu, I.N.; Thierfelder, K.M.; Rehnitz, C.; Weber, M.-A. Imaging Features of Bone Tumors: Conventional Radiographs and MR Imaging Correlation. Magn. Reson. Imaging Clin. N. Am. 2019, 27, 753–767. [Google Scholar] [CrossRef]
  127. Hayes, C.W.; Conway, W.F.; Sundaram, M. Misleading aggressive MR imaging appearance of some benign musculoskeletal lesions. Radiographics 1992, 12, 1119–1134; discussion 1135–1136. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  128. Camargo, O.P.; Croci, A.T.; Oliveira, C.R.; Baptista, A.M.; Caiero, M.T. Functional and radiographic evaluation of 214 aggressive benign bone lesions treated with curettage, cauterization, and cementation: 24 years of follow-up. Clinics 2005, 60, 439–444. [Google Scholar] [CrossRef] [Green Version]
  129. Fletcher, C.D.; Unni, K.; Mertens, F. World Health Organization classification of tumours. In Pathology and Genetics of Tumours of Soft Tissue and Bone; IARC Press: Lyon, France, 2002. [Google Scholar]
  130. Weber, M.A.; Papakonstantinou, O.; Nikodinovska, V.V.; Vanhoenacker, F.M. Ewing’s Sarcoma and Primary Osseous Lymphoma: Spectrum of Imaging Appearances. Semin. Musculoskelet Radiol. 2019, 23, 36–57. [Google Scholar] [CrossRef]
  131. McCarville, M.B.; Chen, J.Y.; Coleman, J.L.; Li, Y.; Li, X.; Adderson, E.E.; Neel, M.D.; Gold, R.E.; Kaufman, R.A. Distinguishing Osteomyelitis From Ewing Sarcoma on Radiography and MRI. AJR Am. J. Roentgenol. 2015, 205, 640–650; quiz 651. [Google Scholar] [CrossRef] [Green Version]
  132. Kuleta-Bosak, E.; Kluczewska, E.; Machnik-Broncel, J.; Madziara, W.; Ciupinska-Kajor, M.; Sojka, D.; Rogala, W.; Juszczyk, J.; Wilk, R. Suitability of imaging methods (X-ray, CT, MRI) in the diagnostics of Ewing’s sarcoma in children—Analysis of own material. Pol. J. Radiol. 2010, 75, 18–28. [Google Scholar]
  133. Mar, W.A.; Taljanovic, M.S.; Bagatell, R.; Graham, A.R.; Speer, D.P.; Hunter, T.B.; Rogers, L.F. Update on imaging and treatment of Ewing sarcoma family tumors: What the radiologist needs to know. J. Comput. Assist. Tomogr. 2008, 32, 108–118. [Google Scholar] [CrossRef]
  134. Vanel, D.; Ruggieri, P.; Ferrari, S.; Picci, P.; Gambarotti, M.; Staals, E.; Alberghini, M. The incidental skeletal lesion: Ignore or explore? Cancer Imaging 2009, 9, S38–S43. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  135. Caballes, R.L.; Caballes, R.A., Jr. Polyostotic giant enostoses with strongly positive radionuclide bone scan. Ann. Diagn. Pathol. 2004, 8, 247–251. [Google Scholar] [CrossRef]
  136. Ulano, A.; Bredella, M.A.; Burke, P.; Chebib, I.; Simeone, F.J.; Huang, A.J.; Torriani, M.; Chang, C.Y. Distinguishing Untreated Osteoblastic Metastases From Enostoses Using CT Attenuation Measurements. Am. J. Roentgenol. 2016, 207, 362–368. [Google Scholar] [CrossRef] [PubMed]
  137. Sala, F.; Dapoto, A.; Morzenti, C.; Firetto, M.C.; Valle, C.; Tomasoni, A.; Sironi, S. Bone islands incidentally detected on computed tomography: Frequency of enostosis and differentiation from untreated osteoblastic metastases based on CT attenuation value. Br. J. Radiol. 2019, 92, 20190249. [Google Scholar] [CrossRef] [PubMed]
  138. Dong, Y.; Zheng, S.; Machida, H.; Wang, B.; Liu, A.; Liu, Y.; Zhang, X. Differential diagnosis of osteoblastic metastases from bone islands in patients with lung cancer by single-source dual-energy CT: Advantages of spectral CT imaging. Eur. J. Radiol. 2015, 84, 901–907. [Google Scholar] [CrossRef] [PubMed]
  139. Schajowicz, F.; Sissons, H.A.; Sobin, L.H. The World Health Organization’s histologic classification of bone tumors. A commentary on the second edition. Cancer 1995, 75, 1208–1214. [Google Scholar] [CrossRef] [PubMed]
  140. Clézardin, P.; Coleman, R.; Puppo, M.; Ottewell, P.; Bonnelye, E.; Paycha, F.; Confavreux, C.B.; Holen, I. Bone metastasis: Mechanisms, therapies, and biomarkers. Physiol. Rev. 2021, 101, 797–855. [Google Scholar] [CrossRef]
  141. Rajiah, P.; Ilaslan, H.; Sundaram, M. Imaging of primary malignant bone tumors (nonhematological). Radiol. Clin. N. Am. 2011, 49, 1135–1161. [Google Scholar] [CrossRef] [PubMed]
  142. Strobel, K.; Exner, U.E.; Stumpe, K.D.; Hany, T.F.; Bode, B.; Mende, K.; Veit-Haibach, P.; von Schulthess, G.K.; Hodler, J. The additional value of CT images interpretation in the differential diagnosis of benign vs. malignant primary bone lesions with 18F-FDG-PET/CT. Eur. J. Nucl. Med. Mol. Imaging 2008, 35, 2000–2008. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  143. Cannavò, L.; Albano, D.; Messina, C.; Corazza, A.; Rapisarda, S.; Pozzi, G.; Di Bernardo, A.; Parafioriti, A.; Scotto, G.; Perrucchini, G.; et al. Accuracy of CT and MRI to assess resection margins in primary malignant bone tumours having histology as the reference standard. Clin. Radiol. 2019, 74, e713–e736. [Google Scholar] [CrossRef]
  144. James, S.L.; Panicek, D.M.; Davies, A.M. Bone marrow oedema associated with benign and malignant bone tumours. Eur. J. Radiol. 2008, 67, 11–21. [Google Scholar] [CrossRef]
  145. Oh, E.; Yoon, Y.C.; Kim, J.H.; Kim, K. Multiparametric approach with diffusion-weighted imaging and dynamic contrast-enhanced MRI: A comparison study for differentiating between benign and malignant bone lesions in adults. Clin. Radiol. 2017, 72, 552–559. [Google Scholar] [CrossRef]
  146. Hillengass, J.; Stieltjes, B.; Bäuerle, T.; McClanahan, F.; Heiss, C.; Hielscher, T.; Wagner-Gund, B.; Habetler, V.; Goldschmidt, H.; Schlemmer, H.P.; et al. Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) and diffusion-weighted imaging of bone marrow in healthy individuals. Acta Radiol. 2011, 52, 324–330. [Google Scholar] [CrossRef]
  147. Suh, C.H.; Yun, S.J.; Jin, W.; Lee, S.H.; Park, S.Y.; Ryu, C.-W. ADC as a useful diagnostic tool for differentiating benign and malignant vertebral bone marrow lesions and compression fractures: A systematic review and meta-analysis. Eur. Radiol. 2018, 28, 2890–2902. [Google Scholar] [CrossRef]
  148. Pozzi, G.; Albano, D.; Messina, C.; Angileri, S.A.; Al-Mnayyis, A.a.; Galbusera, F.; Luzzati, A.; Perrucchini, G.; Scotto, G.; Parafioriti, A.; et al. Solid bone tumors of the spine: Diagnostic performance of apparent diffusion coefficient measured using diffusion-weighted MRI using histology as a reference standard. J. Magn. Reson. Imaging 2018, 47, 1034–1042. [Google Scholar] [CrossRef]
  149. Sharma, G.; Saran, S.; Saxena, S.; Goyal, T. Multiparametric evaluation of bone tumors utilising diffusion weighted imaging and dynamic contrast enhanced magnetic resonance imaging. J. Clin. Orthop. Trauma. 2022, 30, 101899. [Google Scholar] [CrossRef] [PubMed]
  150. Kayhan, A.; Yang, C.; Soylu, F.N.; Lakadamyalı, H.; Sethi, I.; Karczmar, G.; Stadler, W.; Oto, A. Dynamic contrast-enhanced MR imaging findings of bone metastasis in patients with prostate cancer. World J. Radiol. 2011, 3, 241–245. [Google Scholar] [CrossRef] [PubMed]
  151. Jung, H.-S.; Jee, W.-H.; McCauley, T.R.; Ha, K.-Y.; Choi, K.-H. Discrimination of Metastatic from Acute Osteoporotic Compression Spinal Fractures with MR Imaging1. RadioGraphics 2003, 23, 179–187. [Google Scholar] [CrossRef]
  152. Board, W. Classification of Tumours Editorial. Soft Tissue and Bone Tumours; International Agency for Research on Cancer: Lyon, France, 2020. [Google Scholar]
  153. Van Praag Veroniek, V.M.; Rueten-Budde, A.J.; Ho, V.; Dijkstra, P.D.S.; Fiocco, M.; van de Sande, M.A.J. Incidence, outcomes and prognostic factors during 25 years of treatment of chondrosarcomas. Surg. Oncol. 2018, 27, 402–408. [Google Scholar] [CrossRef]
  154. Skeletal Lesions Interobserver Correlation among Expert Diagnosticians (SLICED) Study Group. Reliability of Histopathologic and Radiologic Grading of Cartilaginous Neoplasms in Long Bones. J. Bone Jt. Surg. 2007, 89, 2113–2123. [Google Scholar] [CrossRef]
  155. Eefting, D.; Schrage, Y.M.; Geirnaerdt, M.J.; Le Cessie, S.; Taminiau, A.H.; Bovée, J.V.; Hogendoorn, P.C. Assessment of interobserver variability and histologic parameters to improve reliability in classification and grading of central cartilaginous tumors. Am. J. Surg. Pathol. 2009, 33, 50–57. [Google Scholar] [CrossRef] [PubMed]
  156. Verstraete, K.L.; Lang, P. Bone and soft tissue tumors: The role of contrast agents for MR imaging. Eur. J. Radiol. 2000, 34, 229–246. [Google Scholar] [CrossRef]
  157. Kransdorf, M.J. The use of gadolinium in the MR evaluation of musculoskeletal tumors. Top. Magn. Reason. Imaging 1996, 8, 15–23. [Google Scholar] [CrossRef]
  158. May, D.A.; Good, R.B.; Smith, D.K.; Parsons, T.W. MR imaging of musculoskeletal tumors and tumor mimickers with intravenous gadolinium: Experience with 242 patients. Skelet. Radiol. 1997, 26, 2–15. [Google Scholar] [CrossRef]
  159. Sundaram, M. The use of gadolinium in the MR imaging of bone tumors. Semin Ultrasound CT MR 1997, 18, 307–311. [Google Scholar] [CrossRef]
  160. Stacy, G.S.; Mahal, R.S.; Peabody, T.D. Staging of Bone Tumors: A Review with Illustrative Examples. Am. J. Roentgenol. 2006, 186, 967–976. [Google Scholar] [CrossRef] [PubMed]
  161. Mithal, L.B.; Patel, P.S.; Mithal, D.; Palac, H.L.; Rozenfeld, M.N. Use of gadolinium-based magnetic resonance imaging contrast agents and awareness of brain gadolinium deposition among pediatric providers in North America. Pediatr. Radiol. 2017, 47, 657–664. [Google Scholar] [CrossRef] [PubMed]
  162. Bhargava, R.; Hahn, G.; Hirsch, W.; Kim, M.J.; Mentzel, H.J.; Olsen, O.E.; Stokland, E.; Triulzi, F.; Vazquez, E. Contrast-enhanced magnetic resonance imaging in pediatric patients: Review and recommendations for current practice. Magn. Reason. Insights 2013, 6, 95–111. [Google Scholar] [CrossRef] [Green Version]
  163. Strauss, L.G.; Conti, P.S. The applications of PET in clinical oncology. J. Nucl. Med. 1991, 32, 623–648; discussion 649–650. [Google Scholar] [PubMed]
  164. Hoh, C.K.; Schiepers, C.; Seltzer, M.A.; Gambhir, S.S.; Silverman, D.H.; Czernin, J.; Maddahi, J.; Phelps, M.E. PET in oncology: Will it replace the other modalities? Semin. Nucl. Med. 1997, 27, 94–106. [Google Scholar] [CrossRef] [PubMed]
  165. Brock, C.S.; Meikle, S.R.; Price, P. Does fluorine-18 fluorodeoxyglucose metabolic imaging of tumours benefit oncology? Eur. J. Nucl. Med. 1997, 24, 691–705. [Google Scholar] [CrossRef]
  166. Alauddin, M.M. Positron emission tomography (PET) imaging with (18)F-based radiotracers. Am. J. Nucl. Med. Mol. Imaging 2012, 2, 55–76. [Google Scholar] [PubMed]
  167. Weber, D.A.; Greenberg, E.J.; Dimich, A.; Kenny, P.J.; Rothschild, E.O.; Myers, W.P.; Laughlin, J.S. Kinetics of radionuclides used for bone studies. J. Nucl. Med. 1969, 10, 8–17. [Google Scholar]
  168. Blake, G.M.; Park-Holohan, S.J.; Cook, G.J.; Fogelman, I. Quantitative studies of bone with the use of 18F-fluoride and 99mTc-methylene diphosphonate. Semin. Nucl. Med. 2001, 31, 28–49. [Google Scholar] [CrossRef]
  169. Schlonsky, J. Radioisotope scanning of bone. A review of the literature. Ohio. State Med. J. 1972, 68, 128–133. [Google Scholar]
  170. Bastawrous, S.; Bhargava, P.; Behnia, F.; Djang, D.S.W.; Haseley, D.R. Newer PET Application with an Old Tracer: Role of 18F-NaF Skeletal PET/CT in Oncologic Practice. RadioGraphics 2014, 34, 1295–1316. [Google Scholar] [CrossRef]
  171. Kern, K.A.; Brunetti, A.; Norton, J.A.; Chang, A.E.; Malawer, M.; Lack, E.; Finn, R.D.; Rosenberg, S.A.; Larson, S.M. Metabolic imaging of human extremity musculoskeletal tumors by PET. J. Nucl. Med. 1988, 29, 181–186. [Google Scholar] [PubMed]
  172. Adler, L.P.; Blair, H.F.; Makley, J.T.; Williams, R.P.; Joyce, M.J.; Leisure, G.; al-Kaisi, N.; Miraldi, F. Noninvasive grading of musculoskeletal tumors using PET. J. Nucl. Med. 1991, 32, 1508–1512. [Google Scholar] [PubMed]
  173. Griffeth, L.K.; Dehdashti, F.; McGuire, A.H.; McGuire, D.J.; Perry, D.J.; Moerlein, S.M.; Siegel, B.A. PET evaluation of soft-tissue masses with fluorine-18 fluoro-2-deoxy-D-glucose. Radiology 1992, 182, 185–194. [Google Scholar] [CrossRef]
  174. Dehdashti, F.; Siegel, B.A.; Griffeth, L.K.; Fusselman, M.J.; Trask, D.D.; McGuire, A.H.; McGuire, D.J. Benign vs. malignant intraosseous lesions: Discrimination by means of PET with 2-[F-18]fluoro-2-deoxy-D-glucose. Radiology 1996, 200, 243–247. [Google Scholar] [CrossRef] [PubMed]
  175. Palmer, W.E.; Rosenthal, D.I.; Schoenberg, O.I.; Fischman, A.J.; Simon, L.S.; Rubin, R.H.; Polisson, R.P. Quantification of inflammation in the wrist with gadolinium-enhanced MR imaging and PET with 2-[F-18]-fluoro-2-deoxy-D-glucose. Radiology 1995, 196, 647–655. [Google Scholar] [CrossRef]
  176. Zhang, Y.; Li, B.; Yu, H.; Shi, H. The value of skeletal standardized uptake values obtained by quantitative SPECT/CT in differential diagnosis of bone metastases. J. Nucl. Med. 2020, 61, 1527. [Google Scholar]
  177. Even-Sapir, E.; Metser, U.; Mishani, E.; Lievshitz, G.; Lerman, H.; Leibovitch, I. The detection of bone metastases in patients with high-risk prostate cancer: 99mTc-MDP Planar bone scintigraphy, single- and multi-field-of-view SPECT, 18F-fluoride PET, and 18F-fluoride PET/CT. J. Nucl. Med. 2006, 47, 287–297. [Google Scholar]
  178. Aoki, J.; Endo, K.; Watanabe, H.; Shinozaki, T.; Yanagawa, T.; Ahmed, A.R.; Takagishi, K. FDG-PET for evaluating musculoskeletal tumors: A review. J. Orthop. Sci. 2003, 8, 435–441. [Google Scholar] [CrossRef]
  179. Schwarzbach, M.H.; Dimitrakopoulou-Strauss, A.; Willeke, F.; Hinz, U.; Strauss, L.G.; Zhang, Y.M.; Mechtersheimer, G.; Attigah, N.; Lehnert, T.; Herfarth, C. Clinical value of [18-F]] fluorodeoxyglucose positron emission tomography imaging in soft tissue sarcomas. Ann. Surg. 2000, 231, 380–386. [Google Scholar] [CrossRef]
  180. Shin, D.-S.; Shon, O.-J.; Han, D.-S.; Choi, J.-H.; Chun, K.-A.; Cho, I.-H. The clinical efficacy of (18)F-FDG-PET/CT in benign and malignant musculoskeletal tumors. Ann. Nucl. Med. 2008, 22, 603–609. [Google Scholar] [CrossRef]
  181. Li, Y.; Schiepers, C.; Lake, R.; Dadparvar, S.; Berenji, G.R. Clinical utility of (18)F-fluoride PET/CT in benign and malignant bone diseases. Bone 2012, 50, 128–139. [Google Scholar] [CrossRef] [PubMed]
  182. Rosenbaum, S.J.; Lind, T.; Antoch, G.; Bockisch, A. False-positive FDG PET uptake--the role of PET/CT. Eur. Radiol. 2006, 16, 1054–1065. [Google Scholar] [CrossRef]
  183. Long, N.M.; Smith, C.S. Causes and imaging features of false positives and false negatives on F-PET/CT in oncologic imaging. Insights Imaging 2011, 2, 679–698. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  184. Fan, X.; Zhang, X.; Zhang, Z.; Jiang, Y. Deep Learning-Based Identification of Spinal Metastasis in Lung Cancer Using Spectral CT Images. Sci. Program. 2021, 2021, 2779390. [Google Scholar] [CrossRef]
  185. Orlhac, F.; Thézé, B.; Soussan, M.; Boisgard, R.; Buvat, I. Multiscale Texture Analysis: From 18F-FDG PET Images to Histologic Images. J. Nucl. Med. 2016, 57, 1823–1828. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  186. Orlhac, F.; Soussan, M.; Maisonobe, J.-A.; Garcia, C.A.; Vanderlinden, B.; Buvat, I. Tumor Texture Analysis in 18F-FDG PET: Relationships between Texture Parameters, Histogram Indices, Standardized Uptake Values, Metabolic Volumes, and Total Lesion Glycolysis. J. Nucl. Med. 2014, 55, 414. [Google Scholar] [CrossRef] [Green Version]
  187. Stacy, G.S.; Dixon, L.B. Pitfalls in MR image interpretation prompting referrals to an orthopedic oncology clinic. Radiographics 2007, 27, 805–826. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  188. Trieu, J.; Sinnathamby, M.; Di Bella, C.; Pianta, M.; Perera, W.; Slavin, J.L.; Schlicht, S.M.; Choong, P.F. Biopsy and the diagnostic evaluation of musculoskeletal tumours: Critical but often missed in the 21st century. ANZ J. Surg. 2016, 86, 133–138. [Google Scholar] [CrossRef]
  189. Meek, R.D.; Mills, M.K.; Hanrahan, C.J.; Beckett, B.R.; Leake, R.L.; Allen, H.; Williams, D.D.; Tommack, M.; Schmahmann, S.; Hansford, B.G. Pearls and Pitfalls for Soft-Tissue and Bone Biopsies: A Cross-Institutional Review. RadioGraphics 2020, 40, 266–290. [Google Scholar] [CrossRef]
  190. Traina, F.; Errani, C.; Toscano, A.; Pungetti, C.; Fabbri, D.; Mazzotti, A.; Donati, D.; Faldini, C. Current concepts in the biopsy of musculoskeletal tumors: AAOS exhibit selection. J. Bone Jt. Surg. Am. 2015, 97, e7. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  191. Tan, A.E.; Ng, D.C. Differentiating osteoradionecrosis from nasopharyngeal carcinoma tumour recurrence using 99Tcm-sestamibi SPECT/CT. Br. J. Radiol. 2011, 84, e172–e175. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  192. Alhilali, L.; Reynolds, A.R.; Fakhran, S. Osteoradionecrosis after radiation therapy for head and neck cancer: Differentiation from recurrent disease with CT and PET/CT imaging. AJNR Am. J. Neuroradiol. 2014, 35, 1405–1411. [Google Scholar] [CrossRef] [Green Version]
  193. Lipkova, J.; Chen, R.J.; Chen, B.; Lu, M.Y.; Barbieri, M.; Shao, D.; Vaidya, A.J.; Chen, C.; Zhuang, L.; Williamson, D.F.K.; et al. Artificial intelligence for multimodal data integration in oncology. Cancer Cell 2022, 40, 1095–1110. [Google Scholar] [CrossRef]
  194. Antropova, N.; Huynh, B.Q.; Giger, M.L. A deep feature fusion methodology for breast cancer diagnosis demonstrated on three imaging modality datasets. Med. Phys. 2017, 44, 5162–5171. [Google Scholar] [CrossRef] [Green Version]
  195. Sedghi, A.; Mehrtash, A.; Jamzad, A.; Amalou, A.; Wells, W.M.; Kapur, T.; Kwak, J.T.; Turkbey, B.; Choyke, P.; Pinto, P.; et al. Improving detection of prostate cancer foci via information fusion of MRI and temporal enhanced ultrasound. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 1215–1223. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flowchart for the study (adapted from the PRISMA group, 2020), which describes the research article selection process.
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flowchart for the study (adapted from the PRISMA group, 2020), which describes the research article selection process.
Cancers 15 01837 g001
Figure 2. A diagram highlighting the general framework and key steps for radiomics, namely image acquisition, image processing (including segmentation), feature extraction in the curated regions of interest (ROIs), feature selection, exploratory analysis, and finally modelling.
Figure 2. A diagram highlighting the general framework and key steps for radiomics, namely image acquisition, image processing (including segmentation), feature extraction in the curated regions of interest (ROIs), feature selection, exploratory analysis, and finally modelling.
Cancers 15 01837 g002
Figure 3. A diagram illustrating the difference between the two most popular techniques for machine learning; feature-based radiomics vs. deep learning for classifying lesions.
Figure 3. A diagram illustrating the difference between the two most popular techniques for machine learning; feature-based radiomics vs. deep learning for classifying lesions.
Cancers 15 01837 g003
Table 1. Key characteristics of the selected research articles.
Table 1. Key characteristics of the selected research articles.
AuthorsArtificial Intelligence MethodPublication YearMain ObjectivesTitle of JournalMain Imaging
Modality Used
Performance
Xiong et al. [71]Machine learning–based classifiers (ANN)2021Differentiate Multiple Myeloma from MetastasesFrontiers in OncologyMRIMCC = 0.605; Accuracy: 0.815; Sensitivity 0.879; Specificity: 0.790
Eweje et al. [72]EfficientNet-B0 architecture and a logistic regression model (CNN)2020Differentiate between benign and malignant bone lesionEBioMedicineMRIAccuracy: 0.760; Sensitivity: 0.790; Specificity: 0.750; AUC: 0.820
Zhong et al. [73]Radiomics using MRI machine learning techniques (SVM)2020Differentiate between osteoradionecrosis from spinal metastases in Nasopharyngeal CarcinomaBMC Medical ImagingMRIAccuracy: 0.737; Sensitivity: 0.843; Specificity 0.614; AUC of 0.725
Sun et al. [74]Radiomics using CT machine learning techniques (SVM)2021Distinguish between benign and malignant bone tumoursCancer ImagingMRIAUC of 0.892; NRI of 0.238; IDI of 0.163
Reinus et al. [75]Two layer feed-forward neural network (ANN)1994Diagnosis of focal bone lesionInvestigative RadiologyX-rayAccuracy 0.835
Filograna et al. [76]Radiomics using MRI machine learning techniques (SVM)2019Differentiate between metastatic vs. non-metastatic vertebral bodiesLa Radiologia MedicaMRIAUC: 0.841–0.912
Yin P. et al. [77]Radiomics using MRI machine learning techniques2019Differentiation of lesions in the sacrum, for example, chordoma, giant cell tumour, or metastatic lesions)Journal of Magnetic Resonance ImagingMRIAccuracy: 0.810;
AUC: 0.840
Acar E. et al. [78]Texture analysis and PET CT machine learning techniques (Weighted KNN algorithm)2019Differentiating metastatic and completely responded sclerotic bone lesions in prostate cancerBritish Journal of RadiologyPET/CTAccuracy: 0.735; Sensitivity: 0.735;
Specificity: 0.737;
AUC: 0.76
He et al. [79]EfficientNet-B0 convolutional neural network architecture2020Differentiating benign between benign and malignant bone lesionEbioMedicineX-rayAccuracy: 0.734
AUC: 0.877 (benign);
0.916 (malignant)
Accuracy: 0.99
Average mean IoU: 0.848
Reicher et al. [80]TensorFlow Inception-v3 recurrent convolutional neural network (CNN) trained to the ImageNet model2018Classifying bone tumour matrixSIIM (Society for Imaging Informatics in Medicine)X-rayAccuracy 0.93
Y. Li et al. [81]Super label guided convolutional neural network2018Distinguish between benign and malignant bone tumours and classifies bone tumour matrixArtificial Neural Networks and Machine LearningCTAccuracy: 0.740
Park et al. [82]ResNet 50, GoogleNet Inception v3, and EfficientNet-b1, b2, and b32022Distinguish between benign, malignant or no tumour over the femurPLoS ONEX-rayAccuracy: 0.853; Sensitivity: 0.822; Specificity: 0.912; Precision: 0.821; AUC: 0.953
Liu et al. [83]PyTorch 1.2.0 with Python 3.7.3, XGBoost and SHapley ad- ditive exPlanations value2021Classification of benign, intermediate and malignant bone lesionsEuropean RadiologyX-rayAUC: 0.898 (benign); 0.894 (malignant); 0.865 (intermediate) Macroaverage AUC: 0.872
Pan D. et al. [84]Radiomics using X-ray machine learning techniques, SHapley Additive exPlanations (SHAP)2021Classification of benign, intermediate and malignant bone lesionsBiomed Research InternationalX-rayAUC: 0.970 (binary), 0.940 (tertiary); Accuracy: 0.947 (binary); 0.828 (tertiary)
Hong J. H. et al. [85]Radiomics using CT machine learning techniques2021Distinguish between benign bone island and osteoblastic metastasisRadiologyCTAUC: 0.960; Sensitivity: 0.800; Specificity: 0.960; Accuracy: 0.860
von Schacky et al. [86]Radiomics using X-ray machine learning techniques, Python 3.7.7, scikit- learn 0.22.2 andfastai library2022Distinguish between benign and malignant bone tumoursEuropean RadiologyX-rayAccuracy: 0.80 (internal), 0.75 (external); Sensitivity: 0.75, 0.90; AUC: 0.79, 0.90
T. Perk et al. [87]Radiomics using PET/CT machine learning techniques2018Distinguish between benign and malignant bone tumoursPhysics in Medicine and BiologyPET/CTAUC: 0.950; Sensitivity: 0.880; Specificity: 0.890
Bao H. D. et al. [88]Radiomics using X-ray machine learning techniques2017Classification of different bone tumoursJournal of Digital ImagingX-rayPrimary accuracy: 0.440–0.620; Differential accuracy: 0.620–0.800
Kahn et al. [89]Bayesian network using X-ray machine learning techniques2001Classification of different bone tumoursJournal of Digital ImagingX-rayAccuracy: 0.890 (binary), 0.680 (tertiary)
Gitto et al. [90]Radiomics using MRI machine learning techniques2022Distinguish between benign vs. malignant cartilaginous lesionsEBioMedicineMRIAccuracy: 0.98 (ACT), 0.80 (CS2); AUC: 0.94 (ACT), 0.90 (CS2)
Gitto et al. [91]Radiomics using MRI machine learning techniques2020Distinguish between low-grade vs. high-grade cartilaginous lesionsEuropean Journal of RadiologyMRIAccuracy: 0.857 (training), 0.750 (test); AUC: 0.850 (training), 0.78 (test)
Yin et al. [92]Radiomics using CT machine learning techniques, Pyradiomics python package,2020Distinguish between benign vs. malignant sacral tumourFrontier OncologyCTAccuracy: 0.81 (Clinical-LR), 0.81 (Clinical-DNN); AUC: 0.84 (Clinical-LR), 0.83 (Clinical-DNN)
Georgeanu et al. [93]Radiomics using MRI machine learning techniques, ResNet50 image classifiers2022Distinguish between benign and malignant bone tumoursMedicina (Kaunas)MRIAccuracy: 0.808 (training), 0.805 (test); AUC: 0.885 (training), 0.879 (test)
Fan et al. [34]Radiomics using PET/CT machine learning techniques2021Distinguish between benign and metastatic vertebral lesionsFrontiers in MedicinePET/CTAccuracy: 0.875 (LR), 0.834 (SVM), 0.750 (Decision Tree)
Xu et al. [94]Radiomics using PET/CT machine learning techniques2014Distinguish between malignant and benign bone and soft-tissue lesionsAnnals of Nuclear MedicinePET/CTAccuracy 0.825; Sensitivity 0.864; Specificity 0.772
Chianca et al. [95]Radiomics using MRI machine learning techniques, 3D Slicer heterogeneityCAD module (hCAD) and PyRadiomics2021Distinguish between different benign, primary malignant vs. metastatic vertebral lesionsEuropean Journal of RadiologyMRI2-Label Classification—Accuracy: 0.94 (training), 0.86 (test);
3-Label Classification—Accuracy: 0.80 (training), 0.69 (test).
Gitto et al. [96]Radiomics using MRI machine learning techniques2022Distinguish between benign and metastatic vertebral lesionsLa Radiologica MedicaMRIAccuracy—0.76; Sensitivity: 0.78; Specificity: 0.68; AUC 0.78
Consalvo et al. [97]PyTorch 1.9.0 and cuda toolkit 11.1 & ResNet18 architecture2022Distinguish between Ewing Sarcoma (ES) vs. acute osteomyelitis (OM)Anticancer ResearchX-rayAccuracy: 0.867 (ES), 0.903 (OM); Sensitivity: 1.00 (ES), 0.930 (OM); Specificity: 0.76 (ES), 0.844
Zhao et al. [98]Radiomics using MRI machine learning techniques2022Distinguish between benign vs. malignant bone lesionsJournal of Magnetic Resonance ImagingMRIImproved Sensitivities 0.12 to 0.36 as compared to manual.
Bradshaw et al. [99]Deep convolutional neural network via VGG19 architecture2018Classifying benign and malignant bone lesionJournal of Nuclear MedicinePET/CTAccuracy: 0.88; Sensitivity: 0.90; Specificity 0.85.
Do et al. [100]Deep convolutional neural network with combination of global and patch-based models2021Classifying bone tumours in the knee into benign vs. malignantDiagnosticsX-rayAccuracy: 0.99
Average mean IoU: 0.848
Masoudi et al. [101]Deep convolutional neural network with 2D ResNet- 50 & 3D ResNet-182021Classify benign or malignant bone lesions in prostate cancerIEEE AccessCTAccuracy: 0.922; F1: 92.3%
Gitto et al. [102]Machine-learning classifier (LogitBoost)2021Classification of atypical cartilaginous tumours and higher-grade chondrosarcoma, of long bones.EBioMedicineCTAccuracy 0.750, AUC 0.78 (Validation set)
von Schacky et al. [103]Mask region–based convolutional neural network (Mask-RCNN-X101)2021Classify benign or malignant bone lesionsRadiologyX-rayAccuracy: 80.2%, Sensitivity: 62.9%, Specificity: 88.2%
Abbreviations: ANN (Artificial Neural Network); MCC (Matthews correlation coefficient); CNN (Convolutional Neural Network); AUC (Area under curve); SVM (Support vector machine); NRI (Net reclassification index; IDI (Integrated discrimination index); IoU (Index of Uniformity); DNN (Dense neural network) LR (Likelihood ratio).
Table 2. The performance of Artificial intelligence among various imaging modalities in distinguishing benign vs. malignant bone lesions.
Table 2. The performance of Artificial intelligence among various imaging modalities in distinguishing benign vs. malignant bone lesions.
Imaging ModalityAccuracySensitivitySpecificityArea under Curve (AUC)
X-ray0.44–0.990.75–1.000.78–0.910.79–0.95
Computed Tomography (CT)0.74–0.920.800.960.78–0.96
Magnetic Resonance Imaging (MRI)0.74–0.980.78–0.880.61–0.790.73–0.94
Positron Emission Tomography with CT (PET/CT)0.74–0.880.84–0.900.74–0.850.76–0.95
Overall0.44–0.990.63–1.000.73–0.960.73–0.96
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ong, W.; Zhu, L.; Tan, Y.L.; Teo, E.C.; Tan, J.H.; Kumar, N.; Vellayappan, B.A.; Ooi, B.C.; Quek, S.T.; Makmur, A.; et al. Application of Machine Learning for Differentiating Bone Malignancy on Imaging: A Systematic Review. Cancers 2023, 15, 1837. https://0-doi-org.brum.beds.ac.uk/10.3390/cancers15061837

AMA Style

Ong W, Zhu L, Tan YL, Teo EC, Tan JH, Kumar N, Vellayappan BA, Ooi BC, Quek ST, Makmur A, et al. Application of Machine Learning for Differentiating Bone Malignancy on Imaging: A Systematic Review. Cancers. 2023; 15(6):1837. https://0-doi-org.brum.beds.ac.uk/10.3390/cancers15061837

Chicago/Turabian Style

Ong, Wilson, Lei Zhu, Yi Liang Tan, Ee Chin Teo, Jiong Hao Tan, Naresh Kumar, Balamurugan A. Vellayappan, Beng Chin Ooi, Swee Tian Quek, Andrew Makmur, and et al. 2023. "Application of Machine Learning for Differentiating Bone Malignancy on Imaging: A Systematic Review" Cancers 15, no. 6: 1837. https://0-doi-org.brum.beds.ac.uk/10.3390/cancers15061837

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop