Deep Learning Approaches Applied to Image Classification of Renal Tumors: A Systematic Review

Sandra Amador*, Felix Beuschlein, Vedant Chauhan, Judith Favier, David Gil, Phillip Greenwood, R. R. de Krijger, Matthias Kroiss, Samanta Ortuño-Miquel, Attila Patocs, Anthony Stell, Axel Walch

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review


Renal cancer is one of the ten most common cancers in the population that affects 65,000 new patients a year. Nowadays, to predict pathologies or classify tumors, deep learning (DL) methods are effective in addition to extracting high-performance features and dealing with segmentation tasks. This review has focused on the different studies related to the application of DL techniques for the detection or segmentation of renal tumors in patients. From the bibliographic search carried out, a total of 33 records were identified in Scopus, PubMed and Web of Science. The results derived from the systematic review give a detailed description of the research objectives, the types of images used for analysis, the data sets used, whether the database used is public or private, and the number of patients involved in the studies. The first paper where DL is applied compared to other types of tumors was in 2019 which is relatively recent. Public collection and sharing of data sets are of utmost importance to increase research in this field as many studies use private databases. We can conclude that future research will identify many benefits, such as unnecessary incisions for patients and more accurate diagnoses. As research in this field grows, the amount of open data is expected to increase.

Original languageEnglish
Pages (from-to)615-622
Number of pages8
JournalArchives of Computational Methods in Engineering
Issue number2
Publication statusPublished - Mar 2024


Dive into the research topics of 'Deep Learning Approaches Applied to Image Classification of Renal Tumors: A Systematic Review'. Together they form a unique fingerprint.

Cite this