Sex estimation from maxillofacial radiographs using a deep learning approach

Hiroki Hase, Yuichi Mine, Shota Okazaki, Yuki Yoshimi, Shota Ito, Tzu-Yu Peng, Mizuho Sano, Yuma Koizumi, Naoya Kakimoto, Kotaro Tanimoto, Takeshi Murayama

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

The purpose of this study was to construct deep learning models for more efficient and reliable sex estimation. Two deep learning models, VGG16 and DenseNet-121, were used in this retrospective study. In total, 600 lateral cephalograms were analyzed. A saliency map was generated by gradient-weighted class activation mapping for each output. The two deep learning models achieved high values in each performance metric according to accuracy, sensitivity (recall), precision, F1 score, and areas under the receiver operating characteristic curve. Both models showed substantial differences in the positions indicated in saliency maps for male and female images. The positions in saliency maps also differed between VGG16 and DenseNet-121, regardless of sex. This analysis of our proposed system suggested that sex estimation from lateral cephalograms can be achieved with high accuracy using deep learning.

Original languageEnglish
Pages (from-to)394-399
Number of pages6
JournalDental Materials Journal
Volume43
Issue number3
DOIs
Publication statusPublished - 2024

Keywords

  • Artificial intelligence
  • Deep learning
  • Sex estimation
  • Maxillofacial radiograph
  • Lateral cephalogram

ASJC Scopus subject areas

  • Ceramics and Composites
  • General Dentistry

Fingerprint

Dive into the research topics of 'Sex estimation from maxillofacial radiographs using a deep learning approach'. Together they form a unique fingerprint.

Cite this