Improved watershed transform for tumor segmentation: Application to mammogram image compression

Wei Yen Hsu

Research output: Contribution to journalArticlepeer-review

52 Citations (Scopus)

Abstract

In this study, an automatic image segmentation method is proposed for the tumor segmentation from mammogram images by means of improved watershed transform using prior information. The segmented results of individual regions are then applied to perform a loss and lossless compression for the storage efficiency according to the importance of region data. These are mainly performed in two procedures, including region segmentation and region compression. In the first procedure, the canny edge detector is used to detect the edge between the background and breast. An improved watershed transform based on intrinsic prior information is then adopted to extract tumor boundary. Finally, the mammograms are segmented into tumor, breast without tumor and background. In the second procedure, vector quantization (VQ) with competitive Hopfield neural network (CHNN) is applied on the three regions with different compression rates according to the importance of region data so as to simultaneously reserve important tumor features and reduce the size of mammograms for storage efficiency. Experimental results show that the proposed method gives promising results in the compression applications.

Original languageEnglish
Pages (from-to)3950-3955
Number of pages6
JournalExpert Systems with Applications
Volume39
Issue number4
DOIs
Publication statusPublished - Mar 2012
Externally publishedYes

Keywords

  • Competitive Hopfield neural network (CHNN)
  • Image segmentation
  • Improved watershed transform
  • Mammogram
  • Tumor
  • Vector quantization

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • General Engineering

Fingerprint

Dive into the research topics of 'Improved watershed transform for tumor segmentation: Application to mammogram image compression'. Together they form a unique fingerprint.

Cite this