A contourlet transform feature extraction scheme for ultrasound thyroid texture classification

Stamos Katsigiannis, Eystratios G. Keramidas, Dimitris Maroulis

Research output: Contribution to journalArticle

28 Citations (Scopus)

Abstract

Ultrasonography is an invaluable and widely used medical imaging tool.
Nevertheless, automatic texture analysis on ultrasound images
remains a challenging issue. This work presents and investigates a texture representation scheme on thyroid ultrasound images for the detection of hypoechoic and isoechoic thyroid nodules, which present the highest malignancy risk. The proposed scheme is based on the
Contourlet Transform (CT) and incorporates a thresholding approach for the selection of the most significant CT coefficients. Then a variety of
statistical texture features are evaluated and the optimal subsets are extracted through a selection process. A Gaussian kernel Support Vector Machine (SVM) classifier is applied along the Sequential Floating Forward Selection (SFFS) algorithm, in order to investigate the
most representative set of CT features. For this experimental evaluation, two image datasets have been utilized: one consisting of hypoechoic nodules and normal thyroid tissue and another of isoechoic nodules and normal thyroid tissue. Comparative experiments show that the proposed methodology is more efficient than previous thyroid ultrasound representation methods proposed in the literature. The maximum
classification accuracy reached 95% for hypoechoic dataset, and 92% f
or isoechoic dataset. Such results provide evidence that CT based texture features can be successfully applied for the classification of different types of texture in ultrasound thyroid images.
Original languageEnglish
Pages (from-to)138-145
Number of pages8
JournalEngineering Intelligent Systems
Volume18
Issue number3/4
Publication statusPublished - 2010
Externally publishedYes

Fingerprint

Dive into the research topics of 'A contourlet transform feature extraction scheme for ultrasound thyroid texture classification'. Together they form a unique fingerprint.

Cite this