Home > Papers

 
 
Illumination and Rotation Invariant Featureof Texture Images Based on Hilbert-Huang Transform
Yang Zhihua 1, Zhang Qian 2,Yang Lihua 3
1. Information Science School, Guangdong Finance andEconomics University, Guangzhou 510320
2. College of Mathematics and Computional Science, Shenzhen University, Shenzhen 518060
3. School of Mathematics, Sun Yat-sen University, Guangzhou 510275
*Correspondence author
#Submitted by
Subject:
Funding: Research Fund for the Doctoral Program of Higher Education of China (No.20130171110016)
Opened online:28 April 2017
Accepted by: none
Citation: Yang Zhihua, Zhang Qian,Yang Lihua.Illumination and Rotation Invariant Featureof Texture Images Based on Hilbert-Huang Transform[OL]. [28 April 2017] http://en.paper.edu.cn/en_releasepaper/content/4726832
 
 
This paper presents a novel method to extract theillumination and rotation invariant features for texture imagesbased on Hilbert-Huang transform. Texture images are usually ofquasi-periodic. It is shown in this paper that the main frequencyof the Hilbert marginal spectrum of a texture image can be used to measure the approximate period effectively and thus can be servedas a good feature for texture classification. This feature isproved to be invariant to uneven illumination. Being modified, itis shown that this feature is also invariant rotation. Experimentshave been conducted to compare the feature with the existing ones.It is shown that the proposed approach outperforms the existingmethods in both recognition rate and robustness to unevenillumination, rotation and noise pollution.
Keywords:Empirical mode decomposition (EMD), Hilbert-Huangtransform (HHT), Main frequency center, Texture Classification
 
 
 

For this paper

  • PDF (0B)
  • ● Revision 0   
  • ● Print this paper
  • ● Recommend this paper to a friend
  • ● Add to my favorite list

    Saved Papers

    Please enter a name for this paper to be shown in your personalized Saved Papers list

Tags

Add yours

Related Papers

Statistics

PDF Downloaded 38
Bookmarked 0
Recommend 0
Comments Array
Submit your papers