Iris Segmentation Based on Matting
Author:
Affiliation:

1. Northeastern University, No. 3-11, Wenhua Road, Heping District, Shenyang, China;
2. Tencent, Tencent Binhai Building, No. 33, Haitian Second Road, Nanshan District, Shenzhen, China

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In this paper, we aim to propose a novel and effective iris segmentation method that is robust to uneven light intensity and different kinds of noises such as occlusion by light spots, eyelashes, eyelids, spectacle-frame, etc. Unlike previous methods, the proposed method makes full use of gray intensities of the iris image. Inspired by the matting algorithm, a premier assumption is made that the foreground and background images of the iris image are both locally smooth. According to the RST algorithm, trimaps are built to provide priori information. Under the assumption and priori, the optimal alpha matte can be obtained by least square loss function. A series of effective post processing methods are applied to the alpha image to obtain a more precise iris segmentation. The experiment on CASIA-iris-thousand database shows that the proposed method achieves a much better performance than conventional methods. Our experimental results achieve 20.5% and 26.4%, more than the well-known integro-differential operator and edge detection combined with Hough transform on iris segmentation rate respectively. The stability and validity of the proposed method is further demonstrated through the complementary experiments on the challenging iris images.

    Reference
    Related
    Cited by
Get Citation

CHEN Qiru, WANG Qi, SUN Ting, WANG Ziyuan.[J]. Instrumentation,2022,9(1):12-22

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: June 14,2022
  • Published:
License
  • Copyright (c) 2023 by the authors. This work is licensed under a Creative
  • Creative Commons Attribution-ShareAlike 4.0 International License.