Finger-vein Pattern Matching for Human Identification

DOI : 10.17577/IJERTV3IS21119

Download Full-Text PDF Cite this Publication

Text Only Version

Finger-vein Pattern Matching for Human Identification

Iram Malik1, Rohini Sharma 2, Mohd. Junaid Mir3

1,2. Information Technology Department, Maharishi Markandeshwar University (mullana) Ambala, Haryana, India

3. Research scholar NIT Srinagar

Abstract Biometrics is used for identification of individuals based on their physical or behavioural characteristics. Biometrics has gained importance in todays world where information security is essential. Finger-veins, one of the most well known biometric, is a promising biometric pattern for personal identification and authentication in terms of its security and convenience. Finger vein has gained much attention among researchers to combine accuracy, universality and cost efficiency. Finger-veins based biometric systems are gaining acceptance in medium to high security applications. A biometrics system for identifying individuals using the pattern of veins in a finger is proposed. It has reached an unparallel level of security, efficiency and reliable choice of high precision among the biometrics techniques. In this research a method is proposed that combines two previously developed approaches and results in enhancing the efficiency and reliability of Pattern Extraction for Human Identification. The proposed method, that is, combination of Repeated Line Tracking [2] and Even Gabor [4] extracts the finger-vein features robustly from the unclear image and the same is responsible for matching purpose in order to identify individuals. It achieves robust pattern extraction and matching and is used effectively in personal identification.

Keywords biometrics, finger-vein biometrics, pattern-extraction, human identification, repeated line tracking, even Gabor.

  1. INTRODUCTION

    Personal identification technology is used in a wide range of systems for functions such as area access control, logins for PCs, bank ATM systems, surveillance, driver identification, e-commerce systems and many more. Biometric techniques for identifying individuals are attracting attention because conventional techniques such as keys, passwords, and PIN numbers carry the risks of being stolen, lost, or forgotten. There has been considerable research in biometrics [8], [9] over the last two decades. The list of physiological and behavioural biometric characteristics that has to date been developed and implemented is long and includes the face [10]; [11], iris [12]; [13], fingerprint [14], palm print [15], hand shape [16],

    voice [17], signature [18], and gait [19]. Notwithstanding

    this great and increasing variety of biometrics, no biometric has yet been developed that is perfectly reliable or secure. For example, fingerprints and palm prints are usually frayed; voice, signatures, hand shapes, and iris images are easily forged; face recognition can be made difficult by occlusions or face-lifts; and biometrics such as fingerprints, iris and face recognition are susceptible to spoofing attacks [13], i.e., the biometric identifiers can be copied and used to create artefacts that can deceive many currently available biometric devices. The great challenge to biometrics is thus to improve recognition performance and be maximally resistant to deceptive practices [14]. A biometric system using finger-vein patterns, that is, patterns inside the human body is proposed. The blood vessels, as part of the circulatory system, transport blood throughout the body to sustain the metabolism, using a network of arteries, veins, and capillaries. The use of such vascular structures in the palm, palmdorsal, and fingers has been investigated in the biometrics literature with high success [1]-[3], [6], [7]. The finger-vein patterns are believed to be quite unique, even in the case of identical twins and even between the different fingers of an individual [1]. Finger vein pattern is a promising qualified candidate for biometric-based personal identification. There are various factors that are cited for the preference of finger-vein biometrics [5]:

    Anti-counterfeit: Finger veins staying underneath the skin make vein pattern duplication impossible in practice.

    Active liveness: Vein information disappears with biological tissues losing liveness, which makes artificial veins unavailable in application.

    User friendliness: Fingervein images can be captured noninvasively without the contagion and unpleasant sensations

    This system has the advantage of being resistant to forgery. In this system, infrared light is transmitted from the rear of a hand. One finger is placed between the infrared light source and a camera, as shown in Fig 1. As haemoglobin in the blood absorbs infrared light, the finger-vein patterns are captured as shadow patterns. The intensity of the light is adjusted using the captured image. After that, the outline of the finger is detected, the rotation of the image is corrected, and the vein pattern is extracted. A finger image captured using infrared light contains veins that have various widths and brightness, which may change with time because of fluctuations in the amount of blood in the vein, caused by changes in temperature, physical conditions, etc. To identify a person with high accuracy, the pattern of the thin/thick

    and clear/unclear veins in an image must be extracted equally. Furthermore, the pattern should be extracted with little or no dependence on vein width and brightness fluctuations. Conventional methods such as the matched filter [20] and morphological methods [21] can extract patterns if the widths of veins are constant. However, these methods cannot extract veins that are narrower/wider than the assumed widths, which degrade the accuracy of the personal identification. The normalized finger-vein images from the imaging setup depict a vascular network with varying thickness, clarity, and ambiguity on the topological imperfections/connections. The extraction of finger-vein features using Repeated Line Tracking [2], Maximum Curvature [3], Gabor filter [4], etc individually have given promising results. A new approach for the finger-vein feature extraction combining Repeated Line Tracking and Gabor filter is proposed. This makes pattern extraction robust and gives an improvement over these individual techniques.

    patterns but the effectivity and reliability of the extracted pattern can be increased by combining these two approaches. Pattern extracted by Repeated Line Tracking contains noise that is also tracked along with dark lines to be tracked thereby reducing its efficiency. On the other hand, Even Gabor method is also used for feature extraction but cannot be optimal in accuracy and efficiency considering the development of finger-vein recognition technology [4]. So a method is proposed that combines these two approaches and results in enhancing the efficiency and reliability of Pattern Extraction.

    F(x, y) is the intensity of the pixel (x, y), (xc, yc) is the position of the current tracking point of line tracking in the image, Rf is the set of pixels within the fingers outline, and Tr is the locus space. Suppose the pixel in the lower left in the image to be (0, 0), the positive direction of the x-axis to be rightward in the image, the positive direction of the y- axis to be upward within the image, and Tr (x, y) to be initialized to 0.

    Step 1: Determination of the start point for line tracking and the moving-direction attribute

    The start point for line tracking is (xs, ys), a pair of uniform random numbers selected from Rf That is, the initial value of the current tracking point (xc, yc) is (xs, ys). After that, the moving-direction attribute Dlr, Dud is determined Dlr, Dud is the parameters that prevent the tracking point from following a path with excessive curvature. Dlr and Dud are independently determined as follows:

    = 1,0 ( 2 < 1 (1)

    1,0 ()

    Fig. 1 Human Identification process using Finger-Vein Biometrics

    = 0,1 ( 2 < 1

    0, 1 ()

    (2)

    The proposed technique for Human Identification is based on series of steps involving image acquisition, pre-processing, feature extraction and matching combining Repeated Line Tracking and Gabor Filter methods. This makes Pattern Extraction and matching robust. Repeated Line Tracking is implemented first for extraction of finger-vein features from the unclear image by using line tracking that starts from various positions. Although noise may also be tracked, emphasized to a smaller degree than the dark lines, yet the pattern extracted gives promising results. Multi-orientation Gabor filter is applied later on to the extracted pattern from Repeated Line Tracking in order to make the pattern extraction robust. Finally matching is done. In the matching process, the extracted pattern is converted into matching data, and these data are compared with recorded data for human identification.

  2. ALGORITHMS

    1. Pattern Extraction Algorithm

      This algorithm is responsible for extracting finger-vein features by a combination of two earlier feature extracting methods viz., Repeated Line Tracking [2] and Even Gabor [4]. These two methods individually can extract finger-vein

      Step 2: Detection of the direction of the dark line and movement of the tracking point

      This step is composed of several sub-steps.

      Step 2-1: Initialization of the locus-position table Tc

      The positions that the tracking point moves to are stored in the locus-position table, Tc. The table is initialized in this step.

      Step 2-2: Determination of the set of pixels Nc to which the current tracking point can move

      A pixel to which the current tracking point (xc, yc) moves must be within the finger region, have not been a previous (xc, yc) within the current round of tracking, and be one of the neighbouring pixels of (xc, yc). Therefore, Nc is determined as follows:

      = ( , ) (3) Where Nr (xc, yc) is the set of neighbouring pixels of (xc, yc), selected as follows:

      , =

      3 , 100 < ;

      3 , ( + 1 100 < + ) ;

      8 , + + 1 100 ,

      (4)

      where N8(x, y) is the set of eight neighbouring pixels of a pixel (xc, yc) and N3(D)(x, y) is the set of three neighbouring

      pixels of (xc, yc) whose direction is determined by the moving-direction attribute D defined as (Dx, Dy))

      N3 (D) (x, y) can be described as follows:

      3 , = Dx + x, Dy + y , Dx Dy + x, Dy

      Dx + y , (Dx + Dy + x, Dy + Dx + y)

      (5)

      Parameters plr and pud (4) Are the probability of selecting the three neighbouring pixels in the horizontal or vertical direction, respectively, as Nr (sc, yc). The veins in a finger

      the finger-vein pattern is obtained as chains of high values of Tr (x, y). Figure 3.2 shows a result of finger-vein extraction. Figure 3.2.1 is the infrared image from which Figure 3.2.2 is produced as the distribution of values in Tr (x, y). The higher values are shown as brighter pixels.

      1

      Step 6: A Gabor filter can be viewed as a sinusoidal plane of particular frequency and orientation, modulated by a Gaussian envelope. So firstly we develop a Gaussian envelope by the following formula:

      tend to run in the direction of the fingers length. Therefore,

      , =

      2

      2

      20 (7)

      if we increase the probability that N3 (Dlr) (xc, yc) is

      Where

      2

      2 2

      selected as Nr (xc, yc), we obtain a faithful representation of

      the pattern of finger veins. In preliminary experiments,

      =

      cos sin

      (8)

      excellent results are produced when plr = 50 and pud = 25. Step 2-3: Detection of the dark-line direction near the current tracking point

      To determine the pixel to which the current tracking point (xc, yc) should move, the following equation, referred to as the line-evaluation function, is calculated. This reflects the depth of the valleys in the cross-sectional profiles around the current tracking point

      sin cos

      = 1, is the orientation of a Gabor filter, f 0 denotes the filter centre frequency, and respectively represent the standard deviation (often called scale) and aspect ratio of the elliptical Gaussian envelope, x and y are rotated versions of the coordinates x and y. Step 7: A set of Gabor filters as a pre-processing step for image is used and extraction of oriented image features is done. Veins hold high random

      characteristics in diameter and orientation, so isotropic

      = max( , )

      + cos sin , +

      Gaussian function is considered i.e., is set equal to one.

      2

      sin + cos + + cos sin , +

      For reducing diameter deformation arising from elliptic

      2 2 Gaussian envelope, varies from zero to with a /8

      2

      sin cos 2( + cos , + sin )

      (6)

      Where W is the width of the profiles, r is the distance between (xc, yc) and the cross section, and i is the angle between the line segments (xc, yc) (xc + 1, yc) and (xc, yc)

      (xi, yi). Here in consideration of a thickness of the veins that are visible in the captured images, these parameters are set at W = 11 and r = 1.

      Step 2-4: Registration of the locus in the locus-position table Tc and moving of the tracking point.

      The current tracking point (xc, yc) is added to the locus position table Tc. After that, if Vl is positive, (xc, yc) is then updated to (xi, yi) where Vl is maximum.

      Step 2-5: Repeated execution of steps 2-2 to 2-4

      If Vl is positive, go to step 2-2; if Vl is negative or zero, leave step 2 and go to step 3, since (xc, yc) is not on the dark line.

      Step 3: Updating the number of times points in the locus space have been tracked

      Values of elements in the locus space Tr (x, y) are incremented (x, y) Tc.

      Step 4: Repeated execution of step 1 to step 3 (N times) Steps 1 to 3 are thus executed N times. If the number of repetitions N is too small, insufficient feature extraction is performed. If, on the other hand, N is too big, computational costs are needlessly increased. Through an experiment, we determined that N = 3000 is the lower limit for sufficient feature extraction

      Step 5: Acquisition of the finger-vein pattern from the locus space.

      The total number of times the pixel (x, y) has been the current tracking point in the repetitive line tracking operation is stored in the locus space, Tr (x, y). Therefore,

      interval that is, the even-symmetric Gabor filters are embodied in eight channels.

      Step 8: The generation of a combined feature map can be described as follows:

      F(x,y)=G(x,y)R(x,y) (9)

      F(x, y) denote a filtered result i.e., combined output, R(x, y) is the output image obtained as a result of repeated line tracking and where denotes 2D image convolution operation. It can be observed that the extracted vein structure F(x, y) fits quite well in the original image vascular topologies, and the accompanying noise is suppressed very well.

    2. Matching Algorithm[2]

    In the matching process, the pattern is converted into matching data, and these data are compared with recorded data. Two methods are commonly used for matching line shaped patterns: structural matching [23] and template matching [22]. Structural matching requires additional extraction of feature points such as line endings and bifurcations. Since a finger-vein pattern has few of these points, template matchng based on comparison of pixel values is more appropriate for finger-vein pattern matching. Steps of Template Matching Algorithm:

    Step 1: Labelling of the locus space.

    Step 2: Spatial reduction and relabeling of the locus space. Step 3: Matching of data.

    In step 1, the locus space is binarised by using a threshold. The values of pixels labelled as parts of the background as 0 and of pixels labelled as parts of the vein regions as 255. In step2, the locus space is reduced to one third of its original size in both dimensions. This reduction is accomplished by taking the averages of all non overlapping 3 × 3 pixels. The binarised image of the locus space is turned into a smaller gray scale image. With this, new locus space is re-binarised by simply setting the threshold to 128 and finally in step3, a

    mismatch ratio is calculated to examine whether or not two sets of data have a correlation with each other. The ratio

    is defined as the difference between two sets of data to be matched. R(x, y) and I(x, y) are the values at position (x,

    y) of the registered and input matching data, w and h are the width and height of both sets of data, and are the distances in which motion in the vertical and horizontal directions, respectively, is required to adjust the displacement between the two sets of data, and the template data are defined as the rectangular region within R(x, y) whose upper-left position is R( , ) and lower-right position is R(w , h ). The value of mismatch (s, t), which is the difference between the registered and input data at the positions where R ( , ) overlaps with I(s, t), is defined as follows:

    , = 2 1 2 + , + , +

    FAR is the ratio of the number of unauthorized users accepted by the biometric system to the total of identification attempts to be made. This is also known as type 2 error, False Acceptance Rate is when an imposter is accepted as a legitimate user, This happens when the system find that the biometric data is similar to the template of a legitimate user. FAR is calculated by:

    =

    For example, if FAR=101, it means that out of 10 identification attempts, 1 illegitimate user is identified as legitimate.

    • Genuine Acceptance Rate

      Genuine Acceptance Rate (GAR) is an overall accuracy measurement of a biometric system. It is calculated by the

      , +

      =0

      =0

      (10)

      formula:

      GAR = 1 FAR.

      where w = 320 and h = 240 in consideration of the finger size in the captured image, and are set at = 50and

      = 50 in order to adjust the finger position in the captured image by up to about 1 cm, and in Equation 4 is a parameter that indicates whether a pixel labelled as part of the background region and a pixel labelled as part of a vein region overlapped with each other. When 1 is defined as the pixel value of one pixel and 2 is defined as the pixel value of the other pixel, can be described as follows:

      GAR is important because it is the chief measurement of precision. We can directly compare separate biometric systems by setting their threshold so that the FAR is at a specific value. The GAR is then measured at this specific FAR. If we are comparing two different systems, the system with the highest GAR rate is considered to be the most accurate. GAR is commonly measured at different FAR intervals.

      1

      = 255

      The proposed technique for Human Identification is based on

      1, 2 = 0

      1 2

      (11)

      series of steps involving image acquisition, pre-processing,

      pattern extraction and matching combining Repeated Line

      The minimum value of mismatch Nm, which is the smallest

      (s, t) calculated under the condition that the template overlaps with the input matching data I(x, y) at all positions,

      Tracking and Even Gabor methods. This technique has been applied to gray scale images and gives satisfactory results. The technique has been implemented on a set of 2 gray

      can be defined as follows:

      = min0<2 ,0<

      ,

      (12)

      scale 320×240 images of .bmp extension. The performance

      of the technique has been evaluated and graphically represented on the basis of two measures which are

      The definitions given above, the mismatch ratio is

      defined as follows:

      =

      0+ 2 1 0 +2 1 , , 0 + 1 1 0, ,

      Genuine Acceptance Rate (GAR) and False Acceptance Rate (FAR) in the form of bits. The experiments has resulted in a good GAR (Genuine Acceptance Rate) for its corresponding FAR (False Acceptance Rate) showing improved reliability that frustrate spoofers by developing

      =0

      =0

      =

      =

      (13)

      biometrics that are highly individuating; yet at the same

      time, highly effective and robust.

      Where 0 and 0 are s and t such that Equation 7 is minimized. As is shown by Equation 7, is described as the ratio between and the total number of pixels that are classified as belonging to the vein region in the two data sets.

  3. EXPERIMENTS

    To check the performance of the proposed technique several parameters are used. Parameters evaluate the technique in terms of Genuine Acceptance Rate (GAR) and False Acceptance Rate (FAR).

    1. Evaluation Parameters

      • False Acceptance Rate

      The GAR and FAR of input Finger-Vein image for the three approaches considered viz., Repeated Line Tracking, Gabor Filter method and Combined Method is shown in the tables 1 and 2. If we compare the GAR for the corresponding FAR of the input finger-vein images, we can say that combined approach gives best results.

      TABLE 1

      Comparison Of Different Human Identification Approaches (Repeated Line Tracking, Even Gabor And Combined Method) By Means Of GAR Vs FAR Of Input Finger-Vein Image (Index Finger).

      Input Image 320×240

      Repeated Line Tracking (previous method)

      Even Gabor (previous method)

      Combined Approach (Repeated Line

      Tracking+ Even Gabor)

      FAR

      GAR

      FAR

      GAR

      FAR

      GAR

      index_1.bmp

      0

      0.53

      0

      0.5

      0

      0.73

      index_1.bmp

      10-4

      0.55

      10-4

      0.65

      10-4

      0.83

      index_1.bmp

      10-3

      0.65

      10-3

      0.78

      10-3

      0.88

      index_1.bmp

      10-2

      0.80

      10-2

      0.85

      10-2

      0.89

      index_1.bmp

      10-1

      0.94

      10-1

      0.97

      10-1

      0.97

      index_1.bmp

      100

      1

      100

      1

      100

      1

      Table 2

      Comparison Of Different Human Identification Approaches (Repeated Line Tracking, Even Gabor And Combined Method) By Means Of Gar Vs Far Of Input Finger-Vein Image (Middle Finger)

      Input Image 320×240

      Repeated Line Tracking (previous method)

      Even Gabor (previous method)

      Combined Approach (Repeated Line

      Tracking+ Even Gabor)

      FAR

      GAR

      FAR

      GAR

      FAR

      GAR

      middle_1.bmp

      0

      0.25

      0

      0.33

      0

      0.5

      middle_1.bmp

      10-4

      0.3

      10-4

      0.4

      10-4

      0.5

      index_1.bmp

      10-3

      0.4

      10-3

      0.5

      10-3

      0.6

      index_1.bmp

      10-2

      0.6

      10-2

      0.7

      10-2

      0.79

      middle_1.bmp

      10-1

      0.86

      10-1

      0.92

      10-1

      0.95

      middle_1.bmp

      100

      1

      100

      1

      100

      1

      Fig. 2 shows the graphic representation of ROC (Receiver Operating Characteristics) of index finger. ROC demarcates plot between GAR and its corresponding FAR of an input Finger-Vein image. The horizontal axis shows the FAR and vertical axis shows the corresponding range of GAR. The red line shows the plot for Combined Method (Repeated Line Tracking + Even Gabor) where as blue line shows plot for Even Gabor method and pink line shows plot for Repeated Line Tracking. The graph shows that combined method gives optimum result.

      Fig.2 Graphic representation of GAR versus FAR for index finger.

      Fig. 3 shows the graphic representation of ROC (Receiver Operating Characteristics) of middle finger. ROC demarcates plot between GAR and its corresponding FAR of input Finger-Vein image. The horizontal axis shows the FAR and vertical axis shows the corresponding range of GAR. The red line shows the plot for Combined Method (Repeated Line Tracking + Even Gabor) where as blue line shows plot for Even Gabor method and pink line shows plot for Repeated Line Tracking. The graph shows that combined method gives optimum result.

      Fig.3 Graphic representation of GAR versus FAR for middle finger.

      The performance analysis of the whole process resulted in improved reliability for Human Identification by means of the proposed method that is, combined method (Repeated Line Tracking + Even Gabor) thereby showing effective and optimum results when compared to individual techniques viz., Repeated Line Tracking and Even Gabor method.

  4. CONCLUSION

    The new approach for Human Identification combining Repeated Line Tracking and Gabor filter results in improving performance and reliability and at the same time is highly effective and robust. The proposed technique is based on series of steps involving image acquisition, pre- processing, pattern extraction and matching combining Repeated Line Tracking and Even Gabor methods. This makes Pattern Extraction and matching robust. Repeated Line Tracking is implemented first for extraction of finger-vein features from the unclear image by using line tracking that starts from various positions. Although noise may also be tracked, emphasized to a smaller degree than the dark lines, yet the pattern extracted gives promising results. Multi- orientation Gabor filter is applied later on to the extracted pattern from Repeated Line Tracking in order to make the pattern extraction robust. Finally matching is done by implementing Template matching technique. In the matching process, the extracted pattern is converted into matching data, and these data are compared with recorded data for human identification. It is observed that the proposed technique comes up with good GAR values that give promising results and an improvement over the earlier individual methods.

  5. FUTURE SCOPE

Although Finger-vein Biometrics is one of the latest biometric technologies, it has already established both technical and statistical feasibility, in the very near future the combination of Finger-Vein Biometric with other Biometric technologies will result in a multimodal system with very high accuracy especially for security concerns in sensitive areas.

REFERENCES

    1. Ajay Kumar, Senior Member, IEEE, and Yingbo Zhou, Human Identification Using Finger Images, IEEE transactions on image processing, Issue 4, Volume 21, April 2012.

    2. Naoto Miura, Akio Nagasaki, Takafumi Miyatake, Feature extraction of finger-vein patterns based on repeated line tracking and its application to personal identification, Machine Vision and Applications 15, page(s):194203, 2004.

    3. Naoto Miura, Akio Nagasaki, Takafumi Miyatake, Extraction of Finger-Vein Patterns Using Maximum Curvature Points in Image Profiles, MVA2005 IAPR Conference on Machine Vision Applications, Tsukuba Science City, Japan, May 16-18, 2005.

    4. Jinfeng Yang, Yihua Shi and Renbiao Wu, Finger-Vein Recognition Based on Gabor Features, Biometric Systems, Design and Applications, Mr Zahid Riaz (Ed.), ISBN: 978-953- 307-542-6, 2011.

    5. Jinfeng Yang, Yihua Shi, Fingervein ROI localization and vein ridge enhancement, Pattern Recognition Letters 33, page(s):15691579, 2012.

    6. Mulyono, D., Horng Shi Jinn, A study of finger vein biometric for personal identification, ISBAST 2008. International Symposium on Biometrics and Security Technologies, 23-24, Page(s):18, April 2008.

    7. Lee, E.C., and Park, K.R., Restoration method of skin scattering blurred vein image for finger vein recognition, Electronics Letters 2009, Issue 21, Volume 45, Page(s):1074 1076, October 8 2009.

    8. Jain AK, Bolle R, Pankanti S; Biometrics: personal identification in networked society. Boston: Kluwer Academic; 1998.

    9. Zhang D; Automated biometricstechnologies and systems; Boston: Kluwer Academic; 2000.

    10. Li SZ, Juwei L; Face recognition using the nearest feature line method; IEEE Transactions on Neural Networks 1999;10:439 4.

    11. Abate AF, Nappi M, Riccio D, Sabatino G; 2D and 3D face recognition: a survey; Pattern Recognition Letters 2007; 28:1885906.

    12. Daugman J; how iris recognition works; IEEE Transactions on Circuits and Systems for Video Technology 2004; 14:2130.

    13. Bowyer KW, Hollingsworth K, Flynn PJ; Image understanding for iris biometrics: a survey. Computer Vision and Image Understanding 2008; 110(2):281307.

    14. Ratha NK, Karu K, Shaoyun C, Jain AK; A real-time matching system for large fingerprint databases; IEEE Transactions on Pattern Analysis and Machine Intelligence 1996; 18:799813.

    15. Zhang D, Wai-Kin K, You J, Wong M; Online palm print identification. IEEE Transactions on Pattern Analysis and Machine Intelligence 2003; 25: 104150.

    16. Sanchez-Reillo R, Sanchez-Avila C, Gonzalez-Marcos A; Biometric identification through hand geometry measurements; IEEE Transactions on Pattern Analysis and Machine Intelligence 2000;22:116871.

    17. Wan V, Renals S; Speaker verification using sequence discriminant support vector machines; IEEE Transactions on Speech and Audio Processing 2005; 13: 20310

    18. Lee LL, Berger T, Aviczer E. Reliable online human signature verification systems. IEEE Transactions on Pattern Analysis and Machine Intelligence 1996;18: 6437.

    19. Wang L, Tan T, Ning H, Hu W; Silhouette analysis-based gait recognition for human identification; IEEE Transactions on Pattern Analysis and Machine Intelligence 2003;25:150518.

    20. Hoover, V. Kouznetsova, and M. Goldbaum, Locating blood vessels in retinal images by piece-wise threshold probing of a matched filter response, Proc AMIA Symp. 1998, PMCID: PMC2232087, page(s):931935, 1998.

    21. Maio, D., Maltoni, D., Direct gray-scale minutiae detection in fingerprints, IEEE Transactions on Pattern Analysis and Machine Intelligence, Issue 1, volume 19, page(s): 2740, 1997.

    22. Jain AK, Duin RPW, Mao J, Statistical pattern recognition: a review, IEEE Trans Pattern Anal Mach Intel, 2000.

    23. Zhang W, Wang Y, Core-baed structure matching algorithm of fingerprint verification, In: Proceedings of the IEEE international conference on pattern recognition, 1:7074, 2002.

    24. Hatim A. Aboalsamh, Vein and Fingerprint Biometrics Authentication- Future Trends, International journal of computers and communications Issue 4, Volume 3, 2009.

    25. Beining Huang, Beijing, China Shilei Liu, Wenxin Li, A finger posture change correction method for finger-vein recognition, Computational Intelligence for Security and Defence Applications (CISDA), IEEE Symposium 2012.

    26. Jinfeng Yan, Minfu Yan, An improved method for finger-vein image enhancement, Signal Processing (ICSP), IEEE 10th International Conference 2010.

    27. Yihua Shi, Jinfeng Yang and Jucheng Yang, A New Algorithm for Finger-Vein Image Enhancement and Segmentation, This work was supported in part by the National Natural Science Foundation of China (Grant No.61073143 and 61063035) and CAUC project, No.07 kys01.

    28. Reddy, P.; Kumar, A., Rahman, S., Mundra, T., A New Antispoofing Approach for Biometric Devices, Biomedical Circuits and Systems, IEEE Transactions, Issue 4, volume 2, Page(s): 328 337, December 2008.

    29. Rongyang Xiao, Gongping Yang, Yilong Yin, and Lu Yang, A Novel Matching Strategy for Finger Vein Recognition, (Eds.): IScIDE 2012, LNCS 7751, 2013, and © Springer-Verlag Berlin Heidelberg, page(s):364371, 2013.

    30. Naoto Miura, Akio Nagasaki, Takafumi Miyatake, Automatic Feature Extraction from non-uniform Finger Vein Image and its Application to Personal Identification, MVA2002 IAPR Workshop on Machine Vision Applications, Nara- ken New Public Hall, Nara, Japan, December 1113, 2002.

    31. Bakhtiar Affendi Rosdi, Chai Wuh Shing and Shahrel Azmin Suandi, Finger Vein Recognition Using Local Line Binary Pattern, Sensors, 11, doi: 10.3390/s111211357. ISSN 1424-

      8220, www.mdpi.com/journal/sensors, page(s):11357-11371

      2011.

    32. Yanagawa, Takashi, Aoki, Satoshi, Oyama, Tetsuji, Diversity of human finger vein patterns and its application to personal identification, Bulletin of informatics and cybernetics, URL http://hdl.handle.net/2324/21042, 41, page(s):9-12, 2009.

    33. Beining Huang, Yang gang Dai, Rongfeng Li, Darun Tang and Wenxin Li, Finger-vein Authentication Based on Wide Line Detector and Pattern Normalization, 2010 International Conference on Pattern Recognition, 1051-4651/10 © 2010, IEEE, DOI:10.1109/ICPR.2010.316, 2010.

    34. Gongping Yang, Xiaoming Xi and Yilong Yin, Finger Vein Recognition Based on a Personalized Best Bit Map, Sensors 2012, 12, doi:10.3390/s120201738.ISSN 1424-8220, page(s):1738-1757, December 2012.

    35. Anil K. Jain, Salil Prabhakar and Shaoyun Chen, Combining multiple matchers for a high security Fingerprint verification system, Pattern Recognition Letters 20 (1999), Published by Elsevier Science B.V. www.elsevier.nl/locate/patrec, page(s):1371-1379, 1999.

    36. Miyuki Kono, Hironori Ueki, and Shin-ichiro Umemura, Near- Infrared Finger Vein Patterns for Personal Identification, Applied Optics,(2002),

      http://dx.doi.org/10.1364/AO.41.007429, Issue 35, Volume 41,

      page(s):7429-7436, 2002.

    37. Xianjing Meng, Gongping Yang, Yilong Yin and Rongyang Xiao, Finger Vein Recognition Based on Local Directional Code, Sensors 2012, doi: 10.3390/s121114937. ISSN 1424- 8220, www.mdpi.com/journal/sensors, page(s):14937-14952, December 2012.

    38. Kejun Wang, Hui Ma, Oluwatoyin P. Popoola and Jingyu Liu,

      Finger vein recognition Biometrics, Dr. Jucheng Yang (Ed.), ISBN: 978-953-307-618-8, Intec, Available from: http://www.intechopen.com/books/biometrics/finger-vein- recognition, 2011.

    39. Azadeh Noori Hoshyar, Riza Sulaiman, Afsaneh Noori Houshyar, Smart access control with finger vein authentication and neural network, Journal of American Science, 2011; 7(9), http://www.americanscience.org, 2011.

    40. D. M. Weber and D. Casasent, Quadratic Gabor filters for object detection, IEEE Transactions on Image Processing, Issue 2, Volume 10, page(s):218230, February 2001.

    41. Li Xueyan and Guo Shuxu, The Fourth Biometric – Vein Recognition, Pattern Recognition Techniques, Technology and Applications, Peng-Yeng Yin (Ed.), ISBN: 978 953-7619-24-4, Intec, Available from: http://www.intechopen.com/books/pattern_recognition_techniq ues_technology_and_applications/the_fourth_biometric_-

_vein_recognition, 2008.

Leave a Reply