Accession Number : ADA612426


Title :   Joint Sparse Representation for Robust Multimodal Biometrics Recognition


Descriptive Note : Journal article


Corporate Author : RICE UNIV HOUSTON TX


Personal Author(s) : Shekhar, Sumit ; Patel, Vishal M ; Nasrabadi, Nasser M ; Chellappa, Rama


Full Text : https://apps.dtic.mil/dtic/tr/fulltext/u2/a612426.pdf


Report Date : Jan 2014


Pagination or Media Count : 17


Abstract : Traditional biometric recognition systems rely on a single biometric signature for authentication. While the advantage of using multiple sources of information for establishing the identity has been widely recognized, computational models for multimodal biometrics recognition have only recently received attention. We propose a multimodal sparse representation method, which represents the test data by a sparse linear combination of training data, while constraining the observations from different modalities of the test subject to share their sparse representations. Thus, we simultaneously take into account correlations as well as coupling information among biometric modalities. We modify our model so that it is robust to noise and occlusion. A multimodal quality measure is also proposed to weigh each modality as it gets fused. Furthermore, we also kernelize the algorithm to handle non-linearity in data. The optimization problem is solved using an efficient alternative direction method. Various experiments show that our method compares favorably with competing fusion-based methods.


Descriptors :   *BIOMETRY , *DATA FUSION , ALGORITHMS , MULTIMODE


Subject Categories : Operations Research
      Cybernetics


Distribution Statement : APPROVED FOR PUBLIC RELEASE