Future Internet (Feb 2018)
Increasing Trustworthiness of Face Authentication in Mobile Devices by Modeling Gesture Behavior and Location Using Neural Networks
Abstract
Personal mobile devices currently have access to a significant portion of their user’s private sensitive data and are increasingly used for processing mobile payments. Consequently, securing access to these mobile devices is a requirement for securing access to the sensitive data and potentially costly services. Face authentication is one of the promising biometrics-based user authentication mechanisms that has been widely available in this era of mobile computing. With a built-in camera capability on smartphones, tablets, and laptops, face authentication provides an attractive alternative of legacy passwords for its memory-less authentication process, which is so sophisticated that it can unlock the device faster than a fingerprint. Nevertheless, face authentication in the context of smartphones has proven to be vulnerable to attacks. In most current implementations, a sufficiently high-resolution face image displayed on another mobile device will be enough to circumvent security measures and bypass the authentication process. In order to prevent such bypass attacks, gesture recognition together with location is proposed to be additionally modeled. Gestures provide a faster and more convenient method of authentication compared to a complex password. The focus of this paper is to build a secure authentication system with face, location and gesture recognition as components. User gestures and location data are a sequence of time series; therefore, in this paper we propose to use unsupervised learning in the long short-term memory recurrent neural network to actively learn to recognize, group and discriminate user gestures and location. Moreover, a clustering-based technique is also implemented for recognizing gestures and location.
Keywords