Authentication through Sensing of Tongue and Lip Motion via Smartphone
暂无分享,去创建一个
Current voice-based user authentication explores the unique characteristics from either the voiceprint or mouth movements, which are at risk to replay attacks. During speaking, the vocal tract, tongue, and lip, including the static shape and dynamic movements, expose individual uniqueness, and adversaries hardly imitate them. Moreover, most voice-based user authentications are passphrase-dependent, which significantly reduces the user experience. Therefore, our work aims to employ the individual uniqueness of vocal tract, tongue, lip movement to realize user authentication on a smartphone. This paper presents a new authentication framework to identify smartphone users through articulation, namely tongue and lip motion reading. The main idea is to capture acoustic and ultrasonic signals from a mobile phone and analyze the fine-grained impact of articulation movement on the uttered words. We currently develop a passphrase-independent authentication model by analyzing the articulation in continuous speech, exploring different scenarios, and creating a passphrase-independent authentication model.