Ring-LWE Based Face Encryption and Decryption System on a GPU

This paper presents a novel method to implement ring learning with errors (ring-LWE) cryptography for video-based face encryption and decryption on a graphics processing unit (GPU). By conducting ring arithmetic operations in parallel on a GPU, the processing time of these operations is significantly reduced. Consequently, ring-LWE encryption and decryption operations are remarkably improved. The simulation results conducted on GPU and CPU platforms using CUDA C++ show that the ring-LWE based face encryption and decryption operations implemented on a GPU are approximately 100 times faster than that implemented on a CPU.