Can user gender and recommendation performance be preserved simultaneously?

A recommendation system learns a user's interests from her historical purchasing or watching behavior, which is disclosed to the recommendation system inevitably. Such a disclosure raises a serious concern in the public for the leak of users' privacy. For instance, a person who watches a lot of videos which are more preferred by women than men can be inferred as female. Recently, as a response to this concern, some algorithms are proposed to obfuscate users' historical behavior records to protect users' privacy, at the cost of degradation of recommendation accuracy. It is a common belief that such degradation is inevitable. In this paper, however, we break this pessimistic belief based on the fact that a person's interests are not necessarily limited to items which are geared to a certain age, profession, or gender. Based on this idea, we propose a recommendation-friendly privacy preserving method by introducing a privacy-preserving module between a recommendation system and user side. By obfuscating a user's historical records with a set of properly selected extra factitious records, the privacy-preserving module can efficiently obfuscate the user's privacy information but keep her interest information pass through. Extensive experiments show that our algorithm can not only obfuscate users' gender information efficiently, but also maintain or even improve recommendation accuracy.