Detecting and Mitigating the Effect of Manipulated Reputation on Online Social Networks

In recent times, online social networks (OSNs) are being used not only to communicate but to also create a public/social image. Artists, celebrities and even common people are using social networks to build their brand value and gain more visibility either amongst a restricted set of people or public. In order to enable user to connect to other users in the OSN and gain following and appreciation from them, various OSNs provide different social metrics to the user such as Facebook likes, Twitter followers and Tumblr reblogs. Hence, these metrics give a sense of social reputation to the OSN user. As more users are trying to leverage social media to create a brand value and become more influential, spammers are luring such users to help manipulate their social reputation with the help of paid service (black markets) or collusion networks. In this work, we aim to build a robust alternate social reputation system and detect users with manipulated social reputation. In order to do so, we first start by understanding the underlying structure of various sources of crowdsourced social reputation manipulation like blackmarkets, supply-driven microtask websites and collusion networks. We then build a mechanism for an early detection of users with manipulated social reputation. Our initial results are encouraging and substantiate the possibility of a robust social reputation system.

[1]  Gang Wang,et al.  Serf and turf: crowdturfing for fun and profit , 2011, WWW.

[2]  Christopher Krügel,et al.  Hulk: Eliciting Malicious Behavior in Browser Extensions , 2014, USENIX Security Symposium.

[3]  Gang Wang,et al.  Man vs. Machine: Practical Adversarial Detection of Malicious Crowdsourcing Workers , 2014, USENIX Security Symposium.

[4]  Krishna P. Gummadi,et al.  Towards Detecting Anomalous User Behavior in Online Social Networks , 2014, USENIX Security Symposium.

[5]  Ee-Peng Lim,et al.  Detecting product review spammers using rating behaviors , 2010, CIKM.

[6]  Kyumin Lee,et al.  Characterizing and automatically detecting crowdturfing in Fiverr and Twitter , 2015, Social Network Analysis and Mining.

[7]  Gang Wang,et al.  Follow the green: growth and dynamics in twitter follower markets , 2013, Internet Measurement Conference.

[8]  Vern Paxson,et al.  Trafficking Fraudulent Accounts: The Role of the Underground Market in Twitter Spam and Abuse , 2013, USENIX Security Symposium.

[9]  Jun Hu,et al.  Detecting and characterizing social spam campaigns , 2010, CCS '10.

[10]  Virgílio A. F. Almeida,et al.  Detecting Spammers on Twitter , 2010 .

[11]  Haining Wang,et al.  Detecting Social Spam Campaigns on Twitter , 2012, ACNS.

[12]  Ponnurangam Kumaraguru,et al.  What they do in shadows: Twitter underground follower market , 2015, 2015 13th Annual Conference on Privacy, Security and Trust (PST).

[13]  Krishna P. Gummadi,et al.  Strength in Numbers: Robust Tamper Detection in Crowd Computations , 2015, COSN.

[14]  Stefan Savage,et al.  Dirty Jobs: The Role of Freelance Labor in Web Service Abuse , 2011, USENIX Security Symposium.