: Day by day the use of memory is increases rapidly. The process of eliminating the repeated or duplicates copies of data is called as Data deduplication. This data deduplication process is widely used in cloud storage to decrease storage space and upload bandwidth. By using, deduplication system progress of storage utilization and reliability is increases. In addition, the dare of privacy for sensitive data also take place when they are outsourced by users to cloud. Planning to address the above security test, this paper constructs the first effort to celebrate the idea of scattered reliable deduplication system. The paper recommends a distributed deduplication systems with two methods i.e. File level and the block level. A deduplication technique, on the other hand, can reduce the storage cost at the server side and save the upload bandwidth at the user side. Deduplication has received much attention from both academic and industry because it can more improves storage utilization and save storage space, especially for the applications with high deduplication ratio such as accession storage systems .
[1]
Alessandro Sorniotti,et al.
A Secure Data Deduplication Scheme for Cloud Storage
,
2014,
Financial Cryptography.
[2]
Lihao Xu,et al.
Optimizing Cauchy Reed-Solomon Codes for Fault-Tolerant Network Storage Applications
,
2006,
Fifth IEEE International Symposium on Network Computing and Applications (NCA'06).
[3]
Bin Yan,et al.
R-ADMAD: high reliability provision for large-scale de-duplication archival storage systems
,
2009,
ICS '09.
[4]
Jin Li,et al.
Convergent Dispersal: Toward Storage-Efficient Security in a Cloud-of-Clouds
,
2014,
HotCloud.
[5]
Yang Tang,et al.
A Secure Cloud Backup System with Assured Deletion and Version Control
,
2011,
2011 40th International Conference on Parallel Processing Workshops.
[6]
Mihir Bellare,et al.
DupLESS: Server-Aided Encryption for Deduplicated Storage
,
2013,
USENIX Security Symposium.