Data compression is also called as source coding. It is the process of encoding information using fewer bits than an uncoded representation is also making a use of specific encoding schemes. Compression is a technology for reducing the quantity of data used to represent any content without excessively reducing the quality of the picture. It also reduces the number of bits required to store and/or transmit digital media. Compression is a technique that makes storing easier for large amount of data. There are various techniques available for compression in my paper work , I have analyzed Huffman algorithm and compare it with other common compression techniques like Arithmetic, LZW and Run Length Encoding.
[1]
Darrel Hankerson,et al.
Introduction to Information Theory and Data Compression
,
2003
.
[2]
Ronald L. Rivest,et al.
Introduction to Algorithms, Second Edition
,
2001
.
[3]
Terry A. Welch,et al.
A Technique for High-Performance Data Compression
,
1984,
Computer.
[4]
Ronald L. Rivest,et al.
Introduction to Algorithms
,
1990
.
[5]
Daniel S. Hirschberg,et al.
Data compression
,
1987,
CSUR.
[6]
Abraham Lempel,et al.
A universal algorithm for sequential data compression
,
1977,
IEEE Trans. Inf. Theory.
[7]
Konstantinos Konstantinides,et al.
Image and video compression standards
,
1995
.