Skip to content

GeVic/Image-Compression

Repository files navigation

Image Compression using Huffman Coding

In 1952 David Huffman, a graduate student at the famous Massachusetts Institute of Technology developed an elegant algorithm for lossless compression as part of his schoolwork. The algorithm is now known as Huffman coding. Huffman coding can be used to compress all sorts of data. It is an entropy-based algorithm that relies on an analysis of the frequency of symbols in an array.

Algoritm of Huffman coding

The image compression techniques are categorized into two main classifications namely Lossy compression techniques and Lossless compression techniques.

Lossless Compression

A technique in which the compressed image is reconstructed without any loss of data is called lossless compression. Lossless compression ratio gives good quality of compressed images, but yields only less compression.

Lossy Compression

A technique in which the compressed image is reconstructed with loss of data is called lossy compression. The lossy compression techniques lead to loss of data with higher compression ratio.

Huffman coding is loss less technique with more attractive features in various application such as medical survey and analysis, technical drawing etc. Huffman coding has better characteristics of image compression.

Block Diagram Flow
Input Image
Split equal rows and coloumn
Apply Huffman coding on individual rows and columns
Individual compressed image
sum of compressed individual image
Compressed image

About

Image Compression using Huffman Coding

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages