Google makes Zopfli compression algorithm open source

Posted on Monday, March 04 2013 @ 10:41 CET by Thomas De Maesschalck
Google logo
The Inquirer reports Google has open sourced its Zopfli data compression algorithm. Google's Compression Team claims Zopfli produces three to eight percent smaller files compared to zlib but the catch is that it requires two to three orders of magnitude more CPU time than zlib at maximum quality. There's no performance hit in decompression though. Google software engineer Lode Vandevenne explains the algorithm is best suited for applications where data is compressed once and sent over a network many times. The source code of Zopfli can be downloaded over here.
Google's Zopfli algorithm is based on the Deflate algorithm but has been optimised to produce smaller file sizes at the expense of compression speed. The firm said the compression library, written in C, is based on iterative entropy modelling and a shortest path algorithm, adding that it is bit-stream compatible, meaning that it can be used with gzip, Zip and most importantly HTTP requests.

Lode Vandevenne, a software engineer on Google's Compression Team who implemented the Zopfli algorithm said, "Due to the amount of CPU time required - two to three orders of magnitude more than zlib at maximum quality - Zopfli is best suited for applications where data is compressed once and sent over a network many times."


About the Author

Thomas De Maesschalck

Thomas has been messing with computer since early childhood and firmly believes the Internet is the best thing since sliced bread. Enjoys playing with new tech, is fascinated by science, and passionate about financial markets. When not behind a computer, he can be found with running shoes on or lifting heavy weights in the weight room.



Loading Comments