The Inquirer reports Google has open sourced its Zopfli data compression algorithm. Google's Compression Team claims Zopfli produces three to eight percent smaller files compared to zlib but the catch is that it requires two to three orders of magnitude more CPU time than zlib at maximum quality. There's no performance hit in decompression though. Google software engineer Lode Vandevenne explains the algorithm is best suited for applications where data is compressed once and sent over a network many times.
The source code of Zopfli can be downloaded over here.
Google's Zopfli algorithm is based on the Deflate algorithm but has been optimised to produce smaller file sizes at the expense of compression speed. The firm said the compression library, written in C, is based on iterative entropy modelling and a shortest path algorithm, adding that it is bit-stream compatible, meaning that it can be used with gzip, Zip and most importantly HTTP requests.
Lode Vandevenne, a software engineer on Google's Compression Team who implemented the Zopfli algorithm said, "Due to the amount of CPU time required - two to three orders of magnitude more than zlib at maximum quality - Zopfli is best suited for applications where data is compressed once and sent over a network many times."