Google compression algorithm Zopfli may lead to faster Internet
Algorithm promises faster data transfer speeds and reduced Web page load times by compressing content up to 8 percent smaller than zlib.
Google has released a new data compression algorithm it hopes will make the Internet faster for everyone.
Dubbed Zopfli, the open-source algorithm will accelerate data transfer speeds and reduce Web page load times by compressing content up to 8 percent smaller than the zlib software library, Google said in a company blog post today. Named after a Swiss bread recipe, the new algorithm is an implementation of the Deflate algorithm, which is used in the ZIP archive format, as well as in gzip file compression.
"The higher data density is achieved by using more exhaustive compression techniques, which make the compression a lot slower, but do not affect the decompression speed," said Lode Vandevenne, a Zurich-based software engineer who implemented Zopfli as part of his "20 percent time" -- Google's policy under which workers can use up to 20 percent of their work week to delve into special projects.
Zopfli is a compression-only library, meaning that existing software can decompress the data. It is also bit-stream compatible with gzip, ZIP, PNG, and HTTP requests, among others.
"The smaller compressed size allows for better space utilization, faster data transmission, and lower web page load latencies," he said, adding that mobile users will benefit from the smaller compression in the form of lower data transfer rates and reduced battery use.
Because the amount of CPU time required for compression is two to three orders of magnitude more than zlib, Google believes Zopfli is best suited for applications where data is compressed once and but delivered many times over -- such as static content for the Web.