19

In my application I need do compress of logs that are text files.

Seems that bzip2 and gzip have the same compression ratio.

Is that correct?

slhck
  • 223,558
  • 70
  • 607
  • 592
user710818
  • 360
  • 1
  • 4
  • 10
  • xz (from xz-tools or 7z from p7zip, it is very like lzma) is the best. bzip2 is better than gzip. – osgx Dec 01 '11 at 12:36

7 Answers7

13

Last update of maximumcompression.com is June-2011 (answer updated in Oct-2015)
Therefore this website does not mention
the current champion text compressor worldwide:

      cmix

Competitions/Benchmarks:

Details:
Byron Knoll is actively developping cmix as libre software (GPL) since 2013 based on the book Data Compression Explained by Matt Mahoney. Matt Mahoney also maintains some of the above benchmarks and proposes ZPAQ (WP), a command line incremental archiver.


If you prefer a more standard tool (requiring less RAM) I recommend:

      lrzip

lrzip is an evolution of rzip by Con Kolivas.
lrzip stands for two names: Long Range ZIP and Lzma RZIP.
lrzip is often better than xz (another popular compression tool).
Alexander Riccio also recommends lrzip.


My favorite is:

      zpaq

The "archiver expert", Matt Mahoney, has intensively worked on PAQ algorithms for ten years and provide the best compromise between CPU/memory resources and compression level.

However, the last zpaq version is not often packaged/available on recent distro :-(
I always compile it from sources when I have a new machine and I need a very good compressor: https://github.com/zpaq/zpaq

clone https://github.com/zpaq/zpaq
cd zpaq
g++ -O3 -march=native -Dunix zpaq.cpp libzpaq.cpp -pthread -o zpaq
oHo
  • 3,093
  • 2
  • 17
  • 13
6

Normally, bz2 has a better compression ratio, combined with better recoverability features.

OTOH, gz is faster.

xz is said to be even better than bz2, but I don't know the timing behaviour.

glglgl
  • 1,449
  • 11
  • 23
  • xz is slower than bzip2. – osgx Dec 01 '11 at 14:46
  • xz is not just slower , but much slower, 300 mb file took about 30 seconds for bzip2 to compress. I killed xz after it had been compressing for longer than 5 minutes – Tebe Jan 07 '17 at 01:29
  • @Копать_Шо_я_нашел I think it depends heavily on the compression level you choose. With `-1`, it is not so very slow, but with the default settings, it tends to be quite slow. – glglgl Jan 09 '17 at 09:59
4

i have made a benchmark to test to compress the following:
204MB folder (with 1,600 html files)
results

7zip =>     2.38 MB
winrar =>   49.5 MB
zip =>      50.8 MB
gzip =>     51.9 MB

so the 7zip is the best among them you can get it from here
http://www.7-zip.org/

4

Maybe you could have a look to those benchmarks, especially the part testing the log files compression.

Cédric Julien
  • 644
  • 7
  • 10
0

if you care more about compression ratio than compression speed, then brotli is the best I found so far.

I have a 2MB text file and brotli compressed it twice as better (half size) than bzip2,gzip could do.

from linux apt install brotli and check for yourself.

Zibri
  • 265
  • 2
  • 8
0

bz2 has tighter compression, the algorithm has more options to look for redundancy to compress away.

gzip is in much more tools, and is more cross platform. More Windows tools can deal with .gz files. It's part of http, so even web browsers can understand it.

On linux, there are tools that let you work on compressed files directly. zgrep and bzgrep can search in compressed files.

If just on Linux, I'd use bzip2, for the slightly better compression ratios.

Rich Homolka
  • 31,057
  • 6
  • 55
  • 80
0

xz compresses much better than bz2, but takes more time. So, if maximum compression is your goal and space on your hard drive is at a premium (which is my case with one drive at 98% full - while I reorganize my file systems), and you can fire off a script to do the work - take a break and come back in 5 minutes.

unxz is very fast to uncompress in my experience - which is a good thing for me on a daily basis.

bz2 is faster to compress than xz, but does not appear to achieve the compression results of xz.

The only way to make these assessments is to run benchmarks against a mix of common files you normally would compress/decompress, and vary the parameters to see which comes out on top.

Tom
  • 108
  • 2