1

I have couple of images that are of 32MBs of size and I want to change their size from 32MBs to 100Kbs or any KBs of size without affecting its colours.

The commands I am trying are:

muhammad@muhammad-mohsin:~/scans$ find . -iname '*.png' -exec mogrify -format jpg "*.png" {} +


muhammad@muhammad-mohsin:~/$ find . -type f -iname \*.png -delete


muhammad@muhammad-mohsin:~/$ find . -iname '*.jpg' -exec mogrify -define jpeg:extent=300kb -strip -quality 90 -scale 90% *.jpg {} +

Here, first I convert a PNG to JPG that reduce its size from 32Mbs to 5.8Mbs and everything stays same but when I use 3rd command, it removes background color in image and making it grayscale sort of blurry.

However, the text is still readable but colors and background logo does not.

How I can achieve this with convert, mogrify or any other tool? I tried every possible thing so far.

This is part of original image

This is part of changed image after command

LearningROR
  • 133
  • 7
  • 1
    30MB png to 300kb jpg → Example : `convert Sample.png -resize 22% S300.jpg` .... then you have very good quality, but a smaller image. – Knud Larsen Aug 08 '21 at 09:48
  • @KnudLarsen Thanks for it. Really helped me. Can we add some other feature to make image colours sharp/bright etc? I am using: `convert image.png -resize 35% S300.jpg` and it returns `763Kbs` size. We are close! – LearningROR Aug 08 '21 at 09:57
  • @KnudLarsen Can we a batch process for this command so all images in a folder and sub folders will get this command on them? I am trying `find . -iname '*.png' -exec convert -resize 60% -quality 60 "*.jpg" {} +` but that does not work. – LearningROR Aug 08 '21 at 10:35

2 Answers2

4

It's because of how JPEG compression works. It attempts to round adjoining pixels that are similar to eachother to similar values. This causes loss of details, and blockyness.

This becomes more noticeable as you increase the compression level, which is exactly what you're doing. In addition you're doing it in two steps:

  1. Lossless (PNG) to lossy (JPEG) compression.
  2. Lossy to lossy compression.

You will probably get a better result by going lossless to lossy at final quality, thus only applying lossy compression once, e.g. using jpeg:extent=300kb -strip -quality 90 -scale 90% in the first conversion.

Furthermore, you say nothing about the size of the image and level of detail. It may not be feasible to get it down to 300kB and retain the desired quality.

To get rid of background blotches, you can try to apply thresholds to your document in some image editing software, forcing anything less than a certain shade of gray to be white, for instance.

However, no matter what you do, compressing from a 30MB lossless format to a 300kB lossy format will lead to visibly reduced quality.

vidarlo
  • 21,954
  • 8
  • 58
  • 84
  • Thanks so much for detailed response. What do you say in this case after 5.8MBs of file size with JPGs. Can we use any compression tool to make the size a bit lower? – LearningROR Aug 08 '21 at 09:35
  • I have all files in PNGs. How I can use `jpeg:extent=300kb -strip -quality 90 -scale 90%` in once only in this case? – LearningROR Aug 08 '21 at 09:38
  • You can use it in your first conversion. And yes; you can try with e.g. 2MB size, and it will lead to a better result than 300kB target. – vidarlo Aug 08 '21 at 09:43
  • Thank you. Makes sense. +1 for good response. :) – LearningROR Aug 08 '21 at 09:49
  • 1
    *encodes them as copies of each other.* - JPEG *only* does DCT quantization, without intra-prediction like "copy the block from 29 pixels in this direction". Maybe you're thinking of h.264 / h.265 I frames (https://en.wikipedia.org/wiki/High_Efficiency_Image_File_Format)? I guess a JPEG encoder doing trellis quantization might try to encode nearby blocks the same way so the final lossless compression step could save more bits? – Peter Cordes Aug 08 '21 at 18:14
  • Or maybe you mean on a very local scale, wanting to round off high frequency DCT components to zero, making adjacent pixels more like each other. That similar-pixel effect is *not* the result of literally encoding them as copies of each other, but actually of encoding with mostly low spatial frequency coefficients. – Peter Cordes Aug 08 '21 at 18:15
  • @PeterCordes Thanks for the correction :) – vidarlo Aug 08 '21 at 18:22
  • "*attempts to round adjoining pixels that are similar to eachother to similar values*" is still not how it actually works. That can be part of the result, but it's not that simple. JPEG's lossy compression happens in the frequency domain after [an 8x8 DCT](https://en.wikipedia.org/wiki/Discrete_cosine_transform#Compression_artifacts), not in terms of actual adjacent pixels. So you can get ["ringing" artifacts](https://en.wikipedia.org/wiki/Ringing_artifacts) around sharp edges, and the separate 8x8 block processing is why low qual JPEGs get blocky. – Peter Cordes Aug 08 '21 at 21:23
  • See also [Two EXACTLY the same .jpg images with one image more than twice the file size of the other - Why?](https://photo.stackexchange.com/a/125291) for more description of how JPEG compresses, and what makes some images harder or easier to compress without significant distortion. (I wrote that answer with technical details, but aimed at an audience that didn't already know how image-compression worked.) – Peter Cordes Aug 08 '21 at 21:25
2

I am trying find . -iname '*.png' -exec convert -resize 60% -quality 60 "*.jpg" {} + but that does not work.

Ref. https://superuser.com/questions/71028/batch-converting-png-to-jpg-in-linux

$ ls -1 *.png | xargs -n 1 bash -c 'convert -quality 60 "$0" "${0%.*}.jpg"'

Converts my example 31MB.png to 1.4MB.jpg . ... You may have to repeat with e.g. $ ls -1 *.PNG | ... etc.

Ref. comment by @steeldriver : "slightly better is xargs -d '\n' -n 1 "

Knud Larsen
  • 3,044
  • 2
  • 12
  • 13
  • Thank you, Mr. Knud. I already done that with `find . -iname '*.png' -exec mogrify -resize 60% -quality 60 -format jpg *.png {} +` but accepting and upvoting your answer for putting me in right direction. – LearningROR Aug 08 '21 at 11:50
  • 1
    Note that `ls -1 | xargs -n 1` will break if any of the filenames contains whitespace. You can make it *slightly* better using `xargs -d '\n' -n 1` which will work except for filenames containing newlines - you could handle those as well using null delimiters ex. `printf '%s\0' *.png | xargs -0 -n 1 ... `. However since you are forking a new bash shell for every file I wonder if the whole thing would be just as easily done using a shell loop `for f in *.png; do ... "$f" "${f%.*}.jpg"; done` – steeldriver Aug 08 '21 at 12:39
  • @steeldriver : OK, `ls -1` etc. etc. is just one possible set of options. I can add your suggestion to the answer. – Knud Larsen Aug 08 '21 at 14:38