I know that using tar with -J option makes it possible to compress a folder with a high level of compression resulting a tar.xz file.
I have a folder holding multiple backups of my working space, each of which containing a lot of libraries (.so and .a, etc.) which are usually, but not always, the same files foreach backup (duplicated files).
Is there a method which can compress my folder of backups considering the fact that there are a lot of duplicate files in there, and therefore result in a highest level of compression? Does passing -J option to tar command do the job?
I don't want to take care of the duplicate files inside each folder all the time. Is there a smart tool which considers all the duplicate files as one file then compress that? If not, what is the best tool and option to compress such a folder?