Jump to content
IGNORED

Most efficient compression for Atari


xxl

Recommended Posts

I am looking for the most efficient compression method to use on Atari. At this point, neither the compression speed nor the size of the buffer or library matters.


If anyone can compress it better, please specify on what compressor parameters (the above results are on the default ones).

 

Are there any other (efficient) compressors available for Atari?


who can pack it better?

 

source file: 7680 bytes

 

rle: 2789
lz4: 2219 --- > decompressor for Atari: https://xxl.atari.pl/lz4-decompressor/

zpaq: 2189 (m -5)

FlashPack 3: 2174

Autogamy: 2148

pp: 2052 (Amiga PowerPacker)

Bongo: 1891

ARJBETA mode 4: 1841 --- > decompressor for Atari: http://xxl.atari.pl/arj4-decompressor/

LZSS: 1839
lzsa: 1811 (lzsa.exe -f2 -r)

lzfse: 1810

ZX7: 1801 --- > decompressor for Atari: http://xxl.atari.pl/zx7-decompressor/

BitBuster: 1774 --- > decompressor for Atari: https://xxl.atari.pl/bitbuster-decompressor/

bz2: 1736

ARJBETA mode 7: 1657

apl: 1655  --- > decompressor for Atari: https://xxl.atari.pl/aplib-decompressor/

ZX0: 1625  --- > decompressor for Atari: https://xxl.atari.pl/zx0-decompressor/
def: 1,598  --- > decompressor for Atari:  https://github.com/pfusik/zlib6502

PackFire: 1581 (-t tiny) --- > decompressor for Atari: https://xxl.atari.pl/packfire-decompressor/

EXO: 1537 (exomizer raw -E)

Brotli: 1537

ZX5: 1532  --- > decompressor for Atari: https://xxl.atari.pl/zx5-decompressor/

zstandard: 1522

Lzip: 1519 (lzip -9)

LZMA (7z ultra): 1498

7z: 1453 ($ 7z a -t7z -m0=lzma -mx=9 -mfb=64 -mmf=bt4 -mlc=1 -mlp=0 -mpb=0)

PackFire: 1452 (-l large)

Shrinkler: 1412 (Shrinkler -d -p -9) --- > decompressor for Atari: https://xxl.atari.pl/shrinkler-decompressor/

paq8px: 1121

 

conan.gfx

Edited by xxl
Link to comment
Share on other sites

My (somewhat older) version of WinRar managed 1681 bytes (and 1703 with a quick play with 7Zip)

Though I suspect it'd be handicapped for smaller files somewhat with it's normal overheads and large dictionary size (I forced it to 64K which was the smallest available)

 

I think for larger sizes the advantage on the Atari would vanish and probably go the other way.

 

Link to comment
Share on other sites

Yes. here I would like to know more efficient compressors, regardless of their speed of operation.


there may also be a compressor that does not yet have an Atari decompression procedure (but must be more efficient than others)
 

Link to comment
Share on other sites

According to https://github.com/emmanuel-marty/lzsa/blob/master/pareto_graph.png, Shrinkler (https://github.com/askeksa/Shrinkler) is rather good, I have tried and indeed:

 

Shrinkler.exe -d -p -9 : 1440 bytes 

 

Sounds too good to be true!!

 

(exe is available here http://www.pouet.net/prod.php?which=64851)

 

also lzsa.exe -f2  1819 bytes 

 

  • Like 1
Link to comment
Share on other sites

actually it works ? Amiga rules ... as usual ?


the algorithm is provided for a processor with 16-bit registers :( a lot of optimization is needed to make it work fast.

 

I compared with LZ4 which 8KB decompresses into 11 frames, shrinkler (I didn't play with optimization, dirty code) over 500 frames: the video shows how slow it works: / but it's actually much better than all other packers.


is that all? What's better about data packaging?

 

 

Link to comment
Share on other sites

13 minutes ago, xxl said:

I am looking for a win64 binary of this:

 

https://www.nongnu.org/lzip/

 

---

1527

 

Many ways to skin a cat and all that. One way would be to use cygwin linux environment under windows. If cygwin doesn't provide a package to install you would need to build from source which would require more setup, but should produce a windows executable. Another way would be to cross compile on linux for windows using mingw-w64, but I currently don't have it installed.

Link to comment
Share on other sites

I think it's not a "fair" comparison. You cannot compare single file compressors with archivers. An archiver has way more overhead, like storing paths and filenames.

 

Also, there's only one sample file, which is the Conan title screen.

 

PPM (Prediction by Partial Matching) is currently state-of-the-art compression, but it's very memory hungry.

 

Example of the overhead of an archiver:

$ 7z a -t7z -m0=lzma -mx=9 -mfb=64 -mmf=bt4 -mlc=1 -mlp=0 -mpb=0 bla.7z test.gfx
...
Archive size: 1575 bytes (2 KiB)
...
$ 7z l bla.7z
...
2021-02-01 18:44:46 ....A         7680         1453  test.gfx
...

 

 

Edited by ivop
Link to comment
Share on other sites

54 minutes ago, xxl said:

actually it works ? Amiga rules ... as usual ?


the algorithm is provided for a processor with 16-bit registers :( a lot of optimization is needed to make it work fast.

 

I compared with LZ4 which 8KB decompresses into 11 frames, shrinkler (I didn't play with optimization, dirty code) over 500 frames: the video shows how slow it works: / but it's actually much better than all other packers.


 

 

 

Wow, unlike Shrinkler decompression, you're fast! ?

 

All these results are consistent with those obtained by Emmanuel Marty:

 

image.thumb.png.2c75037b3eae330c415880624fe9d3fd.png

 

I don't know what the "EPC" compressor is. But in any case it seems to be more efficient than the others (Shrinkler excluded).

 

I find that lzsa allows a good compromise between decompression speed and compression ratio:

 

lzsa.exe -f2 -r : 1811

lzsa.exe -f1 -r : 1937

 

Link to comment
Share on other sites

3 minutes ago, fantômas said:

I don't know what the "EPC" compressor is. But in any case it seems to be more efficient than the others (Shrinkler excluded).

the graph shows that 7z (lzma?) should have the strongest compression

Link to comment
Share on other sites

let's check the binary and selected compressors.


River Raid: 8192    (http://www.atarimania.com/game-atari-400-800-xl-xe-river-raid_4388.html)

 

RLE: 7840
LZ4: 7414

LZSA: 6646
APL: 6366
DEF: 6200
lzip: 6129
Shrinkler: 6020


actually the results match ...

Edited by xxl
Link to comment
Share on other sites

Same thing as 7zip, but with xz, and no arhive header overhead:

$ ls -l foo.gfx 
-rw-r--r-- 1 ivo ivo 7680 Feb  1 20:57 foo.gfx
$ xz -9 -Fraw -S.raw -v --lzma1=dict=4096,lc=1,lp=0,pb=0,mode=normal,nice=132,mf=bt4,depth=0 foo.gfx 
foo.gfx (1/1)
  100 %           1,458 B / 7,680 B = 0.190                                    
$ ls -l foo.gfx 
ls: cannot access 'foo.gfx': No such file or directory
$ ls -l foo.gfx.raw 
-rw-r--r-- 1 ivo ivo 1458 Feb  1 20:57 foo.gfx.raw
$ xz -d -Fraw -S.raw -v --lzma1=dict=4096,lc=1,lp=0,pb=0,mode=normal,nice=132,mf=bt4,depth=0 foo.gfx.raw 
foo.gfx.raw (1/1)
  100 %           1,458 B / 7,680 B = 0.190                                    
$ ls -l foo.gfx.raw 
ls: cannot access 'foo.gfx.raw': No such file or directory
$ ls -l foo.gfx 
-rw-r--r-- 1 ivo ivo 7680 Feb  1 20:57 foo.gfx

LZMA also requires quite a lot of memory. So far, I think shrinkler is the way to go. Perhaps throw a bunch of .car and .xex files at all of those compressors with a shell script/batch file, and see how they perform on real Atari 8-bit data.

Link to comment
Share on other sites

5 hours ago, xxl said:

I compared with LZ4 which 8KB decompresses into 11 frames

Just for completeness: "autogamy" (2148 bytes for "Conan") needs 9 frames (DMA off).

River Raid is at 7383 bytes.

 

You can find it in the thread linked in post #3.

 

Edited by Irgendwer
Link to comment
Share on other sites

paq8px archiver v187 (c) 2020, Matt Mahoney et al.

Creating archive conan.gfx.paq8px187 in single file mode...

Filename: conan.gfx (7680 bytes)
Block segmentation:
 0           | default          |      7680 bytes [0 - 7679]
-----------------------
Total input size     : 7680
Total archive size   : 1121

Time 1.31 sec, used 1288 MB (1350729492 bytes) of memory

Just for the laughs, 1121 bytes, paq8px archiver.

 

There's no way PAQ8 would run an Atari (1288MB of memory used for this lol) but that should be the lowest you can get this for comparison in a *general compressor* - PAQ is a consistent winner in compression efficiency among math fiends who are fascinated by this stuff and always testing.

 

You might be able to build a content-aware compressor that takes advantage of the format suggested in the file though and does better. Or, given that it's an image, a GAN compression approach that won't be lossless but would be scary efficient and look nearly the same.

 

Thought it might be interesting to just know where the theoretical edge of efficiency was.

  • Like 1
Link to comment
Share on other sites

9 hours ago, Irgendwer said:

Just for completeness: "autogamy" (2148 bytes for "Conan") needs 9 frames (DMA off).

 

added

 

3 hours ago, gnusto said:

There's no way PAQ8 would run an Atari (1288MB of memory used for this lol)

 

that much memory is needed to decompress this file? : D

 

PAQ8 wins so far ?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...