OVERVIEW
AGF is a sample pre-processor. It transofrms the data into a form having very
little information content. This makes it easier for compression programs to
pack it down to a small size. AGF combined with GZIP gives an average
compression of 50% and it is always better than any other compression method on
its own. It is similar to ADPCM, but better :)
HISTORY
05-09-1999 : Released a version that works properly (more or less)
SUMMARY
AGF - Adaptive Gradient-descent FIR filter.
This is a neural-network-like adaptive FIR filter. The adaptation is
deterministic, which means that the sample can be recovered from the processed
file without needing to save an FIR coefficients to it as well. Adaptation is
done on-line, on a sample-by-sample basis.
USAGE
AGF MODE sample processed_sample
The processed sample can then be efficiently packed with any kind of packer.
I recommend xpk (xGZIP or xSQSH). lha/lzx will also do :)
The results are always MUCH better.
Modes:
x : extract (decode) using a linear ANN
c : compress (encode) using a linear ANN
xd : extract (decode) using a static filter
cd : compress (encode) using a static filter
TODO
Make an xpksublib out of it.
Add options for adjusting the number of coefficients and adaptation rate.
BUGS
Might have problems with samples that clip a lot..
Bugs Reports to olethros@geocities.com with "AGF BUG" as the subject message please
SEE ALSO
see also dev/basic/gasp.lha for a similar pre-processor where the adaptive
process is controlled by a Genetic Algorithm
|