constant bitrate approach with lossless data compression on an FPGA

K

Kurt Kaiser

Hi folks,

I've implemented a lossless data compression (kind of experimental
proprietary solution) algorithm in VHDL. For buffering the non-constant data
output I've chosen a FIFO in the encoder and decoder although I'm not very
happy with that. Because of the high variance it will happen sooner or later
that the FIFOs run under or over, respectively. Is there a common practice
or trick of "enforcing" a certain constant bitrate over certain periods?

Any ideas? Answeers and tips are very appreciated.



Regards Kurt
 
?

=?ISO-8859-15?Q?Michael_Sch=F6berl?=

I've implemented a lossless data compression (kind of experimental
proprietary solution) algorithm in VHDL. For buffering the non-constant data
output I've chosen a FIFO in the encoder and decoder although I'm not very
happy with that. Because of the high variance it will happen sooner or later
that the FIFOs run under or over, respectively. Is there a common practice
or trick of "enforcing" a certain constant bitrate over certain periods?

well ... this is called rate-control

to prevent overflow:
you would need to modify your algorithm to be lossy (at least in some
extreme cases where it would produce too much data) ... this could be
done by some adjustable quantization but this depends on your algorithm ...

to prevent underflow:
just don't read anything from your fifo if there is not enough data!?
or is your output some kind of streaming? fixed data-rate?
you could build "packets" and mark some of them as invalid ..


bye,
Michael
 
K

Kurt Kaiser

Michael Schöberl said:
or is your output some kind of streaming? fixed data-rate?

Unfortunately it is. It HAS to be lossless and it's based on streaming data.
 
T

Thomas Richter

Kurt said:
I've implemented a lossless data compression (kind of experimental
proprietary solution) algorithm in VHDL. For buffering the non-constant data
output I've chosen a FIFO in the encoder and decoder although I'm not very
happy with that. Because of the high variance it will happen sooner or later
that the FIFOs run under or over, respectively. Is there a common practice
or trick of "enforcing" a certain constant bitrate over certain periods?

Any ideas? Answeers and tips are very appreciated.

For overflow: I afraid there's nothing you can do. If you cannot
compress the data to the target rate you promised to compress to, and
lossless is your only option, then the code must fail. If you feed a
high-entropy source into the codec, you cannot expect a high compression
rate, no matter what.

What you can possibly do is to have a certain escape symbol to send the
data uncompressed if the buffer becomes too full. This reduces the
latency at the encoder side. If you then have a limited bandwidth
connection to send the data across - well, nothing can be done at all.

For buffer underflow: That's easier. Just send another escape symbol
("line fill") to the receiver to have him stand by for more data.

So long,
Thomas
 
K

KJ

Kurt Kaiser said:
Unfortunately it is. It HAS to be lossless and it's based on streaming
data.
Then you need to
1. Compute a guaranteed upper bound on the amount of output data produced
per input in your worst case scenario
2. Run the fifo fast enough per the above computation that it will not
overflow.
3. If the output side has flow control then when you perform #1 you also
have to figure out and take into account the worst case amount of time when
the output may be saying 'not ready' and add that in plus any latency in
your logic in responding once it finally does say 'ready'.

KJ
 
J

John_H

I'm an FPGA guy that's flirted with compression and decompression in the
past so I have a good feel for your world. First, you must know that you
cannot compress all of the data all of the time. The comment that you'll
have to be able to go lossy at some point could is valid whether it's an
algorithm that tosses out some of the less significant information or if the
link "comes down" until the data is back to normal. If your data set is
typically well behaved, you've got a good shot of keeping things lossless
most of the time but there can always be a problem case: imagine 100
hawaiian-shirt-wearing leprechauns dancing across a teleconference screen.
The movement and virtual randomness of those colorful patterns across the
majority of the image will play havoc with an encoder that expects a static
background image.

If you can figure the absolute worst case scenario that your lossless
algorithm has to keep up with, you have your speed. If you want all random
data to compress losslessly at a fixed streaming rate, it cannot happen.
There are no FIFOs big enough to guarantee you can cover all situations
losslessly without having a *higher* output rate than your input.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,756
Messages
2,569,535
Members
45,008
Latest member
obedient dusk

Latest Threads

Top