[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: standard for steganography?
On Mon, 28 Feb 1994, Norman Hardy wrote:
> Has anyone done statistical studies of low bits of pixels or sound samples?
> I suspect that they are often far from random. A flat 50% distribution in
> the low bits might standout like a sore thumb. I can imagine the the low
> bit can be distributed dependently on such things as the next to low bits
> or 60 cycle power at the recorder. Some AD converters are known to produce
> 60% ones or some such. Like mechanical typewriters, AD systems probably
> have there own idiosyncrasies. Given a flat stream of cipher data, there
> are techniques to reversably introduce such variations to mimic the biases
> of real AD converters without much data expansion.
>
> It is my wild guess and conjecture that with such statistical variation
> built in there would be no effective statistical test for a given file
> containing hidden messages.
>
>
Yes, pure white noise would be anamalous. I have suggested that one use
a Mimic function with a "garbage grammar". Implemented correctly, it should
withstand statistical analysis.
What is an AD converter? And what are the techniques you speak of that
mimic those AD converters?
Sergey