Re: Decompression Bombs

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



In-Reply-To: <7DB0958915FDD611961400A0C98F18464E8DB1@WINTRIX.thermeon.com>

I feel.. softwares should check the actual intrigity of data first... before extracting the archive insted completely trusting on the header information.

 [this will indeed show down the decompression process to some extent...]

But, if the decompressor engines are used by Anti-virus softwarem... etc it should break the operation and insted list it in its error logs.

but say, a file is stored in a disk and the disk cluster that hold the header info. of the archive gets corrupt! in that case..... the only way to recover the data would be to check the intrigity of the archive and recover!

---the only solution could be, ... using small digital signatures in the archive [say, something like public key of PGP] and only trusted archive should be opened trusting it's header... otherwise, the decompressor engines should check the real intrigity of data as well, before proceding further. ----

... I do wanna know what are your views cauz, If you try to write a code to secure from such problems, it will certainly slow down a normal operation... 

[..I feel, soon the world will see digital certificates in JPG files (O; ]

--------------------------------------------------------------------- 



>> dd if=/dev/zero of=testfile count=10000&&gzip testfile&&ls -la testfile
>No need to fill up your own disk -- do this instead:
>dd if=/dev/zero bs=1k count=10000 | gzip - > testfile.gz 
>


[Index of Archives]     [Linux Security]     [Netfilter]     [PHP]     [Yosemite News]     [Linux Kernel]

  Powered by Linux