Compression Bombs *Description* See [CWE 409](http://cwe.mitre.org/data/definitions/409.html). Also libpng has a [great discussion](http://libpng.sourceforge.net/decompression_bombs.html) Similar to an XML bomb, compression bombs are primarily used for denial of service attacks by filling up RAM or hard disk space. Minimally, this crashes the process and causes denial of service. Crashes, if handled poorly, however, can also cause other integrity problems (e.g. data corruption), or confidentiality problems (e.g. core dumps). What is more challenging about compression bombs is how ubiquitous compression is, and how hard this is to validate. If you are doing input validation, then you probably need to decompress first, so your decompression library is on the front lines of your attack surface.
Further details & Example How does it work? A zip or compression/ de-compression bomb is usually a small file for ease of transport and to avoid suspicion. However, when the file is unpacked, its contents are more than the system can handle. Infamous example: 42.zip Click here to download 42.zip(42.374 bytes zipped) The file contains 16 zipped files, which again contains 16 zipped files, which again contains 16 zipped files, which again contains 16 zipped, which again contains 16 zipped files, which contain 1 file, with the size of 4.3GB. So, if you extract all files, you will most likely run out of space :-) 16 x 4294967295 = 68.719.476.720 (68GB) 16 x 68719476720 = 1.099.511.627.520 (1TB) 16 x 1099511627520 = 17.592.186.040.320 (17TB) 16 x 17592186040320 = 281.474.976.645.120 (281TB) 16 x 281474976645120 = 4.503.599.626.321.920 (4,5PB)
What to do? *Mitigations* *Notes* Technically, a decompression library can mitigate this problem by keeping count of how many bytes have been decompressed and throwing an exception when it exceeds that limit (as is the mitigation with XML bombs). In practice, this feature often does not exist in decompression libraries (sadly). Look for such limits in the libraries that you use. Avoid inputs where an arbitrary number of rounds of compression are allowed (e.g. this is possible with HTTP Headers) Distrustful decomposition + strict system resource limits can mitigate this too. For example, a PNG file that gets server-side processing might want to do that processing in a separate process with tight memory consumption constraints. This adds complexity to your design too, and introduces concurrency complexities. *Notes* Compression is everywhere. HTTP Response headers can be compressed at the web server-to- browser level (unbeknownst to wep app developers). PNG and JPEG files are susceptible to bombing. MS Office documents are simply zip files of XML. Testing for this is very easy - just create a bomb with blank data and compress it heavily. Some compression tools might prevent you from over- compressing, so look up the maximum ratios of your compression algorithms instead of just trusting the compression tool Due to the super-high ratios achieved by modern compression bombing, it is NOT a feasible approach to simply limit the the compressed input size.