hash collision vulnerability of popular web server-side technology

http request headers are typically parsed by the web server as a hash table. If the technology in use has a deterministic hashing scheme, it's vulnerable to hash collision based DOS attack.

No matter how we choose our hash function, it's always possible to devise a set of keys that will hash to the same slot, making the hash scheme perform poorly. This is the basic idea behind hash collision based DOS attack.

To circumvent this, we can "randomize" the choice of hash function. In other word, we change the hash schema from

    h = hash(key)

to h = hash(key, r), where r is a random number.

The random input r is used to choose from a family of hash functions, and the input key is fed into that hash function. Therefore the hash schema in all is a randomized hash schema with respect to the input key.

For more info on this vulnerability, see http://www.nruns.com/_downloads/advisory28122011.pdf

Microsoft published an adivsory on this issue as well, http://technet.microsoft.com/en-us/security/advisory/2659883

 

The idea of "universal hashing" is to design a family of hash functions, each of them is a well-defined hash function. 

For integers a popular one is h(k) = (a*k + b) mod p, where p is a predefined prime integer and a, b are randomly chosen modular p integers.

for more on universal hashing (e.g. hashing family for strings) see http://en.wikipedia.org/wiki/Universal_hashing 

posted @ 2013-01-13 23:49  qsort  阅读(223)  评论(0编辑  收藏  举报