Re: 2.2.6_andrea2.bz2

Hans-Joachim Baader (hans@grumbeer.inka.de)
Sun, 2 May 99 09:25 MET DST


In article <m2iuaca3hw.fsf@blinky.bfr.co.il> you write:
>Trees basically have worse average behavior but better worst case
>behavior. Part of the decision depends on how good a hash function
>you can find, and to what extent you're willing to live with its
>failures.

Did anybody think of using a secondary hash function for collisions?
I guess it's too expensive because it would require to calculate many
hash functions in the worst case.

Another idea: Currently every bucket in the hash table has to be a
linked list. I haven't tried it, but the linked list could be replaced
by a tree in order to limit the worst case. Even better, a bucket could
stay a linked list as long as only one element is in the bucket (the
most common case). I a second element is added, it becomes the root
of a tree. Additional elements build the tree. If an element is removed
from the tree, the normal tree algorithms apply. If the first element
is removed, and there are other elements in the tree, one tree element
must be moved to become the new first element.

This algorithm would combine the hash table, list, and tree data
structures, so it would require some extra code to handle special
cases. But I think it would be very fast because it combines the
advantages of hash table, list, and tree, and has nearly no overhead
for the most common cases (buckets with zero or one elements).

hjb

-- 
"Every use of Linux is a proper use of Linux." -- John "Maddog" Hall

- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/