Hi Sachim, Sergei!

One quick thing I wanted to point out. I did not specifically look at how things get called, but,
when defining constants, I don't agree with:

> +#define HA_HASH_STR_LEN                 strlen(HA_HASH_STR)
> +#define HA_HASH_STR_INDEX_LEN           strlen(HA_HASH_STR_INDEX)

This hides an underlying strlen. Better make it a real constant value. Perhaps the compiler is smart enough to optimize it away, but why risk it?

Another one is why not define them as const char * and const int? This also helps during debugging, as you can do:

(gdb) $ print HA_HAST_STR_INDEX_LEN

I know that a lot of the code makes use of defines with #define, but why not enforce a bit of type safety while we're at it?

Just my 2 cents, feel free to disagree. :)