![](https://secure.gravatar.com/avatar/8ca9f04433bd78f9de5100d8f4d5ad06.jpg?s=120&d=mm&r=g)
Hi Sachim,
Sergei's suggestion with STRING_WITH_LEN macro or sizeof("string") should
fix the problem you're raising.
Regards,
Vicentiu
On Fri, 26 Aug 2016 at 18:11 Sachin Setia
Hi Vicențiu
Thanks Vicențiu for your comment. Although I agree with you but defining #define HA_HASH_STR_LEN 4 or const int HA_HASH_STR_LEN = 4;
make it independent of length of "hash". Although we rarely gone change "hash", so I think it is a good idea. What do you think , Sergei? Regards sachin
On Fri, Aug 26, 2016 at 7:26 PM, Vicențiu Ciorbaru
wrote: Hi Sachin, Sergei!
One quick thing I wanted to point out. I did not specifically look at how things get called, but, when defining constants, I don't agree with:
+#define HA_HASH_STR_LEN strlen(HA_HASH_STR)
Or:
+#define HA_HASH_STR_INDEX_LEN strlen(HA_HASH_STR_INDEX)
This hides an underlying strlen. Better make it a real constant value. Perhaps the compiler is smart enough to optimize it away, but why risk it?
Another one is why not define them as const char * and const int? This also helps during debugging, as you can do:
(gdb) $ print HA_HAST_STR_INDEX_LEN
I know that a lot of the code makes use of defines with #define, but why not enforce a bit of type safety while we're at it?
Just my 2 cents, feel free to disagree. :) Vicentiu