summaryrefslogtreecommitdiffstats
path: root/include/net
diff options
context:
space:
mode:
authorDavid S. Miller <davem@davemloft.net>2011-07-11 10:28:12 +0200
committerDavid S. Miller <davem@davemloft.net>2011-07-11 10:28:12 +0200
commitcd0893369ca85fd11bc517081b2d9079d2ef2f90 (patch)
treee2f46e4b4270c7403a895339df93d9bbefb7c363 /include/net
parentskbuff: update struct sk_buff members comments (diff)
downloadlinux-cd0893369ca85fd11bc517081b2d9079d2ef2f90.tar.xz
linux-cd0893369ca85fd11bc517081b2d9079d2ef2f90.zip
neigh: Store hash shift instead of mask.
And mask the hash function result by simply shifting down the "->hash_shift" most significant bits. Currently which bits we use is arbitrary since jhash produces entropy evenly across the whole hash function result. But soon we'll be using universal hashing functions, and in those cases more entropy exists in the higher bits than the lower bits, because they use multiplies. Signed-off-by: David S. Miller <davem@davemloft.net>
Diffstat (limited to 'include/net')
-rw-r--r--include/net/neighbour.h2
1 files changed, 1 insertions, 1 deletions
diff --git a/include/net/neighbour.h b/include/net/neighbour.h
index 4014b623880c..6fe8c2cd5acb 100644
--- a/include/net/neighbour.h
+++ b/include/net/neighbour.h
@@ -142,7 +142,7 @@ struct pneigh_entry {
struct neigh_hash_table {
struct neighbour __rcu **hash_buckets;
- unsigned int hash_mask;
+ unsigned int hash_shift;
__u32 hash_rnd;
struct rcu_head rcu;
};