marcelk wrote:If the search employs the null move heuristic[2] such move will be considered an irreversible move that resets hm to zero when it is made on the internal board.
You might want to stress that one can still have a "repetition" (null Nf6 null Ng8), but that these shall be ignored for the purposes here. [For that matter, you might mention castling/rights, as it is "reversible" for the 50-move rule, but not for 3-rep (see
this thread].
marcelk wrote:More importantly, in the nodes of interest presumably a complete move generation has been done already because another reversible move is already being searched from there and many move generators produce all such moves at once. (3.1, page 5)
I don't know about this, as hash moves and killers are often reversible, but usually are processed before "ordinary" moves w/o the full move generation?
marcelk wrote:The problem of a regular hash table is that it either needs a lot of space to prevent collisions, or it needs an arbitrary number of probes.
I've often had hash tables in programming that I do, and have never found the cuckoo solution all that useful. Is ensuring the worst case is small that important? Otherwise something simply like a "rolling" hash [
linear probing with step 1, I guess] has good average performance (if X is filled, use X+1, average case takes 1/(1-density) probes). Admittedly, capping the probe limit probably has some branch prediction benefits. Also, in my applications (math/chess/...), usually I can't precompute all the data, and am receiving it on an ongoing basis (e.g., if data is found, return hashkey, else store data in hash), and in one of my applications [due to memory constraints] I was much above 50% density (I ended up with 7636282800 entries in 2^33 bins [8 bytes each, so 64GB], or 89%). You discuss this partially in 3.2, but I think you could say a bit more [maybe bullet point the desirable features compared to alternatives] about why cuckoo might be the solution of choice in the given situation (other than having a cutesy name
).
Also, what is the genesis of Figure 1 [correlation of 15s versus 300s searches]? I have been wondering about such data for some time but could find nothing too recent (my recollection is that an old ICCA article gave a predictive model, but I can't find it).
You might also mention that the idea shares some (vague) similarity to
enhanced transposition cutoffs, though it seems the technical details of your repetition scheme are not available there.