Deterministic chess algorithm

Code, algorithms, languages, construction...
stvs
Posts: 33
Joined: Thu Jun 10, 2010 9:53 am

Re: Deterministic chess algorithm

Post by stvs » Sat Jan 08, 2011 12:12 am

just one question why is bad for one engine to be non deterministic? shouldnt be interesting to add some radomness in the game?
i think adds some flavor..isnt borring same moves in same positions? i iam fan for computer chess 25 years a go and i remember
from the 8 bit era til now only chessmaster added randomness settings.the rating loss wouldnt so big.

hyatt
Posts: 1242
Joined: Thu Jun 10, 2010 2:13 am
Real Name: Bob Hyatt (Robert M. Hyatt)
Location: University of Alabama at Birmingham
Contact:

Re: Deterministic chess algorithm

Post by hyatt » Mon Jan 10, 2011 6:50 am

main thing is it is a PITA to debug. If you get different answers, how can you verify that a change you made did not do anything but speed up the engine, and since this is a common thing to do, you will be hard-pressed to feel confident that the change only made you faster, but didn't unintentionally affect any part of the search or evaluation...

stvs
Posts: 33
Joined: Thu Jun 10, 2010 9:53 am

Re: Deterministic chess algorithm

Post by stvs » Mon Jan 10, 2011 11:55 pm

hyatt wrote:main thing is it is a PITA to debug. If you get different answers, how can you verify that a change you made did not do anything but speed up the engine, and since this is a common thing to do, you will be hard-pressed to feel confident that the change only made you faster, but didn't unintentionally affect any part of the search or evaluation...
i got it. however as a user i just like some radomness in the game, which wont hurt a lot the playing strength of the engine.
since spectrum era i am curious why all chess programmers dont add some randomness settings
stockfish 1.7.1 had this,but since 1.8 no ,i dont know why :( tnx.

hyatt
Posts: 1242
Joined: Thu Jun 10, 2010 2:13 am
Real Name: Bob Hyatt (Robert M. Hyatt)
Location: University of Alabama at Birmingham
Contact:

Re: Deterministic chess algorithm

Post by hyatt » Tue Jan 11, 2011 11:54 pm

stvs wrote:
hyatt wrote:main thing is it is a PITA to debug. If you get different answers, how can you verify that a change you made did not do anything but speed up the engine, and since this is a common thing to do, you will be hard-pressed to feel confident that the change only made you faster, but didn't unintentionally affect any part of the search or evaluation...
i got it. however as a user i just like some radomness in the game, which wont hurt a lot the playing strength of the engine.
since spectrum era i am curious why all chess programmers dont add some randomness settings
stockfish 1.7.1 had this,but since 1.8 no ,i dont know why :( tnx.

There is a ton of randomness built in, because timing is a "jittery" thing to measure. Changing the number of nodes searched in a move will produce different moves and scores. Just play two programs against each other using 5 secs/move, from the same starting position, no opening books, and see if you can produce the same identical game twice. It is _very_ difficult to do.

Hood
Posts: 200
Joined: Thu Jun 10, 2010 2:36 pm
Real Name: Krzych C.

Re: Deterministic chess algorithm

Post by Hood » Wed Jan 26, 2011 3:05 pm

hyatt wrote:Not "slightly decreased". _significantly_ decreased. Shared hash is a big win.
but there is the other advantage when we decide for a determinism, lets assume we are making small adjustments in the algorithm. That case we are able to observe and estimate the result.
Smolensk 2010. Murder or accident... Cui bono ?

There are not bugs free programms. There are programms with undiscovered bugs.
Alleluia.

hyatt
Posts: 1242
Joined: Thu Jun 10, 2010 2:13 am
Real Name: Bob Hyatt (Robert M. Hyatt)
Location: University of Alabama at Birmingham
Contact:

Re: Deterministic chess algorithm

Post by hyatt » Wed Jan 26, 2011 5:04 pm

Hood wrote:
hyatt wrote:Not "slightly decreased". _significantly_ decreased. Shared hash is a big win.
but there is the other advantage when we decide for a determinism, lets assume we are making small adjustments in the algorithm. That case we are able to observe and estimate the result.
If you want real determinism, you can either dump the hash table completely, which would really impact performance. Otherwise, you have to toss out time per move and use fixed depth or fixed number of nodes, neither of which is fair. For my testing of a simple change, I always take old and new and run a few positions to fixed depth to see what the change did to the shape/size of the tree. I then cluster test using normal time controls and play enough games to wash out the non-determinism influence...

Post Reply