Normalize book weights#1109
Conversation
|
I don't understand how this PR changes anything. Scaling the weights shouldn't matter, since all that matters is their relative sizes. Multiplying every weight by the same number should result in the same probability. Also, it looks like the new random choices in the After reading the linked issue, are these changes about making min_weight = polyglot_cfg.min_weight
normalization = polyglot_cfg.normalization
entries = reader.find_all(board)
scalar = (100 if normalization == "none"
else sum(entry.weight for entry in entries) if normalization == "sum"
else max(entry.weight for entry in entries) if normalization == "max")
min_weight = min_weight*scalar/100
# I think this is equivalent to the change in the PR.
# The rest of the function is unchanged.
if selection == "weighted_random":
move = reader.weighted_choice(board).move
elif selection == "uniform_random":
move = reader.choice(board, minimum_weight=min_weight).move
elif selection == "best_move":
move = reader.find(board, minimum_weight=min_weight).move |
|
The goal was to normalize the weights of an opening book, as they can be in the range of 0-65535, and even in the same opening book some positions may have a sum of weights |
|
Is it possible that the maximum score will be negative? If so, the The documentation for How are checkmates handled in books? If they have an infinite score, then do the book functions correctly handle that? |
|
The weights in polyglot opening books are unsigned 16-bit integers so they take values from 0 to 65535. |
Type of pull request:
Description:
Adds the ability to normalize the weights of an opening book, so that
min_weightcan be used more easily across different books.Related Issues:
#1107
Checklist:
Screenshots/logs (if applicable):