Return to my Checkers pages
Return to my home page


Blondie24 by David B. Fogel

Book Review, © Copyright 2003, Jim Loy

Blondie24 is a checkers program which uses neural nets, in order to teach itself to play checkers, with the help of its opponents. Blondie has apparently learned to play losing openings, if the example games are any indication. Blondie has a 2000 rating (Expert) on Microsoft's Gaming Zone. And, again from looking at the example games, 2000 on the Gaming Zone would appear to be "rank beginner." I expect that many of those players play very well when they crank up their computer checkers programs, but they didn't do that in this book. There are three or four relatively good games here, including a nice win against the WWW version of Chinook, but most games are riddled with mistakes. Here is an example (which I will annotate briefly):

Neural Net - human opponent (rating 2207 "master")
11-16 22-18 16-19 24-15 10-19 23-16 12-19 25-22 7-10 21-17?? (Dr. Fogel correctly says, "Not a good move for a master") 9-14 18-9 5-21 22-18 8-11 27-24 11-16 24-15 10-19 31-27 3-7 18-15 1-5 29-25 4-8 27-24 6-9? 24-20 8-12 20-11 7-16 15-11 19-23 26-19 16-23

Let me stop here. Is there a checkers player anywhere on earth who doesn't see the need for 25-22 in this position. Surprisingly, it would seem that 25-22 draws. Only a beginner would move 11-8???, which is what White did. Then 2-6??? (instead of 23-26), then 8-3???, then Blondie moved 23-26 with a win. The game ended in a draw with three kings vs. two kings, because Blondie has no ending database. Most programs would draw such an ending without an ending database. But Blondie's neural nets have not taught her (it?) to crowd the opponent when ahead, in an ending. Then what good are these neural nets?

I'll bet there is much of interest, about neural nets, for the programmer here. But Blondie's opponents have let her get away with opening mistakes like 11-16 23-19? to the point that the neural nets probably recommend that move as the best. Of course this opening is difficult to win for Red (apparently impossible if White plays very accurately, which is not the case here), but Blondie seems to never get punished for bad moves. The example games are mostly of very poor beginner quality.


One reader commented:

I am admittedly a CS student with absolutely no knowledge of checkers (except for the rules). However, I noticed in your book review of Blondie24 by David Fogel, the game you choose to quote is one of the games of Fogel and Kumar's second neural network (rated 1914.3, and called Obi_WanTheJedi, rather than Blondie24's 2045.85). This game was one of only two master or expert level draws for Obi (probably because of the stupid move), so perhaps your analysis of theZone.com's competition is skewed (the rest of the expert's seem fine). What is your analysis of the final neural network's play? Did the addition of spatial analysis neuron's actually help Blondie's game? In your conclusion, you ask what good neural networks are anyway. Note that the game you analyzed was played by a two-hundred and fiftieth generation neural network. How good would you be after playing 250 sets of 5 games in which you did not even know which games you won? Increasing the generation count and number of neurons significantly increased the play of the network (from 1750 to 1914 for 100 to 250 generations, and from 1929 to 2045 for 250 to 840 generations); since the networks learn more in a longer time, it may be the case that increasing the generation count further can further increase the level of play of the network. Also, networks may take a lot of training to compete with an expert system like Chinook, but anybody can grow (or add to) their own on their home pc's extra clock cycles. Digenetics (www.digenetics.com) is soon going to release a program that allows individuals to do just that!

My response:

Let me explain my reaction to the book more clearly. That book is the stupidest bunch of checkers games ever published, by far. Most of the bogus masters and experts played like rank beginners. Blondie never learned to play novice level checkers, if the book is any indication. I don't really know what good neural networks are (plenty perhaps), but neural networks would be a joke if the book is evidence. Blondie's opponents blundered badly and repeatedly, whether Blondie was in strong or weak positions; what kind of training is that? I suspect that Blondie could have learned something from playing against actual checkers players.


Return to my Checkers pages
Return to my home page