I've been thinking for quite some time now that maybe I should stop posting on this blog.
There are many reasons behind this decision and maybe I'll start another blog somewhere down the line. But I simply don't believe that this is the right format anymore (maybe it never was, I don't know) for whatever it is that I'm trying to achieve with // comments are lies! This will therefore be my last post here.
I want to thank everyone who participated in the discussions we've had here. I've learned a LOT. Thank you.
See you @ Twitter, SourceForge, Delicious, Github, Google Groups, Facebook, LinkedIn, ...
I've been thinking for quite some time now that maybe I should stop posting on this blog.
Google.org recently announced Flutrends, a web site that provides estimates of flu activity in the US based on search queries.
I'm not sure whether this is a good thing or just plain scary and I'm not surprised that they've found such a strong correlation between their data and the government's either. I am, however, quite interested in knowing what else is in there.
In Ian Ayres' book Super Crunchers: Why Thinking-by-Numbers Is the New Way to Be Smart it is mentioned that Wal-Mart use data mining techniques in order to match shelf space against predicted customer demand. Pretty cool. And they're not alone. But they use closed data sets. Their own, of course, and data that they buy from companies like Terabyte.
I hope that more companies and governments follow Google's example and open up (at least parts of) their data stores for public use. There are lots and lots of mash-ups available that show the potential of combining data sets and creating useful services to the public. Of course, for most companies they still have to find out that they actually have interesting data in their systems.
"If the only tool you have is a hammer, you will see every problem as a nail." - Abraham Maslow
This week, I've been experimenting with Monte Carlo methods for a Pentago AI player. I've written pattern-based AI players for Pentago before and (somewhere) I've got an unfinished implementation that uses Minimax as well.
I didn't really expect much from this little experimental player (it's only about 100 lines of Python code) but it has turned out to be "not too bad". Despite the fact that it only considers one game state at a time... maybe that says more about my other players though ;-)
I'll try to add MCTS (Monte Carlo Tree Search) during this week, maybe even UCT (Upper Confidence bounds applied to Trees) which ought to make the player quite a bit stronger.
Interested in Computer Go? Then you probably already know about Don Dailey's excellent work on providing a number of Monte Carlo reference implementations. So far, he has managed to implement it in
two three different languages (Java, C and some language called Vala).
The announcement is here and the Java bot is available here. It's not the prettiest Java code out there, but Don is a C programmer (I think) so he is excused. It was very interesting and educating to study the code. I've been reading a lot of the Monte Carlo Computer Go papers but I've always felt as if I've missed something fundamental since my understanding of how it works felt too... simple.
Now that I've seen the code I can say, with confidence, that it *is* quite simple. Implementing a Monte Carlo method, that is. Getting a decent (let alone, good) AI-player for 19x19 Go seems quite far from simple.
Ok, so the pattern matching idea kind of backfired.
I was under the impression that a function's default arguments wouldn't be evaluated until the function was called. Which would have meant that I could've changed them, into valid Python code, in the decorator function. But... that's not the way it works.
My first attempt at something more interesting than matching a constant failed miserably. I wanted to use a variable to constrain that matching process. For example:
>>> @patternmatchedshould match any lst which is a list and where there is, at least one, repeated element somewhere in it. Both [1,1,2,3,4] and [3,2,1,1,2] should match, whilst [1,2,1,3,1] shouldn't.
... def foo(lst = [__, var, var, __]):
Unfortunately. Trying the above results in a NameError since both __ and var are unbound. And, seeing that, I remembered that this is *exactly* what happened about three years ago when I was trying out various syntaxes for PyRete. It's the reason I decided to put all of the matching in the if-statement instead of in the function arguments.
But, it's not quite time yet to give up. There are several possible ways to go about this. The "best" choice is probably to use Andrew's python4ply, or maybe PyPy. Either way, I'll get a chance to compare and contrast the work of implementing a DSL in Python versus Common Lisp.
I've been rewriting the deftemplate code for MPS more or less from scratch... but I'll have to talk about that some other time. It's high time to focus on the defrule macro(s). I'll break this up into several posts and I'll start with the function for executing the RHS.
The MPS syntax (a subset of CLIPS') for defrule looks like this:
defrule-constructI'm quite far from having all of the above supported, but I've got enough to show how the function that executes the RHS of a rule works.
::= (defrule rule-name
::= template-pattern-CE ¦ assigned-pattern-CE ¦
not-CE ¦ and-CE ¦ or-CE ¦ test-CE ¦
exists-CE ¦ forall-CE
::= single-field-variable <- template-pattern-CE
not-CE ::= (not conditional-element)
and-CE ::= (and conditional-element+)
or-CE ::= (or conditional-element+)
test-CE ::= (test function-call)
exists-CE ::= (exists conditional-element+)
forall-CE ::= (forall conditional-element
::= (deftemplate-name single-field-LHS-slot*)
::= (slot-name constraint)
constraint ::= ? connected-constraint
::= single-constraint ¦
single-constraint & connected-constraint
single-constraint ¦ connected-constraint
::= term ¦ ~term
term ::= constant ¦ single-field-variable ¦
:function-call ¦ =function-call
The defrule macro works by expanding into a series of macro calls which handle more and more specific cases in the processing. The macro itself is quite simple
(defmacro defrule (name &body body)As you can see, it expands into two other macro-calls: compile-lhs and compile-rhs, where the first parses the conditional-elements and, in the process, creates two symbol-tables (fact-bindings and variable-bindings) with all of the variables it can find.
(let ((rhs (cdr (member '=> body)))
(lhs (ldiff body (member '=> body))))
(compile-lhs ,name ,@lhs)
(compile-rhs ,name ,@rhs))))
The second macro handles the RHS and looks like this:
(defmacro compile-rhs (name &body rhs)So, if we evaluate this:
(when (null rhs)
(setf rhs '(t)))
`(defun ,(make-sym "RHS/" name) (activation)
(let* ((token (activation-token activation))
,@(mapcar #'make-fact-binding (reverse fact-bindings))
,@(mapcar #'make-variable-binding (reverse variable-bindings)))
MPS> (defrule foobarit expands into this (among other things):
?foo <- (foo (bar ?bar) (baz 1))
(format t "~%~A ~A" ?foo ?bar))
(DEFUN RHS/FOOBAR (ACTIVATION)Most of the work is with figuring out how to assign values to each of the variables. I'm using a simple list as the structure for the tokens. WMEs (fact objects) are added in the order they appear in the list of conditional elements. Once all of the WMEs are bound, all of the variable-bindings are bound by using the automatically constructed accessor methods for those structs.
(LET* ((TOKEN (ACTIVATION-TOKEN ACTIVATION))
(?FOO (NTH 0 TOKEN))
(?BAR (DEFTEMPLATE/FOO-BAR ?FOO)))
(FORMAT T "~%~A ~A" ?FOO ?BAR)))
The actions in the RHS are simply spliced into the let* form at macro-expansion time which completes the function definition. It is then evaluated and stored with the rule's production node. Later, when that production node is left-activated, it creates an activation (with the WMEs and some additional meta-data like timestamps and such) and places it in the conflict-set. If that activation ever triggers a rule, it is passed as an argument to the RHS function.
Since I haven't got enough of "the rest" of MPS in place. We're going to have to mock an activation and call the function directly to see whether or not it works.
MPS> (rhs/foobar (make-activation :token (list (foo (bar 1) (baz 1)))))There's really not much more to say about the RHS so I'll stop here. The LHS is a bit more complicated and I hope that I can show some of that code soon.
#S(DEFTEMPLATE/FOO :BAR 1 :BAZ 1) 1
All those who have tried to write a rules-based program know that there are, at least, two differences between "regular" programming and rules-based programming. The first is that you have no direct control over the flow of execution and the second is that, instead of specifying input parameters, you specify patterns that should be matched for a certain "function" (or rule) to be invoked.
The bit about giving up control is quite difficult for most programmers (at least the ones I know) and it doesn't translate well to other types of programming anyway. But it would be interesting to try and add pattern matching functions to a language such as Python and see whether it could be useful (or at least fun to experiment with).
It's not a new idea. The first language to support pattern matching this way was Snobol (version 4 I think) which was introduced in the late 60s/early 70s. Both Haskell and Erlang has it and there are a bunch of others as well, Qi and Prolog to mention a few.
I'm thinking about something along the lines of:
>>> from patternmatching import *The idea is that the first function would handle any calls to the fac function where the parameter n is 0 and the second function would handle any calls where the parameter n is an int (but not 0). And, yes. I've got the above working already. The tricky bits are providing more elaborate forms of pattern matching with the available syntax.
... def fac(n = 0):
... return 1
... def fac(n = int):
... return n* fac(n = n-1)
I'll be back shortly with some code...