Thursday, July 16, 2009

News on ANNE

The toughest task in a programmer's life is not the actual algorithm but the bug hunting, I noticed that something was still weird in ANNE and I found some more bugs in my code, but well.

I implemented some improved trainers for XOR operation and for the emotion part, but this time not by running a fixed number of epochs, but by checking for the LMS (least mean squares) of the training sets, which actually works nicely so. When setting the trainer to 0.01% LMS: ann structure = input:(hidden layers): output

* XOR works after 11422 epochs with 2 hidden layers and it takes 11855 epochs with 1 hidden layer, 2:(2:2):1 versus
2:(2):1. Precision seems to be on par of both by 1%.

* Emotions are trained after 25201 epochs with 2 hidden layers and 18387 epochs for 1 hidden layer, the first is 6:(6:6):6 and the other is 6:(7):6. Their precision is about on par for learned values, but the first is much more precise towards one or the other emotion when interpreting new values, the second would allow for some more varied emotions (like primary and secondary).

My interpretation here is that while for most practical purposes a second layer is not required, it does add a lot of precision if you care for it and if you don't mind the extra time for the training.

I believe ANNE is now working correctly. Now I will start trying different AI scenarios for NPCs, it seems that ann's are suited to be the brains of a NPC, let's see.



Wednesday, July 8, 2009

ANNE to ELIZA: Think before you talk !!

As I said before, there are so many, many components that need to be built for the complete SMASH, one that I have barely mentioned so far is the AI part, no game ever would be really complete without some intelligence. So let's talk smart today.

Eliza will most likely be part of the NPC perception system and and it will be important for stateful conversations with the added context info (which will later form a sort of decision tree) and when I coded Eliza, I had the idea of also including something e
lse, one thing that I find about basic Eliza is that it feels too cold, there are no emotional reactions behind it. We humans associate words with feelings, like it or not, so I started looking into that.

Now, how do we translate input into emotions ??

In order to do that association I dug up a couple of AI books on my shelf and found a nice model about how a NPC (I recommend the book: Programming Believable Characters for Computer Games by Dr. Penny Baillie-de By
) can translate situational information into emotions, which turns out to be an artificial neural network (ann), so after some real messy programming sessions ANNE was born: Artificial Neural Network Entity.

ANNE is a fully connected multilayer feedforward network with normal error backpropagation. I will spare you yet another explanation of how ann's (also called sometimes multilayer perceptrons) work, mathwise, I found however this very insightful page: Introduction to Neural Networks which does an awesome job at explaining them.

While it's tempting to implement each neuron in Erlang as a process, I didn't, because I do not like to have processes just laying around doing nothing 99% of the time, rather, I implemented this as a list of lists, which can now ve
ry conveniently be saved to KVS*. This way you don't have to retrain it every time, just save a trained version on KVS2 and with KVS* replication, you now have the model ready to use across all SMASH nodes.

I wanted to see just how many neurons ANNE could support, so after some partial testing I can say that I
was able create f.e. an ann with 500 inputs, 2 hidden layers with 5000 neurons each and 500 outputs, let me tell you that this a hell of a list of lists with a total of 30 Mio internal weights, so be patient during the training process. Most literature also does not speak about how many iterations are required to train an ann, but it would seem to me that it requires about 50'000+ epoch (epoch = 1 run with all training sets) to perceive a convergence of the weights. Erlang impressed me yet again, I was unsure if it could hold THAT much data in a single list .... but it did.

Now, let's get a feel for it:
My first test was as suggested by Mrs. Penny a simple XOR operation (with 2 inputs, 2 hidden layers neuron in a single layer and 1 output neuron) which after sufficient training works just fine. Once it passed the initial runs I grabbed that emotional model described in the book looks like this (chapter 7.5.2):

What you see here on the left is the lower part of the trained ann with 6 inputs, 1 hidden layer with 7 neurons and 6 output neurons. The ann is then trained with 6 x 6 samples for about 50'000 epochs, returning the new trained ann, the output values should be loosely binary as intended, meaning 5 ceros and a single 1 value.

The call to anne:ann([INPUT], ANN) accesses the anns knowledge, so I fed the values from the book into it, knowing that those values will need to be interpreted and hoping that the 4th value is clearly the highest one of them, .... indeed the 4th parameter is the highest with 0.88 ... which is correct !!! In this example it represents: "fear", also note that the 3rd value is 0.24 which stands for "anger", the beauty of an ann is that it states the most prevalent emotions, but you could also take into account the 2nd strongest one, so maybe this input to the NPC will cause it to run away in fear cursing angrily.

I now need to find some example with 2 hidden layers to test ANNE on, but so far everything works just fine.

I have not decided yet how a fully grown NPC will have to work, it might be a sort of hierarchical finite state machine (HFSM) with modules of Eliza and ANNE integrated into it or just ANNE + Eliza, but I believe this is good progress already on the NPC AI.

One more nice thing is, if you embed an ann into a looped process you would only need a single instance on all of the SMASH servers for all NPCs to interpret that input. Maybe several anns will feed its data into a HFSM and make it trigger its actions and responses.

ANNE is still an uncooked meal, but the ingredients look fine. And that's all for now folks. Until next time.