I'm halfway there with the new AI system. But that doesn't mean it will work. I set up the basic outline and functions that drive the AI and it's all looking pretty good. The only problem is that it's insanely complex. Then again, so is human behavior. The front end, however, is very easy to use. Writing the library will be the hard part. Implementation will be a piece of cake.

My test today was pretty simple. I had an AI by the name of "Josh" setup and initialized in a very basic "Reality." I gave him some basic actions. He had the ability to do the following things: "get out of the chair," "walk from the Pilot Plant Building to the Main Building," "walk from the Pilot Plant Building to the covered walkway," "walk from the covered walkway to the Main Building," and "climb the stairs in the Main Building." All the actions were set up accordingly so that they could only be performed in their respective locations and caused the consequence of changing location (or seated status in the case of the first action).

I then gave "Josh" an aspiration: "get to the upstairs part of the Main Building." I initialized the Reality with Josh being seated in his chair in the Pilot Plant Building (the very location from where he was being programmed, oddly enough). To successfully achieve his aspiration, Josh would have to figure out that he needed to get out of his chair (because you can't walk around while seated), walk from the PPB to either the Main Building or the covered walkway (depending on the weather), walk from the covered walkway to the Main Building (if the weather is inclement), then climb the stairs to finally reach the upstairs. Of course Josh was not provided with any information indicating that this sequence would have the desired effect. It was the job of the AI to examine the Reality and actions available to it and determine the best course of action to achieve the aspirations.

Without flaw, Josh managed to perform the aforementioned sequence of actions. When I set the weather to sunny, Josh cut across the asphalt accordingly, and when I made it rain, he used the covered walkway to get to the Main Building. All of the implementation was accomplished in only about 50 lines of simple code.

Although it may seem trivial, this is definitely a success. The AI "intelligently" navigated its environment to achieve something it wanted. It's not fully AI yet because it's not smart enough to know how to achieve something if the achievement is too far away, and it doesn't know what to do if it gets stuck. But all of that will be filled in soon enough and I'll be able to test a more complex system (maybe even one that I can't predict).

I wonder if Josh will be able to make good music while he walks to the Main Building?