Friday 29 April 2011

The following is the code that will let you access rexbot code from the botAPI

I've been scratching my head trying to figure out how to access the RexBot code from inworld.

Obviously when things are stable and the game AI is built in to the RexBot code it should run smoothly, but there will be need for hooks into the bot itself from inworld so that I can test it. That leads to the need for me to be able to access the methods and properties of the RexBot code from within the botAPI. Basically they exhibit two different interfaces. Rexbot is IClientAPI and botAPI doesn't have IClientAPI.

Luckily however, we can get access to the Rexbot code through the botmanager code. What I had to do was add a function to the botmanager code which pulled out the appropriate rexbot from the internal private m_bots collection of bots. From there I can get access to the public members.
Which is exactly what I need to do in order to be able to easily test. So *now* I believe I'm well on the way to being able to make the changes I want to make.

public void botSetState(string bot, string State)
{
//this is a test case of being able to get access to a property
//of the actual rexbot itself through the botAPI
IBotManager manager = World.RequestModuleInterface();
if (manager != null)
{
RexBot rxbot;
rxbot = (RexBot)manager.GetBot(UUID.Parse(bot));
//follow up with this
switch (State.ToLower())
{
case "walking":
rxbot.State = RexBot.RexBotState.Walking;
break;
case "idle":
rxbot.State = RexBot.RexBotState.Idle;
break;
case "flying":
rxbot.State = RexBot.RexBotState.Flying;
break;

default:
rxbot.State = RexBot.RexBotState.Unknown;
break;
}
}

}

Thursday 28 April 2011

Code to animate the avatar

Here's a rebased version of Christy Lock's code which will animate the bot:

Add the following code to the Bot_API.cs in the Botmanager module:

public void botAnimate(string bot, string AnimationUUID)
{
m_host.ParentEntity.Scene.ForEachScenePresence(delegate(IScenePresence sp)
{
// this should be the bot id
if (sp.UUID == UUID.Parse(bot))
{
sp.Animator.AddAnimation(UUID.Parse(AnimationUUID), UUID.Zero);
}
});

}

Code to determine the distance between NPC and human's avatar

One of the code snippets I need to do is determine the distance between the NPC and the human avatar in order to determine if I should be running the animation code or not. There's already existing code by Christy Lock to do that for her AStar pathfinding class. This is a rebased version of it which should do the trick hopefully:

Vector3 diffAbsPos = FollowedSP.AbsolutePosition - m_botscenePresence.AbsolutePosition;
if (Math.Abs (diffAbsPos.X) < m_closeToPoint || Math.Abs (diffAbsPos.Y) < m_closeToPoint)
{
// run the attack code here which includes animation and reducing the health points of
// the attacked human, possibly pushing et cetera depending on the force applied
}

So what this does is get a vector between the scenepresence of the followed avatar (i.e. the human) and the scenepresence of the bot. Then it does some math which amounts to checking if either of the X or Y coordinates of the vector are less than the closetopoint (which is 1 i.e. one unit). Depending on the animation to be run, this could vary but 1 unit is good for now.

On decision trees and goal oriented planning

I like to do a lot of research before I actually start coding so I read up on the latest in AI gaming theory and it turns out that the most lifelike NPCs utilize something called goal oriented planning. I kind of know what this is: it's a sequence of steps to get to a goal i.e. a decision tree.

Where it differs from a decision tree is that a decision tree is typically a hard coded best-estimate of the route to get from where you are now to a planned goal (whether it's a physical location or a state or an action or whatever).

But if we switch it up a little bit and allow competing sub-goals to compete for the fastest path to the goal and update the subgoals-in-running and various choices of paths to the goal then we have something more interesting and this is in fact the goal oriented planning situation I'm reading about.

What's interesting about this is that it means effectively that pathfinding and goal oriented planning are exactly the same thing. In other words A Star (or A*). So I need to dig into A* a little more.

Interestingly, Christy Lock has been bending my ear about A* for pathfinding already and she has already coded some A* into the code, so maybe it could be switched up a little to handle pathfinding to a goal.

But that's for later. For now I'm going to stick to a simplified attack decision tree for the first step.

Any case, here is pseudo code (from wikipedia) for the A* algorithm in case anyone is interested.

function A*(start,goal)
closedset := the empty set // The set of nodes already evaluated.
openset := set containing the initial node // The set of tentative nodes to be evaluated.
came_from := the empty map // The map of navigated nodes.

g_score[start] := 0 // Cost from start along best known path.
h_score[start] := heuristic_cost_estimate(start, goal)
f_score[start] := h_score[start] // Estimated total cost from start to goal through y.

while openset is not empty
x := the node in openset having the lowest f_score[] value
if x = goal
return reconstruct_path(came_from, came_from[goal])

remove x from openset
add x to closedset
foreach y in neighbor_nodes(x)
if y in closedset
continue
tentative_g_score := g_score[x] + dist_between(x,y)

if y not in openset
add y to openset
tentative_is_better := true
else if tentative_g_score < g_score[y]
tentative_is_better := true
else
tentative_is_better := false

if tentative_is_better = true
came_from[y] := x
g_score[y] := tentative_g_score
h_score[y] := heuristic_cost_estimate(y, goal)
f_score[y] := g_score[y] + h_score[y]

return failure


function reconstruct_path(came_from, current_node)
if came_from[current_node] is set
p = reconstruct_path(came_from, came_from[current_node])
return (p + current_node)
else
return current_node

K so Christy solved the animation problem

The Awesome Christy Lock solved the animation thing even while I was writing the last post. I'll post the code later on tonight for how to do it.

So for me the next thing is to implement a simple spawn/locate human/move closer/attack/defend/withdraw decision tree.

The code to get the position of the nearest avatar could be in the chat code because I'm vaguely sure I saw a function that returns the avatar's location who chatted the message.

If that code is generic, it should be useable to determine the distance to the avatar so the decision tree should look like this:

Spawn!
Is human near enough to me that I can start the attack animation?
Nope
Move closer
Is human near enough to me that I can start the attack animation?
Yup
Start the attack animation
etc

Alternatively I could just do this:
Spawn!
Is human near enough to me that I can start the attack animation?
Nope
Run the follow avatar code
Is human near enough to me that I can start the attack animation?
Yup
Start the attack animation
etc

Then after that it would hook up to some kind of code that determined the human's hit points and damage he/she was taking etc or fed it directly into the health monitor you see in damage enabled areas when using imprudence.

Animations: k so that's interesting

So that's interesting.

I've dug through the code of osAvatarPlayAnimation and it ultimately leads back to SendAnimation which is a virtual member of the IClientAPI interface which is an interface of RexBot.cs and the old GenericNPCCharacter.cs.

But there's nothing in it. i.e. no implementation.

But yet I know the the osAvatarPlayAnimation works because I've already called it from a script inworld on an Aurora Sim.

So I guess I have to try it again but leave a breakpoint in the code to see where it goes. Maybe then I can cut and paste the relevant sections of the code into the bot so we don't need to set the threat level to high and call it inworld, but instead have it run on the server code.

Coding up animations

One of the things that it will be useful for an NPC to do is run animations.

Consider:

A human player walks down a street past a bot-spawning prim. A script in this prim detects the human player and boots up the decision tree which cascades through the criteria until the decision is taken to spawn an NPC in an angry hostile state with the goal of attacking the human interloper.

So the bot spawns, moves forward into range and then needs to attack. One of the things it should do when attacking (in order to make the experience enjoyable for the human player) is play a reasonable animation which is relevant to the attack.

So... it's necessary to code in some kind of animations to the bot. One of the things we'll need to do is make sure that it's only bots that can be animated or else the potential for griefing would be astronomical. We obviously don't want the ability to script any other player to perform a random animation without their permission, so we need a permissions check built in to make sure that it is in fact a bot and not a human player. That's easy enough to do though.

So the next piece is the actual animation code. Currently that's sitting in the OSSL_API.cs, which I think is not the right place for the code for bots because to run it you need the threat level to be high. Animating a bot is not a high threat level even though animating another human character against their will might be. Any case, I'm going to see if I can't hack up something from the code and put it into the bot code. The following are the two relevant functions, startanimation and stopanimation:

public void osAvatarPlayAnimation(string avatar, string animation)
{
ScriptProtection.CheckThreatLevel(ThreatLevel.VeryHigh, "osAvatarPlayAnimation", m_host, "OSSL");

UUID avatarID = (UUID)avatar;


if (World.Entities.ContainsKey((UUID)avatar) && World.Entities[avatarID] is ScenePresence)
{
ScenePresence target = (ScenePresence)World.Entities[avatarID];
if (target != null)
{
UUID animID=UUID.Zero;
lock (m_host.TaskInventory)
{
foreach (KeyValuePair inv in m_host.TaskInventory)
{
if (inv.Value.Name == animation)
{
if (inv.Value.Type == (int)AssetType.Animation)
animID = inv.Value.AssetID;
continue;
}
}
}
if (animID == UUID.Zero)
target.Animator.AddAnimation(animation, m_host.UUID);
else
target.Animator.AddAnimation(animID, m_host.UUID);
}
}
}

public void osAvatarStopAnimation(string avatar, string animation)
{
ScriptProtection.CheckThreatLevel(ThreatLevel.VeryHigh, "osAvatarStopAnimation", m_host, "OSSL");

UUID avatarID = (UUID)avatar;


if (World.Entities.ContainsKey(avatarID) && World.Entities[avatarID] is ScenePresence)
{
ScenePresence target = (ScenePresence)World.Entities[avatarID];
if (target != null)
{
UUID animID=UUID.Zero;
lock (m_host.TaskInventory)
{
foreach (KeyValuePair inv in m_host.TaskInventory)
{
if (inv.Value.Name == animation)
{
if (inv.Value.Type == (int)AssetType.Animation)
animID = inv.Value.AssetID;
continue;
}
}
}

if (animID == UUID.Zero)
target.Animator.RemoveAnimation(animation);
else
target.Animator.RemoveAnimation(animID);
}
}
}

Wednesday 27 April 2011

K so here's the AIML stuff as promised

Do this.

Go get the AIMLBot source here: http://ntoll.org/file_download/18

Next, unzip the contents and open the solution with Visual C# 2008. It will ask you to convert it to a C# 2008 file. Do it.

Next, go take a look inside the folder where you extracted the AIMLBot solution. There will be a file called "AIML.zip". Extract this file into a folder in the aurora bin directory called aiml.
i.e. aurora\bin\aiml

You will have to create the aiml folder first of course.

Next step is to open up the aurora solution and select the main solution right at the top. Right click this main solution and choose "add new project". Browse to the C# 2008 solution of AIMLBot that you created earlier and add it to the project.

Next find the subproject in the aurora solution called Aurora.botmanager and right click it and choose "Add Reference". On the Add Reference window choose the "projects" tab (should be the third tab at the top) and find and select the AIMLBot project you just previously added as a new project to the main aurora solution.

Next open the C# source file RexBot.cs and add the line "using AIMLBot;" underneath the line which says using System.IO; right at the top of the file after the copyright stuff.

At this stage we are ready to rock and roll and you have successfully added AIMLBot to the project. Now we need to enable it. Do the following steps:


Just before the first C# property in RexBot.cs "public RexBotState State", add a declaration for the AIMLBot: add this line
private cBot myBot;


Next you have to actually instantiate the bot. So find the following constructor function:
// creates new bot on the default location
public RexBot(Scene scene, AgentCircuitData data)

Right at the end of this constructor function after "UniqueId++;", add the following line:
myBot = new cBot(false);

Now we have the bot instantiated. So we just need to listen for it now.
So find the function SendChatMessage which says this:
"public void SendChatMessage (string message, byte type, Vector3 fromPos, string fromName, UUID fromAgentID, byte source, byte audible)"

Add the following lines of code in the function:

String fromNameLower = fromName.ToLower();
String firstNameLower = m_firstName.ToLower();
String lastNameLower = m_lastName.ToLower();
String NameLower = firstNameLower + " " + lastNameLower;
if (fromNameLower.Contains(NameLower) || message.Length==0 || source!=(byte)ChatSourceType.Agent )
{
return;
}


cResponse reply = myBot.chat(message, NameLower);

this.SendChatMessage(1, reply.getOutput());



Your bots are now self aware. Long live Skynet!

AIML: Success!

So I managed to get it going. It's a complete hack and I don't have the code handy right now (I'll post it later tonight) but the gist of it is this:

I tracked down an older and simpler copy of the AIMLBot C# source which was written against the really, really old (in computing terms) dot net 1.0 library. I figured that given that it was so ancient, the 2008 c# compiler should be able to upgrade this easily since there've been a lot of years in between.

I was right. So I downloaded the code, made it into a C# 2008 project and added said project to the aurora sim codebase. Then I added a reference in the aurora.botmanager subproject to the AIMLBot project and added a line of Using AIMLBot; to the namespace references at the top.

Then I was able to access the functions in AIMLBot so it was then a case of looking for "what has the user said to the bot?" and "what is the bot's response to what the user has said?" types of calls in the AIMLBot code.

Luckily there a fairly straightforward piece of sample code whereby there was a simple form where you typed into a text box and the bot responded in another textbox with it's answer, so I cut and pasted that code into the right places in the rexbot.cs class and ran it.

No joy. It still crashed out because the bot decided to talk itself into an infinite loop and crashed the stack.

Luckily I was able to track down the venerable Latif Khalifa, author of the awesome piece of client side software, radegast. I know he had basically solved the infinite feedback loop because the AIMLBot code is already in radegast, just I couldn't figure out the hooks.
He pointed me to the one line of code in radegast and I just translated that into the equivalent in the rexbot.

Basically it's a check to make sure that it's a human speaking and that it doesn't have the same name as the bot or else is not an empty message. That basically takes care of all the crappy messages which send the bot into infinite tailspin.

Result was I got a server side bot that talks. Suh-weet.

It still has a glitch though in that the bot refers to *everyone* as "unknown user".
But that's positively a minor issue for now which I'll work on later.

Later tonight I'll post the code on here because I'm too dumb to figure out how to commit to github without overwriting my own changes LOL.

Saturday 23 April 2011

So a small piece of progress

Trying to hunt down where/how the bot code listens for chat messages.
In the Bot_API.cs code there is a botSendChatMessage function which you can use to send messages from the bot. What I wanted was the event to intercept whereby chat messages are being actively listened for by the bot. Obviously the botSendChatMessage isn't it since this is a function you call rather than a function that responds to chat events.

So where is it?

I couldn't find it and had to ask the guys. Rev Smythe (genius) told me it was in the EventManager code so all I had to do was put a breakpoint there, type a chat message in and watch what happens. So the event fires when someone chats. It goes and gets a list of scenepresences and then fires a SendChatMessage function to each of the individual scenepresences.

Our particular scenepresence (i.e. the bot) was indeed found and it called RexBot.cs's implementation of SendChatMessage (which is empty....) so perhaps something could go there.

But wait (I'm typing this as I go through the code LOL) there's something else.... in the RexBot.cs code is an event called OnBotChatFromViewer.

Hmmmm I wonder if that's what I'm looking for.

Hmmm no. It didn't get called. The only code that does get called is RexBot.cs SendChatMessage. OK so that's where I'm going to have to intercept the chats and send them to pandorabots...

Friday 22 April 2011

AIML: Progress of a sort

So I played around with the Aurora NPC code trying to implement the AIML code. I didn't really come to any firm conclusions and have more questions than answers or results, so this post is more of a ramble more than anything else or in other words, I'm collecting my thoughts and dumping them out as an unparsed stream here...

Some observations are that Ken's code (found here): http://kennethrougeau.com/geekery/opensim-server-side-npc-experiments-day-three/

is actually client side code which hooks up with the website pandorabots. Ken explains it better than I do on the page but the gist of it is you sign up for a pandorabot, get an ID for it (which is generated by the site) then you send your text which you commented to the NPC bot to the web site via an LSL httprequest post and get a request back which you then parse and call the NPC say command with the result. Though it's functional the drawback is you need a prim for each bot with a listening script in it. Don't like that too much to be honest.

The way Radegast does it is much cleaner: the radegast bot itself listens and then calls the AIMLbot code which is on sourceforge. Problem is: it's *seriously* complicated. I was looking for something which would be basically *call this* *send this back*. Unfortunately I couldn't find anything like that on the first hack attempt. Basically my head exploded. So the interrim step is to get Ken's LSL client code working with the aurora bots for now kind of as a proof of concept. Then I'll try to dig deeper into radegast or else ask Khalifa. I believe I've seen his sig on the aurora-dev channel so who knows, he might answer enough to be able to push it ahead.

Unanswered questions: if I were to implement some kind of mash-up whereby I get the radegast listen code to work but instead of being able to implement an AIML parse, I instead server-side code up some httpRequests and Responses, far as I know they are synchronous calls. So if I send an HTTP POST request up to a website, will the region code just hang there until the response comes back? That would be kind of ugly. There ought to be an asynchronous call. But unfortunately my .net coding skillz aren't up to speed on that kind of thing so it might not be the best. In an ideal world, the AIMLBot guy would explain how to parse the AIML so I didn't need to futz around with httprequest etc and could just use AIML directly. On the other hand, I could do a metric ton of reading about how AIML works and just code something up myself....

Anyways. Onwards....

Back on the case

So got the code compiled, a dev environment set up.
Test scenario worked: I can create bots etc. Now to hack the code.

First thing I'm going to do is see if I can't hack Ken's AIML client side code into the bot API...

Friday 8 April 2011

Not done too much recently but will resume in about 10 days

After the initial burst of activity I've slowed down a little due to RL stuff and should get back up to speed in about 10 days when I will have a high end windows box to play with. At that point I should be able to run a local version of aurora inside of Visual Studio and run the imprudence viewer at the same time, whereas that currently doesn't work inside my virtual machine on my mac.

It will be *very* entertaining to finally be able to hack some decent gamer AI into the bot code.