K so i tinkered around and it turns out that it wasn't the server at all, it was imprudence.
When I went to LBSA plaza nobody could see me. I was a cloud to them. So it would seem that although I could see myself in imprudence, everyone else saw me as a cloud. Stands to reason then that if I loaded my current appearance (a cloud) into an NPC then I would also see the NPC as a cloud.
Turns out the simple fix (from Qandy Saw at LBSA plaza) is this:
Go to Preferences > Graphics and click Anisotropic filtering on then save settings.
Everyone else will then be able to see you.
At that point you can rezz the NPCs.
Monday, 5 September 2011
Sunday, 4 September 2011
Update to Opensim NPCs
Looks like Justin Clark Casey did some work on the vanilla opensim end.
From my reading of it, it doesn't look like it's as far along as the aurora code. That said if it can be gotten to work, most of the aurora code could theoretically be ported.
The advantage of getting it to work with Opensim is of course that most of the open metaverse servers run on OpenSim instead of Aurora and you can connect it to OSGrid.
Unfortunately, when I tried it by bringing up my own region on osgrid, the NPCs while they do appear, remain as clouds. I must be doing something wrong but don't have time right now to go figure out what.
That said, a few days ago I got a demo from Haplo Voss and the NPC on his region did load the appearance so I'm probably just doing something stupid.
Justin Clark Casey's wiki for OpenSim NPCs is here: http://opensimulator.org/wiki/OSSLNPC
Also: An interesting rumor is that the big grid i.e. Linden grid is going to implement server side NPCs at some point.
From my reading of it, it doesn't look like it's as far along as the aurora code. That said if it can be gotten to work, most of the aurora code could theoretically be ported.
The advantage of getting it to work with Opensim is of course that most of the open metaverse servers run on OpenSim instead of Aurora and you can connect it to OSGrid.
Unfortunately, when I tried it by bringing up my own region on osgrid, the NPCs while they do appear, remain as clouds. I must be doing something wrong but don't have time right now to go figure out what.
That said, a few days ago I got a demo from Haplo Voss and the NPC on his region did load the appearance so I'm probably just doing something stupid.
Justin Clark Casey's wiki for OpenSim NPCs is here: http://opensimulator.org/wiki/OSSLNPC
Also: An interesting rumor is that the big grid i.e. Linden grid is going to implement server side NPCs at some point.
Saturday, 18 June 2011
Nearly there
So I think I have basically all the pieces for a basic zombie game. Right now it works on my local so I've been looking for a place that will give me a full grid with root access. Looks like opensimhosting is the guy. Darren Williams was super helpful and got me up and running quickly.
Next piece is to re-hack my spawn-follow-attack code so that it works with aurora-master. Revolution Smythe moves so fast that it's hard to keep up lol. I move at a snails pace.
Anyways, I have a gun script that works as long as you have osCauseDamage set to true wherever that is in the .ini files (looking now) and also have damage set for your region.
Also I'm playing with zbrush so I can make my own meshes/sculpties. So far I've been able to make an ugle-assed set of wooden hair that is about as bad as the worst freebies and likewise I've been able to make a pretty clunky looking rifle but it's early days and a lot of fun.
Next piece is to re-hack my spawn-follow-attack code so that it works with aurora-master. Revolution Smythe moves so fast that it's hard to keep up lol. I move at a snails pace.
Anyways, I have a gun script that works as long as you have osCauseDamage set to true wherever that is in the .ini files (looking now) and also have damage set for your region.
Also I'm playing with zbrush so I can make my own meshes/sculpties. So far I've been able to make an ugle-assed set of wooden hair that is about as bad as the worst freebies and likewise I've been able to make a pretty clunky looking rifle but it's early days and a lot of fun.
Monday, 30 May 2011
Update
So I got the NPC code up to the point where it can follow you pretty successfully and then kill you. Additionally I wrote a piece of code that reads a ZHAO II formatted notecard out of the template avatar's inventory and uses the UUIDs it finds there for such animations as MeleeAttack, ProjectileAttack, Die etc etc. That way it makes it much easier for someone inworld to customize the appearance and animation behavior of NPCs without the need to get access to the code-base. It just makes things easier, really.
Anyways, then...Revolution Smythe who is a complete and total genius, extended Christy's Astar pathfinding code so that the follow code is better, the NPCs can follow you up stairs, round walls, over walls, fly after you etc etc.
Outstanding are getting the avatars to die (right now they won't die and mercilessly kill you over and over lol) and also some client side weapons/gun scripts.
The piece I'm working on currently is trying to get the avatars to play a "dying" animation once their health points go to zero or below. It's not working yet and it's been especially entertaining trying to get a chance to run a "kill" script while being relentlessly pursued by NPCs intent on your gory death. Fun times were had by all LOL.
Also: I borked my branch on github somehow when I tried to merge the code from the main branch over the weekend. Not being a github expert I don't know how to clean it up and it's just a big hassle really so I took the easy way out and just created a new repository with just the NPC project (i.e. aurora/botmanager). Since my code should just be a drag and drop replacement for the aurora/botmanager in the main branch that should make things much easier. Especially for those who have been unable to get my branch to compile (nod of the head to Enrico).
Anyways, then...Revolution Smythe who is a complete and total genius, extended Christy's Astar pathfinding code so that the follow code is better, the NPCs can follow you up stairs, round walls, over walls, fly after you etc etc.
Outstanding are getting the avatars to die (right now they won't die and mercilessly kill you over and over lol) and also some client side weapons/gun scripts.
The piece I'm working on currently is trying to get the avatars to play a "dying" animation once their health points go to zero or below. It's not working yet and it's been especially entertaining trying to get a chance to run a "kill" script while being relentlessly pursued by NPCs intent on your gory death. Fun times were had by all LOL.
Also: I borked my branch on github somehow when I tried to merge the code from the main branch over the weekend. Not being a github expert I don't know how to clean it up and it's just a big hassle really so I took the easy way out and just created a new repository with just the NPC project (i.e. aurora/botmanager). Since my code should just be a drag and drop replacement for the aurora/botmanager in the main branch that should make things much easier. Especially for those who have been unable to get my branch to compile (nod of the head to Enrico).
Saturday, 28 May 2011
How to add 32 bit launcher into the project
If you get a fresh virgin copy of aurora from the github master branch it will compile just fine in VS2008 but depending on the machine you have it might not run. This is the case on my machine so I spoke to Rev and he said I need to add the 32 bit launcher to be able to debug it in vs2008.
So on the right hand side in the solution explorer panel, right click on "solution 'Aurora'" right at the top and choose "add reference" then go looking for OpenSim/Tools/32bitlauncher and find the project.
You might also have problems with SmarthThreadPool, which is found in openSim/tools/smartthreadpool.
So on the right hand side in the solution explorer panel, right click on "solution 'Aurora'" right at the top and choose "add reference" then go looking for OpenSim/Tools/32bitlauncher and find the project.
You might also have problems with SmarthThreadPool, which is found in openSim/tools/smartthreadpool.
Tuesday, 10 May 2011
k so proof of concept of killer NPC attack bot
After struggling with the animation code and non-active damage code I eventually code a proof of concept attack bot to work.
It's hard-coded at the minute and basically what it does is it follows the prey avatar and when it gets close enough if it's in attack mode (which it is) it proceeds to play the hardcoded attack animation. It also reduces the prey avatar's health points by 10 each time until the prey avatar's health points reach zero. It will continue to dog the avatar even if the avatar moves away until it kills them. At the point at which it kills them the predator NPC goes out of attack mode and just sits there.
Obviously some more complicated AI would be good. For example, it would be good if an idle bot also attacked. So maybe what needs to happen is that the sensor prim makes all not attacking bots attack the nearest avatar. Anyways, the proof of concept works so now I need to think about how to improve on it and especially remove the hardcoded animations to make it easier for others. Probably the next step is to get it to be able to use ZHAO cards or a modified version for attacking animations et cetera.
Anyways the code is here:
https://github.com/x8ball/Aurora-Sim/commit/4d9ef599515a770be20602c934854ad51d9309ed#diff-1
It's hard-coded at the minute and basically what it does is it follows the prey avatar and when it gets close enough if it's in attack mode (which it is) it proceeds to play the hardcoded attack animation. It also reduces the prey avatar's health points by 10 each time until the prey avatar's health points reach zero. It will continue to dog the avatar even if the avatar moves away until it kills them. At the point at which it kills them the predator NPC goes out of attack mode and just sits there.
Obviously some more complicated AI would be good. For example, it would be good if an idle bot also attacked. So maybe what needs to happen is that the sensor prim makes all not attacking bots attack the nearest avatar. Anyways, the proof of concept works so now I need to think about how to improve on it and especially remove the hardcoded animations to make it easier for others. Probably the next step is to get it to be able to use ZHAO cards or a modified version for attacking animations et cetera.
Anyways the code is here:
https://github.com/x8ball/Aurora-Sim/commit/4d9ef599515a770be20602c934854ad51d9309ed#diff-1
Wednesday, 4 May 2011
Server side AO on the bot
Additionally on the to-do list is a server side AO for the bot using a standard ZHAO notecard.
The plan is to create a fixed directory called aurora/bin/ao whereby each bot's AO animation list will be inside a notecard formatted in standard ZHAO format and called the name of the seeding avie used to seed the bot. i.e. a bot called "test bot" seeded from avie John Doe would have a notecard called JohnDoe.txt (for example) inside aurora/bin/ao.
This code is lower down the priority list in the to-do's but it's in there. Before I do this I'm going to finish the detect-spawn-follow-attack code which will also involve digging into the LSL collision or sensor stuff. The scripting engine seems to be a little flaky in that scripts sometimes don't work (I tried the collision code last night and it didn't work very well so as a backup I could write something into the botmanager code but I digress).
So returning to the Server side AO, we will need code to pick up directories and code to read files etc. Clues to that code should be in the following code snippet:
private void FindDefaultLSLScript()
{
if (!Directory.Exists(ScriptEnginesPath))
{
try
{
Directory.CreateDirectory(ScriptEnginesPath);
}
catch (Exception)
{
}
}
string Dir = Path.Combine(Path.Combine(Environment.CurrentDirectory, ScriptEnginesPath), "default.lsl");
if (File.Exists(Dir))
{
string defaultScript = File.ReadAllText(Dir);
foreach (Scene scene in m_Scenes)
{
ILLClientInventory inventoryModule = scene.RequestModuleInterface();
if (inventoryModule != null)
inventoryModule.DefaultLSLScript = defaultScript;
}
}
}
The plan is to create a fixed directory called aurora/bin/ao whereby each bot's AO animation list will be inside a notecard formatted in standard ZHAO format and called the name of the seeding avie used to seed the bot. i.e. a bot called "test bot" seeded from avie John Doe would have a notecard called JohnDoe.txt (for example) inside aurora/bin/ao.
This code is lower down the priority list in the to-do's but it's in there. Before I do this I'm going to finish the detect-spawn-follow-attack code which will also involve digging into the LSL collision or sensor stuff. The scripting engine seems to be a little flaky in that scripts sometimes don't work (I tried the collision code last night and it didn't work very well so as a backup I could write something into the botmanager code but I digress).
So returning to the Server side AO, we will need code to pick up directories and code to read files etc. Clues to that code should be in the following code snippet:
private void FindDefaultLSLScript()
{
if (!Directory.Exists(ScriptEnginesPath))
{
try
{
Directory.CreateDirectory(ScriptEnginesPath);
}
catch (Exception)
{
}
}
string Dir = Path.Combine(Path.Combine(Environment.CurrentDirectory, ScriptEnginesPath), "default.lsl");
if (File.Exists(Dir))
{
string defaultScript = File.ReadAllText(Dir);
foreach (Scene scene in m_Scenes)
{
ILLClientInventory inventoryModule = scene.RequestModuleInterface
if (inventoryModule != null)
inventoryModule.DefaultLSLScript = defaultScript;
}
}
}
Working on the detect-spawn-follow-attack code
So what I'm working on (loosely) at the moment is basically a detect-spawn-and-attack method a la left 4 dead or else call of duty. So basically an avie walks close to a spawning point, the spawning region detects the avie's presence, spawns a bot, sets the bot to follow the avie and sets the bots belligerence level to "hostile".
Then the bot will follow the avie till it gets close enough and then it will attack which means run the appropriate attack anim and then call the code to do damage to the avie.
Rather than do all of this code myself I can piggyback on the existing follow code and at the same time change the belligerence state to hostile. Then I can mod the code that stops the avie from walking when it's close enough (in the follow code) to instead check to see the attack state and take appropriate action instead.
So here's the follow code in rexbot.cs with the appropriate places to mod commented:
//this is the follow code
CurrentFollowTimeBeforeUpdate++;
if (CurrentFollowTimeBeforeUpdate == FollowTimeBeforeUpdate)
{
Vector3 diffAbsPos = FollowSP.AbsolutePosition - m_scenePresence.AbsolutePosition;
if (Math.Abs (diffAbsPos.X) > m_closeToPoint || Math.Abs (diffAbsPos.Y) > m_closeToPoint)
{
NavMesh mesh = new NavMesh ();
bool fly = FollowSP.PhysicsActor == null ? ShouldFly : FollowSP.PhysicsActor.Flying;
mesh.AddEdge (0, 1, fly ? TravelMode.Fly : TravelMode.Walk);
mesh.AddNode (m_scenePresence.AbsolutePosition); //Give it the current pos so that it will know where to start
mesh.AddEdge (1, 2, fly ? TravelMode.Fly : TravelMode.Walk);
mesh.AddNode (FollowSP.AbsolutePosition); //Give it the new point so that it will head toward it
SetPath (mesh, 0, false, 10000, false); //Set and go
}
else //distance is less than the follow range.
//in here (the else block) we need to check if we are to attack
//(if the bot is hostile we attack
//otherwise we just stop the bot
{
//if the bot is hostile we attack
//attack code goes here
//else
//Stop the bot then
State = RexBotState.Idle;
m_walkTime.Stop ();
m_startTime.Stop ();
}
//Reset the time
CurrentFollowTimeBeforeUpdate = -1;
}
Then the bot will follow the avie till it gets close enough and then it will attack which means run the appropriate attack anim and then call the code to do damage to the avie.
Rather than do all of this code myself I can piggyback on the existing follow code and at the same time change the belligerence state to hostile. Then I can mod the code that stops the avie from walking when it's close enough (in the follow code) to instead check to see the attack state and take appropriate action instead.
So here's the follow code in rexbot.cs with the appropriate places to mod commented:
//this is the follow code
CurrentFollowTimeBeforeUpdate++;
if (CurrentFollowTimeBeforeUpdate == FollowTimeBeforeUpdate)
{
Vector3 diffAbsPos = FollowSP.AbsolutePosition - m_scenePresence.AbsolutePosition;
if (Math.Abs (diffAbsPos.X) > m_closeToPoint || Math.Abs (diffAbsPos.Y) > m_closeToPoint)
{
NavMesh mesh = new NavMesh ();
bool fly = FollowSP.PhysicsActor == null ? ShouldFly : FollowSP.PhysicsActor.Flying;
mesh.AddEdge (0, 1, fly ? TravelMode.Fly : TravelMode.Walk);
mesh.AddNode (m_scenePresence.AbsolutePosition); //Give it the current pos so that it will know where to start
mesh.AddEdge (1, 2, fly ? TravelMode.Fly : TravelMode.Walk);
mesh.AddNode (FollowSP.AbsolutePosition); //Give it the new point so that it will head toward it
SetPath (mesh, 0, false, 10000, false); //Set and go
}
else //distance is less than the follow range.
//in here (the else block) we need to check if we are to attack
//(if the bot is hostile we attack
//otherwise we just stop the bot
{
//if the bot is hostile we attack
//attack code goes here
//else
//Stop the bot then
State = RexBotState.Idle;
m_walkTime.Stop ();
m_startTime.Stop ();
}
//Reset the time
CurrentFollowTimeBeforeUpdate = -1;
}
Friday, 29 April 2011
The following is the code that will let you access rexbot code from the botAPI
I've been scratching my head trying to figure out how to access the RexBot code from inworld.
Obviously when things are stable and the game AI is built in to the RexBot code it should run smoothly, but there will be need for hooks into the bot itself from inworld so that I can test it. That leads to the need for me to be able to access the methods and properties of the RexBot code from within the botAPI. Basically they exhibit two different interfaces. Rexbot is IClientAPI and botAPI doesn't have IClientAPI.
Luckily however, we can get access to the Rexbot code through the botmanager code. What I had to do was add a function to the botmanager code which pulled out the appropriate rexbot from the internal private m_bots collection of bots. From there I can get access to the public members.
Which is exactly what I need to do in order to be able to easily test. So *now* I believe I'm well on the way to being able to make the changes I want to make.
public void botSetState(string bot, string State)
{
//this is a test case of being able to get access to a property
//of the actual rexbot itself through the botAPI
IBotManager manager = World.RequestModuleInterface();
if (manager != null)
{
RexBot rxbot;
rxbot = (RexBot)manager.GetBot(UUID.Parse(bot));
//follow up with this
switch (State.ToLower())
{
case "walking":
rxbot.State = RexBot.RexBotState.Walking;
break;
case "idle":
rxbot.State = RexBot.RexBotState.Idle;
break;
case "flying":
rxbot.State = RexBot.RexBotState.Flying;
break;
default:
rxbot.State = RexBot.RexBotState.Unknown;
break;
}
}
}
Obviously when things are stable and the game AI is built in to the RexBot code it should run smoothly, but there will be need for hooks into the bot itself from inworld so that I can test it. That leads to the need for me to be able to access the methods and properties of the RexBot code from within the botAPI. Basically they exhibit two different interfaces. Rexbot is IClientAPI and botAPI doesn't have IClientAPI.
Luckily however, we can get access to the Rexbot code through the botmanager code. What I had to do was add a function to the botmanager code which pulled out the appropriate rexbot from the internal private m_bots collection of bots. From there I can get access to the public members.
Which is exactly what I need to do in order to be able to easily test. So *now* I believe I'm well on the way to being able to make the changes I want to make.
public void botSetState(string bot, string State)
{
//this is a test case of being able to get access to a property
//of the actual rexbot itself through the botAPI
IBotManager manager = World.RequestModuleInterface
if (manager != null)
{
RexBot rxbot;
rxbot = (RexBot)manager.GetBot(UUID.Parse(bot));
//follow up with this
switch (State.ToLower())
{
case "walking":
rxbot.State = RexBot.RexBotState.Walking;
break;
case "idle":
rxbot.State = RexBot.RexBotState.Idle;
break;
case "flying":
rxbot.State = RexBot.RexBotState.Flying;
break;
default:
rxbot.State = RexBot.RexBotState.Unknown;
break;
}
}
}
Thursday, 28 April 2011
Code to animate the avatar
Here's a rebased version of Christy Lock's code which will animate the bot:
Add the following code to the Bot_API.cs in the Botmanager module:
public void botAnimate(string bot, string AnimationUUID)
{
m_host.ParentEntity.Scene.ForEachScenePresence(delegate(IScenePresence sp)
{
// this should be the bot id
if (sp.UUID == UUID.Parse(bot))
{
sp.Animator.AddAnimation(UUID.Parse(AnimationUUID), UUID.Zero);
}
});
}
Add the following code to the Bot_API.cs in the Botmanager module:
public void botAnimate(string bot, string AnimationUUID)
{
m_host.ParentEntity.Scene.ForEachScenePresence(delegate(IScenePresence sp)
{
// this should be the bot id
if (sp.UUID == UUID.Parse(bot))
{
sp.Animator.AddAnimation(UUID.Parse(AnimationUUID), UUID.Zero);
}
});
}
Code to determine the distance between NPC and human's avatar
One of the code snippets I need to do is determine the distance between the NPC and the human avatar in order to determine if I should be running the animation code or not. There's already existing code by Christy Lock to do that for her AStar pathfinding class. This is a rebased version of it which should do the trick hopefully:
Vector3 diffAbsPos = FollowedSP.AbsolutePosition - m_botscenePresence.AbsolutePosition;
if (Math.Abs (diffAbsPos.X) < m_closeToPoint || Math.Abs (diffAbsPos.Y) < m_closeToPoint)
{
// run the attack code here which includes animation and reducing the health points of
// the attacked human, possibly pushing et cetera depending on the force applied
}
So what this does is get a vector between the scenepresence of the followed avatar (i.e. the human) and the scenepresence of the bot. Then it does some math which amounts to checking if either of the X or Y coordinates of the vector are less than the closetopoint (which is 1 i.e. one unit). Depending on the animation to be run, this could vary but 1 unit is good for now.
Vector3 diffAbsPos = FollowedSP.AbsolutePosition - m_botscenePresence.AbsolutePosition;
if (Math.Abs (diffAbsPos.X) < m_closeToPoint || Math.Abs (diffAbsPos.Y) < m_closeToPoint)
{
// run the attack code here which includes animation and reducing the health points of
// the attacked human, possibly pushing et cetera depending on the force applied
}
So what this does is get a vector between the scenepresence of the followed avatar (i.e. the human) and the scenepresence of the bot. Then it does some math which amounts to checking if either of the X or Y coordinates of the vector are less than the closetopoint (which is 1 i.e. one unit). Depending on the animation to be run, this could vary but 1 unit is good for now.
On decision trees and goal oriented planning
I like to do a lot of research before I actually start coding so I read up on the latest in AI gaming theory and it turns out that the most lifelike NPCs utilize something called goal oriented planning. I kind of know what this is: it's a sequence of steps to get to a goal i.e. a decision tree.
Where it differs from a decision tree is that a decision tree is typically a hard coded best-estimate of the route to get from where you are now to a planned goal (whether it's a physical location or a state or an action or whatever).
But if we switch it up a little bit and allow competing sub-goals to compete for the fastest path to the goal and update the subgoals-in-running and various choices of paths to the goal then we have something more interesting and this is in fact the goal oriented planning situation I'm reading about.
What's interesting about this is that it means effectively that pathfinding and goal oriented planning are exactly the same thing. In other words A Star (or A*). So I need to dig into A* a little more.
Interestingly, Christy Lock has been bending my ear about A* for pathfinding already and she has already coded some A* into the code, so maybe it could be switched up a little to handle pathfinding to a goal.
But that's for later. For now I'm going to stick to a simplified attack decision tree for the first step.
Any case, here is pseudo code (from wikipedia) for the A* algorithm in case anyone is interested.
function A*(start,goal)
closedset := the empty set // The set of nodes already evaluated.
openset := set containing the initial node // The set of tentative nodes to be evaluated.
came_from := the empty map // The map of navigated nodes.
g_score[start] := 0 // Cost from start along best known path.
h_score[start] := heuristic_cost_estimate(start, goal)
f_score[start] := h_score[start] // Estimated total cost from start to goal through y.
while openset is not empty
x := the node in openset having the lowest f_score[] value
if x = goal
return reconstruct_path(came_from, came_from[goal])
remove x from openset
add x to closedset
foreach y in neighbor_nodes(x)
if y in closedset
continue
tentative_g_score := g_score[x] + dist_between(x,y)
if y not in openset
add y to openset
tentative_is_better := true
else if tentative_g_score < g_score[y]
tentative_is_better := true
else
tentative_is_better := false
if tentative_is_better = true
came_from[y] := x
g_score[y] := tentative_g_score
h_score[y] := heuristic_cost_estimate(y, goal)
f_score[y] := g_score[y] + h_score[y]
return failure
function reconstruct_path(came_from, current_node)
if came_from[current_node] is set
p = reconstruct_path(came_from, came_from[current_node])
return (p + current_node)
else
return current_node
Where it differs from a decision tree is that a decision tree is typically a hard coded best-estimate of the route to get from where you are now to a planned goal (whether it's a physical location or a state or an action or whatever).
But if we switch it up a little bit and allow competing sub-goals to compete for the fastest path to the goal and update the subgoals-in-running and various choices of paths to the goal then we have something more interesting and this is in fact the goal oriented planning situation I'm reading about.
What's interesting about this is that it means effectively that pathfinding and goal oriented planning are exactly the same thing. In other words A Star (or A*). So I need to dig into A* a little more.
Interestingly, Christy Lock has been bending my ear about A* for pathfinding already and she has already coded some A* into the code, so maybe it could be switched up a little to handle pathfinding to a goal.
But that's for later. For now I'm going to stick to a simplified attack decision tree for the first step.
Any case, here is pseudo code (from wikipedia) for the A* algorithm in case anyone is interested.
function A*(start,goal)
closedset := the empty set // The set of nodes already evaluated.
openset := set containing the initial node // The set of tentative nodes to be evaluated.
came_from := the empty map // The map of navigated nodes.
g_score[start] := 0 // Cost from start along best known path.
h_score[start] := heuristic_cost_estimate(start, goal)
f_score[start] := h_score[start] // Estimated total cost from start to goal through y.
while openset is not empty
x := the node in openset having the lowest f_score[] value
if x = goal
return reconstruct_path(came_from, came_from[goal])
remove x from openset
add x to closedset
foreach y in neighbor_nodes(x)
if y in closedset
continue
tentative_g_score := g_score[x] + dist_between(x,y)
if y not in openset
add y to openset
tentative_is_better := true
else if tentative_g_score < g_score[y]
tentative_is_better := true
else
tentative_is_better := false
if tentative_is_better = true
came_from[y] := x
g_score[y] := tentative_g_score
h_score[y] := heuristic_cost_estimate(y, goal)
f_score[y] := g_score[y] + h_score[y]
return failure
function reconstruct_path(came_from, current_node)
if came_from[current_node] is set
p = reconstruct_path(came_from, came_from[current_node])
return (p + current_node)
else
return current_node
K so Christy solved the animation problem
The Awesome Christy Lock solved the animation thing even while I was writing the last post. I'll post the code later on tonight for how to do it.
So for me the next thing is to implement a simple spawn/locate human/move closer/attack/defend/withdraw decision tree.
The code to get the position of the nearest avatar could be in the chat code because I'm vaguely sure I saw a function that returns the avatar's location who chatted the message.
If that code is generic, it should be useable to determine the distance to the avatar so the decision tree should look like this:
Spawn!
Is human near enough to me that I can start the attack animation?
Nope
Move closer
Is human near enough to me that I can start the attack animation?
Yup
Start the attack animation
etc
Alternatively I could just do this:
Spawn!
Is human near enough to me that I can start the attack animation?
Nope
Run the follow avatar code
Is human near enough to me that I can start the attack animation?
Yup
Start the attack animation
etc
Then after that it would hook up to some kind of code that determined the human's hit points and damage he/she was taking etc or fed it directly into the health monitor you see in damage enabled areas when using imprudence.
So for me the next thing is to implement a simple spawn/locate human/move closer/attack/defend/withdraw decision tree.
The code to get the position of the nearest avatar could be in the chat code because I'm vaguely sure I saw a function that returns the avatar's location who chatted the message.
If that code is generic, it should be useable to determine the distance to the avatar so the decision tree should look like this:
Spawn!
Is human near enough to me that I can start the attack animation?
Nope
Move closer
Is human near enough to me that I can start the attack animation?
Yup
Start the attack animation
etc
Alternatively I could just do this:
Spawn!
Is human near enough to me that I can start the attack animation?
Nope
Run the follow avatar code
Is human near enough to me that I can start the attack animation?
Yup
Start the attack animation
etc
Then after that it would hook up to some kind of code that determined the human's hit points and damage he/she was taking etc or fed it directly into the health monitor you see in damage enabled areas when using imprudence.
Animations: k so that's interesting
So that's interesting.
I've dug through the code of osAvatarPlayAnimation and it ultimately leads back to SendAnimation which is a virtual member of the IClientAPI interface which is an interface of RexBot.cs and the old GenericNPCCharacter.cs.
But there's nothing in it. i.e. no implementation.
But yet I know the the osAvatarPlayAnimation works because I've already called it from a script inworld on an Aurora Sim.
So I guess I have to try it again but leave a breakpoint in the code to see where it goes. Maybe then I can cut and paste the relevant sections of the code into the bot so we don't need to set the threat level to high and call it inworld, but instead have it run on the server code.
I've dug through the code of osAvatarPlayAnimation and it ultimately leads back to SendAnimation which is a virtual member of the IClientAPI interface which is an interface of RexBot.cs and the old GenericNPCCharacter.cs.
But there's nothing in it. i.e. no implementation.
But yet I know the the osAvatarPlayAnimation works because I've already called it from a script inworld on an Aurora Sim.
So I guess I have to try it again but leave a breakpoint in the code to see where it goes. Maybe then I can cut and paste the relevant sections of the code into the bot so we don't need to set the threat level to high and call it inworld, but instead have it run on the server code.
Coding up animations
One of the things that it will be useful for an NPC to do is run animations.
Consider:
A human player walks down a street past a bot-spawning prim. A script in this prim detects the human player and boots up the decision tree which cascades through the criteria until the decision is taken to spawn an NPC in an angry hostile state with the goal of attacking the human interloper.
So the bot spawns, moves forward into range and then needs to attack. One of the things it should do when attacking (in order to make the experience enjoyable for the human player) is play a reasonable animation which is relevant to the attack.
So... it's necessary to code in some kind of animations to the bot. One of the things we'll need to do is make sure that it's only bots that can be animated or else the potential for griefing would be astronomical. We obviously don't want the ability to script any other player to perform a random animation without their permission, so we need a permissions check built in to make sure that it is in fact a bot and not a human player. That's easy enough to do though.
So the next piece is the actual animation code. Currently that's sitting in the OSSL_API.cs, which I think is not the right place for the code for bots because to run it you need the threat level to be high. Animating a bot is not a high threat level even though animating another human character against their will might be. Any case, I'm going to see if I can't hack up something from the code and put it into the bot code. The following are the two relevant functions, startanimation and stopanimation:
public void osAvatarPlayAnimation(string avatar, string animation)
{
ScriptProtection.CheckThreatLevel(ThreatLevel.VeryHigh, "osAvatarPlayAnimation", m_host, "OSSL");
UUID avatarID = (UUID)avatar;
if (World.Entities.ContainsKey((UUID)avatar) && World.Entities[avatarID] is ScenePresence)
{
ScenePresence target = (ScenePresence)World.Entities[avatarID];
if (target != null)
{
UUID animID=UUID.Zero;
lock (m_host.TaskInventory)
{
foreach (KeyValuePair inv in m_host.TaskInventory)
{
if (inv.Value.Name == animation)
{
if (inv.Value.Type == (int)AssetType.Animation)
animID = inv.Value.AssetID;
continue;
}
}
}
if (animID == UUID.Zero)
target.Animator.AddAnimation(animation, m_host.UUID);
else
target.Animator.AddAnimation(animID, m_host.UUID);
}
}
}
public void osAvatarStopAnimation(string avatar, string animation)
{
ScriptProtection.CheckThreatLevel(ThreatLevel.VeryHigh, "osAvatarStopAnimation", m_host, "OSSL");
UUID avatarID = (UUID)avatar;
if (World.Entities.ContainsKey(avatarID) && World.Entities[avatarID] is ScenePresence)
{
ScenePresence target = (ScenePresence)World.Entities[avatarID];
if (target != null)
{
UUID animID=UUID.Zero;
lock (m_host.TaskInventory)
{
foreach (KeyValuePair inv in m_host.TaskInventory)
{
if (inv.Value.Name == animation)
{
if (inv.Value.Type == (int)AssetType.Animation)
animID = inv.Value.AssetID;
continue;
}
}
}
if (animID == UUID.Zero)
target.Animator.RemoveAnimation(animation);
else
target.Animator.RemoveAnimation(animID);
}
}
}
Consider:
A human player walks down a street past a bot-spawning prim. A script in this prim detects the human player and boots up the decision tree which cascades through the criteria until the decision is taken to spawn an NPC in an angry hostile state with the goal of attacking the human interloper.
So the bot spawns, moves forward into range and then needs to attack. One of the things it should do when attacking (in order to make the experience enjoyable for the human player) is play a reasonable animation which is relevant to the attack.
So... it's necessary to code in some kind of animations to the bot. One of the things we'll need to do is make sure that it's only bots that can be animated or else the potential for griefing would be astronomical. We obviously don't want the ability to script any other player to perform a random animation without their permission, so we need a permissions check built in to make sure that it is in fact a bot and not a human player. That's easy enough to do though.
So the next piece is the actual animation code. Currently that's sitting in the OSSL_API.cs, which I think is not the right place for the code for bots because to run it you need the threat level to be high. Animating a bot is not a high threat level even though animating another human character against their will might be. Any case, I'm going to see if I can't hack up something from the code and put it into the bot code. The following are the two relevant functions, startanimation and stopanimation:
public void osAvatarPlayAnimation(string avatar, string animation)
{
ScriptProtection.CheckThreatLevel(ThreatLevel.VeryHigh, "osAvatarPlayAnimation", m_host, "OSSL");
UUID avatarID = (UUID)avatar;
if (World.Entities.ContainsKey((UUID)avatar) && World.Entities[avatarID] is ScenePresence)
{
ScenePresence target = (ScenePresence)World.Entities[avatarID];
if (target != null)
{
UUID animID=UUID.Zero;
lock (m_host.TaskInventory)
{
foreach (KeyValuePair
{
if (inv.Value.Name == animation)
{
if (inv.Value.Type == (int)AssetType.Animation)
animID = inv.Value.AssetID;
continue;
}
}
}
if (animID == UUID.Zero)
target.Animator.AddAnimation(animation, m_host.UUID);
else
target.Animator.AddAnimation(animID, m_host.UUID);
}
}
}
public void osAvatarStopAnimation(string avatar, string animation)
{
ScriptProtection.CheckThreatLevel(ThreatLevel.VeryHigh, "osAvatarStopAnimation", m_host, "OSSL");
UUID avatarID = (UUID)avatar;
if (World.Entities.ContainsKey(avatarID) && World.Entities[avatarID] is ScenePresence)
{
ScenePresence target = (ScenePresence)World.Entities[avatarID];
if (target != null)
{
UUID animID=UUID.Zero;
lock (m_host.TaskInventory)
{
foreach (KeyValuePair
{
if (inv.Value.Name == animation)
{
if (inv.Value.Type == (int)AssetType.Animation)
animID = inv.Value.AssetID;
continue;
}
}
}
if (animID == UUID.Zero)
target.Animator.RemoveAnimation(animation);
else
target.Animator.RemoveAnimation(animID);
}
}
}
Wednesday, 27 April 2011
K so here's the AIML stuff as promised
Do this.
Go get the AIMLBot source here: http://ntoll.org/file_download/18
Next, unzip the contents and open the solution with Visual C# 2008. It will ask you to convert it to a C# 2008 file. Do it.
Next, go take a look inside the folder where you extracted the AIMLBot solution. There will be a file called "AIML.zip". Extract this file into a folder in the aurora bin directory called aiml.
i.e. aurora\bin\aiml
You will have to create the aiml folder first of course.
Next step is to open up the aurora solution and select the main solution right at the top. Right click this main solution and choose "add new project". Browse to the C# 2008 solution of AIMLBot that you created earlier and add it to the project.
Next find the subproject in the aurora solution called Aurora.botmanager and right click it and choose "Add Reference". On the Add Reference window choose the "projects" tab (should be the third tab at the top) and find and select the AIMLBot project you just previously added as a new project to the main aurora solution.
Next open the C# source file RexBot.cs and add the line "using AIMLBot;" underneath the line which says using System.IO; right at the top of the file after the copyright stuff.
At this stage we are ready to rock and roll and you have successfully added AIMLBot to the project. Now we need to enable it. Do the following steps:
Just before the first C# property in RexBot.cs "public RexBotState State", add a declaration for the AIMLBot: add this line
private cBot myBot;
Next you have to actually instantiate the bot. So find the following constructor function:
// creates new bot on the default location
public RexBot(Scene scene, AgentCircuitData data)
Right at the end of this constructor function after "UniqueId++;", add the following line:
myBot = new cBot(false);
Now we have the bot instantiated. So we just need to listen for it now.
So find the function SendChatMessage which says this:
"public void SendChatMessage (string message, byte type, Vector3 fromPos, string fromName, UUID fromAgentID, byte source, byte audible)"
Add the following lines of code in the function:
String fromNameLower = fromName.ToLower();
String firstNameLower = m_firstName.ToLower();
String lastNameLower = m_lastName.ToLower();
String NameLower = firstNameLower + " " + lastNameLower;
if (fromNameLower.Contains(NameLower) || message.Length==0 || source!=(byte)ChatSourceType.Agent )
{
return;
}
cResponse reply = myBot.chat(message, NameLower);
this.SendChatMessage(1, reply.getOutput());
Your bots are now self aware. Long live Skynet!
Go get the AIMLBot source here: http://ntoll.org/file_download/18
Next, unzip the contents and open the solution with Visual C# 2008. It will ask you to convert it to a C# 2008 file. Do it.
Next, go take a look inside the folder where you extracted the AIMLBot solution. There will be a file called "AIML.zip". Extract this file into a folder in the aurora bin directory called aiml.
i.e. aurora\bin\aiml
You will have to create the aiml folder first of course.
Next step is to open up the aurora solution and select the main solution right at the top. Right click this main solution and choose "add new project". Browse to the C# 2008 solution of AIMLBot that you created earlier and add it to the project.
Next find the subproject in the aurora solution called Aurora.botmanager and right click it and choose "Add Reference". On the Add Reference window choose the "projects" tab (should be the third tab at the top) and find and select the AIMLBot project you just previously added as a new project to the main aurora solution.
Next open the C# source file RexBot.cs and add the line "using AIMLBot;" underneath the line which says using System.IO; right at the top of the file after the copyright stuff.
At this stage we are ready to rock and roll and you have successfully added AIMLBot to the project. Now we need to enable it. Do the following steps:
Just before the first C# property in RexBot.cs "public RexBotState State", add a declaration for the AIMLBot: add this line
private cBot myBot;
Next you have to actually instantiate the bot. So find the following constructor function:
// creates new bot on the default location
public RexBot(Scene scene, AgentCircuitData data)
Right at the end of this constructor function after "UniqueId++;", add the following line:
myBot = new cBot(false);
Now we have the bot instantiated. So we just need to listen for it now.
So find the function SendChatMessage which says this:
"public void SendChatMessage (string message, byte type, Vector3 fromPos, string fromName, UUID fromAgentID, byte source, byte audible)"
Add the following lines of code in the function:
String fromNameLower = fromName.ToLower();
String firstNameLower = m_firstName.ToLower();
String lastNameLower = m_lastName.ToLower();
String NameLower = firstNameLower + " " + lastNameLower;
if (fromNameLower.Contains(NameLower) || message.Length==0 || source!=(byte)ChatSourceType.Agent )
{
return;
}
cResponse reply = myBot.chat(message, NameLower);
this.SendChatMessage(1, reply.getOutput());
Your bots are now self aware. Long live Skynet!
AIML: Success!
So I managed to get it going. It's a complete hack and I don't have the code handy right now (I'll post it later tonight) but the gist of it is this:
I tracked down an older and simpler copy of the AIMLBot C# source which was written against the really, really old (in computing terms) dot net 1.0 library. I figured that given that it was so ancient, the 2008 c# compiler should be able to upgrade this easily since there've been a lot of years in between.
I was right. So I downloaded the code, made it into a C# 2008 project and added said project to the aurora sim codebase. Then I added a reference in the aurora.botmanager subproject to the AIMLBot project and added a line of Using AIMLBot; to the namespace references at the top.
Then I was able to access the functions in AIMLBot so it was then a case of looking for "what has the user said to the bot?" and "what is the bot's response to what the user has said?" types of calls in the AIMLBot code.
Luckily there a fairly straightforward piece of sample code whereby there was a simple form where you typed into a text box and the bot responded in another textbox with it's answer, so I cut and pasted that code into the right places in the rexbot.cs class and ran it.
No joy. It still crashed out because the bot decided to talk itself into an infinite loop and crashed the stack.
Luckily I was able to track down the venerable Latif Khalifa, author of the awesome piece of client side software, radegast. I know he had basically solved the infinite feedback loop because the AIMLBot code is already in radegast, just I couldn't figure out the hooks.
He pointed me to the one line of code in radegast and I just translated that into the equivalent in the rexbot.
Basically it's a check to make sure that it's a human speaking and that it doesn't have the same name as the bot or else is not an empty message. That basically takes care of all the crappy messages which send the bot into infinite tailspin.
Result was I got a server side bot that talks. Suh-weet.
It still has a glitch though in that the bot refers to *everyone* as "unknown user".
But that's positively a minor issue for now which I'll work on later.
Later tonight I'll post the code on here because I'm too dumb to figure out how to commit to github without overwriting my own changes LOL.
I tracked down an older and simpler copy of the AIMLBot C# source which was written against the really, really old (in computing terms) dot net 1.0 library. I figured that given that it was so ancient, the 2008 c# compiler should be able to upgrade this easily since there've been a lot of years in between.
I was right. So I downloaded the code, made it into a C# 2008 project and added said project to the aurora sim codebase. Then I added a reference in the aurora.botmanager subproject to the AIMLBot project and added a line of Using AIMLBot; to the namespace references at the top.
Then I was able to access the functions in AIMLBot so it was then a case of looking for "what has the user said to the bot?" and "what is the bot's response to what the user has said?" types of calls in the AIMLBot code.
Luckily there a fairly straightforward piece of sample code whereby there was a simple form where you typed into a text box and the bot responded in another textbox with it's answer, so I cut and pasted that code into the right places in the rexbot.cs class and ran it.
No joy. It still crashed out because the bot decided to talk itself into an infinite loop and crashed the stack.
Luckily I was able to track down the venerable Latif Khalifa, author of the awesome piece of client side software, radegast. I know he had basically solved the infinite feedback loop because the AIMLBot code is already in radegast, just I couldn't figure out the hooks.
He pointed me to the one line of code in radegast and I just translated that into the equivalent in the rexbot.
Basically it's a check to make sure that it's a human speaking and that it doesn't have the same name as the bot or else is not an empty message. That basically takes care of all the crappy messages which send the bot into infinite tailspin.
Result was I got a server side bot that talks. Suh-weet.
It still has a glitch though in that the bot refers to *everyone* as "unknown user".
But that's positively a minor issue for now which I'll work on later.
Later tonight I'll post the code on here because I'm too dumb to figure out how to commit to github without overwriting my own changes LOL.
Saturday, 23 April 2011
So a small piece of progress
Trying to hunt down where/how the bot code listens for chat messages.
In the Bot_API.cs code there is a botSendChatMessage function which you can use to send messages from the bot. What I wanted was the event to intercept whereby chat messages are being actively listened for by the bot. Obviously the botSendChatMessage isn't it since this is a function you call rather than a function that responds to chat events.
So where is it?
I couldn't find it and had to ask the guys. Rev Smythe (genius) told me it was in the EventManager code so all I had to do was put a breakpoint there, type a chat message in and watch what happens. So the event fires when someone chats. It goes and gets a list of scenepresences and then fires a SendChatMessage function to each of the individual scenepresences.
Our particular scenepresence (i.e. the bot) was indeed found and it called RexBot.cs's implementation of SendChatMessage (which is empty....) so perhaps something could go there.
But wait (I'm typing this as I go through the code LOL) there's something else.... in the RexBot.cs code is an event called OnBotChatFromViewer.
Hmmmm I wonder if that's what I'm looking for.
Hmmm no. It didn't get called. The only code that does get called is RexBot.cs SendChatMessage. OK so that's where I'm going to have to intercept the chats and send them to pandorabots...
In the Bot_API.cs code there is a botSendChatMessage function which you can use to send messages from the bot. What I wanted was the event to intercept whereby chat messages are being actively listened for by the bot. Obviously the botSendChatMessage isn't it since this is a function you call rather than a function that responds to chat events.
So where is it?
I couldn't find it and had to ask the guys. Rev Smythe (genius) told me it was in the EventManager code so all I had to do was put a breakpoint there, type a chat message in and watch what happens. So the event fires when someone chats. It goes and gets a list of scenepresences and then fires a SendChatMessage function to each of the individual scenepresences.
Our particular scenepresence (i.e. the bot) was indeed found and it called RexBot.cs's implementation of SendChatMessage (which is empty....) so perhaps something could go there.
But wait (I'm typing this as I go through the code LOL) there's something else.... in the RexBot.cs code is an event called OnBotChatFromViewer.
Hmmmm I wonder if that's what I'm looking for.
Hmmm no. It didn't get called. The only code that does get called is RexBot.cs SendChatMessage. OK so that's where I'm going to have to intercept the chats and send them to pandorabots...
Friday, 22 April 2011
AIML: Progress of a sort
So I played around with the Aurora NPC code trying to implement the AIML code. I didn't really come to any firm conclusions and have more questions than answers or results, so this post is more of a ramble more than anything else or in other words, I'm collecting my thoughts and dumping them out as an unparsed stream here...
Some observations are that Ken's code (found here): http://kennethrougeau.com/geekery/opensim-server-side-npc-experiments-day-three/
is actually client side code which hooks up with the website pandorabots. Ken explains it better than I do on the page but the gist of it is you sign up for a pandorabot, get an ID for it (which is generated by the site) then you send your text which you commented to the NPC bot to the web site via an LSL httprequest post and get a request back which you then parse and call the NPC say command with the result. Though it's functional the drawback is you need a prim for each bot with a listening script in it. Don't like that too much to be honest.
The way Radegast does it is much cleaner: the radegast bot itself listens and then calls the AIMLbot code which is on sourceforge. Problem is: it's *seriously* complicated. I was looking for something which would be basically *call this* *send this back*. Unfortunately I couldn't find anything like that on the first hack attempt. Basically my head exploded. So the interrim step is to get Ken's LSL client code working with the aurora bots for now kind of as a proof of concept. Then I'll try to dig deeper into radegast or else ask Khalifa. I believe I've seen his sig on the aurora-dev channel so who knows, he might answer enough to be able to push it ahead.
Unanswered questions: if I were to implement some kind of mash-up whereby I get the radegast listen code to work but instead of being able to implement an AIML parse, I instead server-side code up some httpRequests and Responses, far as I know they are synchronous calls. So if I send an HTTP POST request up to a website, will the region code just hang there until the response comes back? That would be kind of ugly. There ought to be an asynchronous call. But unfortunately my .net coding skillz aren't up to speed on that kind of thing so it might not be the best. In an ideal world, the AIMLBot guy would explain how to parse the AIML so I didn't need to futz around with httprequest etc and could just use AIML directly. On the other hand, I could do a metric ton of reading about how AIML works and just code something up myself....
Anyways. Onwards....
Some observations are that Ken's code (found here): http://kennethrougeau.com/geekery/opensim-server-side-npc-experiments-day-three/
is actually client side code which hooks up with the website pandorabots. Ken explains it better than I do on the page but the gist of it is you sign up for a pandorabot, get an ID for it (which is generated by the site) then you send your text which you commented to the NPC bot to the web site via an LSL httprequest post and get a request back which you then parse and call the NPC say command with the result. Though it's functional the drawback is you need a prim for each bot with a listening script in it. Don't like that too much to be honest.
The way Radegast does it is much cleaner: the radegast bot itself listens and then calls the AIMLbot code which is on sourceforge. Problem is: it's *seriously* complicated. I was looking for something which would be basically *call this* *send this back*. Unfortunately I couldn't find anything like that on the first hack attempt. Basically my head exploded. So the interrim step is to get Ken's LSL client code working with the aurora bots for now kind of as a proof of concept. Then I'll try to dig deeper into radegast or else ask Khalifa. I believe I've seen his sig on the aurora-dev channel so who knows, he might answer enough to be able to push it ahead.
Unanswered questions: if I were to implement some kind of mash-up whereby I get the radegast listen code to work but instead of being able to implement an AIML parse, I instead server-side code up some httpRequests and Responses, far as I know they are synchronous calls. So if I send an HTTP POST request up to a website, will the region code just hang there until the response comes back? That would be kind of ugly. There ought to be an asynchronous call. But unfortunately my .net coding skillz aren't up to speed on that kind of thing so it might not be the best. In an ideal world, the AIMLBot guy would explain how to parse the AIML so I didn't need to futz around with httprequest etc and could just use AIML directly. On the other hand, I could do a metric ton of reading about how AIML works and just code something up myself....
Anyways. Onwards....
Back on the case
So got the code compiled, a dev environment set up.
Test scenario worked: I can create bots etc. Now to hack the code.
First thing I'm going to do is see if I can't hack Ken's AIML client side code into the bot API...
Test scenario worked: I can create bots etc. Now to hack the code.
First thing I'm going to do is see if I can't hack Ken's AIML client side code into the bot API...
Friday, 8 April 2011
Not done too much recently but will resume in about 10 days
After the initial burst of activity I've slowed down a little due to RL stuff and should get back up to speed in about 10 days when I will have a high end windows box to play with. At that point I should be able to run a local version of aurora inside of Visual Studio and run the imprudence viewer at the same time, whereas that currently doesn't work inside my virtual machine on my mac.
It will be *very* entertaining to finally be able to hack some decent gamer AI into the bot code.
It will be *very* entertaining to finally be able to hack some decent gamer AI into the bot code.
Wednesday, 30 March 2011
LLDialog NPC HUD controller work-in-progress
K so the code doesn't work yet because it's splitting the list up based on UUID and you can't have more than 24 characters for a button (so that indicates it's actually working for the most part). What we should be doing is keeping two separate lists which are associated to each other by index, one for botName and one for botUUIDs. That way we can choose the bot by name but we will know the index and that will give us the corresponding UUID. Anyways, work on it another time since it's time for bed....
list lstBotList;
list main_menu = ["Blue", "Red"];
list blue_menu = ["Blue Stuff 1", "Blue Stuff 2", "Back"];
list red_menu = ["Red Stuff 1", "Red Stuff 2", "Back"];
integer submenu = FALSE;
integer listen_channel = 1;
default
{
state_entry()
{
llSetText("Touch to split parse the string", <1,0,0>, 1.0);
}
touch_end(integer num)
{
list a = osGetAvatarList();
integer i;
integer intCount= 1;
integer s = llGetListLength(a);
do
{
if (intCount == 4)
{
intCount = 1;
}
if (intCount == 1)
{
string strListItem = llList2String(a,i);
llSay(0,strListItem);
lstBotList = [strListItem] + lstBotList;
}
intCount++;
}
while(s>++i);
llSay(0, "Touched.");
llDialog(llDetectedKey(0),"Try a selection...", lstBotList, listen_channel);
state stReadToGenerateLLDialog;
}
}
state stReadyToGenerateLLDialog
{
state_entry()
{
llListen(listen_channel,"",llGetOwner(),"");
//llSay(0, "Hello, Avatar!");
llSetText("Sample menu using LLDialog", <1,0,0>, 1.0);
}
listen(integer channel, string name, key id, string message)
{
if (submenu == FALSE)
{
// Use a main menu verification
if (message == "Blue")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", blue_menu, listen_channel);
}
if (message == "Red")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", red_menu, listen_channel);
}
submenu = TRUE;
llSetTimerEvent(20.0);
}
else
{
// Use a sub menu verification
llSetTimerEvent(20.0);
if (message == "Back")
{
llDialog(id,message + " Dialog", main_menu, listen_channel);
submenu = FALSE;
}
else
{
llSay(0,"You picked " + message);
//might want to verify which sub-menu was being used to redisplay here, etc
submenu = FALSE;
}
}
}
timer()
{
llSetTimerEvent(0.0);
llSay(0,"You waited too long to pick, resetting menu.");
submenu = FALSE;
}
}
list lstBotList;
list main_menu = ["Blue", "Red"];
list blue_menu = ["Blue Stuff 1", "Blue Stuff 2", "Back"];
list red_menu = ["Red Stuff 1", "Red Stuff 2", "Back"];
integer submenu = FALSE;
integer listen_channel = 1;
default
{
state_entry()
{
llSetText("Touch to split parse the string", <1,0,0>, 1.0);
}
touch_end(integer num)
{
list a = osGetAvatarList();
integer i;
integer intCount= 1;
integer s = llGetListLength(a);
do
{
if (intCount == 4)
{
intCount = 1;
}
if (intCount == 1)
{
string strListItem = llList2String(a,i);
llSay(0,strListItem);
lstBotList = [strListItem] + lstBotList;
}
intCount++;
}
while(s>++i);
llSay(0, "Touched.");
llDialog(llDetectedKey(0),"Try a selection...", lstBotList, listen_channel);
state stReadToGenerateLLDialog;
}
}
state stReadyToGenerateLLDialog
{
state_entry()
{
llListen(listen_channel,"",llGetOwner(),"");
//llSay(0, "Hello, Avatar!");
llSetText("Sample menu using LLDialog", <1,0,0>, 1.0);
}
listen(integer channel, string name, key id, string message)
{
if (submenu == FALSE)
{
// Use a main menu verification
if (message == "Blue")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", blue_menu, listen_channel);
}
if (message == "Red")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", red_menu, listen_channel);
}
submenu = TRUE;
llSetTimerEvent(20.0);
}
else
{
// Use a sub menu verification
llSetTimerEvent(20.0);
if (message == "Back")
{
llDialog(id,message + " Dialog", main_menu, listen_channel);
submenu = FALSE;
}
else
{
llSay(0,"You picked " + message);
//might want to verify which sub-menu was being used to redisplay here, etc
submenu = FALSE;
}
}
}
timer()
{
llSetTimerEvent(0.0);
llSay(0,"You waited too long to pick, resetting menu.");
submenu = FALSE;
}
}
Tuesday, 29 March 2011
There's an error in the osAvatarPlayAnimation code
So the deal is: you dump an animation into the prim along with the script. The script contains the UUID of the bot. Then you touch the script and the bot plays the animation. Doesn't work unfortunately because there's some kind of permissions error.
[09:39 PM] Object: Error script: System.Exception: Runtime Error: osAvatarPlayAnimation permission denied. Prim owner is not in the list of users allowed to execute this function.
at Aurora.ScriptEngine.AuroraDotNetEngine.ScriptProtectionModule.Error(String surMessage, String msg) in d:\works37\Aurora\AuroraDotNetEngine\ScriptProtectionModule.cs:line 178
at Aurora.ScriptEngine.AuroraDotNetEngine.ScriptProtectionModule.CheckThreatLevel(ThreatLevel level, String function, ISceneChildEntity m_host, String API) in d:\works37\Aurora\AuroraDotNetEngine\ScriptProtectionModule.cs:line 169
at Aurora.ScriptEngine.AuroraDotNetEngine.APIs.OSSL_Api.osAvatarPlayAnimation(String avatar, String animation) in d:\works37\Aurora\AuroraDotNetEngine\APIs\OSSL_Api.cs:line 797
at Script.ScriptClass.d__2.MoveNext() in c:\temp\hegrd1sn.0.cs:line 36
at Aurora.ScriptEngine.AuroraDotNetEngine.Runtime.Executor.FireAsEnumerator(EnumeratorInfo Start, MethodInfo ev, Object[] args, Exceptio
[09:39 PM] Object: Error script: System.Exception: Runtime Error: osAvatarPlayAnimation permission denied. Prim owner is not in the list of users allowed to execute this function.
at Aurora.ScriptEngine.AuroraDotNetEngine.ScriptProtectionModule.Error(String surMessage, String msg) in d:\works37\Aurora\AuroraDotNetEngine\ScriptProtectionModule.cs:line 178
at Aurora.ScriptEngine.AuroraDotNetEngine.ScriptProtectionModule.CheckThreatLevel(ThreatLevel level, String function, ISceneChildEntity m_host, String API) in d:\works37\Aurora\AuroraDotNetEngine\ScriptProtectionModule.cs:line 169
at Aurora.ScriptEngine.AuroraDotNetEngine.APIs.OSSL_Api.osAvatarPlayAnimation(String avatar, String animation) in d:\works37\Aurora\AuroraDotNetEngine\APIs\OSSL_Api.cs:line 797
at Script.ScriptClass.
at Aurora.ScriptEngine.AuroraDotNetEngine.Runtime.Executor.FireAsEnumerator(EnumeratorInfo Start, MethodInfo ev, Object[] args, Exceptio
Sample Aurora NPC Scripts: Remove a bot from the sim
In order to use this one you need to know the UUID of the bot you want to nuke. If you're nuking a single bot, right click on it, use profile and then "get key" if you're in imprudence. Then copy the UUID in here and nuke it.
Another possibility might be to generate the list of UUIDs and step through each one of them at a time and nuke them all.
default
{
state_entry()
{
llSetText("Zap bot by UUID", <1,0,0>, 1.0);
}
touch_start(integer a)
{
string bot = "c8269ef3-873d-4b68-a667-a2938ffc90af"; //llGetObjectDesc();
botRemoveBot(bot);
}
}
Another possibility might be to generate the list of UUIDs and step through each one of them at a time and nuke them all.
default
{
state_entry()
{
llSetText("Zap bot by UUID", <1,0,0>, 1.0);
}
touch_start(integer a)
{
string bot = "c8269ef3-873d-4b68-a667-a2938ffc90af"; //llGetObjectDesc();
botRemoveBot(bot);
}
}
Aurora NPC Sample Scripts: Create a menu
This one builds a menu from a list and then depending on your selection, performs other actions in a list. This could be modified to accept a list of bots passed in from the generate bots list script and then choose actions for them to perform from a list.
list main_menu = ["Blue", "Red"];
list blue_menu = ["Blue Stuff 1", "Blue Stuff 2", "Back"];
list red_menu = ["Red Stuff 1", "Red Stuff 2", "Back"];
integer submenu = FALSE;
integer listen_channel = 1;
default
{
state_entry()
{
llListen(listen_channel,"",llGetOwner(),"");
//llSay(0, "Hello, Avatar!");
llSetText("Sample menu using LLDialog", <1,0,0>, 1.0);
}
touch_start(integer total_number)
{
llSay(0, "Touched.");
llDialog(llDetectedKey(0),"Try a selection...", main_menu, listen_channel);
}
listen(integer channel, string name, key id, string message)
{
if (submenu == FALSE)
{
// Use a main menu verification
if (message == "Blue")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", blue_menu, listen_channel);
}
if (message == "Red")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", red_menu, listen_channel);
}
submenu = TRUE;
llSetTimerEvent(20.0);
}
else
{
// Use a sub menu verification
llSetTimerEvent(20.0);
if (message == "Back")
{
llDialog(id,message + " Dialog", main_menu, listen_channel);
submenu = FALSE;
}
else
{
llSay(0,"You picked " + message);
//might want to verify which sub-menu was being used to redisplay here, etc
submenu = FALSE;
}
}
}
timer()
{
llSetTimerEvent(0.0);
llSay(0,"You waited too long to pick, resetting menu.");
submenu = FALSE;
}
}
list main_menu = ["Blue", "Red"];
list blue_menu = ["Blue Stuff 1", "Blue Stuff 2", "Back"];
list red_menu = ["Red Stuff 1", "Red Stuff 2", "Back"];
integer submenu = FALSE;
integer listen_channel = 1;
default
{
state_entry()
{
llListen(listen_channel,"",llGetOwner(),"");
//llSay(0, "Hello, Avatar!");
llSetText("Sample menu using LLDialog", <1,0,0>, 1.0);
}
touch_start(integer total_number)
{
llSay(0, "Touched.");
llDialog(llDetectedKey(0),"Try a selection...", main_menu, listen_channel);
}
listen(integer channel, string name, key id, string message)
{
if (submenu == FALSE)
{
// Use a main menu verification
if (message == "Blue")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", blue_menu, listen_channel);
}
if (message == "Red")
{
llSay(0,"Thanks for picking " + message);
llDialog(id,message + " Dialog", red_menu, listen_channel);
}
submenu = TRUE;
llSetTimerEvent(20.0);
}
else
{
// Use a sub menu verification
llSetTimerEvent(20.0);
if (message == "Back")
{
llDialog(id,message + " Dialog", main_menu, listen_channel);
submenu = FALSE;
}
else
{
llSay(0,"You picked " + message);
//might want to verify which sub-menu was being used to redisplay here, etc
submenu = FALSE;
}
}
}
timer()
{
llSetTimerEvent(0.0);
llSay(0,"You waited too long to pick, resetting menu.");
submenu = FALSE;
}
}
Sample Aurora NPC Scripts: Make bot move
This one will make the bot (whose UUID is specified in the string) move 2 units Yward and then 2 units Xward by walking.
default
{
state_entry()
{
llSetText("Make bot move", <1,0,0>, 1.0);
}
touch_start(integer a)
{
string botID = "271419d4-a01d-4432-83f7-64cae13c86bf";
//Now give it a list of positions to go around
list positions = [llGetPos(), llGetPos() + <0, 2, 0>, llGetPos() + <2, 0, 0>];
//Now tell it how it will get there
//0 - Walk to the next target
//1 - Fly to the next target
list types = [0,0,0];
//Now tell the bot what to do
botSetMap(botID, positions, types);
}
}
default
{
state_entry()
{
llSetText("Make bot move", <1,0,0>, 1.0);
}
touch_start(integer a)
{
string botID = "271419d4-a01d-4432-83f7-64cae13c86bf";
//Now give it a list of positions to go around
list positions = [llGetPos(), llGetPos() + <0, 2, 0>, llGetPos() + <2, 0, 0>];
//Now tell it how it will get there
//0 - Walk to the next target
//1 - Fly to the next target
list types = [0,0,0];
//Now tell the bot what to do
botSetMap(botID, positions, types);
}
}
Sample Aurora NPC Scripts: Get a list of all bots in region
This one will generate a list of all the bots in the region along with locations.
This in theory could be used to parse out the bots and pass them to some kind of menu list from which you could choose a bot's actions.
// ----------------------------------------------------------------
// Example / Sample Script to show function use.
//
// Script Title: osGetAgents.lsl
// Script Author: WSM
// Threat Level: None
// Script Source: SUPPLEMENTAL http://opensimulator.org/wiki/osGetAgents
//
// Notes: See Script Source reference for more detailed information
// This sample is full opensource and available to use as you see fit and desire.
// Threat Levels only apply to OSSL & AA Functions
// See http://opensimulator.org/wiki/Threat_level
//================================================================
// C# Source Line: public LSL_List osGetAgents()
// Inworld Script Line: list osGetAgents();
//
// Example of osGetAgents
//
//default
//{
// state_entry()
// {
// llSay(0, "Touch to get a List of Avatars on this Region using osGetAgents");
// }
// touch_start(integer num)
// {
// llSay(0, "The Avatars located here are: "+ llList2CSV(osGetAgents()));
// }
//}
default
{
state_entry()
{
llSetText("Touch to get a list of bots and UUIDs", <1,0,0>, 1.0);;
}
touch_start(integer total_number)
{
list avatars = osGetAvatarList(); //creates a Strided List (3 strides)
llSay(0, "UUID, Position, AvatarName, on this Region (without the owner):\n" + llList2CSV(avatars));
}
}
This in theory could be used to parse out the bots and pass them to some kind of menu list from which you could choose a bot's actions.
// ----------------------------------------------------------------
// Example / Sample Script to show function use.
//
// Script Title: osGetAgents.lsl
// Script Author: WSM
// Threat Level: None
// Script Source: SUPPLEMENTAL http://opensimulator.org/wiki/osGetAgents
//
// Notes: See Script Source reference for more detailed information
// This sample is full opensource and available to use as you see fit and desire.
// Threat Levels only apply to OSSL & AA Functions
// See http://opensimulator.org/wiki/Threat_level
//================================================================
// C# Source Line: public LSL_List osGetAgents()
// Inworld Script Line: list osGetAgents();
//
// Example of osGetAgents
//
//default
//{
// state_entry()
// {
// llSay(0, "Touch to get a List of Avatars on this Region using osGetAgents");
// }
// touch_start(integer num)
// {
// llSay(0, "The Avatars located here are: "+ llList2CSV(osGetAgents()));
// }
//}
default
{
state_entry()
{
llSetText("Touch to get a list of bots and UUIDs", <1,0,0>, 1.0);;
}
touch_start(integer total_number)
{
list avatars = osGetAvatarList(); //creates a Strided List (3 strides)
llSay(0, "UUID, Position, AvatarName, on this Region (without the owner):\n" + llList2CSV(avatars));
}
}
Sample Useful scripts for Aurora NPCs: Followbot
This one will generate a bot using the UUID of the originator avatar. The so-generated bot will appear wearing the clothes, skin and attachments the avatar was wearing last time it was logged in. Then the bot will follow the avatar whose UUID is given in toFollow
string first = "Test";
string last = "Bot";
key userToDuplicate;
string botID;
string toFollow;
default
{
state_entry()
{
llSetText("Create followbot by UUID", <1,0,0>, 1.0);
//On startup, we'll generate a new bot, then make it move when we touch it
//Create the bot with the given first/last name and the user whose appearance it will duplicate
//userToDuplicate = llGetOwner();
userToDuplicate = "121419d4-a04d-4018-83f7-64cae13c86bf";
botID = botCreateBot(first, last, userToDuplicate);
//You can either put an avatar's name or UUID here
//botFollowAvatar(botID, llGetOwner());
toFollow = "f13f4fb8-035a-4bca-b9f5-553d5b773f86";
botFollowAvatar(botID,toFollow);
}
touch_start(integer a)
{
botRemoveBot(botID);
}
}
string first = "Test";
string last = "Bot";
key userToDuplicate;
string botID;
string toFollow;
default
{
state_entry()
{
llSetText("Create followbot by UUID", <1,0,0>, 1.0);
//On startup, we'll generate a new bot, then make it move when we touch it
//Create the bot with the given first/last name and the user whose appearance it will duplicate
//userToDuplicate = llGetOwner();
userToDuplicate = "121419d4-a04d-4018-83f7-64cae13c86bf";
botID = botCreateBot(first, last, userToDuplicate);
//You can either put an avatar's name or UUID here
//botFollowAvatar(botID, llGetOwner());
toFollow = "f13f4fb8-035a-4bca-b9f5-553d5b773f86";
botFollowAvatar(botID,toFollow);
}
touch_start(integer a)
{
botRemoveBot(botID);
}
}
Saturday, 26 March 2011
So what can the basic aurora bots do?
Lolz I'm having fun tonight. Rev's code works and I loaded up 3 bots.
How you do it is rezz a prim and dump the script from the previous post in. That rezzes the bot. Then there are chat commands you can give it.
!stop makes the bot stop what it's doing and not move any more.
!go forward makes the bot take a step forward
!go back makes the bot take a step backward
!go left makes the bot take a step to the left
!go right makes the bot take a step to the right
!fly makes the bot fly.
!teleport makes the bot teleport.
Obviously the code could be beefed up (I'm thinking of Ken Rougeau's AIML chatbot script for one). It would be *very* interesting to add in some decent gaming AI then Aurora would be a pretty reasonable gaming platform given the ease of building things and dressing up/redesigning the avatar.
Little glitches I've noticed: all the bots respond to the same command at the same time. Also, the bots wear the last outfit the avatar was wearing last time someone logged in with them as a user BUT they *don't* wear attachments. That's definitely going to have to be looked into.
Anyways, good times.
How you do it is rezz a prim and dump the script from the previous post in. That rezzes the bot. Then there are chat commands you can give it.
!stop makes the bot stop what it's doing and not move any more.
!go forward makes the bot take a step forward
!go back makes the bot take a step backward
!go left makes the bot take a step to the left
!go right makes the bot take a step to the right
!fly makes the bot fly.
!teleport makes the bot teleport.
Obviously the code could be beefed up (I'm thinking of Ken Rougeau's AIML chatbot script for one). It would be *very* interesting to add in some decent gaming AI then Aurora would be a pretty reasonable gaming platform given the ease of building things and dressing up/redesigning the avatar.
Little glitches I've noticed: all the bots respond to the same command at the same time. Also, the bots wear the last outfit the avatar was wearing last time someone logged in with them as a user BUT they *don't* wear attachments. That's definitely going to have to be looked into.
Anyways, good times.
Friday, 25 March 2011
This is the LSL script to initialize the Aurora NPC bots
Dump this LSL into a prim end then touch it.... it should create a bot in your own image.
string first = "Test";
string last = "Bot";
key userToDuplicate;
string botID;
default
{
state_entry()
{
//On startup, we'll generate a new bot, then make it move when we touch it
//Create the bot with the given first/last name and the user whose appearance it will duplicate
userToDuplicate = llGetOwner();
botID = botCreateBot(first, last, userToDuplicate);
llListen( 0, "", NULL_KEY, "" );
}
touch_start(integer number)
{
//Now give it a list of positions to go around
list positions = [llGetPos(), llGetPos() + <0, 20, 20>, llGetPos() + <20, 0, 20>];
//Now tell it how it will get there
//0 - Walk to the next target
//1 - Fly to the next target
list types = [1,1,1];
//Now tell the bot what to do
botSetMap(botID, positions, types);
}
listen( integer channel, string name, key id, string message )
{
if ( id == llGetOwner() )
{
if(message == "pause")
{
//This disables the bots movement, however, the bot will warp to its next location once the alloted time runs out for movement
botPause(botID);
}
if(message == "resume")
{
//This reenables movement for the bot and does not turn on the movement timer
botResume(botID);
}
if(message == "stop")
{
//This disables the bots movement, as well as the auto warp that will occur if the bot does not get to its position in the alloted period of time
botStop(botID);
}
if(message == "start")
{
//This reenables movement for the bot and does turn on the movement timer
botStart(botID);
}
}
}
}
string first = "Test";
string last = "Bot";
key userToDuplicate;
string botID;
default
{
state_entry()
{
//On startup, we'll generate a new bot, then make it move when we touch it
//Create the bot with the given first/last name and the user whose appearance it will duplicate
userToDuplicate = llGetOwner();
botID = botCreateBot(first, last, userToDuplicate);
llListen( 0, "", NULL_KEY, "" );
}
touch_start(integer number)
{
//Now give it a list of positions to go around
list positions = [llGetPos(), llGetPos() + <0, 20, 20>, llGetPos() + <20, 0, 20>];
//Now tell it how it will get there
//0 - Walk to the next target
//1 - Fly to the next target
list types = [1,1,1];
//Now tell the bot what to do
botSetMap(botID, positions, types);
}
listen( integer channel, string name, key id, string message )
{
if ( id == llGetOwner() )
{
if(message == "pause")
{
//This disables the bots movement, however, the bot will warp to its next location once the alloted time runs out for movement
botPause(botID);
}
if(message == "resume")
{
//This reenables movement for the bot and does not turn on the movement timer
botResume(botID);
}
if(message == "stop")
{
//This disables the bots movement, as well as the auto warp that will occur if the bot does not get to its position in the alloted period of time
botStop(botID);
}
if(message == "start")
{
//This reenables movement for the bot and does turn on the movement timer
botStart(botID);
}
}
}
}
Rev may have fixed the cloud rezzing problem at least in Aurora
https://github.com/aurora-sim/Aurora-Sim/compare/e09d77b...04fef5c
I'm gonna see if I can't get that hacked into my branch of aurora.
Hopefully if this code works, it will find its way into the master branch of aurora soon.
I'm gonna see if I can't get that hacked into my branch of aurora.
Hopefully if this code works, it will find its way into the master branch of aurora soon.
Maybe NPCs have been picked up again by the Aurora guys
Just hanging out with the Aurora guys for a bit listening to their conversation and they've been playing with the NPC code same as me over the last little while. Rev Smythe is doing some nice work. Seems there is an exception when fetching the appearance, which is related to XML.
Digging around I found some forum posts confirming the bake theory
Bots largely depend on viewer compositing, this means they
send the
individual clothing layers, but no baked ones, and the viewing
user's viewer composites them into a viewable avatar.
Starting from 1.23, LL viewer no longer send, or expect,
compositing
layers. Avatars no longer carry texture information about their
individual clothing items. This is to combat theft, avatars now
carry only the 3 baked textures. Therefore, Copybot can no
longer
rip clothing from an avatar it sees.
Because of this, the "Cloud" stage is perceived to be longer
on 1.23
users, as they don't render until the bakes have been pushed
to the
server and then downloaded by the other clients. An avatar
counts as
unloaded if it's visual params are the default ones OR it
has no
baked textures. Previously, it would count as unloaded only
if it
had no textures at all, or all default visual params.
Since bots don't bake, they will never render for users of
the 1.23,
since it neither creates not expects compositing layers.
The following are the minimum textures that need to be baked for an
avatar to be considered "complete"
TEX_HEAD_BAKED
TEX_UPPER_BAKED
TEX_LOWER_BAKED
TEX_EYES_BAKED
TEX_HAIR_BAKED*
send the
individual clothing layers, but no baked ones, and the viewing
user's viewer composites them into a viewable avatar.
Starting from 1.23, LL viewer no longer send, or expect,
compositing
layers. Avatars no longer carry texture information about their
individual clothing items. This is to combat theft, avatars now
carry only the 3 baked textures. Therefore, Copybot can no
longer
rip clothing from an avatar it sees.
Because of this, the "Cloud" stage is perceived to be longer
on 1.23
users, as they don't render until the bakes have been pushed
to the
server and then downloaded by the other clients. An avatar
counts as
unloaded if it's visual params are the default ones OR it
has no
baked textures. Previously, it would count as unloaded only
if it
had no textures at all, or all default visual params.
Since bots don't bake, they will never render for users of
the 1.23,
since it neither creates not expects compositing layers.
The following are the minimum textures that need to be baked for an
avatar to be considered "complete"
TEX_HEAD_BAKED
TEX_UPPER_BAKED
TEX_LOWER_BAKED
TEX_EYES_BAKED
TEX_HAIR_BAKED*
This is the radegast bake code
The following is the radegast bake code. Note that this is not all the code you need because there is a call to Baker.Bake() and Baker.AddTexture() et cetera.
That said, here it is for posterity:
/// Blocking method to create and upload baked textures for all of the
/// missing bakes
///
///True on success, otherwise false
private bool CreateBakes()
{
bool success = true;
List pendingBakes = new List();
// Check each bake layer in the Textures array for missing bakes
for (int bakedIndex = 0; bakedIndex < BAKED_TEXTURE_COUNT; bakedIndex++) { AvatarTextureIndex textureIndex = BakeTypeToAgentTextureIndex((BakeType)bakedIndex); if (Textures[(int)textureIndex].TextureID == UUID.Zero) { // If this is the skirt layer and we're not wearing a skirt then skip it if (bakedIndex == (int)BakeType.Skirt && !Wearables.ContainsKey(WearableType.Skirt)) continue; pendingBakes.Add((BakeType)bakedIndex); } } if (pendingBakes.Count > 0)
{
DownloadTextures(pendingBakes);
Parallel.ForEach(Math.Min(MAX_CONCURRENT_UPLOADS, pendingBakes.Count), pendingBakes,
delegate(BakeType bakeType)
{
if (!CreateBake(bakeType))
success = false;
}
);
}
// Free up all the textures we're holding on to
for (int i = 0; i < Textures.Length; i++) { Textures[i].Texture = null; } // We just allocated and freed a ridiculous amount of memory while // baking. Signal to the GC to clean up GC.Collect(); return success; } ///
/// Blocking method to create and upload a baked texture for a single
/// bake layer
///
/// Layer to bake ///True on success, otherwise false
private bool CreateBake(BakeType bakeType)
{
List textureIndices = BakeTypeToTextures(bakeType);
Baker oven = new Baker(bakeType);
for (int i = 0; i < textureIndices.Count; i++) { AvatarTextureIndex textureIndex = textureIndices[i]; TextureData texture = Textures[(int)textureIndex]; texture.TextureIndex = textureIndex; oven.AddTexture(texture); } int start = Environment.TickCount; oven.Bake(); Logger.DebugLog("Baking " + bakeType + " took " + (Environment.TickCount - start) + "ms"); UUID newAssetID = UUID.Zero; int retries = UPLOAD_RETRIES; while (newAssetID == UUID.Zero && retries > 0)
{
newAssetID = UploadBake(oven.BakedTexture.AssetData);
--retries;
}
Textures[(int)BakeTypeToAgentTextureIndex(bakeType)].TextureID = newAssetID;
if (newAssetID == UUID.Zero)
{
Logger.Log("Failed uploading bake " + bakeType, Helpers.LogLevel.Warning);
return false;
}
return true;
}
///
/// Blocking method to upload a baked texture
///
/// Five channel JPEG2000 texture data to upload ///UUID of the newly created asset on success, otherwise UUID.Zero
private UUID UploadBake(byte[] textureData)
{
UUID bakeID = UUID.Zero;
AutoResetEvent uploadEvent = new AutoResetEvent(false);
Client.Assets.RequestUploadBakedTexture(textureData,
delegate(UUID newAssetID)
{
bakeID = newAssetID;
uploadEvent.Set();
}
);
// FIXME: evalute the need for timeout here, RequestUploadBakedTexture() will
// timout either on Client.Settings.TRANSFER_TIMEOUT or Client.Settings.CAPS_TIMEOUT
// depending on which upload method is used.
uploadEvent.WaitOne(UPLOAD_TIMEOUT, false);
return bakeID;
}
That said, here it is for posterity:
/// Blocking method to create and upload baked textures for all of the
/// missing bakes
///
///
private bool CreateBakes()
{
bool success = true;
List
// Check each bake layer in the Textures array for missing bakes
for (int bakedIndex = 0; bakedIndex < BAKED_TEXTURE_COUNT; bakedIndex++) { AvatarTextureIndex textureIndex = BakeTypeToAgentTextureIndex((BakeType)bakedIndex); if (Textures[(int)textureIndex].TextureID == UUID.Zero) { // If this is the skirt layer and we're not wearing a skirt then skip it if (bakedIndex == (int)BakeType.Skirt && !Wearables.ContainsKey(WearableType.Skirt)) continue; pendingBakes.Add((BakeType)bakedIndex); } } if (pendingBakes.Count > 0)
{
DownloadTextures(pendingBakes);
Parallel.ForEach
delegate(BakeType bakeType)
{
if (!CreateBake(bakeType))
success = false;
}
);
}
// Free up all the textures we're holding on to
for (int i = 0; i < Textures.Length; i++) { Textures[i].Texture = null; } // We just allocated and freed a ridiculous amount of memory while // baking. Signal to the GC to clean up GC.Collect(); return success; } ///
/// Blocking method to create and upload a baked texture for a single
/// bake layer
///
/// Layer to bake ///
private bool CreateBake(BakeType bakeType)
{
List
Baker oven = new Baker(bakeType);
for (int i = 0; i < textureIndices.Count; i++) { AvatarTextureIndex textureIndex = textureIndices[i]; TextureData texture = Textures[(int)textureIndex]; texture.TextureIndex = textureIndex; oven.AddTexture(texture); } int start = Environment.TickCount; oven.Bake(); Logger.DebugLog("Baking " + bakeType + " took " + (Environment.TickCount - start) + "ms"); UUID newAssetID = UUID.Zero; int retries = UPLOAD_RETRIES; while (newAssetID == UUID.Zero && retries > 0)
{
newAssetID = UploadBake(oven.BakedTexture.AssetData);
--retries;
}
Textures[(int)BakeTypeToAgentTextureIndex(bakeType)].TextureID = newAssetID;
if (newAssetID == UUID.Zero)
{
Logger.Log("Failed uploading bake " + bakeType, Helpers.LogLevel.Warning);
return false;
}
return true;
}
///
/// Blocking method to upload a baked texture
///
/// Five channel JPEG2000 texture data to upload ///
private UUID UploadBake(byte[] textureData)
{
UUID bakeID = UUID.Zero;
AutoResetEvent uploadEvent = new AutoResetEvent(false);
Client.Assets.RequestUploadBakedTexture(textureData,
delegate(UUID newAssetID)
{
bakeID = newAssetID;
uploadEvent.Set();
}
);
// FIXME: evalute the need for timeout here, RequestUploadBakedTexture() will
// timout either on Client.Settings.TRANSFER_TIMEOUT or Client.Settings.CAPS_TIMEOUT
// depending on which upload method is used.
uploadEvent.WaitOne(UPLOAD_TIMEOUT, false);
return bakeID;
}
Hunting down the Bake code.
So after a bit of sweat and tears I tracked down the client side bake code.
I suspect that maybe the reason why the NPCs stay as clouds (in addition to all the other properties I've not set right in e.g. the AgentCircuitData) that due to the convoluted way baked textures get passed around, the NPC code will be missing the bake code.
The way it appears to work to display the avatars to the rest of the connected clients is this: Scene gets a request to create an avatar in a scenepresence so it sends a request for baked textures to the client because the baking isn't done on the server. (Was it done on the server before? i.e. why did it work in 0.6.9? Need to investigate how 0.6.9 did it).
Anyways, the way it works *now* is that the client then receives the textures from the servers which it has to bake or else it picks up the textures it has from its cache. At that point it runs the baking code, effectively compressing all the different jpeg (or whatever) layers into a single jpeg (or whatever). This "baked" image is then sent back up to the Scene on the server. This is done for each texture for each section of the avatar's body (i.e. head, upper body and lower body). There are also alpha layers ant whatnot but I don't fully understand how those work.
Once the Scene has received the appropriate bakedtextures it somehow allocates them as temporary assets and associates them somehow with the agent or with the scenepresence(?) - a little unclear on the *exact* mechanics but more or less that's it.
Next step is it sends the baked textures down to all the other scenepresences in the scene. What that means is that only your own client has the real textures for your avatar. Everybody else only has baked textures, which makes it difficult for other viewers to rip your content out of the scene.
Anyways, the corresponding bake code is in the libopenmetaverse. I can't find any bake code like that in imprudence. It has to be there but I can't find it. Any case, in an ideal world we want C# and not have to convert from C++ to C# so libopenmetaverse is probably the best choice.
The code module is ibopenmetaverse/trunk/openmetaverse/imaging/bakelayer.cs
How this code is used can be found in the radegast code also: look for code like oven.bake and oven.addtexture to locate it in the radegast code (appearancemanage.cs in the radegast trunk). So in theory that's all the code you need to get the baking implemented.
I was hoping there was an easier way to do it than this, but if not well at least I know where to look for the relevant code.
Also: opensim does use some libraries from openmetaverse but I'm unclear on whether the package required to do the baking is in or not. The required package is OpenMetaverse.Imaging.
If it's not in it, then there's a further dependency on OpenMetaverse.Assets.
The following are the required "usings" in the bakelayer.cs code:
using System;
using System.Collections.Generic;
using System.IO;
using System.Drawing;
using OpenMetaverse.Assets;
namespace OpenMetaverse.Imaging
I suspect that maybe the reason why the NPCs stay as clouds (in addition to all the other properties I've not set right in e.g. the AgentCircuitData) that due to the convoluted way baked textures get passed around, the NPC code will be missing the bake code.
The way it appears to work to display the avatars to the rest of the connected clients is this: Scene gets a request to create an avatar in a scenepresence so it sends a request for baked textures to the client because the baking isn't done on the server. (Was it done on the server before? i.e. why did it work in 0.6.9? Need to investigate how 0.6.9 did it).
Anyways, the way it works *now* is that the client then receives the textures from the servers which it has to bake or else it picks up the textures it has from its cache. At that point it runs the baking code, effectively compressing all the different jpeg (or whatever) layers into a single jpeg (or whatever). This "baked" image is then sent back up to the Scene on the server. This is done for each texture for each section of the avatar's body (i.e. head, upper body and lower body). There are also alpha layers ant whatnot but I don't fully understand how those work.
Once the Scene has received the appropriate bakedtextures it somehow allocates them as temporary assets and associates them somehow with the agent or with the scenepresence(?) - a little unclear on the *exact* mechanics but more or less that's it.
Next step is it sends the baked textures down to all the other scenepresences in the scene. What that means is that only your own client has the real textures for your avatar. Everybody else only has baked textures, which makes it difficult for other viewers to rip your content out of the scene.
Anyways, the corresponding bake code is in the libopenmetaverse. I can't find any bake code like that in imprudence. It has to be there but I can't find it. Any case, in an ideal world we want C# and not have to convert from C++ to C# so libopenmetaverse is probably the best choice.
The code module is ibopenmetaverse/trunk/openmetaverse/imaging/bakelayer.cs
How this code is used can be found in the radegast code also: look for code like oven.bake and oven.addtexture to locate it in the radegast code (appearancemanage.cs in the radegast trunk). So in theory that's all the code you need to get the baking implemented.
I was hoping there was an easier way to do it than this, but if not well at least I know where to look for the relevant code.
Also: opensim does use some libraries from openmetaverse but I'm unclear on whether the package required to do the baking is in or not. The required package is OpenMetaverse.Imaging.
If it's not in it, then there's a further dependency on OpenMetaverse.Assets.
The following are the required "usings" in the bakelayer.cs code:
using System;
using System.Collections.Generic;
using System.IO;
using System.Drawing;
using OpenMetaverse.Assets;
namespace OpenMetaverse.Imaging
So hacking the way through the TODO list for today
I couldn't find very easily the code for [SCENE] Incoming client which appears to be Scene.VerifyClient because I don't have my laptop here but I found code for LindenUDP.LLUDPServer.HandleUseCircuitCode.
Now this code is possibly not exactly the same as what I have on my test box at home and I'm looking at both vanilla opensim and aurora so it's not a 100% match but it should give some ideas as to what's going on. Anyways.. let's dig through the code:
2.1 Looking at HandleUseCircuitCode(object o) (follows). You will notice that this calls AddNewClient with a circuitcode packet and a remoteendpoint, so these are possibly necessary. Right now we don't call this function in our code, but instead mimic in the background and our c ircuitcode is built up manually and passed to the scene manually instead of coming in as a packet. We also have a null endpoint. So definitely we can say that the flow is different here though the steps are similar. So next, jump past the code to section 2.2 to see what AddNewClient with two parameters does.
private void HandleUseCircuitCode(object o)
{
object[] array = (object[])o;
UDPPacketBuffer buffer = (UDPPacketBuffer)array[0];
UseCircuitCodePacket packet = (UseCircuitCodePacket)array[1];
IPEndPoint remoteEndPoint = (IPEndPoint)buffer.RemoteEndPoint;
// Begin the process of adding the client to the simulator
AddNewClient((UseCircuitCodePacket)packet, remoteEndPoint);
// Acknowledge the UseCircuitCode packet
SendAckImmediate(remoteEndPoint, packet.Header.Sequence);
}
2.2 two parameter AddNewClient. What does this do? Let's see... It checks circuitcode to see if it's valid, asking for a sessionID, a remoteEndPoint and sessioninfo which it uses to call AddClient. Right off the bat our sessionID is 00000's, our remoteEndPoint is null and we haven't set sessionInfo either. So we possibly need to look into that. It also Authenticates the response with sessioninfo and also checks to see if the client is authorized before it calls AddClient(circuitCode, agentID, sessionID, remoteEndPoint, sessionInfo). So let's jump past this code block to 2.3
private void AddNewClient(UseCircuitCodePacket useCircuitCode, IPEndPoint remoteEndPoint)
{
UUID agentID = useCircuitCode.CircuitCode.ID;
UUID sessionID = useCircuitCode.CircuitCode.SessionID;
uint circuitCode = useCircuitCode.CircuitCode.Code;
if (m_scene.RegionStatus != RegionStatus.SlaveScene)
{
AuthenticateResponse sessionInfo;
if (IsClientAuthorized(useCircuitCode, out sessionInfo))
{
AddClient(circuitCode, agentID, sessionID, remoteEndPoint, sessionInfo);
}
else
{
// Don't create circuits for unauthorized clients
m_log.WarnFormat(
"[LLUDPSERVER]: Connection request for client {0} connecting with unnotified circuit code {1} from {2}",
useCircuitCode.CircuitCode.ID, useCircuitCode.CircuitCode.Code, remoteEndPoint);
}
}
else
{
// Slave regions don't accept new clients
m_log.Debug("[LLUDPSERVER]: Slave region " + m_scene.RegionInfo.RegionName + " ignoring UseCircuitCode packet");
}
}
2.3 In this section we're looking at AddClient. There are a bunch of things we are not doing here such as adding a LogoutHandler event as well as doing client.Start in the vanilla code. I'd like to get a look at client.Start to see what it does but I don't seem to have that code here so instead I'm going to look at the Aurora version which I can see more easily on the web. The aurora version has the line m_scene.AddNewClient(client) which is an overriden single parameter version. So in 2.4 we'll look at that: m_scene.AddNewClient
protected virtual void AddClient(uint circuitCode, UUID agentID, UUID sessionID, IPEndPoint remoteEndPoint, AuthenticateResponse sessionInfo)
{
// Create the LLUDPClient
LLUDPClient udpClient = new LLUDPClient(this, m_throttleRates, m_throttle, circuitCode, agentID, remoteEndPoint, m_defaultRTO, m_maxRTO);
IClientAPI existingClient;
if (!m_scene.TryGetClient(agentID, out existingClient))
{
// Create the LLClientView
LLClientView client = new LLClientView(remoteEndPoint, m_scene, this, udpClient, sessionInfo, agentID, sessionID, circuitCode);
client.OnLogout += LogoutHandler;
// Start the IClientAPI
client.Start();
}
else
{
m_log.WarnFormat("[LLUDPSERVER]: Ignoring a repeated UseCircuitCode from {0} at {1} for circuit {2}",
udpClient.AgentID, remoteEndPoint, circuitCode);
}
}
2.4 This is the aurora code for m_scene.AddNewClient which is pretty interesting and might in fact have most of what we need. I think I should look at this and do a compare of what I have in the relevant section of NPCModule because this might be the majority of code needed and it appears like it gives a significant number of clues as to what we might be missing. For example: it's pulling the appearance out of the circuitdata whereas I know we have null for appearance in circuit data. Maybe need to run a test and intercept this code to see what a real client has in it's circuit data….
00547 ///
00548 /// Adding a New Client and Create a Presence for it.
00549 ///
00550 /// 00551 public void AddNewClient(IClientAPI client)
00552 {
00553 System.Net.IPEndPoint ep = (System.Net.IPEndPoint)client.GetClientEP();
00554 AgentCircuitData aCircuit = AuthenticateHandler.AuthenticateSession(client.SessionId, client.AgentId, client.CircuitCode, ep);
00555
00556 if (aCircuit == null) // no good, didn't pass NewUserConnection successfully
00557 return;
00558
00559 //Create the scenepresence, then update it with any info that we have about it
00560 ScenePresence sp = m_sceneGraph.CreateAndAddChildScenePresence(client);
00561 lock (m_incomingChildAgentData)
00562 {
00563 if (m_incomingChildAgentData.ContainsKey(sp.UUID))
00564 {
00565 //Found info, update the agent then remove it
00566 sp.ChildAgentDataUpdate(m_incomingChildAgentData[sp.UUID]);
00567 m_incomingChildAgentData.Remove(sp.UUID);
00568 }
00569 }
00570 //Make sure the appearanace is updated
00571 if (aCircuit != null)
00572 sp.Appearance = aCircuit.Appearance;
00573 sp.IsChildAgent = aCircuit.child;
00574
00575 m_clientManager.Add(client);
00576
00577 //Trigger events
00578 m_eventManager.TriggerOnNewPresence(sp);
00579
00580 if (GetScenePresence(client.AgentId) != null)
00581 {
00582 EventManager.TriggerOnNewClient(client);
00583 if ((aCircuit.teleportFlags & (uint)Constants.TeleportFlags.ViaLogin) != 0)
00584 EventManager.TriggerOnClientLogin(client);
00585 }
00586
00587 //Add the client to login stats
00588 ILoginMonitor monitor = (ILoginMonitor)RequestModuleInterface().GetMonitor("", "LoginMonitor");
00589 if ((aCircuit.teleportFlags & (uint)Constants.TeleportFlags.ViaLogin) != 0 && monitor != null)
00590 {
00591 monitor.AddSuccessfulLogin();
00592 }
00593 }
Now this code is possibly not exactly the same as what I have on my test box at home and I'm looking at both vanilla opensim and aurora so it's not a 100% match but it should give some ideas as to what's going on. Anyways.. let's dig through the code:
2.1 Looking at HandleUseCircuitCode(object o) (follows). You will notice that this calls AddNewClient with a circuitcode packet and a remoteendpoint, so these are possibly necessary. Right now we don't call this function in our code, but instead mimic in the background and our c ircuitcode is built up manually and passed to the scene manually instead of coming in as a packet. We also have a null endpoint. So definitely we can say that the flow is different here though the steps are similar. So next, jump past the code to section 2.2 to see what AddNewClient with two parameters does.
private void HandleUseCircuitCode(object o)
{
object[] array = (object[])o;
UDPPacketBuffer buffer = (UDPPacketBuffer)array[0];
UseCircuitCodePacket packet = (UseCircuitCodePacket)array[1];
IPEndPoint remoteEndPoint = (IPEndPoint)buffer.RemoteEndPoint;
// Begin the process of adding the client to the simulator
AddNewClient((UseCircuitCodePacket)packet, remoteEndPoint);
// Acknowledge the UseCircuitCode packet
SendAckImmediate(remoteEndPoint, packet.Header.Sequence);
}
2.2 two parameter AddNewClient. What does this do? Let's see... It checks circuitcode to see if it's valid, asking for a sessionID, a remoteEndPoint and sessioninfo which it uses to call AddClient. Right off the bat our sessionID is 00000's, our remoteEndPoint is null and we haven't set sessionInfo either. So we possibly need to look into that. It also Authenticates the response with sessioninfo and also checks to see if the client is authorized before it calls AddClient(circuitCode, agentID, sessionID, remoteEndPoint, sessionInfo). So let's jump past this code block to 2.3
private void AddNewClient(UseCircuitCodePacket useCircuitCode, IPEndPoint remoteEndPoint)
{
UUID agentID = useCircuitCode.CircuitCode.ID;
UUID sessionID = useCircuitCode.CircuitCode.SessionID;
uint circuitCode = useCircuitCode.CircuitCode.Code;
if (m_scene.RegionStatus != RegionStatus.SlaveScene)
{
AuthenticateResponse sessionInfo;
if (IsClientAuthorized(useCircuitCode, out sessionInfo))
{
AddClient(circuitCode, agentID, sessionID, remoteEndPoint, sessionInfo);
}
else
{
// Don't create circuits for unauthorized clients
m_log.WarnFormat(
"[LLUDPSERVER]: Connection request for client {0} connecting with unnotified circuit code {1} from {2}",
useCircuitCode.CircuitCode.ID, useCircuitCode.CircuitCode.Code, remoteEndPoint);
}
}
else
{
// Slave regions don't accept new clients
m_log.Debug("[LLUDPSERVER]: Slave region " + m_scene.RegionInfo.RegionName + " ignoring UseCircuitCode packet");
}
}
2.3 In this section we're looking at AddClient. There are a bunch of things we are not doing here such as adding a LogoutHandler event as well as doing client.Start in the vanilla code. I'd like to get a look at client.Start to see what it does but I don't seem to have that code here so instead I'm going to look at the Aurora version which I can see more easily on the web. The aurora version has the line m_scene.AddNewClient(client) which is an overriden single parameter version. So in 2.4 we'll look at that: m_scene.AddNewClient
protected virtual void AddClient(uint circuitCode, UUID agentID, UUID sessionID, IPEndPoint remoteEndPoint, AuthenticateResponse sessionInfo)
{
// Create the LLUDPClient
LLUDPClient udpClient = new LLUDPClient(this, m_throttleRates, m_throttle, circuitCode, agentID, remoteEndPoint, m_defaultRTO, m_maxRTO);
IClientAPI existingClient;
if (!m_scene.TryGetClient(agentID, out existingClient))
{
// Create the LLClientView
LLClientView client = new LLClientView(remoteEndPoint, m_scene, this, udpClient, sessionInfo, agentID, sessionID, circuitCode);
client.OnLogout += LogoutHandler;
// Start the IClientAPI
client.Start();
}
else
{
m_log.WarnFormat("[LLUDPSERVER]: Ignoring a repeated UseCircuitCode from {0} at {1} for circuit {2}",
udpClient.AgentID, remoteEndPoint, circuitCode);
}
}
2.4 This is the aurora code for m_scene.AddNewClient which is pretty interesting and might in fact have most of what we need. I think I should look at this and do a compare of what I have in the relevant section of NPCModule because this might be the majority of code needed and it appears like it gives a significant number of clues as to what we might be missing. For example: it's pulling the appearance out of the circuitdata whereas I know we have null for appearance in circuit data. Maybe need to run a test and intercept this code to see what a real client has in it's circuit data….
00547 ///
00548 /// Adding a New Client and Create a Presence for it.
00549 ///
00550 /// 00551 public void AddNewClient(IClientAPI client)
00552 {
00553 System.Net.IPEndPoint ep = (System.Net.IPEndPoint)client.GetClientEP();
00554 AgentCircuitData aCircuit = AuthenticateHandler.AuthenticateSession(client.SessionId, client.AgentId, client.CircuitCode, ep);
00555
00556 if (aCircuit == null) // no good, didn't pass NewUserConnection successfully
00557 return;
00558
00559 //Create the scenepresence, then update it with any info that we have about it
00560 ScenePresence sp = m_sceneGraph.CreateAndAddChildScenePresence(client);
00561 lock (m_incomingChildAgentData)
00562 {
00563 if (m_incomingChildAgentData.ContainsKey(sp.UUID))
00564 {
00565 //Found info, update the agent then remove it
00566 sp.ChildAgentDataUpdate(m_incomingChildAgentData[sp.UUID]);
00567 m_incomingChildAgentData.Remove(sp.UUID);
00568 }
00569 }
00570 //Make sure the appearanace is updated
00571 if (aCircuit != null)
00572 sp.Appearance = aCircuit.Appearance;
00573 sp.IsChildAgent = aCircuit.child;
00574
00575 m_clientManager.Add(client);
00576
00577 //Trigger events
00578 m_eventManager.TriggerOnNewPresence(sp);
00579
00580 if (GetScenePresence(client.AgentId) != null)
00581 {
00582 EventManager.TriggerOnNewClient(client);
00583 if ((aCircuit.teleportFlags & (uint)Constants.TeleportFlags.ViaLogin) != 0)
00584 EventManager.TriggerOnClientLogin(client);
00585 }
00586
00587 //Add the client to login stats
00588 ILoginMonitor monitor = (ILoginMonitor)RequestModuleInterface
00589 if ((aCircuit.teleportFlags & (uint)Constants.TeleportFlags.ViaLogin) != 0 && monitor != null)
00590 {
00591 monitor.AddSuccessfulLogin();
00592 }
00593 }
Thursday, 24 March 2011
OK after some hunting through the console logs a new TODO list
I tracked down each of the messages logged in the console comparing 8ball and Jane Doe. Luckily the message texts are 1:n (where n is low) for each function call in the code base, meaning I was able to track down each of the functions.
Looks like my hunch was right about the baking. The server requests baked textures from the client, the client does it and then sends the baked textures up to the server which receives them and probably distributes them down to the other clients via one of the scenepresence methods. That's my hunch.
Anyways, here is the list of associated function calls for each of the console messages, which I will use as a TODO.
TODO: Go take a look at each of the function calls made when a real client was connecting... The list of function calls follows:
[SCENE]: Incoming client x8Ball in region chibacity1 via regular login. Client IP verification not performed.
--> This could be OpenSim.Region.Framework.Scenes.Scene.VerifyClient(AgentCircuitData..
[LLUDPSERVER}: Handling UseCircuitCode packet from 10.211.55.3:1076
--> This is OpenSim.Region.ClientStack.LindenUDP.LLUDPServer
[SCENE]: Adding new agent x8Ball to scene chibacity1
--> This is OpenSim.Region.Framework.Scenes.Scene.AddNewClient
[SCENE]: Upgrading child to root agent for x8Ball in chibacity1
--> This is OpenSim.Region.Framework.Scenes.ScenePresence.MakeRootAgent(Vector3 pos, bool isFlying)
[PRESENCE DETECTOR]: Detected root presence 3efbb294-e1d6-430e-a508-fa5ca4748dbb in chibacity1
--> This is OpenSim.Region.CoreModules.ServiceConnectorsOut.Presence.PresenceDetector.OnMakeRootAgent(ScenePresence sp)
[PRESENCE SERVICE]: ReportAgent with session .... in region ....
--> This is OpenSim.Services.PresenceService.PresenceService.ReportAgent(UUID sessionID, UUID regionID)
[ACTIVITY DETECTOR]: Detected root presence 3efbb.... in chibacity1
--> This is OpenSim.Region.CoreModules.ServiceConnectorsOut.GridUser.ActivityDetector.OnMakeRootAgent(ScenePresence sp)
[SCENE]: Received request for wearables of x8Ball
--> This is OpenSim.Region.Framework.Scenes.ScenePresence
[CAPS]: UploadBakedTexture Request in region: chibacity1
--> This is OpenSim.Framework.Capabilities.Caps.UploadBakedTexture(string request, string path, string param, OSHttpRequest httpRequest, OSHttpResponse httpResponse)
[CAPS]: Received baked texture ef7b9.....
--> This is OpenSim.Framework.Capabilities.Caps.BakedTextureUploaded(UUID assetID, byte[] data)
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 83a04...
[CAPS]: Received baked texture 3ce3adc2...
[CAPS]: Received baked texture 47a32....
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 9308...
[CAPS]: Received baked texture d331...
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 5a84...
[CAPS]: Received baked texture 48cd...
[SCENE]: Adding new agent Jane Doe to scene chibacity1
[APPEARANCE]: Appearance not found in chibacity1, returning default
--> This is OpenSim.Region.Framework.Scenes.Scene.GetAvatarAppearance(IClientAPI client, out AvatarAppearance appearance)
[SCENE]: Upgrading child to root agent for Jane Doe in chibacity1
[ATTACHMENT]: Appearance has not been initialized for agent 8d69...
--> This is OpenSim.Region.Framework.Scenes.ScenePresence.RezAttachments()
[SCENE PRESENCE]: null appearance in MakeRoot in chibacity1
--> OpenSim.Region.Framework.Scenes.ScenePresence.MakeRootAgent(Vector3 pos, bool isFlying)
[PRESENCE DETECTOR]: Detected root presence 8d69... in chibacity1
[PRESENCE SERVICE]: ReportAgent with session 000000.... in region
003fdfc7.....
[ACTIVITY DETECTOR]: Detected root presence 8d693dc7.... in chibacity1
[SCENE]: Received request for wearables of Jane Doe
Looks like my hunch was right about the baking. The server requests baked textures from the client, the client does it and then sends the baked textures up to the server which receives them and probably distributes them down to the other clients via one of the scenepresence methods. That's my hunch.
Anyways, here is the list of associated function calls for each of the console messages, which I will use as a TODO.
TODO: Go take a look at each of the function calls made when a real client was connecting... The list of function calls follows:
[SCENE]: Incoming client x8Ball in region chibacity1 via regular login. Client IP verification not performed.
--> This could be OpenSim.Region.Framework.Scenes.Scene.VerifyClient(AgentCircuitData..
[LLUDPSERVER}: Handling UseCircuitCode packet from 10.211.55.3:1076
--> This is OpenSim.Region.ClientStack.LindenUDP.LLUDPServer
[SCENE]: Adding new agent x8Ball to scene chibacity1
--> This is OpenSim.Region.Framework.Scenes.Scene.AddNewClient
[SCENE]: Upgrading child to root agent for x8Ball in chibacity1
--> This is OpenSim.Region.Framework.Scenes.ScenePresence.MakeRootAgent(Vector3 pos, bool isFlying)
[PRESENCE DETECTOR]: Detected root presence 3efbb294-e1d6-430e-a508-fa5ca4748dbb in chibacity1
--> This is OpenSim.Region.CoreModules.ServiceConnectorsOut.Presence.PresenceDetector.OnMakeRootAgent(ScenePresence sp)
[PRESENCE SERVICE]: ReportAgent with session .... in region ....
--> This is OpenSim.Services.PresenceService.PresenceService.ReportAgent(UUID sessionID, UUID regionID)
[ACTIVITY DETECTOR]: Detected root presence 3efbb.... in chibacity1
--> This is OpenSim.Region.CoreModules.ServiceConnectorsOut.GridUser.ActivityDetector.OnMakeRootAgent(ScenePresence sp)
[SCENE]: Received request for wearables of x8Ball
--> This is OpenSim.Region.Framework.Scenes.ScenePresence
[CAPS]: UploadBakedTexture Request in region: chibacity1
--> This is OpenSim.Framework.Capabilities.Caps.UploadBakedTexture(string request, string path, string param, OSHttpRequest httpRequest, OSHttpResponse httpResponse)
[CAPS]: Received baked texture ef7b9.....
--> This is OpenSim.Framework.Capabilities.Caps.BakedTextureUploaded(UUID assetID, byte[] data)
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 83a04...
[CAPS]: Received baked texture 3ce3adc2...
[CAPS]: Received baked texture 47a32....
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 9308...
[CAPS]: Received baked texture d331...
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 5a84...
[CAPS]: Received baked texture 48cd...
[SCENE]: Adding new agent Jane Doe to scene chibacity1
[APPEARANCE]: Appearance not found in chibacity1, returning default
--> This is OpenSim.Region.Framework.Scenes.Scene.GetAvatarAppearance(IClientAPI client, out AvatarAppearance appearance)
[SCENE]: Upgrading child to root agent for Jane Doe in chibacity1
[ATTACHMENT]: Appearance has not been initialized for agent 8d69...
--> This is OpenSim.Region.Framework.Scenes.ScenePresence.RezAttachments()
[SCENE PRESENCE]: null appearance in MakeRoot in chibacity1
--> OpenSim.Region.Framework.Scenes.ScenePresence.MakeRootAgent(Vector3 pos, bool isFlying)
[PRESENCE DETECTOR]: Detected root presence 8d69... in chibacity1
[PRESENCE SERVICE]: ReportAgent with session 000000.... in region
003fdfc7.....
[ACTIVITY DETECTOR]: Detected root presence 8d693dc7.... in chibacity1
[SCENE]: Received request for wearables of Jane Doe
Experiments with ScenePresence.Appearance.
So I got the Jane Does to appear as clouds and I wanted to experiment. So I added the following lines of code (totally blindly) to the bottom of the NPC instantiation I posted earlier tonight:
sp.Appearance.ClearAttachments();
sp.Appearance.ClearWearables();
sp.Appearance.SetDefaultWearables();
sp.SendInitialFullUpdateToAllClients();
sp.SendWearables();
sp.SendFullUpdateToAllClients();
Sadly it didn't work. But what *was* of interest was the differences in the console logs between watching x8ball log in via imprudence, and Jane Doe suddenly appear in the region as an NPC. Following is a record of the events that took place in the console. You will notice there is a bunch more for the x8Ball stuff than for the Jane Doe stuff. A couple of points that stand out are error messages about null appearances and a zero session ID. Maybe session ID is important.... Anyway's we'll see. Here are the logs below:
[SCENE]: Incoming client x8Ball in region chibacity1 via regular login. Client IP verification not performed.
[LLUDPSERVER}: Handling UseCircuitCode packet from 192.168.1.3:1076
[SCENE]: Adding new agent x8Ball to scene chibacity1
[SCENE]: Upgrading child to root agent for x8Ball in chibacity1
[PRESENCE DETECTOR]: Detected root presence 3efbb294-e1d6-430e-a508-fa5ca4748dbb in chibacity1
[PRESENCE SERVICE]: ReportAgent with session .... in region ....
[ACTIVITY DETECTOR]: Detected root presence 3efbb.... in chibacity1
[SCENE]: Received request for wearables of x8Ball
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture ef7b9.....
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 83a04...
[CAPS]: Received baked texture 3ce3adc2...
[CAPS]: Received baked texture 47a32....
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 9308...
[CAPS]: Received baked texture d331...
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 5a84...
[CAPS]: Received baked texture 48cd...
[SCENE]: Adding new agent Jane Doe to scene chibacity1
[APPEARANCE]: Appearance not found in chibacity1, returning default
[SCENE]: Upgrading child to root agent for Jane Doe in chibacity1
[ATTACHMENT]: Appearance has not been initialized for agent 8d69...
[SCENE PRESENCE]: null appearance in MakeRoot in chibacity1
[PRESENCE DETECTOR]: Detected root presence 8d69... in chibacity1
[PRESENCE SERVICE]: ReportAgent with session 000000.... in region
003fdfc7.....
[ACTIVITY DETECTOR]: Detected root presence 8d693dc7.... in chibacity1
[SCENE]: Received request for wearables of Jane Doe
sp.Appearance.ClearAttachments();
sp.Appearance.ClearWearables();
sp.Appearance.SetDefaultWearables();
sp.SendInitialFullUpdateToAllClients();
sp.SendWearables();
sp.SendFullUpdateToAllClients();
Sadly it didn't work. But what *was* of interest was the differences in the console logs between watching x8ball log in via imprudence, and Jane Doe suddenly appear in the region as an NPC. Following is a record of the events that took place in the console. You will notice there is a bunch more for the x8Ball stuff than for the Jane Doe stuff. A couple of points that stand out are error messages about null appearances and a zero session ID. Maybe session ID is important.... Anyway's we'll see. Here are the logs below:
[SCENE]: Incoming client x8Ball in region chibacity1 via regular login. Client IP verification not performed.
[LLUDPSERVER}: Handling UseCircuitCode packet from 192.168.1.3:1076
[SCENE]: Adding new agent x8Ball to scene chibacity1
[SCENE]: Upgrading child to root agent for x8Ball in chibacity1
[PRESENCE DETECTOR]: Detected root presence 3efbb294-e1d6-430e-a508-fa5ca4748dbb in chibacity1
[PRESENCE SERVICE]: ReportAgent with session .... in region ....
[ACTIVITY DETECTOR]: Detected root presence 3efbb.... in chibacity1
[SCENE]: Received request for wearables of x8Ball
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture ef7b9.....
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 83a04...
[CAPS]: Received baked texture 3ce3adc2...
[CAPS]: Received baked texture 47a32....
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 9308...
[CAPS]: Received baked texture d331...
[CAPS]: UploadBakedTexture Request in region: chibacity1
[CAPS]: Received baked texture 5a84...
[CAPS]: Received baked texture 48cd...
[SCENE]: Adding new agent Jane Doe to scene chibacity1
[APPEARANCE]: Appearance not found in chibacity1, returning default
[SCENE]: Upgrading child to root agent for Jane Doe in chibacity1
[ATTACHMENT]: Appearance has not been initialized for agent 8d69...
[SCENE PRESENCE]: null appearance in MakeRoot in chibacity1
[PRESENCE DETECTOR]: Detected root presence 8d69... in chibacity1
[PRESENCE SERVICE]: ReportAgent with session 000000.... in region
003fdfc7.....
[ACTIVITY DETECTOR]: Detected root presence 8d693dc7.... in chibacity1
[SCENE]: Received request for wearables of Jane Doe
OK - Partial Success!!!
K so I've been hacking vanilla opensim 0.7.2 which initially wasn't doing anything. Discovered that for whatever reason the osNPCCreate commands aren't working. Probably something to do with the %$%^#$#%-ing virtual machine I'm running on my trusty little mac (which is more than a *little* bit slow).
Anyways, I digress. I'm now back to where Haplo Voss got us near christmas: the code below this picture of success (yay!) rezzes a nice couple of physical (though cloudlike) Jane Doe NPCs...
void m_timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
lock (p_lock)
{
if (p_inUse)
{
p_inUse = false;
p_scene = m_scene;
p_firstname = "Jane";
p_lastname = "Doe";
p_position.X = 128;
p_position.Y = 128;
p_position.Z = 23;
NPCAvatar npcAvatar = new NPCAvatar(p_firstname,
p_lastname, p_position, p_scene);
//
AgentCircuitData ACD = new AgentCircuitData(); ;
uint circuitcode;
circuitcode = (uint) Util.RandomClass.Next(0, int.MaxValue);
ACD.circuitcode = circuitcode;
npcAvatar.CircuitCode = circuitcode;
ACD.firstname = p_firstname;
ACD.lastname = p_lastname;
ACD.startpos = p_position;
ACD.AgentID = npcAvatar.getAgentId();
ACD.SessionID = UUID.Zero;
ACD.SecureSessionID = UUID.Zero;
ACD.child = false; //should be a root instead of a child - yes beacause it's being instantiated for the first time
ACD.Viewer = "NPC";
//ACD.Appearance = ; //this should be set to something I think...
// not setting everything in the AgentCircuitData: missing out the following:
// child, InventoryFolder,
// BaseFolder, CapsPath, ChildrenCapSeeds
// Are they needed? Don't know....
//dan
p_scene.AuthenticateHandler.AddNewCircuit(npcAvatar.CircuitCode,ACD);
//
p_scene.AddNewClient(npcAvatar);
ScenePresence sp;
ScenePresence sp_tmp;
//up to this next line works - but this code returns a null scenepresence...
//find out why....
if (p_scene.TryGetScenePresence(npcAvatar.AgentId, out sp))
{
p_scene.TryGetAvatarByName("Jane Doe", out sp_tmp);
p_cloneAppearanceFrom = sp_tmp.UUID;
AvatarAppearance x =
GetAppearance(p_cloneAppearanceFrom, p_scene);
sp.SetAppearance(x.Texture,
(byte[])x.VisualParams.Clone());
}
m_avatars.Add(npcAvatar.AgentId, npcAvatar);
p_returnUuid = npcAvatar.AgentId;
}
}
}
Anyways, I digress. I'm now back to where Haplo Voss got us near christmas: the code below this picture of success (yay!) rezzes a nice couple of physical (though cloudlike) Jane Doe NPCs...
void m_timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
lock (p_lock)
{
if (p_inUse)
{
p_inUse = false;
p_scene = m_scene;
p_firstname = "Jane";
p_lastname = "Doe";
p_position.X = 128;
p_position.Y = 128;
p_position.Z = 23;
NPCAvatar npcAvatar = new NPCAvatar(p_firstname,
p_lastname, p_position, p_scene);
//
AgentCircuitData ACD = new AgentCircuitData(); ;
uint circuitcode;
circuitcode = (uint) Util.RandomClass.Next(0, int.MaxValue);
ACD.circuitcode = circuitcode;
npcAvatar.CircuitCode = circuitcode;
ACD.firstname = p_firstname;
ACD.lastname = p_lastname;
ACD.startpos = p_position;
ACD.AgentID = npcAvatar.getAgentId();
ACD.SessionID = UUID.Zero;
ACD.SecureSessionID = UUID.Zero;
ACD.child = false; //should be a root instead of a child - yes beacause it's being instantiated for the first time
ACD.Viewer = "NPC";
//ACD.Appearance = ; //this should be set to something I think...
// not setting everything in the AgentCircuitData: missing out the following:
// child, InventoryFolder,
// BaseFolder, CapsPath, ChildrenCapSeeds
// Are they needed? Don't know....
//dan
p_scene.AuthenticateHandler.AddNewCircuit(npcAvatar.CircuitCode,ACD);
//
p_scene.AddNewClient(npcAvatar);
ScenePresence sp;
ScenePresence sp_tmp;
//up to this next line works - but this code returns a null scenepresence...
//find out why....
if (p_scene.TryGetScenePresence(npcAvatar.AgentId, out sp))
{
p_scene.TryGetAvatarByName("Jane Doe", out sp_tmp);
p_cloneAppearanceFrom = sp_tmp.UUID;
AvatarAppearance x =
GetAppearance(p_cloneAppearanceFrom, p_scene);
sp.SetAppearance(x.Texture,
(byte[])x.VisualParams.Clone());
}
m_avatars.Add(npcAvatar.AgentId, npcAvatar);
p_returnUuid = npcAvatar.AgentId;
}
}
}
TO DO List as of today
OK so skimming through our version of the code which I think is the last working version I made this checklist for the calling sequence to get an NPC instantiated and good to go.
Note that this does not include any of the other leads to check. This is a preliminary before I can do any of that....
Calling sequence points to check when an NPC appears in a region:
• First part is register the client interface with the scene.
○ i.e. Scene.RegisterModuleInterface(this);
§ here check the Scene is valid
• next is create the avatar instance
○ npcAvatar = new NPCAvatar(first, last, pos, p_scene)
§ here you need to check the p_scene is valid
• next is create the circuit data
○ check each piece of the circuit data object
• next is adding the circuit to the scene
○ p_scene.AuthenticatedHandler.AddNewCircuit(circuitdata, ACD)
§ make sure p_scene is valid
§ make sure ACD is valid
§ make sure circuitdata is not null
• next is add the client (NPCAvatar) to the scene
○ p_scene.addnewclient(npca)
§ make sure p_scene is valid
§ make sure npca is valid
• next is get the scenepresence back out of the scene
○ p_scene.TryGetScenePresence(npca.agentID, out sp)
§ check p_scene is valid
§ check npca is valid
§ check npca.agentID is not null
§ make sure you get a scenepresence back in sp
• next is get the avatar appearance
○ AvatarAppearance x = GetAppearance(p_cloneappearancefrom, p_scene)
§ check p_cloneappearancefrom is a valid and existing uuid
§ check p_scene is valid
• get the avatar data - what is this? what does avatar data mean?
○ AvatarData adata = scene.AvatarService.GetAvatar(agentID)
§ make sure adata returns a non-null AvatarData - a new one at the worst
§ make sure scene is valid
§ make sure agentID is not null
• then set the appearance of the avatar corresponding to the scenepresence
○ sp.setappearance(x.texture, (byte[])x.VisualParams.Clone());
§ what does this mean?
§ check sp is valid
§ check x.texture is valid
§ check x.VisualParams.Clone returns something
• Add the newly rezzed avatar to the list of npcs
○ m_avatars.add(npca.AgentID, npca)
§ make sure m_avatars isn't null
§ check npca.AgentID is valid
§ check npca is valid
• get the UUID of the NPC Avie
○ P_returnUUID = npca.agentID
§ make sure npca is valid
§ make sure npca.agentID is not null
Note that this does not include any of the other leads to check. This is a preliminary before I can do any of that....
Calling sequence points to check when an NPC appears in a region:
• First part is register the client interface with the scene.
○ i.e. Scene.RegisterModuleInterface(this);
§ here check the Scene is valid
• next is create the avatar instance
○ npcAvatar = new NPCAvatar(first, last, pos, p_scene)
§ here you need to check the p_scene is valid
• next is create the circuit data
○ check each piece of the circuit data object
• next is adding the circuit to the scene
○ p_scene.AuthenticatedHandler.AddNewCircuit(circuitdata, ACD)
§ make sure p_scene is valid
§ make sure ACD is valid
§ make sure circuitdata is not null
• next is add the client (NPCAvatar) to the scene
○ p_scene.addnewclient(npca)
§ make sure p_scene is valid
§ make sure npca is valid
• next is get the scenepresence back out of the scene
○ p_scene.TryGetScenePresence(npca.agentID, out sp)
§ check p_scene is valid
§ check npca is valid
§ check npca.agentID is not null
§ make sure you get a scenepresence back in sp
• next is get the avatar appearance
○ AvatarAppearance x = GetAppearance(p_cloneappearancefrom, p_scene)
§ check p_cloneappearancefrom is a valid and existing uuid
§ check p_scene is valid
• get the avatar data - what is this? what does avatar data mean?
○ AvatarData adata = scene.AvatarService.GetAvatar(agentID)
§ make sure adata returns a non-null AvatarData - a new one at the worst
§ make sure scene is valid
§ make sure agentID is not null
• then set the appearance of the avatar corresponding to the scenepresence
○ sp.setappearance(x.texture, (byte[])x.VisualParams.Clone());
§ what does this mean?
§ check sp is valid
§ check x.texture is valid
§ check x.VisualParams.Clone returns something
• Add the newly rezzed avatar to the list of npcs
○ m_avatars.add(npca.AgentID, npca)
§ make sure m_avatars isn't null
§ check npca.AgentID is valid
§ check npca is valid
• get the UUID of the NPC Avie
○ P_returnUUID = npca.agentID
§ make sure npca is valid
§ make sure npca.agentID is not null
I keep circling round OpenSim.Framework.AvatarAppearance
I wonder if this might be where the relevant code is to at least force the NPC to be a ruth so we have something to work with while figuring out how to get the correct appearance we want.
The Aurora documentation says the following (cryptic) little piece about AvatarAppearance:
"Contain's the Avatar's appearance and methods to manipulate the appearance".
Certainly does sound promising.
Once I get the code working (again) which at least rezzes an NPC cloud I'm going to hack in some of these methods one at a time to see what happens. If that works, I'm gong to start trying to hack the movement code we had from the 0.6.9 back in.
Once I get that working (lol - not too much to ask right) I'm going to hack that code into the rexbot implementation in Aurora.
Wish me luck....
The Aurora documentation says the following (cryptic) little piece about AvatarAppearance:
"Contain's the Avatar's appearance and methods to manipulate the appearance".
Certainly does sound promising.
Once I get the code working (again) which at least rezzes an NPC cloud I'm going to hack in some of these methods one at a time to see what happens. If that works, I'm gong to start trying to hack the movement code we had from the 0.6.9 back in.
Once I get that working (lol - not too much to ask right) I'm going to hack that code into the rexbot implementation in Aurora.
Wish me luck....
Maybe this is something to do with why the 0.7 code fails but the 0.6.9 code works
The following code appears to be the default settings for a newly initialized avatar.
Conspicously they have turned *off* the default which equals BAKE AVATAR = YES and reset it
to BAKE AVATAR = NO. That appears to suggest in my mind that this hands off the baking
mechanism to the viewer and THUS it's impossible to get a baked avatar (i.e. it will always be a cloud)
unless the viewer bakes the textures and then sends them up to the server.
clutching at straws but this might be it, since it appears to be in the zone of where I think the problem is and it's also exactly the opposite of the way it worked in 0.6.9 where it *did* work.
http://opensimulator.org/viewgit/?a=commitdiff&p=opensim&h=9668fb4e31c612ce457fc4d6e7708ea43234dbac
The relevant lines of code are the following where in the old code it was m_serial = 1 and it's now set to m_serial=0. I don't know what serial means but it *may* have something to do with it.
- protected int m_serial = 1;
+ protected int m_serial = 0;
Conspicously they have turned *off* the default which equals BAKE AVATAR = YES and reset it
to BAKE AVATAR = NO. That appears to suggest in my mind that this hands off the baking
mechanism to the viewer and THUS it's impossible to get a baked avatar (i.e. it will always be a cloud)
unless the viewer bakes the textures and then sends them up to the server.
clutching at straws but this might be it, since it appears to be in the zone of where I think the problem is and it's also exactly the opposite of the way it worked in 0.6.9 where it *did* work.
http://opensimulator.org/viewgit/?a=commitdiff&p=opensim&h=9668fb4e31c612ce457fc4d6e7708ea43234dbac
The relevant lines of code are the following where in the old code it was m_serial = 1 and it's now set to m_serial=0. I don't know what serial means but it *may* have something to do with it.
- protected int m_serial = 1;
+ protected int m_serial = 0;
OK so this is interesting
Scene.TryGetScenePresence goes through a bunch of wrapper calls but ultimately calls ScenePresence.AddToPhysicalScene.
What's interesting about that is that it checks if it's a physics actor. If it's null then it doesn't try to add the avatar to the physical scene. But if it *is* a physics actor then it checks to see if it's NOT a child agent. So that suggests that the AgentCircuitData.child has to be set to false in order to add the av to the scene.
If all goes well in SetAppearance it ultimately calls m_controllingClient.SendAvatarDataImmediate.
In our case our version of the client doesn't have an implementation of SendAvatarDataImmediate.
The wrapper function is there but there's no code in it. I wonder if that's the code we need to look at in other IClient interface implementations.
What's interesting about that is that it checks if it's a physics actor. If it's null then it doesn't try to add the avatar to the physical scene. But if it *is* a physics actor then it checks to see if it's NOT a child agent. So that suggests that the AgentCircuitData.child has to be set to false in order to add the av to the scene.
If all goes well in SetAppearance it ultimately calls m_controllingClient.SendAvatarDataImmediate.
In our case our version of the client doesn't have an implementation of SendAvatarDataImmediate.
The wrapper function is there but there's no code in it. I wonder if that's the code we need to look at in other IClient interface implementations.
So this is Haplo's last word on where the 0.7.2 code is at
Ok sure... so the hair deal is this:
The code as you sent it to me, allows a 'blank' NPC to be created and
respond to Say and Remove commands. It retains the correct first and
lastname information from the osNpcCreate call and carries a unique
UUID. Using the for loop to create 10 NPCs and a list of their UUIDs
(or keys) each one retains name and unique ID and can be commanded to
Say and Remove respectively.
Notes: Depending on viewer - these NPCs will either be nothing but a
puffy cloud, or the 'witch king' (which is basically Hippo, Second
Life first gen viewer, and some viewers versions of a failed AV load.
It is what all settings to '0' would look like on hair - with no body
present. It looks like a grey Helm of the Witch King floating in space
lol)
Now then - if you change this:
ACD.AgentID = p_cloneAppearanceFrom; //npcAvatar.getAgentId();
Then for some reason any viewer will load a default hair floating in
space for every NPC. It is textured, colored, and loads in completion.
No body, skin, etc. Just friggin hair by itself. It does not make any
sense that assigning an entirely different ID would only make this
small of a difference (rather than just make or entirely break it?
Escpecially since the NPC still obeys all working commands)
p_scene: Ok so this will look really shitty in an email - but paste it
into a code editor and it will make better sense. I commented out
things, leaving them in place so I knew where to just 'crop' them back
in later.
As you can see below I hacked out the fact checking on the scene
presence in an attempt to force the current code to kick in. If I
leave it back in it's original 'IF' check, the function never fires
according to the debug message. Obviously the way I have it here - it
always fires without fail since I am forcing it to do so.
p_scene.AddNewClient(npcAvatar);
ScenePresence sp;
//if (
p_scene.TryGetScenePresence(p_cloneAppearanceFrom, out sp); //)
//{
AvatarAppearance x =
GetAppearance(p_cloneAppearanceFrom, p_scene);
m_log.Debug("[TryGetScenePrescence]: Function
Has Fired"); // Debug
sp.SetAppearance(x.Texture,
(byte[])x.VisualParams.Clone());
//}
m_avatars.Add(npcAvatar.AgentId, npcAvatar);
p_returnUuid = npcAvatar.AgentId;
Also if you want to have console logging enabled - you need the
following at the top of your file: using log4net; This will output
debug messages into your console window. Nice to have when you run the
osNPC* commands you can flip to your console and see what the hell is
going on ;)
ANYway ... so that is where I am at. Flat busted. Really none of the
changes I have made ended up taking me anywhere but back to your
original code you sent me. I also tried creating a user, then taking
that user, folder and asset ID information and hard coding it into
what I thought would be the appropriate slots for the information we
weren't sure we needed. Results were the same - still worked fine, but
no 'physical' appearance. Just a floaty name in space. So I am
probably not coding it in properly or something is missing entirely.
Also, I don't know if it matters or not but just in case - there is
definitely an actual 'ghost-presence' being created. You can bump into
and knock around the NPC even though it is just a point in space - so
the capsule is there.
The code as you sent it to me, allows a 'blank' NPC to be created and
respond to Say and Remove commands. It retains the correct first and
lastname information from the osNpcCreate call and carries a unique
UUID. Using the for loop to create 10 NPCs and a list of their UUIDs
(or keys) each one retains name and unique ID and can be commanded to
Say and Remove respectively.
Notes: Depending on viewer - these NPCs will either be nothing but a
puffy cloud, or the 'witch king' (which is basically Hippo, Second
Life first gen viewer, and some viewers versions of a failed AV load.
It is what all settings to '0' would look like on hair - with no body
present. It looks like a grey Helm of the Witch King floating in space
lol)
Now then - if you change this:
ACD.AgentID = p_cloneAppearanceFrom; //npcAvatar.getAgentId();
Then for some reason any viewer will load a default hair floating in
space for every NPC. It is textured, colored, and loads in completion.
No body, skin, etc. Just friggin hair by itself. It does not make any
sense that assigning an entirely different ID would only make this
small of a difference (rather than just make or entirely break it?
Escpecially since the NPC still obeys all working commands)
p_scene: Ok so this will look really shitty in an email - but paste it
into a code editor and it will make better sense. I commented out
things, leaving them in place so I knew where to just 'crop' them back
in later.
As you can see below I hacked out the fact checking on the scene
presence in an attempt to force the current code to kick in. If I
leave it back in it's original 'IF' check, the function never fires
according to the debug message. Obviously the way I have it here - it
always fires without fail since I am forcing it to do so.
p_scene.AddNewClient(npcAvatar);
ScenePresence sp;
//if (
p_scene.TryGetScenePresence(p_cloneAppearanceFrom, out sp); //)
//{
AvatarAppearance x =
GetAppearance(p_cloneAppearanceFrom, p_scene);
m_log.Debug("[TryGetScenePrescence]: Function
Has Fired"); // Debug
sp.SetAppearance(x.Texture,
(byte[])x.VisualParams.Clone());
//}
m_avatars.Add(npcAvatar.AgentId, npcAvatar);
p_returnUuid = npcAvatar.AgentId;
Also if you want to have console logging enabled - you need the
following at the top of your file: using log4net; This will output
debug messages into your console window. Nice to have when you run the
osNPC* commands you can flip to your console and see what the hell is
going on ;)
ANYway ... so that is where I am at. Flat busted. Really none of the
changes I have made ended up taking me anywhere but back to your
original code you sent me. I also tried creating a user, then taking
that user, folder and asset ID information and hard coding it into
what I thought would be the appropriate slots for the information we
weren't sure we needed. Results were the same - still worked fine, but
no 'physical' appearance. Just a floaty name in space. So I am
probably not coding it in properly or something is missing entirely.
Also, I don't know if it matters or not but just in case - there is
definitely an actual 'ghost-presence' being created. You can bump into
and knock around the NPC even though it is just a point in space - so
the capsule is there.
*This* is the latest version of the NPCModule code
This version of the code should work to instantiate a cloud NPC in 0.7.2. Not tried any version of this in Aurora yet.
/*
* Copyright (c) Contributors, http://opensimulator.org/
* See CONTRIBUTORS.TXT for a full list of copyright holders.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* * Neither the name of the OpenSimulator Project nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE DEVELOPERS ``AS IS'' AND ANY
* EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
* DISCLAIMED. IN NO EVENT SHALL THE CONTRIBUTORS BE LIABLE FOR ANY
* DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
* ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
* SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
using System.Collections.Generic;
using System.Threading;
using OpenMetaverse;
using Nini.Config;
using OpenSim.Region.Framework.Interfaces;
using OpenSim.Region.Framework.Scenes;
using OpenSim.Region.CoreModules.Avatar.NPC;
using OpenSim.Framework;
using Timer=System.Timers.Timer;
using OpenSim.Services.Interfaces;
namespace OpenSim.Region.OptionalModules.World.NPC
{
public class NPCModule : IRegionModule, INPCModule
{
// private const bool m_enabled = false;
private Mutex m_createMutex;
private Timer m_timer;
private Dictionary m_avatars = new Dictionary();
private Dictionary m_appearanceCache = new Dictionary();
// Timer vars.
private bool p_inUse = false;
private readonly object p_lock = new object();
// Private Temporary Variables.
private string p_firstname;
private string p_lastname;
private Vector3 p_position;
private Scene p_scene;
private UUID p_cloneAppearanceFrom;
private UUID p_returnUuid;
private AvatarAppearance GetAppearance(UUID target, Scene scene)
{
if (m_appearanceCache.ContainsKey(target))
return m_appearanceCache[target];
AvatarData adata = scene.AvatarService.GetAvatar(target);
if (adata != null)
{
AvatarAppearance x = adata.ToAvatarAppearance(target);
m_appearanceCache.Add(target, x);
return x;
}
return new AvatarAppearance();
}
public UUID CreateNPC(string firstname, string lastname,Vector3 position, Scene scene, UUID cloneAppearanceFrom)
{
// Block.
m_createMutex.WaitOne();
// Copy Temp Variables for Timer to pick up.
lock (p_lock)
{
p_firstname = firstname;
p_lastname = lastname;
p_position = position;
p_scene = scene;
p_cloneAppearanceFrom = cloneAppearanceFrom;
p_inUse = true;
p_returnUuid = UUID.Zero;
}
while (p_returnUuid == UUID.Zero)
{
Thread.Sleep(250);
}
m_createMutex.ReleaseMutex();
return p_returnUuid;
}
public void Autopilot(UUID agentID, Scene scene, Vector3 pos)
{
lock (m_avatars)
{
if (m_avatars.ContainsKey(agentID))
{
ScenePresence sp;
scene.TryGetScenePresence(agentID, out sp);
sp.DoAutoPilot(0, pos, m_avatars[agentID]);
}
}
}
public void Say(UUID agentID, Scene scene, string text)
{
lock (m_avatars)
{
if (m_avatars.ContainsKey(agentID))
{
m_avatars[agentID].Say(text);
}
}
}
public void DeleteNPC(UUID agentID, Scene scene)
{
lock (m_avatars)
{
if (m_avatars.ContainsKey(agentID))
{
scene.RemoveClient(agentID);
m_avatars.Remove(agentID);
}
}
}
public void Initialise(Scene scene, IConfigSource source)
{
m_createMutex = new Mutex(false);
m_timer = new Timer(500);
m_timer.Elapsed += m_timer_Elapsed;
m_timer.Start();
scene.RegisterModuleInterface(this);
}
void m_timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
lock (p_lock)
{
if (p_inUse)
{
p_inUse = false;
NPCAvatar npcAvatar = new NPCAvatar(p_firstname, p_lastname, p_position, p_scene);
//
AgentCircuitData ACD;
uint circuitcode;
ACD = new AgentCircuitData();
circuitcode = (uint) Util.RandomClass.Next(0, int.MaxValue);
ACD.circuitcode = circuitcode;
npcAvatar.CircuitCode = circuitcode;
ACD.firstname = p_firstname;
ACD.lastname = p_lastname;
ACD.startpos = p_position;
ACD.AgentID = npcAvatar.getAgentId();
ACD.SessionID = UUID.Zero;
ACD.SecureSessionID = UUID.Zero;
ACD.Viewer = "NPC";
// not setting everything in the AgentCircuitData: missing out the following:
// child, InventoryFolder,
// BaseFolder, CapsPath, ChildrenCapSeeds
// Are they needed? Don't know....
p_scene.AuthenticateHandler.AddNewCircuit(npcAvatar.CircuitCode, ACD);
//
p_scene.AddNewClient(npcAvatar);
ScenePresence sp;
if (p_scene.TryGetScenePresence(npcAvatar.AgentId, out sp))
{
AvatarAppearance x = GetAppearance(p_cloneAppearanceFrom, p_scene);
sp.SetAppearance(x.Texture, (byte[])x.VisualParams.Clone());
}
m_avatars.Add(npcAvatar.AgentId, npcAvatar);
p_returnUuid = npcAvatar.AgentId;
}
}
}
public void PostInitialise()
{
}
public void Close()
{
}
public string Name
{
get { return "NPCModule"; }
}
public bool IsSharedModule
{
get { return true; }
}
}
}
/*
* Copyright (c) Contributors, http://opensimulator.org/
* See CONTRIBUTORS.TXT for a full list of copyright holders.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* * Neither the name of the OpenSimulator Project nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE DEVELOPERS ``AS IS'' AND ANY
* EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
* DISCLAIMED. IN NO EVENT SHALL THE CONTRIBUTORS BE LIABLE FOR ANY
* DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
* ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
* SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
using System.Collections.Generic;
using System.Threading;
using OpenMetaverse;
using Nini.Config;
using OpenSim.Region.Framework.Interfaces;
using OpenSim.Region.Framework.Scenes;
using OpenSim.Region.CoreModules.Avatar.NPC;
using OpenSim.Framework;
using Timer=System.Timers.Timer;
using OpenSim.Services.Interfaces;
namespace OpenSim.Region.OptionalModules.World.NPC
{
public class NPCModule : IRegionModule, INPCModule
{
// private const bool m_enabled = false;
private Mutex m_createMutex;
private Timer m_timer;
private Dictionary
private Dictionary
// Timer vars.
private bool p_inUse = false;
private readonly object p_lock = new object();
// Private Temporary Variables.
private string p_firstname;
private string p_lastname;
private Vector3 p_position;
private Scene p_scene;
private UUID p_cloneAppearanceFrom;
private UUID p_returnUuid;
private AvatarAppearance GetAppearance(UUID target, Scene scene)
{
if (m_appearanceCache.ContainsKey(target))
return m_appearanceCache[target];
AvatarData adata = scene.AvatarService.GetAvatar(target);
if (adata != null)
{
AvatarAppearance x = adata.ToAvatarAppearance(target);
m_appearanceCache.Add(target, x);
return x;
}
return new AvatarAppearance();
}
public UUID CreateNPC(string firstname, string lastname,Vector3 position, Scene scene, UUID cloneAppearanceFrom)
{
// Block.
m_createMutex.WaitOne();
// Copy Temp Variables for Timer to pick up.
lock (p_lock)
{
p_firstname = firstname;
p_lastname = lastname;
p_position = position;
p_scene = scene;
p_cloneAppearanceFrom = cloneAppearanceFrom;
p_inUse = true;
p_returnUuid = UUID.Zero;
}
while (p_returnUuid == UUID.Zero)
{
Thread.Sleep(250);
}
m_createMutex.ReleaseMutex();
return p_returnUuid;
}
public void Autopilot(UUID agentID, Scene scene, Vector3 pos)
{
lock (m_avatars)
{
if (m_avatars.ContainsKey(agentID))
{
ScenePresence sp;
scene.TryGetScenePresence(agentID, out sp);
sp.DoAutoPilot(0, pos, m_avatars[agentID]);
}
}
}
public void Say(UUID agentID, Scene scene, string text)
{
lock (m_avatars)
{
if (m_avatars.ContainsKey(agentID))
{
m_avatars[agentID].Say(text);
}
}
}
public void DeleteNPC(UUID agentID, Scene scene)
{
lock (m_avatars)
{
if (m_avatars.ContainsKey(agentID))
{
scene.RemoveClient(agentID);
m_avatars.Remove(agentID);
}
}
}
public void Initialise(Scene scene, IConfigSource source)
{
m_createMutex = new Mutex(false);
m_timer = new Timer(500);
m_timer.Elapsed += m_timer_Elapsed;
m_timer.Start();
scene.RegisterModuleInterface
}
void m_timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
lock (p_lock)
{
if (p_inUse)
{
p_inUse = false;
NPCAvatar npcAvatar = new NPCAvatar(p_firstname, p_lastname, p_position, p_scene);
//
AgentCircuitData ACD;
uint circuitcode;
ACD = new AgentCircuitData();
circuitcode = (uint) Util.RandomClass.Next(0, int.MaxValue);
ACD.circuitcode = circuitcode;
npcAvatar.CircuitCode = circuitcode;
ACD.firstname = p_firstname;
ACD.lastname = p_lastname;
ACD.startpos = p_position;
ACD.AgentID = npcAvatar.getAgentId();
ACD.SessionID = UUID.Zero;
ACD.SecureSessionID = UUID.Zero;
ACD.Viewer = "NPC";
// not setting everything in the AgentCircuitData: missing out the following:
// child, InventoryFolder,
// BaseFolder, CapsPath, ChildrenCapSeeds
// Are they needed? Don't know....
p_scene.AuthenticateHandler.AddNewCircuit(npcAvatar.CircuitCode, ACD);
//
p_scene.AddNewClient(npcAvatar);
ScenePresence sp;
if (p_scene.TryGetScenePresence(npcAvatar.AgentId, out sp))
{
AvatarAppearance x = GetAppearance(p_cloneAppearanceFrom, p_scene);
sp.SetAppearance(x.Texture, (byte[])x.VisualParams.Clone());
}
m_avatars.Add(npcAvatar.AgentId, npcAvatar);
p_returnUuid = npcAvatar.AgentId;
}
}
}
public void PostInitialise()
{
}
public void Close()
{
}
public string Name
{
get { return "NPCModule"; }
}
public bool IsSharedModule
{
get { return true; }
}
}
}
Subscribe to:
Posts (Atom)