Off to a very slow start today.
I started off with a little dialog cleanup today. I have never really done very much character dialog and I'm not really sure if what I'm writing is any good, so I keep making revisions. I'm my own worst critic (at least, I hope I'm my own worst critic, or I'm going to get utterly roasted alive) so it's difficult to leave dialog in place for more than a day before I decide that it's utter crap and needs to be revised. Some of the revision is certainly good, helping to clarify things that are clearly not clear, but some of it is probably arbitrary.
So, escaping that cycle for a moment, I started working on a tree view control so I can actually see the node graph of all of the components I have loaded, and eventually tie that into the ability to edit data in an attribute file. To some extent, it's mostly just piecing together bits of other controls that I already have. The main problem at the moment is that it's fairly easy to specify a hierarchy of controls from an attribute file, but that is static, and I need a "pleasant" way to generate a control set from more dynamic data.
I also realized that having a simple "scroll box" would be useful (basically a control that automatically puts up scroll bars if the size of its contents exceeds the size of the control) and did some work to split that out from the few controls that duplicate this behavior.
That actually sounds like I did a lot more than I think I did... now, if I could write a Python script to turn blog posts directly into C++ code, maybe I'd be getting somewhere...
Monday, September 30, 2013
Saturday, September 28, 2013
It's A Sign
You know, it's kind of shame that C by default makes you type out "unsigned <type>" when you want an unsigned variable.
OK, most of the time it doesn't make any difference, but I've found it makes the intention of some code clearer.
I put these in my "top level" header that gets included everywhere:
These too, so they can be redefined if the compiler you're on has different ideas about word sizes and you need to read binary files across platforms:
I find I use uint a lot. There are a lot of things, like array indices or container sizes, that can never be negative (or if they are, you have a problem), so uint makes more sense.
On the other hand, I have found myself using iterators a lot more than indices recently.
One caveat: The conversion from unsigned-to-float is pretty expensive on PC compared to signed-to-float. You may want to keep that in mind if you're mixing floating point and unsigned values.
I even had one case where I tried to cast the unsigned to int first, even going so far as to pass it through an inline function, and Visual Studio just said, "oh, you REALLY meant to convert from unsigned to float" and went through the slow-path conversion code.
OK, most of the time it doesn't make any difference, but I've found it makes the intention of some code clearer.
I put these in my "top level" header that gets included everywhere:
typedef unsigned long ulong; typedef unsigned int uint; typedef unsigned short ushort; typedef unsigned char uchar;
These too, so they can be redefined if the compiler you're on has different ideas about word sizes and you need to read binary files across platforms:
typedef __int64 int64; typedef unsigned __int64 uint64; typedef int int32; typedef unsigned int uint32; typedef short int16; typedef unsigned short uint16; typedef char int8; typedef unsigned char uint8;
I find I use uint a lot. There are a lot of things, like array indices or container sizes, that can never be negative (or if they are, you have a problem), so uint makes more sense.
On the other hand, I have found myself using iterators a lot more than indices recently.
One caveat: The conversion from unsigned-to-float is pretty expensive on PC compared to signed-to-float. You may want to keep that in mind if you're mixing floating point and unsigned values.
I even had one case where I tried to cast the unsigned to int first, even going so far as to pass it through an inline function, and Visual Studio just said, "oh, you REALLY meant to convert from unsigned to float" and went through the slow-path conversion code.
Friday, September 27, 2013
Resolution-Independent Coordinate System
Today, I'm thinking about UI issues.
I have dialogue displaying, but the basic layout was tied to the screen resolution (currently 1280x720, because it fits nicely on my 1920x1080 monitor in windowed mode). I began wondering what an ideal "virtual" coordinate system would look like, where you just use one coordinate system and scale it for the actual screen resolution.
This causes some problems with fonts, which should really be rendered at screen resolution, so I'm leaving those out of the equation for now. You can make some progress playing with point sizes, but in the end it probably won't be exactly the same on different resolutions.
So, what would a good "virtual" coordinate system look like?
You could have 0..1 in each axis, but then you have to work with a lot of fiddly decimals, and that doesn't take into account aspect ratio.
I have dialogue displaying, but the basic layout was tied to the screen resolution (currently 1280x720, because it fits nicely on my 1920x1080 monitor in windowed mode). I began wondering what an ideal "virtual" coordinate system would look like, where you just use one coordinate system and scale it for the actual screen resolution.
This causes some problems with fonts, which should really be rendered at screen resolution, so I'm leaving those out of the equation for now. You can make some progress playing with point sizes, but in the end it probably won't be exactly the same on different resolutions.
So, what would a good "virtual" coordinate system look like?
You could have 0..1 in each axis, but then you have to work with a lot of fiddly decimals, and that doesn't take into account aspect ratio.
I think ideally, the coordinates should divide evenly by a large number of primes so you can create symmetric divisions without requiring fractions.
You could maybe have something like 1600x900, which is good for a 16x9 aspect ratio. 1600 doesn't divide evenly by 3, though.
You could maybe have something like 1600x900, which is good for a 16x9 aspect ratio. 1600 doesn't divide evenly by 3, though.
Since 1920x1080 is my target resolution, I decided to just go with that. It's pixel-accurate at the target, and both divide evenly by multiple powers of 2, as well as 3 and 5.
The rest of today consisted of a little bit of modeling, and I picked up my guitar for the first time in a couple months - I clearly need to get my hands back in shape again. :)
Thursday, September 26, 2013
Back Into Gameplay Land
Finally back into gameplay land today, which means I'm doing a while bunch of random stuff.
First up, brief tests at conversation text suggest that I need to differentiate the speakers a bit more. I don't have character portraits up yet, which would help, but I'm wondering if choosing a unique font for each character would be a good idea. I could go with a different color for each character, but using a different font for each that helps convey their personality might be interesting.
That led me to Google Fonts, which looks to have a decent variety with reasonable licensing requirements. If anyone has any other suggestions, let me know.
Of course, this led to the need to modify the game code to support a different font for each speaker. That's not too big a deal, but causes a bit of trouble because line heights are no longer constant.
The level script is coming along though. It navigates around the level with appropriate conversation text displaying (and not displaying) at the appropriate times.
I need to work on object placement a bit more, so next up is probably getting a "bird's eye view" of the level with some sort of transform editing. We'll see how that goes. I don't want to write full serialization for every component, so I might see if I can modify the attribute file that spawned the object directly, because that's easy to serialize.
First up, brief tests at conversation text suggest that I need to differentiate the speakers a bit more. I don't have character portraits up yet, which would help, but I'm wondering if choosing a unique font for each character would be a good idea. I could go with a different color for each character, but using a different font for each that helps convey their personality might be interesting.
That led me to Google Fonts, which looks to have a decent variety with reasonable licensing requirements. If anyone has any other suggestions, let me know.
Of course, this led to the need to modify the game code to support a different font for each speaker. That's not too big a deal, but causes a bit of trouble because line heights are no longer constant.
The level script is coming along though. It navigates around the level with appropriate conversation text displaying (and not displaying) at the appropriate times.
I need to work on object placement a bit more, so next up is probably getting a "bird's eye view" of the level with some sort of transform editing. We'll see how that goes. I don't want to write full serialization for every component, so I might see if I can modify the attribute file that spawned the object directly, because that's easy to serialize.
Wednesday, September 25, 2013
Zzzzzzzzzzzzz........
Kind of slow day today.
The parts of the day that weren't spent staring blankly at the screen consisted primarily of playing around with shader code.
I incorporated spherical harmonics for diffuse lighting and cleaned up to code a little bit, splitting off the Phong lighting code completely. That whole process actually took way too long and generated way too many bugs. Most of the problem was getting the diffuse and specular intensities to be something reasonable.
Other than that, I was trying to gather some more reference images. Why is it so difficult to find decent pictures of Starbuck in the cockpit? I'm just going to have to go back and watch the entire Battlestar Galactica series again apparently. (Note to self: do not tempt self.)
That's all I've got for today!
The parts of the day that weren't spent staring blankly at the screen consisted primarily of playing around with shader code.
I incorporated spherical harmonics for diffuse lighting and cleaned up to code a little bit, splitting off the Phong lighting code completely. That whole process actually took way too long and generated way too many bugs. Most of the problem was getting the diffuse and specular intensities to be something reasonable.
Other than that, I was trying to gather some more reference images. Why is it so difficult to find decent pictures of Starbuck in the cockpit? I'm just going to have to go back and watch the entire Battlestar Galactica series again apparently. (Note to self: do not tempt self.)
That's all I've got for today!
Tuesday, September 24, 2013
The Intersection of Head and Desk
A bit of a slow day today.
I spent much too much time trying to figure out why the normal map on one ship seemed to be working fine, but on a different ship it looked like all of the normals were pointing the wrong direction.
Turns out I had the texture linked to an old (and very wrong) version of the normal map with the same name in a different directory.
Head, meet desk.
Anyway, once that was sorted out, I decided that since I actually have meaningful values coming through the pipeline now, maybe I should use them. I spent a bit of time getting the Cook-Torrance shader working. I think it turned out alright:
Actually, I need to get some proper lights in the scene. It's progress anyway.
I also poked a prodded at the level script, dialog and design. The first level of the game has a lot of dialog and scripted sequences, so although it will be an excellent test level as far as preparing for the rest of the game, it's going to take a while.
I spent much too much time trying to figure out why the normal map on one ship seemed to be working fine, but on a different ship it looked like all of the normals were pointing the wrong direction.
Turns out I had the texture linked to an old (and very wrong) version of the normal map with the same name in a different directory.
Head, meet desk.
Anyway, once that was sorted out, I decided that since I actually have meaningful values coming through the pipeline now, maybe I should use them. I spent a bit of time getting the Cook-Torrance shader working. I think it turned out alright:
Actually, I need to get some proper lights in the scene. It's progress anyway.
I also poked a prodded at the level script, dialog and design. The first level of the game has a lot of dialog and scripted sequences, so although it will be an excellent test level as far as preparing for the rest of the game, it's going to take a while.
Monday, September 23, 2013
Rendering OD
I spent most of the weekend working on getting the shader graph out of Blender and into the pipeline.
I had expected generating the shader from the graph to be the difficult part, but that was actually surprisingly simple.
A simple emissive shader looks like this:
Because this is just my attribute file format, after loading it, it's trivial to take the Links section and insert them directly into the Inputs section. Then you just find the MATERIAL_OUTPUT node and recursively work your way back up the nodes, using default values if there is no link.
Each node type typically ends up generating only one or two lines of code, and a handful of variables. Generating braces at the beginning and end of each node keeps all variables declarations local, and you just pass the name of the variable you want to set to the recursive call. For BSDF nodes, just emit a function call to a pre-defined lighting function (which all just reduce to Phong at the moment, but that can be dealt with later).
So, the hard part was not generating the actual fragment shader.
Mapping UV channels initially seemed like it was going to be difficult, because they are mapped by name and Assimp doesn't preserve this information. Then I realized I could do the name lookup when I export the shader graph, and as long as Assimp preserves the channel order I can just embed the UV index in the attribute file.
The only part that really caused trouble was trying to get the textures to map to materials properly. The Blender COLLADA exporter only looks at the "legacy" shaders, and only exports a single texture (and maybe a normal map if you have things set correctly). Ultimately this came down to ignoring textures from the COLLADA file and gathering, converting and binding textures directly from the shader graph. The only other thing was telling Assimp not to merge materials, because it doesn't have an accurate picture of things.
In any case, this all seems to be working. It's a lot more flexible that what I had before, and I can actually set proper emissive ranges for HDR.
So, since you bothered to read this far, I give you an in-game screenshot:
I had expected generating the shader from the graph to be the difficult part, but that was actually surprisingly simple.
A simple emissive shader looks like this:
EngineGlow { Nodes { Material_Output { Type = OUTPUT_MATERIAL Inputs { Surface { Type = SHADER } Volume { Type = SHADER } Displacement { Type = VALUE Default = 0.000000 } } Outputs { } } Emission { Type = EMISSION Inputs { Color { Type = RGBA Default = 0.000000, 0.405629, 0.800000, 1.000000 } Strength { Type = VALUE Default = 400.000000 } } Outputs { Emission { Type = SHADER } } } } Links { 0 { FromNode = Emission FromSocket = Emission ToNode = Material_Output ToSocket = Surface } } }
Because this is just my attribute file format, after loading it, it's trivial to take the Links section and insert them directly into the Inputs section. Then you just find the MATERIAL_OUTPUT node and recursively work your way back up the nodes, using default values if there is no link.
Each node type typically ends up generating only one or two lines of code, and a handful of variables. Generating braces at the beginning and end of each node keeps all variables declarations local, and you just pass the name of the variable you want to set to the recursive call. For BSDF nodes, just emit a function call to a pre-defined lighting function (which all just reduce to Phong at the moment, but that can be dealt with later).
So, the hard part was not generating the actual fragment shader.
Mapping UV channels initially seemed like it was going to be difficult, because they are mapped by name and Assimp doesn't preserve this information. Then I realized I could do the name lookup when I export the shader graph, and as long as Assimp preserves the channel order I can just embed the UV index in the attribute file.
The only part that really caused trouble was trying to get the textures to map to materials properly. The Blender COLLADA exporter only looks at the "legacy" shaders, and only exports a single texture (and maybe a normal map if you have things set correctly). Ultimately this came down to ignoring textures from the COLLADA file and gathering, converting and binding textures directly from the shader graph. The only other thing was telling Assimp not to merge materials, because it doesn't have an accurate picture of things.
In any case, this all seems to be working. It's a lot more flexible that what I had before, and I can actually set proper emissive ranges for HDR.
So, since you bothered to read this far, I give you an in-game screenshot:
Friday, September 20, 2013
The Monte Carlo Algorithm Applied To Game Development
One of the interesting things about being the only person on a project is that everything that might need to be done is your responsibility. There is no one to pick up the slack, and nothing gets done unless you do it yourself.
I occasionally wonder if my shotgun approach to game development is going to work out. In case you haven't noticed, I tend to work on a lot of different things from day to day. Sometimes it feels unfocused, but I'm basically working on the assumption that, as long as what I'm doing needed to be done at some point, it doesn't really matter. Additionally, if I'm interested or excited by what I'm working on, it will turn out better and faster. And I'll be happier, which is kind of the point of this endeavor.
So, today ended up being a modeling day. I did a little bit of script/dialog for the first level, but that was going slowly. After learning quite a bit about modeling on my latest ship, I decided to revisit my old fighter model (see first post ever) and clean it up a bit.
I actually enjoy 3D modeling. I really can't draw, but I seem to be OK at making models. Texturing is kind if fun too, though I'm currently kind of hit-and-miss on that front.
Anyway, the point was, I think having a lot of things to work on at once is a good thing, because if you get stuck, or bored, there is always something else productive you can be doing. I have had jobs before where I needed to work on a very narrow area, and if you run into a roadblock, nothing gets done.
Weekend time - not sure if I'm getting a weekend edition out, I don't have any topic in mind yet. We'll see, have a good one!
Thursday, September 19, 2013
Rendering Addiction
The first step in dealing with your addiction is admitting that you have a problem.
My problem is that the default material options in Blender are somewhat limited, and the node shader system is much nicer to work with. My other problem is that I am not aware of any way to export the node graph to a standard file format.
OK, I'm not sure how admiting that is going to help me get over my rendering addiction and get back to gameplay programming. All that did is make me write an exporter in Python so I can get at all of the lovely shader node graph into the pipeline. If I can find a way to bind the shader UV channel names to the Assimp/COLLADA imported mesh, I should be able to use that to generate a shader.
Then I can hopefully start to work towards some prettier lighting:
My problem is that the default material options in Blender are somewhat limited, and the node shader system is much nicer to work with. My other problem is that I am not aware of any way to export the node graph to a standard file format.
OK, I'm not sure how admiting that is going to help me get over my rendering addiction and get back to gameplay programming. All that did is make me write an exporter in Python so I can get at all of the lovely shader node graph into the pipeline. If I can find a way to bind the shader UV channel names to the Assimp/COLLADA imported mesh, I should be able to use that to generate a shader.
Then I can hopefully start to work towards some prettier lighting:
Wednesday, September 18, 2013
The Day Nothing Worked
OK, the title isn't entirely accurate.
I actually did get the normal map texture pipeline straightened out. Multitexturing seems to work OK now, though there are probably some bugs lurking in the darkness.
Things that didn't work today:
The Blender COLLADA exporter seems to have a problem. It insists that my normal map appears in all materials, even those materials with no textures at all. Fortunately I can probably fix it because, Open Source and all, but that's not helping productivity.
Then I ran into some really strange problem with various tools getting deadlocked during the asset build. The shader compiler was getting stuck in fflush(), and a system call to mkdir was hanging as well. My build tool just runs a Python script, which seemed to work from the command line but not from the tool. I'm little confused... it was working just fine, must be a problem with capturing the output? Who knows.
Anyway, I need to stop playing with rendering and do more gameplay. Let's see how much willpower I have. :)
Tuesday, September 17, 2013
In the Pipeline
Somehow, today turned into pipeline day.
After putting normal maps on an object, I realized that my texture support in the art pipeline was severely lacking.
Currently, the art pipeline looks like this:
This is much improved, though I'm still working on binding the textures to the shaders. Hopefully I'll get that working tomorrow and maybe finally get back to, you know, making the game part of the game.
After putting normal maps on an object, I realized that my texture support in the art pipeline was severely lacking.
Currently, the art pipeline looks like this:
- Blender is used to create the asset
- Export to COLLADA format (.dae)
- Asset compiler (custom tool)
- Using Assimp to import the data
- Convert textures for target platform
- Generate shaders for target platform
- Write it to a custom game data format
- Import it into the game
This is much improved, though I'm still working on binding the textures to the shaders. Hopefully I'll get that working tomorrow and maybe finally get back to, you know, making the game part of the game.
Monday, September 16, 2013
Something different...
OK, I'm slowly getting back to work. I haven't been doing much programming recently though - today I'm pretending to be an artist! :)
I spent a bit of time last week putting together some character concept reference. That also involved a bit of character bio writing, and sifting through months of notes - good ideas, bad ideas, ideas I somehow forgot about but shouldn't have, ideas that should probably be deleted, burned, and launched into the sun...
Then, since I didn't really feel like programming, I decided to turn this:
I spent a bit of time last week putting together some character concept reference. That also involved a bit of character bio writing, and sifting through months of notes - good ideas, bad ideas, ideas I somehow forgot about but shouldn't have, ideas that should probably be deleted, burned, and launched into the sun...
Then, since I didn't really feel like programming, I decided to turn this:
into this:
Not quite there yet, but a decent start.
I've been learning quite a bit about Blender through the process of making a few models. I was having some trouble with the X-Mirror mode, because subdividing an edge is not a mirrored operation.
It was easiest to just put a Mirror modifier on the whole mesh and model just the left half of the ship. The .dae exporter even supports this without having to generate the whole right side first, which was nice.
I also figured out that selecting two adjacent faces and scaling to zero in normal space is a good way to make two faces coplanar.
I also learned a lot about baking textures and materials, but, being Blender, I have now forgotten it again. :)
Wednesday, September 11, 2013
Do You Remember?
You remember those weeks when you're hyper-focused, there are no distractions and everything is working out the way you planned?
This is not one of those weeks.
Regularly scheduled updates will resume once there is something to update about.
(I did get the line thing working. And some tweaks to the expressions.)
Monday, September 9, 2013
Lines and Expressions
Short update today, yet another sick day. Today was mostly some cleanup work.
I have been working on a "thick line" class so I can draw lines that are more than a single pixel wide. I'm basically taking a default 1m square plane and then doing the appropriate math to scale and rotate it so that it falls along the desired line, and then rotating it along that line so that it faces towards the camera. Some shader tricks and I should be able to get a nice falloff at the edges.
I had this embedded in a game-side component, I thought it would be better to move it into a generic Line Component. While doing that, I thought I would look into binding expressions to the component construction process. At the moment I'm just using a simple type parser (vector, transform, float, string) to get the values from attribute files. Using expressions would make it possible to do more complex math, and more importantly, reference variables so I don't have to change the same constant all over the place.
As far as referencing variables, at the moment I only have variables that are bound from C++. I started on making it possible to declare variables more dynamically - for starters, declaring a bunch of global variables in an attribute file. This isn't quite ready to go yet.
So, hopefully finish that off tomorrow if I'm feeling up to it.
I have been working on a "thick line" class so I can draw lines that are more than a single pixel wide. I'm basically taking a default 1m square plane and then doing the appropriate math to scale and rotate it so that it falls along the desired line, and then rotating it along that line so that it faces towards the camera. Some shader tricks and I should be able to get a nice falloff at the edges.
I had this embedded in a game-side component, I thought it would be better to move it into a generic Line Component. While doing that, I thought I would look into binding expressions to the component construction process. At the moment I'm just using a simple type parser (vector, transform, float, string) to get the values from attribute files. Using expressions would make it possible to do more complex math, and more importantly, reference variables so I don't have to change the same constant all over the place.
As far as referencing variables, at the moment I only have variables that are bound from C++. I started on making it possible to declare variables more dynamically - for starters, declaring a bunch of global variables in an attribute file. This isn't quite ready to go yet.
So, hopefully finish that off tomorrow if I'm feeling up to it.
Saturday, September 7, 2013
Weekend Edition - Update Phases
How do you keep AI, rendering and physics all in sync with each other?
In most game architectures I have seen, you typically have two main "phases" - Update and Render.
During the Update phase, you do all of your game AI, animation, input handling - basically, do anything that is going to change the state of the game over a single frame. By the end of the update, you want everything to be where it is supposed to be for drawing the frame.
During the Render phase, you want to take all of the objects in the game and draw them where they are supposed to be on that frame. Ideally, during the Render phase, absolutely no changes are made to the game state. This is fairly important, because sometimes you want to render exactly the same scene multiple times - if the game is paused, to generate shadow or reflection maps, etc.
So far this seems relatively straightforward, but things are not always that easy. In particular, to synchronize AI with physics, the order that things happen in during the Update phase can be critically important. To make sure things happen in the right order, I tend to break the Update phase into several sub-phases:
At this point, hopefully you have everything where it is supposed to be and can give the Rendering phase the most up-to-date positions for all of your objects and are ready to get on to the next frame's Update.
In most game architectures I have seen, you typically have two main "phases" - Update and Render.
During the Update phase, you do all of your game AI, animation, input handling - basically, do anything that is going to change the state of the game over a single frame. By the end of the update, you want everything to be where it is supposed to be for drawing the frame.
During the Render phase, you want to take all of the objects in the game and draw them where they are supposed to be on that frame. Ideally, during the Render phase, absolutely no changes are made to the game state. This is fairly important, because sometimes you want to render exactly the same scene multiple times - if the game is paused, to generate shadow or reflection maps, etc.
So far this seems relatively straightforward, but things are not always that easy. In particular, to synchronize AI with physics, the order that things happen in during the Update phase can be critically important. To make sure things happen in the right order, I tend to break the Update phase into several sub-phases:
- Update - called once per frame
- Process input
- Do animation
- Apply forces to physics objects
- If the game object transform does not match the physics transform, forcibly move the physics object
- PhysicsStep - may be called multiple times if frame time is greater than physics step time
- PhysicsUpdate
- Called before every physics step
- Apply forces that vary based on object position / velocity
- OnCollision
- You may get collision callbacks during the physics simulation
- PostPhysicsUpdate
- Called after every physics step
- Limit motion, velocities
- PostUpdate
- Respond to any collisions you received during the PhysicsStep
- Update game object positions from the current physics positions
At this point, hopefully you have everything where it is supposed to be and can give the Rendering phase the most up-to-date positions for all of your objects and are ready to get on to the next frame's Update.
Friday, September 6, 2013
Brain Reactivate, Fix Stupid Bug
Camera guidance - check
Multiple nav points - check
Ship operates in parent object's space - check
Ship weapons fire emits in correct frame of reference - check
AABB limiter works in correct frame of reference (keep ship on screen) - check
Considering I really only had half a work day, today has somehow been relatively productive.
Mostly it came down to fixing the one bug where the physics was not syncing correctly when the parent game object to the ship's game object was moving, rather than the ship itself. Ultimately the biggest problem was that the Update event was happening in the ModelComponent before the NavComponent, causing the world space transform of the object to be inconsistent before and after the physics update.
Or, for the non-technical reader: Did bad stupid thing. Fixed bad stupid thing. Is good. Have post in queue for Weekend Edition. You read, all make sense.
Best moment of the day:
Ship turns and accelerates towards the first nav point. Velocity vector aligns with the target direction, all is good. Approaching the model at the target point, we hit the first nav point and... yes, proximity check works, we're headed to the second nav point! Ship turns, velocity vector aligns, here comes the enemy ship I put at the next nav point and....
My ship explodes. Oh, right, apparently this one has a collision object attached to it. :) The camera didn't care though, it was all just, "OK, next nav point, let's go..."
Well, it was funny when it happened.
Multiple nav points - check
Ship operates in parent object's space - check
Ship weapons fire emits in correct frame of reference - check
AABB limiter works in correct frame of reference (keep ship on screen) - check
Considering I really only had half a work day, today has somehow been relatively productive.
Mostly it came down to fixing the one bug where the physics was not syncing correctly when the parent game object to the ship's game object was moving, rather than the ship itself. Ultimately the biggest problem was that the Update event was happening in the ModelComponent before the NavComponent, causing the world space transform of the object to be inconsistent before and after the physics update.
Or, for the non-technical reader: Did bad stupid thing. Fixed bad stupid thing. Is good. Have post in queue for Weekend Edition. You read, all make sense.
Best moment of the day:
Ship turns and accelerates towards the first nav point. Velocity vector aligns with the target direction, all is good. Approaching the model at the target point, we hit the first nav point and... yes, proximity check works, we're headed to the second nav point! Ship turns, velocity vector aligns, here comes the enemy ship I put at the next nav point and....
My ship explodes. Oh, right, apparently this one has a collision object attached to it. :) The camera didn't care though, it was all just, "OK, next nav point, let's go..."
Well, it was funny when it happened.
Thursday, September 5, 2013
Bug Day
Productivity today has been, shall we say, sub-optimal.
I made some progress on getting the "frame of reference" for the game to follow waypoints through the world. The real trick has been getting the rest of the game to respect that frame of reference, especially the physics. This also exposed a few bugs with the model / game object transform synchronization, particularly how that works in a hierarchy of game objects.
However, between illness, medication, telemarketers, family interruptions, lightning storms and power flickering out, today has not been a good day for concentrating on work.
It has, however, been a relatively good day for zoning out on YouTube. Here, have some Muppets:
I made some progress on getting the "frame of reference" for the game to follow waypoints through the world. The real trick has been getting the rest of the game to respect that frame of reference, especially the physics. This also exposed a few bugs with the model / game object transform synchronization, particularly how that works in a hierarchy of game objects.
However, between illness, medication, telemarketers, family interruptions, lightning storms and power flickering out, today has not been a good day for concentrating on work.
It has, however, been a relatively good day for zoning out on YouTube. Here, have some Muppets:
Wednesday, September 4, 2013
Coroutines, Conversations and Camera Control
In the quest to build an actual level, I have kind of been bouncing all over the place with what I'm working on. On one hand, it feels a little random and undirected. On the other hand, there hasn't really been any effort that I would call wasted. And if I had a third hand, I would say, at least I'm having fun. :)
In any case, before you can really build a level editor, you need to know what data it is you want to edit. While trying to rough in a basic level by hand (i.e. editing text attribute files), I realized there are a couple of fairly key components I'm missing.
The first one is character dialogue. RPG-style conversations and dialogue are one of the key things I want in the game. I had this working in Lua months ago, but I was using coroutines, so it doesn't exactly port easily to C++.
Basically, I would like to write a script like this:
In any case, before you can really build a level editor, you need to know what data it is you want to edit. While trying to rough in a basic level by hand (i.e. editing text attribute files), I realized there are a couple of fairly key components I'm missing.
The first one is character dialogue. RPG-style conversations and dialogue are one of the key things I want in the game. I had this working in Lua months ago, but I was using coroutines, so it doesn't exactly port easily to C++.
Basically, I would like to write a script like this:
say("Luke: You've picked one up...watch it!" )
say("Biggs: I can't see it! He's on me tight, I can't shake him.")
say("Luke: I'll be right there.")
say("Biggs: I can't see it! He's on me tight, I can't shake him.")
say("Luke: I'll be right there.")
set_nav_point(0)
The call to say(...) shouldn't actually return until you're done displaying the dialogue, which will take multiple frames. With coroutines this is easy - just run the whole script in a coroutine. Call resume() during the game's update and yield() in the say() function after setting up the frame's dialogue.
There are ways to get coroutines working in C++. Getting a stack frame allocated for your coroutine seems to be a bit of a messy business though, so I didn't bother with that and just did it in the stupidest way possible: Each coroutine just creates a new thread two thread event handles. The thread waits on one of the handles. The coroutine's resume() function just signals the thread to continue and waits on the other handle. The yield() function just signals the main thread to continue and waits on the first handle again.
So, that took very little time to write and seems to work great.
The second thing I need, as hinted above, is some sort of navigation system. Because the other part of the game is basically a scrolling shooter, I need a good way to get the camera to move around the world. I've been going back and forth on whether the camera should navigate the world, or if it should be fixed near the origin and have "the world come to the player," and I'm currently leaning towards having it navigate through the world.
I had this working in Lua too, though I wasn't entirely happy with it - following a Bezier curve works, but it's too easy to make corners that feel kind of abrupt and unnatural. I'm going try looking at using some of the guidance control ideas for navigating between points under constant angular acceleration to see if that looks/feels better.
Tuesday, September 3, 2013
Timer Component (Updated)
What I did today:
Meh.
Another sick day (well, more like, never got over what I had a two weeks ago).
So instead, let's talk about useful components.
Meh.
Another sick day (well, more like, never got over what I had a two weeks ago).
So instead, let's talk about useful components.
Subscribe to:
Posts (Atom)