Thursday, December 19, 2013

Slack

OK, I've been slacking on daily updates again.  :)

There hasn't been a lot of bloggable stuff going on, though.  I've mostly been focusing on greyblocking all of the levels and putting dialog in.

I made a few additions to the level editor to support some of the things I was doing.  I added clone/duplicate to the editor, which has made placing multiple objects a lot easier.  This was pretty easy to do because cloning is basically an inherent feature in the game object / component system.

There are still some changes that I need to make there.  Most importantly, I need a better way to select objects.  The mouse clicking actually works OK right now, except I'm not displaying bounding boxes for everything, so sometimes it's hard to know what you're clicking on. 

Overlapping objects are a bit of a pain.  What I ended up doing was, when you click on multiple overlapping objects, you select the one with the smallest bounding box.  Most of the time this actually seems to do what you expect.

Some bounding boxes overlap completely, though.  Worse, some objects really don't have a bounding box because they're either points or just stuff in the scene that's doing stuff that doesn't really exist at a specific location in space (such as the main music track).  These all just kind of overlap at the origin.

My UI controls are kind of in a state of brokenness.  I suppose I should just fix them, then I can choose from a list of objects.  I had started work on a tree view, but didn't get it finished yet.  Or I can do the hacker thing and put in next/prev object keys.

Sounds like I have code to write, yay! :)  I like doing level design and dialog, but it doesn't flow as easily as code.


Oh, and after nine years of purposefully avoiding it, World of Warcraft has finally got its hooks into me.  So far it's only cutting into lunch and overtime hours.  :)

Tuesday, December 10, 2013

Weak Parent

Blogging on the go today.  Typing a whole post with the iPhone keyboard is far from fun, but not much choice today.

Today started with a little modeling work. There were a couple "vents" on one of my spacecraft I wasn't too keen about.  I'm still not liking it too much, but it's better.  I'll probably have another go at it tomorrow and maybe get an image up.

Most of the rest of the day was trying to get everything in place to allow proper level transitions.  It's about time to rough in all of the planned levels, but currently things are a bit debuggish and I can't currently chain level loads.

I also put in a new "weak parent" component that basically just copies the transform of one game object to or from another.  After parenting the game camera to the ship, I discovered that the game stops rendering when the player ship explodes because the camera goes with it as the child of the ship.  I suppose you can debate whether this is a good arrangement, or if children should just be reparented.  This way makes a little more sense to me, and adding reparenting as a separate operation from destruction would work fine.

So, busy day, working towards getting levels roughed in.  More of the same tomorrow.

Monday, December 9, 2013

Gray Day

After a weekend of Bioshock: Infinite (which was just excellent, in case you are like me and way, WAY behind the times) it's time to get back to work.

After doing a bunch of art-related stuff this morning, I decided to get back to level creation stuff.

Using the drum track of the level's music last week was a good start, but needed to introduce some more randomness into the equation.

I assigned a spawn range to each drum note and then used a Gray code to determine a spawn position inside each range.  This is a little nicer than just using completely random numbers, because you're guaranteed to not have a repeated number (each new number differs from the previous by exactly one bit).  But it's also gives the appearance of randomness on the scale that I'm working at.

Result:  pretty fun, actually.  :)  New problem:  it's pretty difficult, too; will need some testing to determine if it's appropriate for just the second level.

Oh, and the art related stuff.  Kicking around "movie poster" type ideas.  I was looking at a bunch of minimalist movie posters and thought they were quite striking.  This is not quite so minimalist, and will probably make no sense to you at the moment, but hopefully it stands out:


Friday, December 6, 2013

"End of stream" means something else apparently

Trying to get the audio synchronization to work.

I was having some trouble last time I looked at it with getting a proper sample cursor position on the stream voice.  The music stream gets read in chunks, so there are multiple buffers getting filled and submitted to the audio system.  I seemed to be having the problem that every time a buffer was started, the current play position was reset, even though the docs said it should be the count until an end of stream marker is reached.

Debugging audio can be a pain in the butt, because there are all sorts of things all happening at the same time with multiple threads, hardware interrupts, etc.  It's hard to debug things in a properly "stopped" state sometimes, because every time you step an instruction, all sorts of other stuff happens.

In any case, the problem should have been obvious:  the end of stream marker doesn't actually stop anything, it just keeps playing submitted buffers happily.  And the end of stream marker was set on every single buffer.

Then I got into looking at the shader node stuff in Blender.  I was trying to set the transparency based on the dot product of the view vector and the surface normal to give a nice smooth falloff.

First off, the shader system is completely awesome, because it actually has input pins for normal, view vector, etc., and math nodes for doing dot products and other vector operations.  So, in theory, it should actually be possible to compute that.

Two problems:

  1. Specifying transparency is a bit difficult, because the renderer is using path tracing, so there isn't really an "alpha channel" you can specify for opacity, it's more about probability that each of the R/G/B channels will be passed through.
  2. The above math operations don't seem to be working as expected.
Not sure what the problem is trying to get a dot product result.  Possibly some difficulty dealing with negative results, maybe the vectors aren't in the same space (camera vs. world?) 

Plus, it seems that different operations might give output on different pins.  The dot product will result in a scalar, it would be nice if it output it to all channels of the vector pin or something too.  Or was documented at all.  Or the source code wasn't so opaque.  (haha, opaque... stopping now)

Oh, I also wrote a tool to convert a MIDI file into script commands to spawn objects a specific time values.  This is half way to cool, except drum tracks only have a few different notes, which is presently kind of boring.  Still working on a good solution to that.

Thursday, December 5, 2013

Smooth, very smooth

Back to work.

I basically took the last two days to recharge.  Motivation and energy have been in short supply recently for some reason, so I took some time to play some games and just record some music for fun.  It's nice to be able to take a mental break when you need it, and today has been relatively productive.

A while back while playing with Shadertoy(1) I came across the smoothstep shader command, which for some reason I hadn't come across before.  Microsoft's documentation is a little opaque, but the Wikipedia article there is quite informative.

So basically, rather than doing a linear interpolation with lerp(), you can smooth interpolation with smoothstep().  The function it uses is f(x) = 3*x^2 - 2*x^3, which at first glace looks like a bunch of numbers somebody just made up, but as you can read in the article, they're the numbers you come up with if you figure out the coefficients on a generic 3rd order equation that has f(0) = 0, f(1) = 1, f'(0) = 0 and f'(1) = 0.  How cool is that?

You can also do the same thing for a 5th order equation that also has zeros at the second derivative, for a "smootherstep."

So, I inserted that into my vector and curve libraries, because that's just very useful.  :)

Then, onto some real work.

I want blue laser "bullets," but they really weren't showing up very well against my background.  What I really want is the "lightsaber" look - bright white inner "core" with a blue halo around it.  I was trying to put a really bright blue bullet into the scene hoping the bloom filter would blur it out, but that doesn't really give the white core. 

I modified the bullet model to have a white core with a transparent blue shell around it.  That's an improvement, but not quite there yet.  This uncovered some unimplemented features in the shader graph (the "mix" node wasn't working properly).  This also prompted me to fix the alpha sorting, which wasn't properly detecting when transparent shader nodes were being used and setting the correct flags.  Hopefully that's it for pipeline work for a while.

Regardless, I now have a white core... with a blue shell around it.  Which looks OK, but doesn't have the falloff I want.  Ideally I would adjust the alpha based on the dot product of the view vector and the normal at each pixel to give more falloff at the edges.  That's an easy enough shader to write, but I'll need to find a way to specify it in the shader graph.

Then I wondered what the game would look like if I set the parent of the game camera to the ship rather than the world as I'm fly down and around the "hyperspace tunnel."  I think this is going to have to be part of the game.


1:  In Firefox, I needed to go to about:config and set webgl.prever-native-gl to true, or that site would hang Firefox basically forever and I would have to kill it from the process manager.  From what I was able to determine, the translation layer that converts from WebGL to DirectX doesn't work very well on some code.

Monday, December 2, 2013

I Used To See Four Lights

So, what's going on today...

I decided to put some proper lighting code into the component system.  I've been using the same four hard-coded lights for quite some time, and it seemed like maybe now was the time to do something a little better.  :)

So, adding in a couple of light components.  First, the lights themselves, which are attached to a transform and can therefore be attached to other objects or otherwise moved around the scene.  And second, a collector, which can be attached to a model object and determine the influence of the lights in the scene on the object.

This is pretty basic, we'll see how well it works out.  Currently I'm just accumulating all of the lights into a single set of spherical harmonic coefficients for each object.  That might become performance-prohibitive eventually, though any sort of dynamic spatial partitioning probably won't be cheap either.

So, today was mostly spent brushing up on the math for all of that again... and then remembering that DirectX has a bunch of functions for this already... and then remembering that I already wrote code that was using them months ago.  I seriously need to defrag my brain or something.

Thursday, November 28, 2013

Pushing

So many things going on all at once... not sure which of them might be interesting.  :)

The last couple of days have been spent jumping around all over the place:
  • Reviewing concept character art - I now know what my lead characters look like (!)
  • Writing character dialog
  • Working on the "hyperspace" level design, resurrecting some old test code
  • Arranging music
  • Adding "cylinder" primitive drawing to the render library
  • Putting local variables into the script system
  • Music sync and object spawn sync
  • Dealing with physics issues
The physics issues were perhaps the biggest problem.  Currently, Blender doesn't let you set the mass of anything to more than 10000 units (say, kg) and not less than something like 0.01.  This was causing some problems with low-mass bullets pushing high-mass objects around in unpleasing ways simply because they move fast enough to carry a lot of energy.

I tried a few different methods for dealing with this, but none of them were working very well.  Disabling collision response kind of works, but isn't selective enough and actually lets some objects pass through each other that I do want to respond.

I tried making the object Kinematic, which means it exists in the collision system but doesn't have it's motion affected by collisions.  Unfortunately, in Bullet, it doesn't actually have any built-in motion of any sort, it requires game-side key-frame animation, where I just wanted to set a velocity and have it go.

So, I ended up just making it a regular dynamic rigid body with a really, really large mass.  This may cause weird numerical instability, but my immediate tests seem to work just fine - it doesn't get pushed around by bullets, and nudges the player ship out of the way during collisions.

Then, I was trying to get a constraint to work in the physics system, but it was all going horribly wrong... until I finally figured out that some code forcing the ship orientation was causing the constraint to be violated.  Removing that and just letting physics "do its thing" fixed everything nicely.


I would say there has been some real progress on getting the level done.  Getting music, spawning and rendering all in sync still needs some work, but the pieces are really falling into place now.

Monday, November 25, 2013

Sonic Screwed Up Driver

It was a Doctor Who weekend - meaning, I didn't exactly get a great deal done.  That is to say, I got an immense amount of "research" done, but not a lot else.

I did listen to a few of the bits of music I put together last week several times.  It's vitally important to listen to things you've been recording at 2:00am with fresh ears in the cold light of day.  While I like some of them quite a bit, it will be a while before they're a proper track.

I put what I have into the engine today anyway.  As predicted, horns and strings by themselves show up nicely in the spectrum, while guitar, bass and drums are all kind of squashed into the same frequency band, and drown out everything else.  I will need to play with the EQ to see if I can find something that both looks and sounds good.

In any case, I started doing the script to spawn a few asteroids in time with the music.  That's working OK, except for being slightly out of sync.  The big problem is that the texture scroll and the object motion seem to be slightly out of phase.  Motion in games is inherently jittery, but they're not jittering at the same time, which kind of stands out.  This probably has something to do with the texture being updated based on when new audio buffers become available, and the physics is just updated off the system timer.  I've been meaning to run the game update time off of the audio play cursor, which will probably help, but I imagine there will be some wrangling to make that work.

Also, the whole thing could hardly be called "frantic" at this point... which is fine for the first level of trying to ease the player into the game, but will need to be worked out at some point.  I had some old gameplay tests with a different camera angle and objects moving faster, I'll need to give that a try.

Which will happen shortly after I stop finding interesting* stuff on the internet and actually focus for five minutes.

Which makes game development kind of like a Doctor Who episode.  All running about and getting distracted by nonsense, and then five minutes of brilliance saves the day.  Most of the time.



*not actually interesting, so why is it distracting?

Friday, November 22, 2013

Extra Dimensions

Music AND art day today.  Oh my.

What I learned today:  My non-artist brain likes to think of hair as strands.  This is not at all helpful when trying to make a rough pencil sketch, where you really just want to trace the outline and overlaps. 

Most of the day was spent composing "Hyperspace," a high energy little tune for those times when you are hurtling though extra-dimensional space with reckless abandon and trying not to die in a giant fireball from an asteroid collision.  Seriously, we've all been there.

It's coming along.  I have a few of the basic bits worked out, though there's quite a bit to go, and I'm not "feeling the love" yet.  The main problem with trying to compose a specific song for a specific purpose is that if it's not working, you're still kind of committed.  I start a lot of pieces, but only a handful ever get finished because they'll have one cool riff that just doesn't go anywhere.  We'll see how it goes, hopefully I don't have to reboot it.  :)

The other thing of note is that my giant snare from yesterday is probably a bit airy for something that wouldn't sound too out of place coming from Symphony X or Metallica, so that might need some work.  We'll save that for some heavy, plodding, real-space themed song.  :)

Thursday, November 21, 2013

Heavy Metal Games

It's a music day today.

I spent most of the day figuring out how to create a decent heavy metal drum kit.  I think I'm basically happy with the basic sound now.  The toms could maybe use some work, but the kick and snare are sounding decent.

So, things learned today:

The kick drum doesn't really need much of its low frequency range.  EQing up anything below 75Hz just muddies things up, though you do need a bit of low end.  The interesting parts of the kick actually occur closer to 4Hz and 8KHz, where the "click" actually happens and makes the kick punch through the mix.

To make the giant 80s snare sound basically calls for a gated reverb.  That's basically an effect chain with a reverb followed by a noise gate.  To get the noise gate to work is a bit of a trick, because you want it keyed off the initial sound level, not the reverb level.  I'm using Reaper, which has a fairly easy method of doing this:
  1. Send the snare to a separate track for just the snare reverb
  2. Give that track 4 channels
  3. Effect chain:
    • Duplicate the input channels 1/2 to channels 3/4 (Utility/chanmix2)
    • Add the reverb
    • Add the gate (ReaGate)
      • Set the Detector input to Auxiliary Input

The main trick here is that ReaGate lets you use a channel for volume detection that is separate from the main signal.  There are probably other methods in other recording software, but this works nicely here.

Then I did a little playing with parallel compression, which is just mixing the original uncompressed sound with a heavily compressed version of it. I still need to look into this some more.

This took a lot of tweaking, and then recording a guitar and bass loop so I could see how they sounded mixed together.  It's quite interesting how things can sound vastly different (for better or for worse) one you actually mix them all together.

So, tomorrow will hopefully be composing a test piece for a level and maybe actually getting object spawning synced with it.

Wednesday, November 20, 2013

Title Goes Here

Sometimes, figuring out a clever title for the blog post is the hardest part.  Yesterday's drawing failure, power failure and failure to fail made for a fairly obvious choice though.  :)

Then there are days like today that just devolve into a bug hunt and "oh I didn't implement that yet did I" moments.

I can at least sequence object spawning from a script now.

Other things on today's menu included trying to coax a good heavy metal drum kit out of Native Instruments' Studio Drummer.  I had a kit set up with another plugin that I really liked, but it tends to lose its settings and otherwise behave badly.  The drums in Studio Drummer sound really, really nice, but they're a little subdued for what I was going for.  I'm sure it can be fixed with the proper application of EQ, compression and reverb, but it's going to take some work.

So, now that I'm able to spawn an object, I discovered... I really need to figure out my enemy design.  :)  I'm trying to figure out a basic "building block" piece that can be duplicated assembled into multiple different structures.  I haven't quite figured out exactly what that shape is yet.

Tuesday, November 19, 2013

Fail, Fail, Fail

OK, busy day.  Productive, maybe not so much, but busy.

First, I was playing around in GIMP trying to figure out why I Can't Draw.  Today's discovery:  I can draw horizontal lines relatively well.  Just drawing over the same horizontal line has a very tolerable margin of error.  I can't draw vertical lines at all.  They're not straight, and trying to draw over the same line has like 1000% error.  Minimum.  Tracing circles is interesting too.  There are certain quadrants that I can trace very accurately.  Then I get to, primarily, the lower right quadrant and my pen wobbles all over the place, no matter how slowly or careful I'm trying to be.  This requires more study, because I can't yet detect the subtlety that is causing thing to go wrong.

Then, the wind knocked the power out.  Not helping.  Then I even briefly lost contact with the cell phone network.  Seriously, I nearly started sharpening sticks to go hunt wooly mammoths during this brush with the stone age.

Fortunately this brush with abject terror only lasted about 90 minutes, but the UPS couldn't handle it and my Linux uptime is ruined.


Next, time to do some actual work.  :)  I started working on some language features to make object spawn scripting work better.  Primarily this involved adding some coroutine support so I can make a timed event wait for that time to arrive.  The system works more or less fine for sequencing things without this, but it's hard to get things to trigger only once.

I'm still debating how I want to do certain things.  On one hand, it would be nice to spawn a specific object at a specific place at a specific time with specific AI.  On the other, pseudo-randomly generating an entire level has a certain appeal.  Unfortunately, the latter has had me stalled for much too long and it's time to press on.

The idea was for this first game to fail quickly while I built up technology, so I could make the second a success.  It's not failing quickly enough... or, something.

Monday, November 18, 2013

Always One More Thing. Or Two.

Last weeks daily updates were not exactly daily (and therefore, I suppose, not updates either).  Let's see if this week goes better.  :) 

At least you got to miss all of the updates that can be roughly summarized as: "I don't feel well today, YouTube, brain fog, Twitter, I can't figure this out, Facebook, what was I doing again? twitch.tv, this is never going to work, oh I got something working when did that happen?"

I got most of the major overhaul of the scripting system done, so I can basically map any C object directly into the scripting system.  I'm sure that's giving some security expert somewhere palpitations or convulsions, but since it's all bound to specifically named objects and data members, it should be fine.

Overall, this is actually kind of handy, although I'm still not "feeling the love" for a good way to define level scripts.  Somehow, writing out every enemy spawn, by hand, at specific locations, just feels a bit meh.  A quick perusal around the web of methods other people have used pretty much comes down to exactly that, though.

So, I suppose it's time to actually do some real scripting.  But first, I need to... make a couple more tweaks to the scripting system (you didn't see that coming, did you? :-p )

No, really, they're good ones.

First off, I'm kind of trying to sync the levels to music, so I'm going to add something to natively convert measure/beat time codes to seconds.  Easy enough.

Second, one of the main suggestions I have run across regarding scripting enemy spawns is that hard-coding the numbers is bad because if you want to shift sections around, you have to change a lot of numbers.  So I'm going to build in the concept of time regions.  Basically, there is a global time based on how long the level has been running, and you can recursively open a new local time scope relative to the parent time.  I've used this before, it's pretty slick for sequencing or animating things.  It should make it fairly easy to move around sections anyway.

With those things and a Spawn() function, hopefully I'm off and running.

Why it took so long to get to this point is one of the great mysteries of the universe.  I suspect it has something to do with the universal programming equation:

t = t * 2
where:
t is how long you estimate something is going to take and
2 is some constant, possibly larger than -INF and smaller and +INF, but quite possibly NAN.

Being recursive, it causes some consternation amongst project managers and people trying to get things done.

Wednesday, November 13, 2013

The Code That Binds

I have been spending the last little while trying to get the data binding stuff worked out.

After several false starts, I have finally settled on using the expression parser / compiler to do the data binding.  Ultimately, this seemed to make the most sense.  Copying data from one arbitrary place to another is basically what a script is supposed to do, plus it gives some additional flexibility.

Quite often it seems that the best approach to solving problems is to ask, "how do I want to interact with this system" rather than, "how do I want to implement it?"  Basically, I decided I wanted to specify data binding something like this:

    ScriptComponent
    {
        Context
        {
            src = BezierFunctionComponent
            dst = SpawnComponent
        }
        Script = dst.Rate = src.Value(t)
    }
That basically lets me map specific components to variable names in the script environment, and then operate on them.  Since this all gets compiled at load time, and the above example only generates a tiny number of nodes in the parse tree (say, 4 or 5), this should execute insanely fast and be very flexible.

Unfortunately, the variable system I had in place was somewhat limited.  So, I'm about half way through getting a proper symbol table system implemented.  This basically lets me memory-map a component into the script system, with the added benefit of putting structure definitions and stack frames (local variables) into the scripting system in general.

Still a ways to go, but making progress.

Friday, November 8, 2013

Time to Reflect

It's time to start putting some enemies into the game.

This raises some questions about what exactly is the best way to do that.

The basic components:
  • Enemy spawner - creates enemies from one of several "prototypes"
  • Enemy path - curve enemy should follow
  • Path chooser - binds one of multiple possible paths when an enemy spawns
 I've been trying to figure out the best way to implement this in the component system. 

Ideally, each of the path components would exist separately from the enemy object, so they only need to be created once.  The Bezier curve component does a lot of pre-tessellation of the curve that I don't want to duplicate every time an enemy is spawned, so it can't live directly inside each enemy instance.

I already have a fairly decent system for binding objects by name, so I can probably use that.  The question is whether to create a unique GameObject for each path and link to the GameObject, or to embed a set of the BezierPathComponents in a single GameObject.  The system for handling multiple components of the same time in a single GameObject is slightly underdeveloped, but this is a pretty obvious way to create "sets" of paths.

The next thing then would probably be to create some sort of "chooser" component that can select a particular component at random and plug it into the GameObject.

Along this line, I've also been wondering about the best way to animate values in certain components.


For instance, in the spawner itself, there is a Rate value that specifies the number of GameObjects to spawn per second.  Ideally, I would like to animate this value on a curve so I can essentially turn it on and off at different points in the level.  I would typically have changed the Rate variable from a float (effectively constant once the object is spawned) to a different type that can either bind to a curve or script or something similar and be evaluated each time it is referenced.

I'm thinking this is the wrong way to do things.

What I want is a separate component that animates the value and writes it into the Rate variable directly.  The advantages to this are that any variable can potentially be animated, and there is no overhead for variables that are not animated.

The trick is to do the data binding properly.  I don't really have any sort of reflection available, so I don't have any data-driven way to bind to a specific member variable by name.  I either need to add that or find a good place to add custom code that binds the source and target together. 

I'm not exactly enthused about doing either of those :) but some sort of reflection interface seems like the "correct" choice.  Ideally I could hook that up to the construction code that pulls things out of the attribute files too, but it will have to be introduced piecemeal, because I'm not going back to put it everywhere today.

Thursday, November 7, 2013

Music Collection

Exploring gameplay possibilities today.

Visualizing the frequency graph of music was working out pretty well, and finding the peaks in the music, but it wasn't exactly interactive yet.  In fact, pretty much all of the data about the peaks only ever existed inside the shader, so...

I moved a bunch of the peak funding stuff back to the CPU and store it so I can use it game-side.

First experiment:  fly around and collect the peaks!  My ship can now fly around and carve a swathe through the audio data.  Current it's just using the bounding box for the ship.  It shouldn't be too difficult to give it some sort of masking pattern, but let's see how this works out first.

First problem encountered:  ship acceleration and top speed are way too high for precision maneuvering.  I knew this before, but it wasn't a big concern until I needed to stop at a very specific location.  The biggest problem is the minimum distance you could move was a little bit more than a ship's width, and the desired precision is a lot smaller than that.  So, I added acceleration curves to the ship and tweaked some of the values there.  It probably still needs more work, but it's definitely easier to handle now.

Next problem encountered:  this is kind of cool, but I'm not sure it's exactly fun yet.  :)  I think I want to look into bits you do want to collect, and bits you don't want to collect, and maybe the ability to shoot bits away.

I also looked briefly into spawning objects based on the frequency data, but that was resulting in too many objects, or badly spaced objects.  It probably has a place, just needs more control over the spawn rules.


So, next, can we make this fun?  :)

Wednesday, November 6, 2013

Frequently Analytic

Still monkeying around with frequency analysis, trying to brainstorm up interesting ideas of things to do with all of this data I'm generating.

I finally worked out a really good method for getting some very nice, representative data for the "main" frequencies in music.  It was actually pretty easy - basically, taking the difference between the level of a particular frequency and the RMS of all frequencies on a particular frame really makes the peaks jump out (with some logarithms and scales to smooth things out).  I was previously using a rolling average of each frequency band, and while it worked "under laboratory conditions," it was just unreliable.

One slight annoyance is that at 44.1kHz, smallish FFT window sizes (say, 1024 samples) mean that each frequency band is 22Hz in width, which is fine for high frequencies, but is much less than the difference between adjacent notes at low frequencies.  Equally unfortunately, you get frequency data all the way out to 22kHz, when most musically interesting stops around 4kHz.

So, I'm using an FFT window size of 4096 and taking the lowest 256 samples, which gets me fairly meaningful data out to 5.5kHz, while expanding out the lower frequencies.

The next interesting thing is that different styles of music and mixing can apparently give wildly different visual results:

Volbeat - Heaven Nor Hell
  • Kick / Snare / Bass / Guitar all get kind of muddled into the same narrow low frequency bands
  • Drum-only sections really jump out though
  • Singer's voice shows up amazingly well - interesting harmonic ringing as well (why?)
  • Harmonica creates some amazing high frequency sections that wobble back and forth :)
Star Wars - Main Theme
  • Orchestras apparently cover a HUGE frequency range with meaningful music frequencies.
  • There are peaks across the entire spectrum that clearly correspond with the music.
  • Loud and quiet passages are also clearly visible.
  • Interestingly, there is almost nothing below ~80Hz, an area clogged in
Laura Shigihara - Jump
  • The instruments give amazingly clear individual notes
  • Occasionally her voice jumps out, but most of the time it gets lost in the other instruments
  • Sometimes, though, I can see two parallel frequency bands corresponding with vocal harmony parts, which is cool :)
  • There's actually not much OTHER than peaks here, very clean and quiet frequency map.
Me - Rock7
  •  Kick / Snare / Guitar / Bass are just kind of jumbled into the low bands
  • The artificial harmonics really stand out...
  • But there are actually a huge number of peaks that jump out semi-randomly across the entire spectrum.  They kind of correspond with the lead guitar, and maybe cymbal / hi-hat hits.
  • Unfortunately, the lead guitar kind of gets lost among the noise.
Some of my other rock tracks exhibit the same sort of frequency clustering below 200Hz.  I suspect this is maybe just mixing low frequencies too hot, pushing up the RMS such that only these frequencies get above it.  There are some higher frequency synth sections that manage to jump out of the mess and show up very clearly though.

It will probably take a bit of mixing experimentation to figure out what works well here.  Actually, I could probably consider dropping everything below 60-80Hz from the RMS calculation... will need to investigate that...

Tuesday, November 5, 2013

Visualization

More visualizing music today.

I started off getting the music synced to the rendering.  This wasn't too difficult, but I need to read ahead quite a large amount of the audio file to "fill up" the data buffer before the music starts.

Then, I spent most of the rest of the day playing around with different settings to see if I could find something I like, and that will work for multiple different music files.

Mostly, I want to detect the current peaks in the frequency data, and detect when beats occur.  This is a little bit difficult, as frequency data is kind of messy and noisy, but maintaining a running average for each frequency and seeing when the current level is above the average, seems to work reasonably well.  There is a little bit of magic scaling going on to produce more or fewer peaks, and it seems to be difficult to pick a value that works nicely for all files.  I suspect I can probably automatically tweak it to gradually move towards  a target number of peaks-per-second.

Now, to figure out what to do with that.  There is a game in here just waiting to get out, I can feel it.  :) 

Monday, November 4, 2013

Audio From Before Time

Time for a change of pace.

Several months ago I did some prototyping work on using music files to, to some extent, generate levels.  I've been planning on using this for various later parts of the game, but since I'm feeling a loss of momentum at the moment, it seemed like a good time to go back and bring some of that code up to date.

Since this code predates the last major engine architecture change (and before I started blogging regularly), it's fairly out of date.  At the time I was using a horrible, cobbled-together .wav streamer.  Since then, I completely re-wrote the audio streaming system (now supporting .ogg files, yay) and now have a much better system for dealing with streamed audio buffers.  Today has mostly been a bunch of cut-and-paste to move everything into a proper component and hook it up to the streamer callback.

So, that's coming along, and now sends everything through the FFT to get a spectrum out the other end, and dumps it into a texture so I can actually visualize the audio. 

I had a few new ideas for gameplay with this system.... I'm not sure how I work it into the fiction of the what I want to do for the rest of the game, but I'm sure that can be worked out.  :)  I'm half-inclined to make it a game of its own and maybe get something actually, you know, done and released.  We'll see about that...

Friday, November 1, 2013

Shaders and Pipeline

Yesterday was a bit of a distraction with Halloween, and hardware upgrades gone awry (not to my PC, my PC is still sufficiently awesome).  That's all dealt with, so back to some real work today.

I was still unhappy with the way materials and shaders were coming out of the pipeline, so I spent most of today cleaning all of that up.  It makes it quite a bit easier to modify specific shader parameters on specific materials, which I really need.

The main problem is that when I converted the shaders to the shader node / graph system, I was basically baking the shader constant values directly into the shader.  This basically meant there was one shader compiled for every material on every mesh.  On one hand, this wasn't so bad, because I don't have that many models really.  On the other, it's just awful.  :)  So, I put in proper shader sharing within a single model and added material / shader constant blocks that are set (and potentially modified) at runtime.

That's coming along well.  There is maybe a little cleanup required, and a few game-side functions to add to access the material data in a friendlier fashion, but that should all be done by the end of the weekend.

Wednesday, October 30, 2013

The Constants of Change

Another busy level editing day.

The BezierPathComponent thing is working out decently well.  Travelling along the path and allowing starting/stopping at specific endpoints is doing it's thing, and being able to specify a speed along the curve, rather than relying on spacing the points consistently, is very nice.

It's actually kind of nice to be able to actually play basically the entire level with nav points and dialog now.  :)  The only things not really working are the opening camera pan and the small puzzle right at the end.

The opening camera pan... well, I need to get a better method for animating the camera.  I partially wonder if it would be worth getting animation export working from Blender so I can just animate a scene there and replay it in-game...but that seems a bit excessive at the moment.

The puzzle at the end presented some more troubles.  I basically just want to player to arrange a bunch of objects into the correct sequence.  The arrangement part isn't too hard, but when I went to make a test model for it, I realized that I want to be able to modify certain shader constants to be able to change the colors of certain meshes at runtime.  Unfortunately, my shader pipeline was not exactly set up to support this, so I dove back into the asset compiler again to make that work.  The shader generation is fixed, but I still need to deal with the runtime side of it.


Tuesday, October 29, 2013

So, you want to travel along a Bezier curve

So today was a bunch more playing around with Bezier curves.

I got the curve editor basically working well, but how to actually travel along the curve?  The thing about Bezier curves is that, as you evaluate them over a specific t∈[0,1], the change in distance travelled is not directly proportional to the change in t.

Even worse, if you are joining multiple curves together, using time elapsed as the t value could give wild discontinuities in velocity as you cross the boundary between a long curve and a short curve.  It's also very difficult to control acceleration at the beginning and end of travel along the curve.

So, today has been about evaluating a Bezier curve in terms of distance travelled, rather than elapsed time.
It appears that evaluating the length of a Bezier curve is fairly expensive as there is no closed form for computing it.  You basically have to recursively divide it up into a bunch of line segments and sum up their lengths, with improved accuracy the smaller you make the segments.  The error is just the difference between the sum of the lengths between the control points vs. the length between the first and last points - that is, keep subdividing until the segment is close enough to a straight line.

This is fairly straightforward, so my current approach is:
  • The BezierPathComponent takes an associated BezierCurveComponent and breaks it down into a bunch of segments (storing length, t0 and t1 for each segment)
  • Store the sum of the length all of the segments for a total curve length
  • Set a constant target distance along curve, target (maximum) speed and acceleration value
  • Keep track of the total distance travelled and the current speed
Then, we can update the speed each frame, update the total distance travelled, and figure out which segment we are in based on the total distance travelled so far.  After figuring out which segment we are in, we can either linearly interpolate along the segment, or figure out a t value based on the stored range of the segment to compute the position directly from the Bezier curve.

That should let us move at a controlled speed along a set of curves of varying shapes and lengths.

Monday, October 28, 2013

Curves

Lots of little things going on today.

Some cleanup of the level editor interface.  I'm still working on figuring out exactly what the "template code" (not actually using templates) should look like as far as handling input and object selection.  I decided on having the actual editing be a different mode from selection, as there is typically all sorts of stuff overlapping on the screen and it's too easy to select a different object when you're trying to move, say, a control point on a Bezier curve.

Which leads to, yes, I got a Bezier curve editor of a fashion up and running.  It looks like it will work well for plotting paths in 2D, though it's not currently suited to doing something like camera animation.  As just a curve in space, you can get a point, and a tangent/direction, but a camera needs to be able to have a clearer sense of "up" and can be rotated differently from its direction of motion.

I current have a nav point component, which basically uses physics calculations to "fly" a ship to a particular point.  This works tolerably well, and is good for starting from an arbitrary location, but for flying a fixed path through a level, it's a bit much.  Plus it's a bit flaky when arriving at a target - I have an approach vector specified, but it's not used properly yet, and overshooting the target causes frantic turns to compensate.  :)  It's good, but needs some work still.

So, that's in progress.  Then I fiddled around with generating some sound effects for various events, and possibly some ambient sounds.  I have the Native Instruments Komplete 9 package, which has a TON of stuff, all though it's not really geared for sound effects.  The Massive synth actually seemed pretty easy to use for building some generic FM sounds, and the Evolve Mutations packages have a lot of cool stuff in them.  The Giant piano also has some really cool piano noises, though I having exactly found a good place to use them yet.

So, decent start to the week.  Hopefully I can mostly focus on laying down objects and paths this week.

Thursday, October 24, 2013

Late post

Somehow I keep missing my blog schedule.  :)

There has been so much stuff going on, not all work related.  I have pretty much been focused on level editing and content creation, while patching up and improving the editor.

I finally ended up doing what I suggested a while ago, and am loading the "root" level attribute file in and editing the text directly in the level editor.  This works pretty well for item placement and nav point editing.  It took a lot of tweaking things to get it working well, but it cleaned up a bunch of code.  I made improvements to the base class for game objects and components to allow for identification by type or unique string ID, and added insertion functionality into my queue container, which I seem to have neglected to do previously. 

I spent a bit of time putting together a small piano piece for the introduction.  That actually turned out OK and went into the game pretty easily.  Then I got sucked into playing around with music software and didn't get a whole lot more done.



Next up, I may look at putting a Bezier curve editor in, and adding editor functionality for some other existing components.  Mostly I'm focused on level layout and navigation though.

Tuesday, October 22, 2013

Bug Day

I picked up an inexpensive graphics tablet to give it a try.  I was starting to feel the limitations of using just a mouse - touch sensitivity is pretty nice!  Things have come a long way since the last time I tried anything that resembled a tablet.  :)

After spending some time getting acquainted and configuring a few applications to cooperate, it was time to get down to some programming again.

At which point I ran headlong into stupid pipeline problems.  Again. 

It seems that the way I'm exporting model data doesn't really allow for having multiple "root" objects.  This is a little bit important because of various other stupid limitations of Blender/Bullet that require independent rigid bodies to be parented to nothing at all (not even an empty transform), otherwise they just become a single physics object.  >Headache<

All of this makes arranging a scene tree in a sensible fashion pretty much impossible.  There are ways to work around it in Blender, but they don't export properly.  >Headache<


I can modify my asset compiler to handle multiple root objects at least.  Going to be a long day...

Monday, October 21, 2013

Deviations


Sorry, nothing to exciting today, just gritty details of game development.  :)
Today was kind of all over the place.  I started off with a good push towards doing some level editing, but got sidetracked on some pipeline bugs and miscellaneous other things.

I made some progress getting the nav point editor started.  At least it displays the nav points now.  :)  I then went to make some placeholder art for a broken object in the scene.  After taking a few minutes to patch it up properly, because the cell fracturing tool does not get along with non-manifold meshes, I finally got it into the game and quickly discovered that the collision mesh and render meshes were not at all aligned.

I had discovered this problem earlier with some other models, but didn't really take the time to look into it.  Today it was severe enough to call for another look.  It turns out to have been a problem with the way I'm converting from the right-handed to left-handed coordinate system in the pipeline.  The Bullet importer doesn't really support this, so I have to go in and manually transform everything.  There was a bug when the pivot for the object was not at the origin, and basically everything was getting doubly-offset on the Y-axis, for mathematical reasons too tedious to explain right now.

Then I encountered a problem with binding the rigid body to the render mesh, which is done by name.  The COLLADA exporter replaces all periods in names with an underscore.  The Bullet exporter does not.  Not a really big deal, but took some time to track down and fix.

So, at last, I actually got a little work done on adding some new components to the engine.  The one thing I really wanted to add was the ability to script various operations on models.  In particular, I wanted to give every individual piece of an "exploded" model some random angular velocity when it is spawned, so I'm looking into a ModifyPhysicsComponent that will spin over all of the rigid bodies in a model and use a script/expression to modify them.  We'll see how it goes, looks fairly good so far.


So, overall, progress, but I really need to get past all of the expository bits in the first few levels of the game and get to the blowy-up bits.  :)

Saturday, October 19, 2013

Character Modeling, Take Four

I should have posted this a couple days ago:






After much mucking about with creases and cheek bones and eyebrow ridges and eyelids, I just took a giant Smooth brush and flattened everything.  Suddenly, everything looked right.  :)

I think the anime look works much better.  Next up, maybe I'll try for some better hair.

Friday, October 18, 2013

Bullet Trigger

Today started off with a daring rescue operation after the main mission ran out of fuel and the previous rescue operation went awry due to lack of power, leaving two astronauts stranded in lunar orbit.  All three astronauts have returned home safely and are eager to perform more ill-conceived celestial antics science.

But enough Kerbal Space Program.

I spent most of the day putting trigger volume support into the game engine.  Integrating it into Bullet was a little troublesome.  Bullet has callbacks for contact added/destroyed... but they don't actually seem to do what you would expect from the names.  Actually, the Add only works if you set special flags, and I have no idea what's up with the Destroy - it doesn't even take the objects in question as arguments, and only gets called... well, never as far as I can tell.

No worries, the Hacker Within shall prevail.  Without modifying 3rd party code if at all possible.

It looks like the collision callback happens every frame if two objects are touching, so I just set up the TriggerVolumeComponent to track if anything was touching it last physics tick, and if anything is touching it this tick.  In the post-physics update, if the flags don't match, the appropriate notification function is called.

This kind of limits the options for passing around which objects were actually causing the trigger, but ultimately I don't think that's too important.  Mostly I want to know the on/off state of the trigger.  The collision filtering takes care of most of that (only objects of the correct type affects a trigger) and each object can handle the regular collision callback if it cares that it's touching a trigger.

This opens the door to some other fairly useful components for building a level. 

An EnableOnTriggerComponent can mediate between a trigger volume and any other game object to enable or disable it.  If, for instance, a SpawnComponent is created disabled, it can be activated when the player enters a specific area.

For puzzle-type things, it could also be used to detect when the player has placed an object in the right location.
Basically, it enables the actual Game bit to work in the game.  :)

Thursday, October 17, 2013

Express Post

Back to programming again.

I decided to expand the expression parser to allow for if/while/for constructs and statement lists.  Those are actually pretty straightforward, though I also thought it might be a good idea to wedge in local variable declarations, and that is a little less straightforward.

Currently an Eval() call in the parse tree is expecting a templated "result" variable of the appropriate type generated by the sub-tree of the expression at each node.  I'm thinking this should just use a system stack structure of some sort - the return value just gets written above the stack pointer, and all of the local variables can then just sit below it.  That makes it just a short hop to proper function definitions.

So, there's some progress there.  Kerbal Space Program 0.22 was also released today, which makes it slightly more difficult to get any real work done.  :)

Wednesday, October 16, 2013

Seeing Through Time and Face

This morning I thought to myself, "If you start modeling, you know you're just going to spend all day doing it."

If only I could turn this future-telling prowess into money...

Despite the lack of meaningful level development, I might be getting better at the face modeling thing.  Having no artistic background at all is making for a steep learning curve.  Really,  I think I'm nearly face-blind.  But now, after studying facial details for two days, just going out to the store has been weird.  I keep seeing faces as collections of features rather than just generic facial blobs.

I think this is the best part of being an indie developer - learning new stuff that blows your mind.  :)

(I also took a look at 3D-Coat.  Apparently they have taken some lessons from Blender in interface incomprehensibilty.  Looks neat, but I'm not sure the 30 day trial is going to be enough to get it figured out...)


Character Modeling, Take Three

OK, this is way better.  :)


Not there yet, but definitely better.


Tuesday, October 15, 2013

Character Modeling, Take Two

I was going to do some more level work, but somehow ended up spending all day trying my hand at character modeling again.



At least this time I got something that might actually pass for a person.  Perhaps my artist friends can take a look and tell me everything I'm doing wrong.  :) 

It's still a work in progress, don't be too harsh.

Friday, October 11, 2013

Cut and Paste

After yesterday's coding fiasco, I got out the Giant Code Axe to fix the problem.  Actually, the Small Code Hatchet probably would have done the job.  Or maybe the Lesser Code Butter Knife.

Ultimately, the problem was that I was trying to use the same piece of data to solve incompatible problems (immediate input response, and inter-frame state).  The code was basically fine, I just needed to track two separate states. 

I also added an UpdateInput() event that gets broadcast for every mouse or keyboard event so they can get responded to properly for UI.  For gameplay, I basically just want to know how far the mouse moved in a frame, and which buttons were pressed.

Once I realized the problem, it was surprisingly easy to fix.

The peril of being a solo developer:  instead of explaining your problem to a co-worker before realizing you're being an idiot, you have to explain how you're being an idiot to the whole internet first.  Or at least to a bunch of loyal followers.  Or a bunch of spam-bots.  Is there anybody out there?  Hello?

The rest of the day was spent working on some level editing stuff.  I ran into a little trouble with my "spawn" component, which randomly fills an area with instances chosen from a set of models.  Editing the component live doesn't work very well, since it doesn't actually track what it spawned, so resetting it just spawns new models without deleting the old ones first.  Since it tries not to overlap models on spawn, it typically just can't find a spawn location and asserts.

This might be a good time to get level reset / restart working.

Thursday, October 10, 2013

Mouse Trap

Today started with making some meaningful progress on level editing, and devolved into re-writing most of the low-level mouse input API.  Bah.

I did make some progress rearranging and designing things such that each component can basically handle its own editor functionality (deciding how do draw its selection box, whether it's clicked or not, and what to display when it's the active object).  This looks like it will work better than having some sort of master "editor" class running the show.

Then I started to notice some weird input problems.  After clicking on an object and selecting Rotate, it wouldn't let me rotate for some reason.  Trying a second time would work.  It turns out the "mouse up" event from selecting the object was basically being saved as the last mouse event, and was then being read as the "mouse up" event for committing the edit immediately after entering the edit mode.

Bah.

The thing with handling mouse input is that it's possible to get multiple mouse events in a single frame, or it's possible to get none.  It's also important to know the mouse location at each button event.  During the game update, if you only look at the last event each frame, or even some sort of union of the states on a frame, you miss some meaningful information. 

The keyboard is a little more forgiving - generally, you only want to know if a key is down right now, or if a key was pressed for the first time since the last update.  Mouse input cares a lot more about both button up and button down events, where the cursor was when it happened, and what order it happened in.

So, the current solution is to buffer the mouse input for a frame so I can play it back, basically like a keyboard buffer.  This works OK, except where I have some key-checking and mouse-checking code mixed together, and the key-checking code doesn't execute unless there has been a mouse event on that frame.


I get the feeling I have caught teh dumb and that there is a simple, clean and obvious solution that I am just not seeing.

The only thing that really comes to mind is to do the input processing from the mouse event callback, so there is only ever one mouse event in the queue... which I was actually doing at one point, but abandoned when I moved to the component architecture.

I hear the Giant Code Axe being sharpened.

Wednesday, October 9, 2013

Real Life Intercedes

Short update today, as real life has decided that "making games" is "optional," while "ferrying people about" is "mandatory."

  • Nearly had a panic attack because CTRL-Z was no longer working in any application.  Turns out the `/~ key was jammed down somehow.
  • Working on editor related things - getting transform tools to work.
  • Reviewing / tweaking character concept art.
  • Did some modeling, trying my hand at character modeling.  Discovered I should stick with spaceships.  :)
  • Blender's sculpting tool is actually kind of nifty, if you can memorize all of the hotkeys.  The same thing is true for the rest of Blender, of course.


That's it, gotta run.

Tuesday, October 8, 2013

The Hacker Within

OK, level editor == monumental && monumentally boring task.  (@pedants: no, that doesn't parse meaningfully as a C++ expression, deal)

In an effort to make some meaningful progress, the Hacker Within has emerged and said those four most dreaded words (when coming from the Hacker Within):  "Why don't you just..."

Sayeth the Hacker Within, "Since everything is specified in text files anyway, why do you actually want to make a UI?  You've been editing attribute files by hand and it's fine.  The file monitor already knows which file was used to define an object.  When the text file changes, it auto-refreshes the object anyway.  Why don't you just do a system call and edit the attribute file of the currently selected object in Notepad++?"

"Well, because that's a stupid idea!  It's inelegant, hackish, brutish, and... oh, it's already done.  I suppose it has that going for it."

"But what about..." I begin to protest.

Sayeth the Hacker Within, "Look, just make some buttons to copy attributes to the Windows clipboard if you want to edit something like, I don't know, transforms in-game."

This has got to be the worst best idea ever.  At least it's progress.

Monday, October 7, 2013

Apparently I Still Remember How To Write Code

Back to coding again.  And just when modeling was getting fun.

Most of the day has been spent doing level-editor type things.  Mostly looking at ray/AABB intersection tests so I can figure out what I'm clicking on... though now that I'm writing this, I realize it probably makes more sense to project the AABB into screen space and do the test there (see, it pays to blog this stuff).

I spent part of the day decluttering my render library API.  I had added a bunch of "draw" functions in the main render class - things for drawing lines, circles, rectangles, boxes, etc.  This also involved loading a couple default shaders, creating the various vertex and index buffers for these objects, and a bunch of default textures.

That's all good and useful stuff, but really the base render class really has a single purpose:  translate application render calls to hardware API calls.  That mostly consists of setting render targets, vertex declarations, vertex streams and index buffers, and actually drawing primitives.  Since all of that "draw" stuff was taking up over half of the file, it was time for a split.

The change was actually pretty straightforward.  The draw functions really just use the application API and aren't hardware specific at all.  It was mostly a matter of making them call various functions in the other class instead of "this," and then patch up several places in the game code to use the new class rather than the main render API.  This took a lot less time than expected.

This cleaned up a few things too.  The virtual coordinate system stuff I was doing was kind of getting mingled into the print function.  With this change, the main API keeps a very base level of functionality - it just prints what it's told - and the "draw" API has its own print function that can scale for different resolutions before calling the low-level function.

Sunday, October 6, 2013

Playing With Blender

The weekend was spent playing with Blender.  I'm justifying it by saying that I will probably eventually need to stage proper background and camera angles for dialog and cut scenes, but mostly I'm just having fun. :)

Ship in nebula:



The majority of the time was probably spent doing the cockpit controls (visible at the beginning of the video).  Working with Blender's Cycles render engine is pretty nice.  Being able to set up materials and shaders in a sensible manner is good, especially being able to set up proper emissive materials and textures.

Also, after looking at several pictures of fighter jets, I realized that that the visibility from my ship was horrible, so I totally reworked the cockpit canopy to give better rear and side visibility.  This also resulted in putting in a proper seat, and then a second seat (which didn't really have any story purpose, but hey why not), and then a test character for size, and a few other details.

Because Blender is doing proper caustics, I ran into some problems with the whole volume of the canopy essentially being considered one big slab of glass and distorting the light, so I had to give it some thickness.  That's just totally unnecessary for game-side, but, whatever, it's really low poly and most of it will just get backface culled anyway.

I'm almost tempted to bring the whole thing into the game to see what will explode.  :)

And then I delved back into Blender's animation system and decided to render the whole thing at 1080p, and there you have my weekend.  Rendering at 1080p takes forever, fortunately the middle parts end up being mostly sky and go pretty quick.


Wednesday, October 2, 2013

Shattered

No update yesterday - there was really nothing to update about.  And today I was still not in a coding mood.  Fortunately, there is a lot of other stuff to do.

Today was mostly getting character reference in place because I finally contracted a concept artist.  It's good to start getting a better picture of characters that have been slowly forming in my head over the last several months, and hopefully getting that together will solidify my focus.

After that, today was mostly spent putting together some generic spaceship debris that I need for the opening scene. That went OK, though kind of slowly.  Most of the time was spent figuring out Blender's cell fracturing tool - and more specifically, why it wasn't doing what I wanted.  The short version is, manifold meshes are fairly important.  

My biggest complaint is that most tutorials seem to have "gone to video" recently.  That's fine to some extent, if people would actually edit their videos down to the important bits.  Rather than being able to skim text and glean what I want in a matter of seconds, I'm forced to watch half an hour of fiddling around, or hopelessly jump around the video trying to find something useful.  The other real problem with video is that by the time something really educational happens, it happens so fast you miss it completely and have to rewind endlessly.  Seriously, "needle in a haystack" is the only way to describe it.

OK, rant done.

Monday, September 30, 2013

Today I... ooh, something shiny on Twitter!

Off to a very slow start today.

I started off with a little dialog cleanup today.  I have never really done very much character dialog and I'm not really sure if what I'm writing is any good, so I keep making revisions.  I'm my own worst critic (at least, I hope I'm my own worst critic, or I'm going to get utterly roasted alive) so it's difficult to leave dialog in place for more than a day before I decide that it's utter crap and needs to be revised.  Some of the revision is certainly good, helping to clarify things that are clearly not clear, but some of it is probably arbitrary.

So, escaping that cycle for a moment, I started working on a tree view control so I can actually see the node graph of all of the components I have loaded, and eventually tie that into the ability to edit data in an attribute file.  To some extent, it's mostly just piecing together bits of other controls that I already have.  The main problem at the moment is that it's fairly easy to specify a hierarchy of controls from an attribute file, but that is static, and I need a "pleasant" way to generate a control set from more dynamic data.

I also realized that having a simple "scroll box" would be useful (basically a control that automatically puts up scroll bars if the size of its contents exceeds the size of the control) and did some work to split that out from the few controls that duplicate this behavior.

That actually sounds like I did a lot more than I think I did...  now, if I could write a Python script to turn blog posts directly into C++ code, maybe I'd be getting somewhere...

Saturday, September 28, 2013

It's A Sign

You know, it's kind of  shame that C by default makes you type out "unsigned <type>" when you want an unsigned variable.

OK, most of the time it doesn't make any difference, but I've found it makes the intention of some code clearer.

I put these in my "top level" header that gets included everywhere:

typedef unsigned long ulong;
typedef unsigned int uint;
typedef unsigned short ushort;
typedef unsigned char uchar;

These too, so they can be redefined if the compiler you're on has different ideas about word sizes and you need to read binary files across platforms:

typedef __int64 int64;
typedef unsigned __int64 uint64;
typedef int int32;
typedef unsigned int uint32;
typedef short int16;
typedef unsigned short uint16;
typedef char int8;
typedef unsigned char uint8;

I find I use uint a lot.  There are a lot of things, like array indices or container sizes, that can never be negative (or if they are, you have a problem), so uint makes more sense.

On the other hand, I have found myself using iterators a lot more than indices recently.

One caveat:  The conversion from unsigned-to-float is pretty expensive on PC compared to signed-to-float.  You may want to keep that in mind if you're mixing floating point and unsigned values. 

I even had one case where I tried to cast the unsigned to int first, even going so far as to pass it through an inline function, and Visual Studio just said, "oh, you REALLY meant to convert from unsigned to float" and went through the slow-path conversion code. 

Friday, September 27, 2013

Resolution-Independent Coordinate System

Today, I'm thinking about UI issues.

I have dialogue displaying, but the basic layout was tied to the screen resolution (currently 1280x720, because it fits nicely on my 1920x1080 monitor in windowed mode).  I began wondering what an ideal "virtual" coordinate system would look like, where you just use one coordinate system and scale it for the actual screen resolution.

This causes some problems with fonts, which should really be rendered at screen resolution, so I'm leaving those out of the equation for now.  You can make some progress playing with point sizes, but in the end it probably won't be exactly the same on different resolutions.

So, what would a good "virtual" coordinate system look like? 

You could have 0..1 in each axis, but then you have to work with a lot of fiddly decimals, and that doesn't take into account aspect ratio.

I think ideally, the coordinates should divide evenly by a large number of primes so you can create symmetric divisions without requiring fractions.

You could maybe have something like 1600x900, which is good for a 16x9 aspect ratio.  1600 doesn't divide evenly by 3, though.

Since 1920x1080 is my target resolution, I decided to just go with that.  It's pixel-accurate at the target, and both divide evenly by multiple powers of 2, as well as 3 and 5.

 The rest of today consisted of a little bit of modeling, and I picked up my guitar for the first time in a couple months - I clearly need to get my hands back in shape again.  :)

Thursday, September 26, 2013

Back Into Gameplay Land

Finally back into gameplay land today, which means I'm doing a while bunch of random stuff.

First up, brief tests at conversation text suggest that I need to differentiate the speakers a bit more.  I don't have character portraits up yet, which would help, but I'm wondering if choosing a unique font for each character would be a good idea.  I could go with a different color for each character, but using a different font for each that helps convey their personality might be interesting.

That led me to Google Fonts, which looks to have a decent variety with reasonable licensing requirements.  If anyone has any other suggestions, let me know.

Of course, this led to the need to modify the game code to support a different font for each speaker.  That's not too big a deal, but causes a bit of trouble because line heights are no longer constant.

The level script is coming along though.  It navigates around the level with appropriate conversation text displaying (and not displaying) at the appropriate times. 

I need to work on object placement a bit more, so next up is probably getting a "bird's eye view" of the level with some sort of transform editing.  We'll see how that goes.  I don't want to write full serialization for every component, so I might see if I can modify the attribute file that spawned the object directly, because that's easy to serialize.

Wednesday, September 25, 2013

Zzzzzzzzzzzzz........

Kind of  slow day today.

The parts of the day that weren't spent staring blankly at the screen consisted primarily of playing around with shader code.

I incorporated spherical harmonics for diffuse lighting and cleaned up to code a little bit, splitting off the Phong lighting code completely.  That whole process actually took way too long and generated way too many bugs.  Most of the problem was getting the diffuse and specular intensities to be something reasonable.

Other than that, I was trying to gather some more reference images.  Why is it so difficult to find decent pictures of Starbuck in the cockpit?  I'm just going to have to go back and watch the entire Battlestar Galactica series again apparently.  (Note to self: do not tempt self.)

That's all I've got for today!

Tuesday, September 24, 2013

The Intersection of Head and Desk

A bit of a slow day today.

I spent much too much time trying to figure out why the normal map on one ship seemed to be working fine, but on a different ship it looked like all of the normals were pointing the wrong direction.

Turns out I had the texture linked to an old (and very wrong) version of the normal map with the same name in a different directory.

Head, meet desk.

Anyway, once that was sorted out, I decided that since I actually have meaningful values coming through the pipeline now, maybe I should use them.  I spent a bit of time getting the Cook-Torrance shader working.  I think it turned out alright:


Actually, I need to get some proper lights in the scene.  It's progress anyway.

I also poked a prodded at the level script, dialog and design.  The first level of the game has a lot of dialog and scripted sequences, so although it will be an excellent test level as far as preparing for the rest of the game, it's going to take a while.

Monday, September 23, 2013

Rendering OD

I spent most of the weekend working on getting the shader graph out of Blender and into the pipeline.

I had expected generating the shader from the graph to be the difficult part, but that was actually surprisingly simple.

A simple emissive shader looks like this:

EngineGlow
{
 Nodes
 {
  Material_Output
  {
   Type = OUTPUT_MATERIAL
   Inputs
   {
    Surface
    {
     Type = SHADER
    }
    Volume
    {
     Type = SHADER
    }
    Displacement
    {
     Type = VALUE
     Default = 0.000000
    }
   }
   Outputs
   {
   }
  }
  Emission
  {
   Type = EMISSION
   Inputs
   {
    Color
    {
     Type = RGBA
     Default = 0.000000, 0.405629, 0.800000, 1.000000
    }
    Strength
    {
     Type = VALUE
     Default = 400.000000
    }
   }
   Outputs
   {
    Emission
    {
     Type = SHADER
    }
   }
  }
 }
 Links
 {
  0
  {
   FromNode = Emission
   FromSocket = Emission
   ToNode = Material_Output
   ToSocket = Surface
  }
 }
}

Because this is just my attribute file format, after loading it, it's trivial to take the Links section and insert them directly into the Inputs section.  Then you just find the MATERIAL_OUTPUT node and recursively work your way back up the nodes, using default values if there is no link.

Each node type typically ends up generating only one or two lines of code, and a handful of variables.  Generating braces at the beginning and end of each node keeps all variables declarations local, and you just pass the name of the variable you want to set to the recursive call.  For BSDF nodes, just emit a function call to a pre-defined lighting function (which all just reduce to Phong at the moment, but that can be dealt with later).

So, the hard part was not generating the actual fragment shader. 

Mapping UV channels initially seemed like it was going to be difficult, because they are mapped by name and Assimp doesn't preserve this information.  Then I realized I could do the name lookup when I export the shader graph, and as long as Assimp preserves the channel order I can just embed the UV index in the attribute file.

The only part that really caused trouble was trying to get the textures to map to materials properly.  The Blender COLLADA exporter only looks at the "legacy" shaders, and only exports a single texture (and maybe a normal map if you have things set correctly).  Ultimately this came down to ignoring textures from the COLLADA file and gathering, converting and binding textures directly from the shader graph.  The only other thing was telling Assimp not to merge materials, because it doesn't have an accurate picture of things.

In any case, this all seems to be working.  It's a lot more flexible that what I had before, and I can actually set proper emissive ranges for HDR.

So, since you bothered to read this far, I give you an in-game screenshot:



Friday, September 20, 2013

The Monte Carlo Algorithm Applied To Game Development

One of the interesting things about being the only person on a project is that everything that might need to be done is your responsibility.  There is no one to pick up the slack, and nothing gets done unless you do it yourself.

I occasionally wonder if my shotgun approach to game development is going to work out.  In case you haven't noticed, I tend to work on a lot of different things from day to day.  Sometimes it feels unfocused, but I'm basically working on the assumption that, as long as what I'm doing needed to be done at some point, it doesn't really matter.  Additionally, if I'm interested or excited by what I'm working on, it will turn out better and faster.  And I'll be happier, which is kind of the point of this endeavor.

So, today ended up being a modeling day.  I did a little bit of script/dialog for the first level, but that was going slowly.  After learning quite a bit about modeling on my latest ship, I decided to revisit my old fighter model (see first post ever) and clean it up a bit.

I actually enjoy 3D modeling.  I really can't draw, but I seem to be OK at making models.  Texturing is kind if fun too, though I'm currently kind of hit-and-miss on that front.

Anyway, the point was, I think having a lot of things to work on at once is a good thing, because if you get stuck, or bored, there is always something else productive you can be doing.  I have had jobs before where I needed to work on a very narrow area, and if you run into a roadblock, nothing gets done. 

Weekend time - not sure if I'm getting a weekend edition out, I don't have any topic in mind yet.  We'll see, have a good one!

Thursday, September 19, 2013

Rendering Addiction

The first step in dealing with your addiction is admitting that you have a problem.

My problem is that the default material options in Blender are somewhat limited, and the node shader system is much nicer to work with.  My other problem is that I am not aware of any way to export the node graph to a standard file format.

OK, I'm not sure how admiting that is going to help me get over my rendering addiction and get back to gameplay programming.  All that did is make me write an exporter in Python so I can get at all of the lovely shader node graph into the pipeline.  If I can find a way to bind the shader UV channel names to the Assimp/COLLADA imported mesh, I should be able to use that to generate a shader.

Then I can hopefully start to work towards some prettier lighting:




Wednesday, September 18, 2013

The Day Nothing Worked

OK, the title isn't entirely accurate.

I actually did get the normal map texture pipeline straightened out.  Multitexturing seems to work OK now, though there are probably some bugs lurking in the darkness.

Things that didn't work today:

The Blender COLLADA exporter seems to have a problem.  It insists that my normal map appears in all materials, even those materials with no textures at all.  Fortunately I can probably fix it because, Open Source and all, but that's not helping productivity.

Then I ran into some really strange problem with various tools getting deadlocked during the asset build.  The shader compiler was getting stuck in fflush(), and a system call to mkdir was hanging as well.  My build tool just runs a Python script, which seemed to work from the command line but not from the tool.  I'm little confused... it was working just fine, must be a problem with capturing the output?  Who knows.

Anyway, I need to stop playing with rendering and do more gameplay.  Let's see how much willpower I have.  :)

Tuesday, September 17, 2013

In the Pipeline

Somehow, today turned into pipeline day.

After putting normal maps on an object, I realized that my texture support in the art pipeline was severely lacking.

Currently, the art pipeline looks like this:

  • Blender is used to create the asset
  • Export to COLLADA format (.dae)
  • Asset compiler (custom tool)
    • Using Assimp to import the data
    • Convert textures for target platform
    • Generate shaders for target platform
    • Write it to a custom game data format
  • Import it into the game
The code for generating shaders with normal maps was somewhat incomplete (translation: not started) and the code for binding textures to shader parameters was pretty dodgy (translation: only supported one texture).
 
This is much improved, though I'm still working on binding the textures to the shaders.  Hopefully I'll get that working tomorrow and maybe finally get back to, you know, making the game part of the game.

Monday, September 16, 2013

Something different...

OK, I'm slowly getting back to work. I haven't been doing much programming recently though - today I'm pretending to be an artist! :)

I spent a bit of time last week putting together some character concept reference. That also involved a bit of character bio writing, and sifting through months of notes - good ideas, bad ideas, ideas I somehow forgot about but shouldn't have, ideas that should probably be deleted, burned, and launched into the sun...

Then, since I didn't really feel like programming, I decided to turn this:


into this:




Not quite there yet, but a decent start.

I've been learning quite a bit about Blender through the process of making a few models.  I was having some trouble with the X-Mirror mode, because subdividing an edge is not a mirrored operation.  

It was easiest to just put a Mirror modifier on the whole mesh and model just the left half of the ship.  The .dae exporter even supports this without having to generate the whole right side first, which was nice.

I also figured out that selecting two adjacent faces and scaling to zero in normal space is a good way to make two faces coplanar.

I also learned a lot about baking textures and materials, but, being Blender, I have now forgotten it again.  :)

Wednesday, September 11, 2013

Do You Remember?

You remember those weeks when you're hyper-focused, there are no distractions and everything is working out the way you planned?

This is not one of those weeks.

Regularly scheduled updates will resume once there is something to update about.

(I did get the line thing working.  And some tweaks to the expressions.)


Monday, September 9, 2013

Lines and Expressions

Short update today, yet another sick day.  Today was mostly some cleanup work. 

I have been working on a "thick line" class so I can draw lines that are more than a single pixel wide.  I'm basically taking a default 1m square plane and then doing the appropriate math to scale and rotate it so that it falls along the desired line, and then rotating it along that line so that it faces towards the camera.  Some shader tricks and I should be able to get a nice falloff at the edges.

I had this embedded in a game-side component, I thought it would be better to move it into a generic Line Component.  While doing that, I thought I would look into binding expressions to the component construction process.  At the moment I'm just using a simple type parser (vector, transform, float, string) to get the values from attribute files.  Using expressions would make it possible to do more complex math, and more importantly, reference variables so I don't have to change the same constant all over the place.


As far as referencing variables, at the moment I only have variables that are bound from C++.  I started on making it possible to declare variables more dynamically - for starters, declaring a bunch of global variables in an attribute file.  This isn't quite ready to go yet.

So, hopefully finish that off tomorrow if I'm feeling up to it.

Saturday, September 7, 2013

Weekend Edition - Update Phases

How do you keep AI, rendering and physics all in sync with each other?

In most game architectures I have seen, you typically have two main "phases" - Update and Render.

During the Update phase, you do all of your game AI, animation, input handling - basically, do anything that is going to change the state of the game over a single frame.  By the end of the update, you want everything to be where it is supposed to be for drawing the frame.

During the Render phase, you want to take all of the objects in the game and draw them where they are supposed to be on that frame.  Ideally, during the Render phase, absolutely no changes are made to the game state.  This is fairly important, because sometimes you want to render exactly the same scene multiple times - if the game is paused, to generate shadow or reflection maps, etc.

So far this seems relatively straightforward, but things are not always that easy.  In particular, to synchronize AI with physics, the order that things happen in during the Update phase can be critically important.  To make sure things happen in the right order, I tend to break the Update phase into several sub-phases:

  • Update - called once per frame
    • Process input
    • Do animation
    • Apply forces to physics objects
    • If the game object transform does not match the physics transform, forcibly move the physics object
  • PhysicsStep - may be called multiple times if frame time is greater than physics step time
    • PhysicsUpdate
      • Called before every physics step
      • Apply forces that vary based on object position / velocity
    • OnCollision
      • You may get collision callbacks during the physics simulation
    • PostPhysicsUpdate
      • Called after every physics step
      • Limit motion, velocities
  • PostUpdate
    • Respond to any collisions you received during the PhysicsStep
    • Update game object positions from the current physics positions

At this point, hopefully you have everything where it is supposed to be and can give the Rendering phase the most up-to-date positions for all of your objects and are ready to get on to the next frame's Update.