Thursday, November 28, 2013

Pushing

So many things going on all at once... not sure which of them might be interesting.  :)

The last couple of days have been spent jumping around all over the place:
  • Reviewing concept character art - I now know what my lead characters look like (!)
  • Writing character dialog
  • Working on the "hyperspace" level design, resurrecting some old test code
  • Arranging music
  • Adding "cylinder" primitive drawing to the render library
  • Putting local variables into the script system
  • Music sync and object spawn sync
  • Dealing with physics issues
The physics issues were perhaps the biggest problem.  Currently, Blender doesn't let you set the mass of anything to more than 10000 units (say, kg) and not less than something like 0.01.  This was causing some problems with low-mass bullets pushing high-mass objects around in unpleasing ways simply because they move fast enough to carry a lot of energy.

I tried a few different methods for dealing with this, but none of them were working very well.  Disabling collision response kind of works, but isn't selective enough and actually lets some objects pass through each other that I do want to respond.

I tried making the object Kinematic, which means it exists in the collision system but doesn't have it's motion affected by collisions.  Unfortunately, in Bullet, it doesn't actually have any built-in motion of any sort, it requires game-side key-frame animation, where I just wanted to set a velocity and have it go.

So, I ended up just making it a regular dynamic rigid body with a really, really large mass.  This may cause weird numerical instability, but my immediate tests seem to work just fine - it doesn't get pushed around by bullets, and nudges the player ship out of the way during collisions.

Then, I was trying to get a constraint to work in the physics system, but it was all going horribly wrong... until I finally figured out that some code forcing the ship orientation was causing the constraint to be violated.  Removing that and just letting physics "do its thing" fixed everything nicely.


I would say there has been some real progress on getting the level done.  Getting music, spawning and rendering all in sync still needs some work, but the pieces are really falling into place now.

Monday, November 25, 2013

Sonic Screwed Up Driver

It was a Doctor Who weekend - meaning, I didn't exactly get a great deal done.  That is to say, I got an immense amount of "research" done, but not a lot else.

I did listen to a few of the bits of music I put together last week several times.  It's vitally important to listen to things you've been recording at 2:00am with fresh ears in the cold light of day.  While I like some of them quite a bit, it will be a while before they're a proper track.

I put what I have into the engine today anyway.  As predicted, horns and strings by themselves show up nicely in the spectrum, while guitar, bass and drums are all kind of squashed into the same frequency band, and drown out everything else.  I will need to play with the EQ to see if I can find something that both looks and sounds good.

In any case, I started doing the script to spawn a few asteroids in time with the music.  That's working OK, except for being slightly out of sync.  The big problem is that the texture scroll and the object motion seem to be slightly out of phase.  Motion in games is inherently jittery, but they're not jittering at the same time, which kind of stands out.  This probably has something to do with the texture being updated based on when new audio buffers become available, and the physics is just updated off the system timer.  I've been meaning to run the game update time off of the audio play cursor, which will probably help, but I imagine there will be some wrangling to make that work.

Also, the whole thing could hardly be called "frantic" at this point... which is fine for the first level of trying to ease the player into the game, but will need to be worked out at some point.  I had some old gameplay tests with a different camera angle and objects moving faster, I'll need to give that a try.

Which will happen shortly after I stop finding interesting* stuff on the internet and actually focus for five minutes.

Which makes game development kind of like a Doctor Who episode.  All running about and getting distracted by nonsense, and then five minutes of brilliance saves the day.  Most of the time.



*not actually interesting, so why is it distracting?

Friday, November 22, 2013

Extra Dimensions

Music AND art day today.  Oh my.

What I learned today:  My non-artist brain likes to think of hair as strands.  This is not at all helpful when trying to make a rough pencil sketch, where you really just want to trace the outline and overlaps. 

Most of the day was spent composing "Hyperspace," a high energy little tune for those times when you are hurtling though extra-dimensional space with reckless abandon and trying not to die in a giant fireball from an asteroid collision.  Seriously, we've all been there.

It's coming along.  I have a few of the basic bits worked out, though there's quite a bit to go, and I'm not "feeling the love" yet.  The main problem with trying to compose a specific song for a specific purpose is that if it's not working, you're still kind of committed.  I start a lot of pieces, but only a handful ever get finished because they'll have one cool riff that just doesn't go anywhere.  We'll see how it goes, hopefully I don't have to reboot it.  :)

The other thing of note is that my giant snare from yesterday is probably a bit airy for something that wouldn't sound too out of place coming from Symphony X or Metallica, so that might need some work.  We'll save that for some heavy, plodding, real-space themed song.  :)

Thursday, November 21, 2013

Heavy Metal Games

It's a music day today.

I spent most of the day figuring out how to create a decent heavy metal drum kit.  I think I'm basically happy with the basic sound now.  The toms could maybe use some work, but the kick and snare are sounding decent.

So, things learned today:

The kick drum doesn't really need much of its low frequency range.  EQing up anything below 75Hz just muddies things up, though you do need a bit of low end.  The interesting parts of the kick actually occur closer to 4Hz and 8KHz, where the "click" actually happens and makes the kick punch through the mix.

To make the giant 80s snare sound basically calls for a gated reverb.  That's basically an effect chain with a reverb followed by a noise gate.  To get the noise gate to work is a bit of a trick, because you want it keyed off the initial sound level, not the reverb level.  I'm using Reaper, which has a fairly easy method of doing this:
  1. Send the snare to a separate track for just the snare reverb
  2. Give that track 4 channels
  3. Effect chain:
    • Duplicate the input channels 1/2 to channels 3/4 (Utility/chanmix2)
    • Add the reverb
    • Add the gate (ReaGate)
      • Set the Detector input to Auxiliary Input

The main trick here is that ReaGate lets you use a channel for volume detection that is separate from the main signal.  There are probably other methods in other recording software, but this works nicely here.

Then I did a little playing with parallel compression, which is just mixing the original uncompressed sound with a heavily compressed version of it. I still need to look into this some more.

This took a lot of tweaking, and then recording a guitar and bass loop so I could see how they sounded mixed together.  It's quite interesting how things can sound vastly different (for better or for worse) one you actually mix them all together.

So, tomorrow will hopefully be composing a test piece for a level and maybe actually getting object spawning synced with it.

Wednesday, November 20, 2013

Title Goes Here

Sometimes, figuring out a clever title for the blog post is the hardest part.  Yesterday's drawing failure, power failure and failure to fail made for a fairly obvious choice though.  :)

Then there are days like today that just devolve into a bug hunt and "oh I didn't implement that yet did I" moments.

I can at least sequence object spawning from a script now.

Other things on today's menu included trying to coax a good heavy metal drum kit out of Native Instruments' Studio Drummer.  I had a kit set up with another plugin that I really liked, but it tends to lose its settings and otherwise behave badly.  The drums in Studio Drummer sound really, really nice, but they're a little subdued for what I was going for.  I'm sure it can be fixed with the proper application of EQ, compression and reverb, but it's going to take some work.

So, now that I'm able to spawn an object, I discovered... I really need to figure out my enemy design.  :)  I'm trying to figure out a basic "building block" piece that can be duplicated assembled into multiple different structures.  I haven't quite figured out exactly what that shape is yet.

Tuesday, November 19, 2013

Fail, Fail, Fail

OK, busy day.  Productive, maybe not so much, but busy.

First, I was playing around in GIMP trying to figure out why I Can't Draw.  Today's discovery:  I can draw horizontal lines relatively well.  Just drawing over the same horizontal line has a very tolerable margin of error.  I can't draw vertical lines at all.  They're not straight, and trying to draw over the same line has like 1000% error.  Minimum.  Tracing circles is interesting too.  There are certain quadrants that I can trace very accurately.  Then I get to, primarily, the lower right quadrant and my pen wobbles all over the place, no matter how slowly or careful I'm trying to be.  This requires more study, because I can't yet detect the subtlety that is causing thing to go wrong.

Then, the wind knocked the power out.  Not helping.  Then I even briefly lost contact with the cell phone network.  Seriously, I nearly started sharpening sticks to go hunt wooly mammoths during this brush with the stone age.

Fortunately this brush with abject terror only lasted about 90 minutes, but the UPS couldn't handle it and my Linux uptime is ruined.


Next, time to do some actual work.  :)  I started working on some language features to make object spawn scripting work better.  Primarily this involved adding some coroutine support so I can make a timed event wait for that time to arrive.  The system works more or less fine for sequencing things without this, but it's hard to get things to trigger only once.

I'm still debating how I want to do certain things.  On one hand, it would be nice to spawn a specific object at a specific place at a specific time with specific AI.  On the other, pseudo-randomly generating an entire level has a certain appeal.  Unfortunately, the latter has had me stalled for much too long and it's time to press on.

The idea was for this first game to fail quickly while I built up technology, so I could make the second a success.  It's not failing quickly enough... or, something.

Monday, November 18, 2013

Always One More Thing. Or Two.

Last weeks daily updates were not exactly daily (and therefore, I suppose, not updates either).  Let's see if this week goes better.  :) 

At least you got to miss all of the updates that can be roughly summarized as: "I don't feel well today, YouTube, brain fog, Twitter, I can't figure this out, Facebook, what was I doing again? twitch.tv, this is never going to work, oh I got something working when did that happen?"

I got most of the major overhaul of the scripting system done, so I can basically map any C object directly into the scripting system.  I'm sure that's giving some security expert somewhere palpitations or convulsions, but since it's all bound to specifically named objects and data members, it should be fine.

Overall, this is actually kind of handy, although I'm still not "feeling the love" for a good way to define level scripts.  Somehow, writing out every enemy spawn, by hand, at specific locations, just feels a bit meh.  A quick perusal around the web of methods other people have used pretty much comes down to exactly that, though.

So, I suppose it's time to actually do some real scripting.  But first, I need to... make a couple more tweaks to the scripting system (you didn't see that coming, did you? :-p )

No, really, they're good ones.

First off, I'm kind of trying to sync the levels to music, so I'm going to add something to natively convert measure/beat time codes to seconds.  Easy enough.

Second, one of the main suggestions I have run across regarding scripting enemy spawns is that hard-coding the numbers is bad because if you want to shift sections around, you have to change a lot of numbers.  So I'm going to build in the concept of time regions.  Basically, there is a global time based on how long the level has been running, and you can recursively open a new local time scope relative to the parent time.  I've used this before, it's pretty slick for sequencing or animating things.  It should make it fairly easy to move around sections anyway.

With those things and a Spawn() function, hopefully I'm off and running.

Why it took so long to get to this point is one of the great mysteries of the universe.  I suspect it has something to do with the universal programming equation:

t = t * 2
where:
t is how long you estimate something is going to take and
2 is some constant, possibly larger than -INF and smaller and +INF, but quite possibly NAN.

Being recursive, it causes some consternation amongst project managers and people trying to get things done.

Wednesday, November 13, 2013

The Code That Binds

I have been spending the last little while trying to get the data binding stuff worked out.

After several false starts, I have finally settled on using the expression parser / compiler to do the data binding.  Ultimately, this seemed to make the most sense.  Copying data from one arbitrary place to another is basically what a script is supposed to do, plus it gives some additional flexibility.

Quite often it seems that the best approach to solving problems is to ask, "how do I want to interact with this system" rather than, "how do I want to implement it?"  Basically, I decided I wanted to specify data binding something like this:

    ScriptComponent
    {
        Context
        {
            src = BezierFunctionComponent
            dst = SpawnComponent
        }
        Script = dst.Rate = src.Value(t)
    }
That basically lets me map specific components to variable names in the script environment, and then operate on them.  Since this all gets compiled at load time, and the above example only generates a tiny number of nodes in the parse tree (say, 4 or 5), this should execute insanely fast and be very flexible.

Unfortunately, the variable system I had in place was somewhat limited.  So, I'm about half way through getting a proper symbol table system implemented.  This basically lets me memory-map a component into the script system, with the added benefit of putting structure definitions and stack frames (local variables) into the scripting system in general.

Still a ways to go, but making progress.

Friday, November 8, 2013

Time to Reflect

It's time to start putting some enemies into the game.

This raises some questions about what exactly is the best way to do that.

The basic components:
  • Enemy spawner - creates enemies from one of several "prototypes"
  • Enemy path - curve enemy should follow
  • Path chooser - binds one of multiple possible paths when an enemy spawns
 I've been trying to figure out the best way to implement this in the component system. 

Ideally, each of the path components would exist separately from the enemy object, so they only need to be created once.  The Bezier curve component does a lot of pre-tessellation of the curve that I don't want to duplicate every time an enemy is spawned, so it can't live directly inside each enemy instance.

I already have a fairly decent system for binding objects by name, so I can probably use that.  The question is whether to create a unique GameObject for each path and link to the GameObject, or to embed a set of the BezierPathComponents in a single GameObject.  The system for handling multiple components of the same time in a single GameObject is slightly underdeveloped, but this is a pretty obvious way to create "sets" of paths.

The next thing then would probably be to create some sort of "chooser" component that can select a particular component at random and plug it into the GameObject.

Along this line, I've also been wondering about the best way to animate values in certain components.


For instance, in the spawner itself, there is a Rate value that specifies the number of GameObjects to spawn per second.  Ideally, I would like to animate this value on a curve so I can essentially turn it on and off at different points in the level.  I would typically have changed the Rate variable from a float (effectively constant once the object is spawned) to a different type that can either bind to a curve or script or something similar and be evaluated each time it is referenced.

I'm thinking this is the wrong way to do things.

What I want is a separate component that animates the value and writes it into the Rate variable directly.  The advantages to this are that any variable can potentially be animated, and there is no overhead for variables that are not animated.

The trick is to do the data binding properly.  I don't really have any sort of reflection available, so I don't have any data-driven way to bind to a specific member variable by name.  I either need to add that or find a good place to add custom code that binds the source and target together. 

I'm not exactly enthused about doing either of those :) but some sort of reflection interface seems like the "correct" choice.  Ideally I could hook that up to the construction code that pulls things out of the attribute files too, but it will have to be introduced piecemeal, because I'm not going back to put it everywhere today.

Thursday, November 7, 2013

Music Collection

Exploring gameplay possibilities today.

Visualizing the frequency graph of music was working out pretty well, and finding the peaks in the music, but it wasn't exactly interactive yet.  In fact, pretty much all of the data about the peaks only ever existed inside the shader, so...

I moved a bunch of the peak funding stuff back to the CPU and store it so I can use it game-side.

First experiment:  fly around and collect the peaks!  My ship can now fly around and carve a swathe through the audio data.  Current it's just using the bounding box for the ship.  It shouldn't be too difficult to give it some sort of masking pattern, but let's see how this works out first.

First problem encountered:  ship acceleration and top speed are way too high for precision maneuvering.  I knew this before, but it wasn't a big concern until I needed to stop at a very specific location.  The biggest problem is the minimum distance you could move was a little bit more than a ship's width, and the desired precision is a lot smaller than that.  So, I added acceleration curves to the ship and tweaked some of the values there.  It probably still needs more work, but it's definitely easier to handle now.

Next problem encountered:  this is kind of cool, but I'm not sure it's exactly fun yet.  :)  I think I want to look into bits you do want to collect, and bits you don't want to collect, and maybe the ability to shoot bits away.

I also looked briefly into spawning objects based on the frequency data, but that was resulting in too many objects, or badly spaced objects.  It probably has a place, just needs more control over the spawn rules.


So, next, can we make this fun?  :)

Wednesday, November 6, 2013

Frequently Analytic

Still monkeying around with frequency analysis, trying to brainstorm up interesting ideas of things to do with all of this data I'm generating.

I finally worked out a really good method for getting some very nice, representative data for the "main" frequencies in music.  It was actually pretty easy - basically, taking the difference between the level of a particular frequency and the RMS of all frequencies on a particular frame really makes the peaks jump out (with some logarithms and scales to smooth things out).  I was previously using a rolling average of each frequency band, and while it worked "under laboratory conditions," it was just unreliable.

One slight annoyance is that at 44.1kHz, smallish FFT window sizes (say, 1024 samples) mean that each frequency band is 22Hz in width, which is fine for high frequencies, but is much less than the difference between adjacent notes at low frequencies.  Equally unfortunately, you get frequency data all the way out to 22kHz, when most musically interesting stops around 4kHz.

So, I'm using an FFT window size of 4096 and taking the lowest 256 samples, which gets me fairly meaningful data out to 5.5kHz, while expanding out the lower frequencies.

The next interesting thing is that different styles of music and mixing can apparently give wildly different visual results:

Volbeat - Heaven Nor Hell
  • Kick / Snare / Bass / Guitar all get kind of muddled into the same narrow low frequency bands
  • Drum-only sections really jump out though
  • Singer's voice shows up amazingly well - interesting harmonic ringing as well (why?)
  • Harmonica creates some amazing high frequency sections that wobble back and forth :)
Star Wars - Main Theme
  • Orchestras apparently cover a HUGE frequency range with meaningful music frequencies.
  • There are peaks across the entire spectrum that clearly correspond with the music.
  • Loud and quiet passages are also clearly visible.
  • Interestingly, there is almost nothing below ~80Hz, an area clogged in
Laura Shigihara - Jump
  • The instruments give amazingly clear individual notes
  • Occasionally her voice jumps out, but most of the time it gets lost in the other instruments
  • Sometimes, though, I can see two parallel frequency bands corresponding with vocal harmony parts, which is cool :)
  • There's actually not much OTHER than peaks here, very clean and quiet frequency map.
Me - Rock7
  •  Kick / Snare / Guitar / Bass are just kind of jumbled into the low bands
  • The artificial harmonics really stand out...
  • But there are actually a huge number of peaks that jump out semi-randomly across the entire spectrum.  They kind of correspond with the lead guitar, and maybe cymbal / hi-hat hits.
  • Unfortunately, the lead guitar kind of gets lost among the noise.
Some of my other rock tracks exhibit the same sort of frequency clustering below 200Hz.  I suspect this is maybe just mixing low frequencies too hot, pushing up the RMS such that only these frequencies get above it.  There are some higher frequency synth sections that manage to jump out of the mess and show up very clearly though.

It will probably take a bit of mixing experimentation to figure out what works well here.  Actually, I could probably consider dropping everything below 60-80Hz from the RMS calculation... will need to investigate that...

Tuesday, November 5, 2013

Visualization

More visualizing music today.

I started off getting the music synced to the rendering.  This wasn't too difficult, but I need to read ahead quite a large amount of the audio file to "fill up" the data buffer before the music starts.

Then, I spent most of the rest of the day playing around with different settings to see if I could find something I like, and that will work for multiple different music files.

Mostly, I want to detect the current peaks in the frequency data, and detect when beats occur.  This is a little bit difficult, as frequency data is kind of messy and noisy, but maintaining a running average for each frequency and seeing when the current level is above the average, seems to work reasonably well.  There is a little bit of magic scaling going on to produce more or fewer peaks, and it seems to be difficult to pick a value that works nicely for all files.  I suspect I can probably automatically tweak it to gradually move towards  a target number of peaks-per-second.

Now, to figure out what to do with that.  There is a game in here just waiting to get out, I can feel it.  :) 

Monday, November 4, 2013

Audio From Before Time

Time for a change of pace.

Several months ago I did some prototyping work on using music files to, to some extent, generate levels.  I've been planning on using this for various later parts of the game, but since I'm feeling a loss of momentum at the moment, it seemed like a good time to go back and bring some of that code up to date.

Since this code predates the last major engine architecture change (and before I started blogging regularly), it's fairly out of date.  At the time I was using a horrible, cobbled-together .wav streamer.  Since then, I completely re-wrote the audio streaming system (now supporting .ogg files, yay) and now have a much better system for dealing with streamed audio buffers.  Today has mostly been a bunch of cut-and-paste to move everything into a proper component and hook it up to the streamer callback.

So, that's coming along, and now sends everything through the FFT to get a spectrum out the other end, and dumps it into a texture so I can actually visualize the audio. 

I had a few new ideas for gameplay with this system.... I'm not sure how I work it into the fiction of the what I want to do for the rest of the game, but I'm sure that can be worked out.  :)  I'm half-inclined to make it a game of its own and maybe get something actually, you know, done and released.  We'll see about that...

Friday, November 1, 2013

Shaders and Pipeline

Yesterday was a bit of a distraction with Halloween, and hardware upgrades gone awry (not to my PC, my PC is still sufficiently awesome).  That's all dealt with, so back to some real work today.

I was still unhappy with the way materials and shaders were coming out of the pipeline, so I spent most of today cleaning all of that up.  It makes it quite a bit easier to modify specific shader parameters on specific materials, which I really need.

The main problem is that when I converted the shaders to the shader node / graph system, I was basically baking the shader constant values directly into the shader.  This basically meant there was one shader compiled for every material on every mesh.  On one hand, this wasn't so bad, because I don't have that many models really.  On the other, it's just awful.  :)  So, I put in proper shader sharing within a single model and added material / shader constant blocks that are set (and potentially modified) at runtime.

That's coming along well.  There is maybe a little cleanup required, and a few game-side functions to add to access the material data in a friendlier fashion, but that should all be done by the end of the weekend.