This game has been Greenlit by the Community!

The community has shown their interest in this game. Valve has reached out to this developer to start moving things toward release on Steam.

Astrobase Command
July 24 - Jellyfish Games


Hello everyone!

This week was a quiet one so far on the programming side. I had to attend to other business to do my part in keeping this show running. Luckily, my weekend is wide open for wrapping up connectors once and for all (for a little while at least). So look forward to more of that next week!


Hi Everybody!

I’ve been busy working on the heads this week as well. Finished up the tests I was doing and started in on the procedural textures, making all the masks and details it will need. Fun work but not really much to talk about or to show.


Hey guys,

This week I’ll talk a bit more about missions, as it is the major feature I have left to do before we are ready to launch the early access version.

To reiterate previous posts, the purpose of missions from a game perspective is several-fold:

1) Construct and tell compelling stories about the characters that are sent on a mission
2) Close the core systems game loop. Note this necessitates using a reward system, and also balancing that reward vs the risk of a mission (which in turn needs to be extracted from the planet parameters). A reward system is necessary to give players a reason to send characters on missions in the first place. The simplest “loot” examples are resources, food, medicine, parts, etc. But more importantly there is character progression, and specifically personality trait progression which in my opinion is the most interesting one.
3) Give a purpose to the various skills, stats, and traits that comprise a character, and the gear they are assigned, to give them meaning in game terms beyond the on-station gameplay.

Suffice to say, there are a lot of moving parts in the mission system, and it’s above all the one feature we need to absolutely nail in terms of both fulfilling its goals and doing so with a fun and interesting game experience.

Up until now, I’ve been working on various prototypes which test and validate individual aspects or functionalities of missions. This includes narrative generation systems that describes what a character does (and then the system that actually does it), conversation generation between characters, as well as mission structure and flow, and the various activities characters do such as combat, mining, landing, etc. And also all the auto-generated planets and planet features upon which the mission is taking place, which is the setting and backdrop and must be interesting and compelling in its own right. It’s not fun to explore a bunch of similar and samey locations, where it’s obvious and predictable how game elements are reused!

Now that I am satisfied with the results of these prototypes, I took some time to review the lot of them, and start planning the final mission implementation. So for example, the generated story system was using data I had hand-written (so as to isolate my variables) but now I need to feed it the auto-generated data of planet parameters which I tested in an entirely separate code module. And before that, taking a second pass on the planet location data and their organization within the systems, so I can generate good stories from that using the storygen system. Everything is tied together, and it should be.

In short, what’s left to do is putting it all together into a cohesive feature. The very first step here is defining the entire list of things which need to be (re)designed, (re)implemented, and/or integrated based on how their prototypes played, which I did last week.

Typically with prototypes, you test something to make sure it works, it is fun, and the game dynamics and aesthetics fulfill the vision. But in order to isolate that something, the prototype needs to live in a self-contained box to ensure it is a proper test. Then when you take it out of the box to integrate into the larger game, there are always adjustments that need to be made so the pieces fit with the other toys in the other boxes.

This week I’ve started grinding through the list of systems now that I am confident what the pieces need to look like, how they connect, and how their final put-together form should play.

On a personal note, I took a few weeks and moved to beautiful Nanaimo. There were a lot of reasons for this move, and I’ll tell you the one which occupied my thoughts.

My observation from working in the games industry is that the environment in which games are made at a human level (at least everywhere I have worked) is contrary to good games being made. It is counter-productive to live in a concrete and glass office building from 9-to-6:30 and “be creative.” I could ruminate on the intricacies of this subject for at least an entire post, but I’ll spare you the think piece and simply say that as long as I’m working from home remotely I might as well live in a place I find inspiring. And if one believes games to be creative works, then it is also logical to assume that this matters to the results.

June 22 - Jellyfish Games


This week I jumped in and attacked one of those strange beasts that lie at the intersection between art, design, and code. A GUI (Graphical User Interface). We tend to split these up and do a bit of the work each. However this one was heavily dependent on work and systems I had designed, and Adam was busy with other super important code stuff, so I have been spending my time implementing the Species Creation UI. Or at least the first pass of it.

It’s a pretty intricate piece of UI. In part because it is tied to (and controls) the massive substance that generates portraits, and in part because it needs to have some UI elements and concepts that are non-standard.

This is one of the more interesting parts of working on a small team, you cant just hand stuff over to others once the task leaves the most narrow definition of your discipline but you sometimes end up taking it all the way.


This past week, I’ve been focusing on setting up some simple game settings. I specifically gave myself the ability to disable graphical features so my terribad computer could run it smoothly. I’ll be spending some of the weekend integrating Daniel’s connector graphics to really tie together the station graphically.


his week I wrote the sentencing parsing for generic actions that a character can execute on a mission. So, for example if a pilot avoids a plasma storm while landing on a planet, this is structurally the same in code as a character exploring a planet scaling an ice wall. They both have the form of subject, modifier (quality), operative verb, object. This is after a successful result has been generated that allows the character to execute the action.

My next step is writing the failure cases (unsuccessful result), which cannot be generic. If a pilot doesn’t manage to avoid the plasma storm, or if a character slips and falls while scaling an ice wall these are vastly different things. But I find failure more interesting than success, and opportunities for recovery or partial recovery (the character uses an ice pick to stop his fall, but now he’s stuck on the side of a cliff) to be where I want to spend my time.

I’ve probably posted before about having a system under which characters can be “creative” based on their skills and personality. But this is why each system needs to be custom per action category — how a character can be creative when trying to keep the shuttle from crashing is different than how a character can be creative when trying to survive the ice storms. I suspect some general patterns might emerge, and we’ll see what I can reuse.


Greetings fellow Living Beings!

Today I come to you with a bag of animations. As I’ve previously mentioned we’re doing indie-style mocap. This means that I, using a Kinect and some super nifty software, record myself doing things (without having to wear a skintight suit with ping-pong balls glued on it, sadly) that then get interpreted, stabilized, cleaned, scrubbed, massaged and generally processed to get something that’s usable in-game.

Some examples of the animations in-engine

Now, this is actually more tricky than it first appears, not because of the different skills needed for the various software involved but for the fact that due to all that processing to get rid of jittering you end up losing a lot of the finer detail in the movement. Especially things like head movement, nodding, shaking, and so on gets near impossible to get through without exaggerating a lot.

Of course for our purposes, with the camera not really necessarily zoomed all the way in on a character this actually works out pretty nicely as the exaggerated motions read a lot better when at a distance than a realistic version would.

Once all the animations are recorded, you bring them into a software that interprets what you are doing, or more specifically what positions and rotations your limbs have, per frame. Once that has finished processing the data is pretty good but it still needs some fixes. For example, in occasions where the tracking drops the bones in question freak out and can go end up flipping around like crazy. Another thing, which actually is much more common, is when you rotate the pelvis it tends to while keeping the feet planted at the heels rotate the feet to face the same direction as the pelvis. Clearly both of these examples look pretty silly but are relatively easy to fix, so into an the animation software we go and fix those things and try to get rid of any remaining jittering or general wobbliness.

At this point you are ready to get it in-game. Which in Unity, using Mecanim’s retargeting, is pretty easy and straightforward.

And that’s the story of how we now have 200+ additional animations without having to hire an animator for half a year. Hopefully the quality is high enough to be acceptable.

Adam[/b[ this past week has been spent in tools mode once again. since daniel started using unity 5 for his art integration and tweaked our lighting, we haven’t actually had a chance to integrate the new sections yet. so, i’ve been working to smoothen out the process of integrating everything into the game. i’m also taking the opportunity to address some workflow issues we’ve had in that department in the past. our current setup tends to break everything when daniel reimports his models into unity. so i needed to modify our approach to storing and loading rooms so this wouldn’t be an issue in the future. luckily, the solution also comes with a nifty performance gain when loading rooms, which is a welcome bonus! [b]

This past week has been spent in tools mode once again. Since Daniel started using Unity 5 for his art integration and tweaked our lighting, we haven’t actually had a chance to integrate the new sections yet. So, I’ve been working to smoothen out the process of integrating everything into the game.

I’m also taking the opportunity to address some workflow issues we’ve had in that department in the past. Our current setup tends to break everything when Daniel reimports his models into Unity. So I needed to modify our approach to storing and loading rooms so this wouldn’t be an issue in the future. Luckily, the solution also comes with a nifty performance gain when loading rooms, which is a welcome bonus!


I dug into Adam’s new debug command console, and it’s fantastic! It allows me to implement features with complicated inputs before we have the GUI. For example, spawning procedurally generated items (with parameters) and equipping to individual characters with text console commands, which is needed for missions.

Hey everybody! Adam here. Sorry again for the delay on this update! I had a 48-hour game jam this weekend and it ate into my posting time on Friday and completely fried my brain for Monday. Now, oooon with the show!


Hello fellow humans!

This week I’ve actually spent with more behind the scenes boring stuff that isn’t very interesting so instead of that I’ll talk a bit about the character work I did the other week.

Some examples of the updated characters

Now, there are two major parts to this work with a common goal. The goal was to update the materials to use the new Standard Shader (PBR) in Unity 5.

The first step in doing this was to expand the Substances (procedural textures) to not only output Metallic/Smoothness maps but also to add meaningful controls. The way the Substances for the characters work is that they take RGB-masks to define what area is what so the substance knows where and how to apply colors, extending this to the specular term (Metallic/Smoothness in this case).

Doing this allows setting for example the trousers to be made of shiny metal if we’d want to. Most of the time only details and trims are set to metal since, well, it looks a bit overpoweringly cheesy to be walking around in silver and gold clothes.

The second step is to add Normal maps, and extend the same type of control to them as the Albedo and Specular terms before it. Now I know, normal maps have been around for a long while and were supported prior to 5. The reason we’ve held back on adding them is that its a fair amount of work to add them.

While adding the technical side of it is pretty quick, the main chunk of work is in manually sculpting the clothes. This includes sculpting in seams, details, trims, folds, buttons, pleats, wrinkles of many kinds, and so on.

After the details are sculpted in they are baked out into a Normal map that then passes through the relevant Substance where we can control how much each part of the clothing is impacted by its sculpted details.


This past week was a short one for me. Between other commitments and the game jam, I only really had a chance to address part of the connector placement flow. I should have more to show on that in this week’s update, though!

April 24 - Jellyfish Games


Hi Everybody!

Since we last updated you on what was going on a lot of work has been done. For me it’s been split into two main chunks of work, space and characters. For the characters I’ve been adding detail and updating their procedural textures to fully take advantage of the new possibilities in Unity 5. Space on the other hand has been more of an extension than an iteration.

Instead of as previously only having an earth-like planet we now have gas planets, ring systems, rocky planets, and moons. Today I’ll talk more about the moons.

Some moons just hanging around

So, the way our moons work is that we feed them several layers of surface features (like craters and cracks etc), the Substance then takes that and with a ton of variables available for tweaking outputs the appropriate albedo/metallic/roughness/ normal/height/ambient occlusion.

This allows us to not only set up several different moons and re-use a lot of the masks to avoid having to make one unique texture stored on disk for each moon depending on its composition but in fact means that if we want to we can interpolate between any number of these exposed variables and generate an almost endless amount of moons or rocky planets.

Of course the risk you run with making a very reusable procedural system is that it might get repetitive and bland, hopefully our solutions are enough to have it feel fresh and high quality.


These past few weeks have been spent hammering away at tools, stabilizing code for playtesting and more Datapad Programs, all in anticipation of the madness that will be Early Access once the first build is out.

Tools work has included basic audio management, a new approach to saving data that will be the groundwork for our save files, solutions for combining character meshes, as well as runtime combining for merging things like rooms together, a tool for placing volumes in a room that can be automatically populated with knickknacks. But, my favorite so far (probably because I’m a programmer) is a tool to bring up a debug command-line terminal in the game to allow me to execute code at will for testing purposes.

I’m also in the middle of working on module connectors. These are currently our only station piece that allows the player to connect modules on different “floors”. The way I’m approaching it allows me to extend its functionality to things like solar panels, defensive turrets and the like in the future.

Finally, I completely re-worked how we manage our UI in the Datapad. This made them more robust and testable, which will come in handy when everyone start breaking our program menus in earnest :).

See you next week!

< 1  2  3  4  5 >
Showing 1-5 of 21 entries