LineWars VR blog posts

Dec 23rd, 2017 - Modeling and Texturing Asteroids

Asteroid references

A week or so ago I began looking into creating asteroids for LineWars VR. In the original Linewars II I had and asteroid mesh that had 12 vertices and 20 polygons. I then scaled this randomly and differently in all three dimensions, to create asteroids of various sizes and shapes. However, for Linewars VR I want to have something a bit more natural looking, so I spent a couple of days looking for ideas and tutorials about asteroid generation. I thought that modeling the asteroid mesh by hand would not create suitable variation, so I mainly looked into procedural asteroid generation. I even found a Unity forum thread about that exact subject, so I was certainly not the first one trying to do that. The Unity forum thread did not seem to have exactly what I was after, though. I also found a tutorial about creating asteroids in Cinema 4D, but those asteroids did not look quite like what I had in mind for LineWars VR.

Procedural textures

Finally I found a thread about procedural asteroid material in Blender, which seemed to have results much like what I was after. So, I decided to first look into creating a suitable texture for my asteroid, and only after that look into the actual shape of the asteroid. The example used a procedural texture with Cells Voronoi noise together with some color gradient. At first I tried to emulate that in Cinema 4D, but did not quite succeed. Finally I realized that the Cinema 4D Voronoi 1 noise actually generated crater-like textures when applied to the Bump channel, with no need for a separate color gradient or other type of post-processing! Thus, I mixed several different scales of Voronoi 1 (for different sized craters), and added some Buya noise for small angular-shaped rocks/boulders. For the diffusion channel (the asteroid surface color) I just used some Blistered Turbulence (for some darker patches on the surface) together with Buya noise (again for some rocks/boulders on the surface).

Procedural asteroid mesh

Okay, that took care of the textures, but my asteroid was still just a round sphere. How do I make it looking more interesting? For my texturing tests I used the default Cinema 4D sphere object with 24 segments. For that amount of segments, the resulting sphere has 266 vertices. For my first tests to non-spherify this object, I simply randomized all these vertex coordinates in Unity when generating the mesh to display. This sort of worked, but it generated a lot of sharp angles, and the asteroid was not very natural-looking. Many of the online tutorials used FFD (Free Form Deformation) tool in the modeling software to generate such deformed objects. I could certainly also use the FFD tool in Cinema 4D for this, but I preferred something that I could use within Unity, so that I could generate asteroids that are different during every run of the game, just like they were in the original LineWars II.

I decided to check if Unity would have an FFD tool, and found a reasonably simple-looking FreeFormDeformation.cs C# code for Unity by Jerdak (J. Carson). I modified that code so, that instead of creating the control points as GameObjects for the Unity editor, I created the control points in code with some random adjustments, and then used these control points for deforming the original sphere mesh while instantiating a new asteroid in Unity. After some trial and error with the suitable random value ranges I was able to generate quite convincing asteroid forms, at least in my opinion. This is my current random adjustment, which still keeps the asteroids mostly convex, so I don't need to worry about self-shadowing (as I want to have dynamic lighting, but plan to avoid real-time shadows, for performance reasons):

    Vector3 CreateControlPoint(Vector3 p0, int i, int j, int k)
    {
        Vector3 p = p0 + (i / (float)L * S) + (j / (float)M * T) + (k / (float)N * U);
        return new Vector3(p.x * (0.5f + 4 * Random.value), p.y * (0.5f + 4 * Random.value), p.z * (0.5f + 4 * Random.value));
    }

Exporting procedural textures from Cinema 4D to Unity

Now I had a nicely textured sphere in Cinema 4D, and a nice loking asteroid mesh in Unity, but I still needed to somehow apply the procedural texture generated in Cinema 4D to the mesh deformed in Unity. I first looked into some YouTube tutorial videos, and then began experimenting. Using the Bake Object command in Cinema 4D I was able to convert the sphere object to a six-sided polygon object with proper UV coordinates, together with baked textures.

To generate a normal texture for Unity from the bump channel in Cinema 4D I had to use the Bake Texture command, which gives me full control over which material channels to export, how the normals should be exported (using the Tangent method, as in the screen shots below), and so on.

When I imported this mesh into Unity, applied my Free Form Deformation to it (which meant I had to call the Unity RecalculateNormals() method afterwards), and applied the texture to the mesh, there were visible seams where the six separate regions met. After some googling I found a blog post that explained the problem, together with code for a better method to recalculate normals in Unity. I implemented this algorithm, and got a seamless asteroid! Here below is an animated GIF captured from the Unity game viewport (and speeded up somewhat).

Asteroid shader

After I got the asteroid working witht the Standard Shader of Unity, I wanted to experiment coding my own shader for it. I had several reasons for creating a custom shader for my asteroid object:

  1. I wanted to learn shader programming, and this seemed like a good first object for experimenting with that.
  2. I had an idea of combining both the diffuse texture and the normal texture into a single texture image, as my diffuse color is just shades of gray. I can pack the 24bpp normal map with the 8bpp color map to a single 32bpp RGBA texture. This should save some memory.
  3. I wanted to follow the "GPU Processing Budget Approach to Game Development" blog post in the ARM Community. I needed to have easy access to the shader source code, and be able to make changes to the shader, for this to be possible.
  4. I am not sure how efficient the Standard Shader is, as it seems to have a lot of options. I might be able to optimize my shader better using the performance results from the Mali Offline Compiler for example, as I know the exact use case of my shader.
I followed the excellent tutorials by Jasper Flick from CatlikeCoding, especially the First Light and Bumpiness tutorials, when coding my own shader. I got the shader to work without too much trouble, and was able to check the performance values from the MOC:
C:\Projects\LineWarsVR\Help>malisc -c Mali-T760 Asteroid.vert
  4 work registers used, 15 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   19      10      0       A
  Shortest Path Cycles:   9.5     10      0       L/S
  Longest Path Cycles:    9.5     10      0       L/S

C:\Projects\LineWarsVR\Help>malisc -c Mali-T760 Asteroid.frag
  2 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   9       4       1       A
  Shortest Path Cycles:   4       4       1       A, L/S
  Longest Path Cycles:    4       4       1       A, L/S
So, the vertex shader (which needs to calculate a BiNormal vector for the vertex, based on the existing Normal and Tangent vectors) takes 10 GPU cycles per vertex to execute (so for 266 vertices in the asteroid, this takes at most 2660 GPU cycles per asteroid, probably less if the back-facing polygons have been culled in an earlier step of the rendering pipeline), and the fragment shader (which needs to calculate the TangentSpace normal vector from the normal map and the Normal, Tangent and BiNormal vectors provided by the vertex shader) takes only 4 GPU cycles per fragment (pixel). As my Galaxy S6 (which is close to the low end of the Gear VR -compatible devices) has a GPU processing budget of 28 GPU cycles per pixel, my asteroid is well within this budget.

Nov 28th, 2017 - The Beginning

Okay, so I decided to start working on a new game project, after quite a long while. Many times since I coded and released my LineWars II game, I have been thinking about getting back to coding a new space game. However, I hadn't dared to start working on such, as it seems that all games nowadays are built by a large team of developers, artists, and other professionals. I thought that a single person making a game during their free time would probably not have a chance of succeeding in competition against such big game projects. However, I recently ran across End Space for Gear VR, which idea-wise is a rather similar space shooter as what Linewars II was. Reading the developer's blog, I found out that it was actually created by a single person. As there are not all that many cockpit-type space shooter games for Gear VR, and this End Space seems to be rather popular, I thought that perhaps there would also be interest for a Virtual Reality port of my old LineWars II game!

As the Gear VR runs on mobile devices, it means that the graphics and other features need to be quite optimized and rather minimalistic to keep the frame rate at the required 60 fps. This nicely limits the complexity of the game, and also gives some challenges, so this would be a good fit to my talents. I am no graphics designer, but I do like to optimize code, so hopefully I can get some reasonably good looking graphics running fast. No need for a team of artists, when you can not take advantage of graphics complexity. :-)

Music

I heard about End Space at the end of November 2017, and after making the decission to at least look into porting LineWars II to Gear VR, I started looking at what sort of assets I already had or could easily create for this project. Pretty much the first thing I looked into was music. Linewars II used four pieces originally composed for Amiga 500 by u4ia (Jim Young). He gave me permission to use those pieces of music in LineWars II, and I converted the original MOD files to a sort of hybrid MIDI/MOD format, in order to play the same music on Creative SoundBlaster, Gravis UltraSound or Roland MT-32, which were the main audio devices at that time. By far the best music quality could be achieved from playing the music via a Roland MT-32 sound module. However, the only way to play that hybrid MIDI/MOD song format was within LineWars II itself, and I was not sure if I could somehow record the music from the game, now 24 years later!

After some experiments and a lot of googling, I managed to run the original Linewars II in DOSBox, together with the Munt Roland MT-32 software emulator and Audacity, and record the music into WAV files with a fully digital audio path, so the music actually sounded better than it had ever sounded on my real Roland LAPC-1 (an internal PC audio card version of the MT-32 sound module)! So, the music was sorted, what else might I already have that I could use in this project?

Missä Force Luuraa

That is the title of a Finnish Star Wars fan film from 2002. The film never got finished or released, but I created some 3D animation clips for the movie, as I was just learning to use Cinema 4D at that time (as that was the only 3D modeling package I could understand after experimenting with the demo versions of many such packages). Now as I was going through my old backup discs of various old projects, I found the scene files for all these animation clips. Looking at those brought back memories, but they also contained some interesting scenes, for example this tropical planet. This would fit nicely into Linewars VR, I would think, as pretty much all the missions happen near a planet.

Snow Fall

Back in 2002 I started working on a 3D animation fan film myself, the setting of my fan film "Snow Fall" being the events of a chapter of the same name in the late Iain M. Banks' novel "Against a Dark Background". This project has also been on hold for quite a while now, but I do every now and then seem to get back to working on it. The last time I worked on it was in 2014, when I added a few scenes to the movie. It is still very far from finished, and it seems the 3D animation technology progresses much faster than I can keep up with my movie, so it does not seem like it will ever get finshed.

In any case, I spent a lot of time working on a detailed ship cockpit for this animation project. I believe I can use at least some of the objects and/or textures of the cockpit in my LineWars VR project. This cockpit mesh has around a million polygons, and uses a lot of different materials, most of which use reflections (and everything uses shadows), so I will need to optimize it quite a bit to make it suitable for real-time game engine rendering. Here below is a test render of the cockpit from June 28th, 2003.

Learning Unity

As I haven't worked with Unity before, there are a lot of things to learn before I can become properly productive with it. I am familiar with C#, though, so at least I shoudl have no trouble with the programming language. I have been reading chapters from the Unity manual every evening (as a bedtime reading :), and thus have been slowly familiarizing myself with the software.

Explosion animation

One of the first things I experimented with in Unity was a texture animation shader, which I plan to use for the explosions. I found an example implementation by Mattatz from Github, and used that for my tests. I also found free explosion footage from Videezy, which I used as the test material. This explosion did not have an alpha mask, but it seems that none of those explosion animations that have an alpha masks are free, so I think I will just add an alpha mask to this free animation myself.