LineWars VR Blog Posts

Oct 19th, 2018 - Cruiser Bridge Work

I was a bit sick during the end of September, so I got somewhat behind schedule with my project. It does look like my internal goal of getting the game done by the end of this year would not have happened even if I had not gotten sick, as I still have so much work to do. So, the new goal is to get the game released before the next summer. Anyways, during the past month I have mostly been working on the Cruiser Bridge object, because in several missions you can pilot a battlecruiser in addition to the fighter ships.

Teleport Between Different Types of Ships

The first step in making it possible to pilot a battlecruiser was to enable teleporting between different types of ships. Same as in LineWars II, in LineWars VR you can teleport between all the friendly ships, and you also get automatically teleported ("Emergency teleport!") to another ship when the ship you are currently piloting gets destroyed. Teleporting to a different type of ship (for example from a Cobra fighter to the battlecruiser) needs all the cockpit mesh objects and instruments to get switched over, and also the camera position needs to switch to the correct position in the cockpit of the new ship.

I solved this problem by having all the needed cockpits and their instrument objects in the scene, with the not currently active cockpit disabled. I changed my instrument handling code to have an array of InstData classes, each of which contains the necessary data (including pointers to the scene objects, the camera positions, UV coordinate indices and so on) for a certain cockpit type. When the user then teleports between different ship types, I switch the index into that InstData array, while activating the scene objects of the new InstData item and deactivating them for the old InstData item. This way I am always handling only the instruments of the currently active cockpit type.

Additional Shadows to Cruiser Bridge

After generating a neat procedural shadow system for the Cruiser object (as described in the previous blog post), I began experimenting with some additional vertex-specific shadows to the Cruiser Bridge model as well. Back in April I had created a shadow system where all the windows and window struts are handled, so that only the areas where the sun is shining through the windows are lit. However, this still left all the other parts of the bridge that should cause shadows unhandled. For example, it looked pretty fake when the support leg of the weapons officer instrument console caused no shadows on the floor, even though the floor had shadows from the window struts.

The original fragment shader from April had performance like this:

  4 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   52      6       1       A
  Shortest Path Cycles:   1       1       1       A, L/S, T
  Longest Path Cycles:    8.5     6       1       A
I began optimizing the fragment shader, using the tricks I learned when coding the Cruiser fragment shader, like using abs() function when possible with the shadow checks. I also moved the dot(v.vertex.xyz, _ShadowsLightDir) calculation from the fragment shader to the vertex shader, as that changes linearly within the polygon. I then added the additional two-plane shadow checks using code from the Cruiser object, like this:
    if ((abs(i.shadowPos.x) <= i.shadowData.x && abs(i.shadowPos.y) <= i.shadowData.y) ||
        (abs(i.shadowPos.z) <= i.shadowData.z && abs(i.shadowPos.w) <= i.shadowData.w))
        return shCol;
I was very happy to notice that I was able to keep the performance of the new code exactly the same as the original code, even though the code is now able to handle two extra shadow planes in addition to the window shadows for each fragment!
  4 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   44      6       1       A
  Shortest Path Cycles:   1.5     2       1       L/S
  Longest Path Cycles:    8.5     6       1       A

Okay, so the fragment shader change was a simple one, but the vertex shader needed a lot more work, as I needed to add the shadow texture lookups and the shadow plane calculations. I decided to test a system where the vertex shader itself calculates the sun direction "quadrant", instead of having separate sets of color and UV2 arrays like I had in the Cruiser object. I thought I could get by using much fewer shadow planes here, so I split the 256-column texture into four 64-column blocks, depending on whether the sun is behind, in front, left, or right of the ship. I did not think I needed to worry about the up/down direction, as the windows are mostly on the upper half of the cruiser bridge, so if the sun is below the ship, there are not a lot of objects causing extra shadows anyways.

As the texture only has 256 different values per pixel, I needed to figure out how to map these values to the cruiser bridge coordinates. The bridge is 12 meters wide, 3.75 meters high, and 7 meters deep (or actually even deeper, but I am only interested in the area that the pilot is normally seeing). After looking at the structure of the bridge, I noticed that the highest items that would cause shadows are only 245 cm above the floor, so I could neatly fit the Y coordinate into 256 items if I used 1 cm granularity. Also, any objects in the Z direction causing shadows are between 0.6 and 5.7 meters along the Z axis, so that 5.1 meters range would fit into 256 if I used 2 cm granularity for the Z coordinate. This still left X coordinate, with the difficult 12 meters range.

I decided to split the X coordinate into negative and positive halves, as the bridge is very symmetrical. I also decided not to have any shadow-causing objects on the far edges, so I could use the same 5.1 meters range with 2 cm granularity for the X coordinate. This just meant that I had to use some extra code in the vertex shader to determine whether the X coordinate should be negative or positive. For the other axis I could simply choose a suitable zero position, for the Y coordinate this is naturally the floor (at -2.25 meters in my object), and for the Z coordinate it was the 0.6 meters Z position.

In my Cruiser vertex shader, I had used three separate plane configurations, with the planes always aligned by the coordinate axis. However, as I already had angled shadow planes for the side windows of both my Cruiser Bridge object and my Cobra cockpit, I thought I could try to use that full plane equation instead of forcing the shadow planes to be axis-aligned. The problem with this was that the plane equation has a constant d term, which should have the full float accuracy, so I could not have that value in the texture. At first, I thought about adding another UV coordinate set to handle this value, until I realized what I could actually calculate the d term in the shader!

The d term of the plane equation is actually the negative value of the dot product of any point in the plane and the plane surface normal. I could use the center point of the plane (which I would need in the shader anyways, to be able to use the abs() method of checking the plane extents) as the "any point", and the surface normal should be a unit vector, so it could be put into the texture, same as any standard Normal Texture that Unity uses.

Next, I spent some time simplifying the equation for the vertex projection onto the shadow plane that I would need in the vertex shader. These are the terms I use in the following equations:

V = vertex (point), N = plane normal vector (unit length), C = plane center (point), L = light vector (unit length)
As described in the algebraic method for the Ray-Plane intersection, the starting point for projecting the vertex onto the shadow plane (in other words, determining the intersection point of the plane and the ray starting at the vertex and following the light vector) is this full equation (where the constant term d is replaced by the full -dot(C,N)):
V + (-(dot(V,N) + -dot(C,N))/dot(L,N)) * L
For my purposes, I still needed to subtract the plane center point from that result (to center the interpolators around the plane in order to use abs() less-than checks for the plane boundaries), so the actual equation I used was the following:
interpolators = V + (-(dot(V,N) + -dot(C,N))/dot(L,N)) * L - C
Writing it out for the X coordinate (as an example) produced the following equation:
X interpolator = Vx - (Vx*Nx+Vy*Ny+Vz*Nz-(Cx*Nx+Cy*Ny+Cz*Nz)) / (Lx*Nx+Ly*Ny+Lz*Nz) * Lx - Cx
Looking at that equation I noticed there were a few duplicated terms, and thus I was able to simplify the equation by subtracting the center point from the vertex separately:
X interpolator = (Vx-Cx) - ((Vx-Cx)*Nx+(Vy-Cy)*Ny+(Vz-Cz)*Nz) / (Lx*Nx+Ly*Ny+Lz*Nz) * Lx
This reduced the number of dot products from three to two, also getting rid of the constant d term in the process. Here below is the actual vertex shader code, where you can see these equations being used. The first line of the code calculates the (horizontal) index into the texture, based on the quadrant of the light vector and the input green channel of the mesh color. I set up these mesh colors in my MeshPostProcessor code that gets run when the mesh gets imported into Unity. I also generate the texture image in this code. The first row of the texture contains the plane boundary extents, the second row contains the first plane normal and center X-coordinate, the third row similarly the second plane normal and center X-coordinate, with the last row containing the Y and Z-coordinates of the plane centers.

    // Calculate polygon-specific shadows
    fixed idx = v.color.g + sign(_ShadowsLightDir.x)/8.0 + 0.125 + sign(_ShadowsLightDir.z)/4.0 + 0.25; // Texture index + light dir quadrant of the texture to use
    // Shadow extents are in the order xzxy (sort of like using Y-plane and Z-plane)
    o.shadowData = tex2Dlod(_ShadowTex, float4(idx, 0.1, 0, 0)) * float4(5.1, 5.1, 5.1, 2.55);	// Y has 1 cm granularity, other axis 2 cm
    half4 n1 = tex2Dlod(_ShadowTex, float4(idx, 0.3, 0, 0)) * half4(2,2,2,5.1) - half4(1,1,1,0); // Plane 1 normal + X center
    half4 n2 = tex2Dlod(_ShadowTex, float4(idx, 0.6, 0, 0)) * half4(2,2,2,5.1) - half4(1,1,1,0); // Plane 2 normal + X center
    half4 c = tex2Dlod(_ShadowTex, float4(idx, 0.8, 0, 0)) * half4(2.55, 5.1, 2.55, 5.1) + half4(-2.25,0.6,-2.25,0.6); // Plane 1 yz and Plane 2 yz
    // We are only interested in the vertex position relative to the shadow plane center
    half2 t = v.vertex.x >= 0.0 ? half2(n1.w, n2.w) : half2(-n1.w,-n2.w);  // Plane X-center signs follow the vertex X-coordinate signs
    half3 a = v.vertex.xyz - half3(t.x, c.x, c.y);
    half3 b = v.vertex.xyz - half3(t.y, c.z, c.w);
    // Project the vertex onto the shadow planes
    a = a - dot(a, n1.xyz) / dot(_ShadowsLightDir, n1.xyz) * _ShadowsLightDir;
    b = b - dot(b, n2.xyz) / dot(_ShadowsLightDir, n2.xyz) * _ShadowsLightDir;
    o.shadowPos = half4(a.x, a.z, b.x, b.y);

The vertex shader performance before adding all these lines of code was like this:

  8 work registers used, 8 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   32      23      0       A
  Shortest Path Cycles:   17.5    23      0       L/S
  Longest Path Cycles:    17.5    23      0       L/S
After adding the new code, the performance changed to this:
  7 work registers used, 7 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   53      21      4       A
  Shortest Path Cycles:   28      21      4       A
  Longest Path Cycles:    28      21      4       A
I was actually able to save on the load/store operations (which are the more critical GPU cycles, as they are subject to possible stalls due to cache misses). The arithmetic operations increased quite a bit, though, so it is now even more important to try and keep the vertex count of my Cruiser Bridge as low as possible.

Weapons Officer Avatar Work

The Cruiser Bridge would be rather empty if it did not have any other people besides the player. Thus, I always wanted to populate the bridge with a couple of human characters, at least the weapons officer and navigator, sitting below and in front of the captain of the cruiser. The problem with these extra human characters was, that I would need to model their heads! Modeling the human head (convincingly) is notoriously difficult. Luckily my characters were facing away from the player, so at least I did not need to worry about their faces!

Even with their faces hidden, I would still need to model some hair and their ears, which felt both rather difficult and expensive considering the polygon count that would be needed for convincing results. I thought about having them wearing a helmet, but that felt somewhat silly. Nobody on Star Trek or Star Wars wears a helmet when on the bridge of a capital ship, so they should not do that in LineWars VR either! I then figured out that they would need to be wearing some communications devices in their ears anyways, so why not wear full blown headphones? I could then hide their ears and part of their hair easily underneath the headphones (which would be considerably simpler to model, just some donuts and cylinders joined together)!

So, I went to work, copying the pilot avatar legs and arms, and then going back to my old animation project character for the head and hair. I reduced the head polygon count considerably, and then began modeling the headphones. The head with the hair ended up using 300 vertices, and the headphones 257 vertices. The headphones have a microphone, which is probably using a bit too many vertices, as it is only a few pixels in size in the game, so I will perhaps still adjust that object a bit. The rest of the character uses 506 vertices, so the total number of vertices is a bit over a thousand. All in all, I think I managed to create a pretty neat looking virtual person sitting on the weapons officer console. Much of the character is hidden behind the seat, so I only bothered to model the parts that will be visible.

Cruiser Bridge Consoles

Until now my weapons officer and navigator consoles had been without proper textures. I grew tired of watching them, and decided to finally create some textures on them. I reserved a 190x600 area of the common cockpit texture for these consoles, and began looking for some reference images. I made some Google image searches with "star ship bridge consoles" and similar search terms, and found a lot of images, mostly for the Star Trek bridges. These were not exactly what I wanted, but I found some display panel images that looked pretty good, so I used those as a starting point, and then began working on my texture.

I wanted to have the consoles consist of two parts, an upper display section and a lower keyboard section. For the display part I ended up with two large displays and two small displays, plus a warning light panel in front of the console operator, and just two large panels and the warning lights on the side console. I also decided to reuse the left/right display images, so that the left-hand side console shows the same image as the right-hand forward console, and wise versa. This saved on the texture memory usage and allowed me to have larger display images. For the keyboard part I just used some real keyboard images, shrunk them to a suitable size, and then added some side key panels, with red keys for the weapons officer and blue keys for the navigator.

I had also noticed when watching some Star Citizen videos, that they seem to have some text scrolling on their various secondary displays in the larger ships. I thought that was a neat idea, and made one of the smaller displays on either side perform similarly. For the text that scrolls, I found out that I could let Irfanview load my LineWars VR project notes, so that it converted it to an image, and then I could just crop a suitable part of that image for my scrolling text! Since I already have to update the UV coordinates of various indicators on the pilot display panels (like the PDC ammo counters), I added these scrolling displays to my instrument object and added simple code to scroll the UV coordinates every now and then. The result looked pretty nice!

Here below is what the weapons officer console looks like with the sun shining on it. This image also shows the weapons officer avatar and various extra shadow planes, for example for the yellow side railings and the console leg. There are still many areas that should have additional shadows, but many of those would require more than two shadow planes, or are otherwise difficult to set up, so I am thinking I would rather leave those out than to have shadows that look weird.

The bridge walls are still pretty much work-in-progress as far as the textures are considered. Here I am experimenting with a sort of metal mesh grid texture for the walls (assuming even in the future keeping the space ship as light as possible is advantageous). Below is what the console looks like when the sun is not shining on it, with the displays and buttons glowing in the dark.

Compute Shader Experiment

After testing the PDC bullet movement code on the actual Gear VR device, it looked like there were some infrequent frame skips whenever there were a lot of bullets flying. So, I decided to revisit the bullet movement code. My original code used up to 400 separate GameObjects, all using the LineRenderer to draw a simple line. I used local coordinates for the line ends, letting Unity then dynamically batch all the separate GameObjects into a single big mesh for rendering. I thought that perhaps it would be smarter to have just a single GameObject, which could even be located at the world origin, and then just use world coordinates for the line ends.

After making that change, I then began to wonder whether a compute shader would be usable when targeting Android devices. It looked like my Samsung Galaxy S6 does support compute shaders, even though the Unity editor (when targeting Android) does not. I found a good YouTube tutorial for Unity Compute Shaders, and decided to experiment with the ideas shown in that tutorial.

After a lot of trial and error (and some bug hunting, and fighting with an issue where the actual display shader does compile, but only produces purple result) it seemed like my Samsung Galaxy S6 does not support StructuredBuffers in the vertex shader (as per this Unity forum thread). That was pretty annoying. My fallback option was to move the vertices (and handle the collision tests) in the compute shader, then use GetData to transfer the vertices from GPU to CPU, and then use mesh.vertices to send them back from CPU to GPU. This is far from optimal, but seemed to finally allow my compute shader to work.

I then checked the resulting StopWatch Ticks when running the game on my Gear VR (as I had noticed that the System.Diagnostics.Stopwatch gives sensible-looking values also on Android, not only when running in the Editor). Originally the code that moved and tested the bullets for collisions (for the target and viewer ship) took around 2750 ticks to run. My new compute shader actually checked collisions with the 8 closest ships, ordered in priority so that close enemy ships will always get included, and neutral objects like asteroids get left out of the collision tests if there are already eight more important close objects. It took only around 500 ticks to prepare the data for the compute shader per frame, but calling the GetData routine to get the new vertex positions and the hit test results from the compute shader to the CPU code took a mind-boggling 110000 ticks! I had read that the GetData call needs to do some synchronization between the GPU and CPU and thus may take a while, but I did not think it would take two orders of magnitude longer than just moving the bullets on CPU! In addition to this, setting the vertices of the mesh took another 200 ticks.

So, it seemed like my experiments with the compute shader were mostly wasted time. I decided to use the shader code rewritten in C# on the CPU for my bullet movement and collision tests. It takes between 2000 and 6000 ticks when moving 512 bullets that never hit the targets, but in real situation and with only 384 bullets the code mostly takes around 1100 ticks, which I thought was pretty acceptable. I just need to optimize some of my other code if I begin to experience frame drops. I have also considered porting some of my code to a native plugin, but haven't yet had a pressing need for that. I may look into that if I need to make some major code speedups.

AI Ship Movement Improvements

My old LineWars II game had a demo game, where you could just watch the game play against itself, in the form of watching a group of Cobra fighters attacking an enemy StarBase, protected by Pirate ships launching from the StarBase. I wanted to have a similar demo game in LineWars VR, although being virtually in a ship that moves and bounces around is a pretty sure recipe for nausea. However, you can easily take control of the ship by pressing the Teleport key, so I thought this could work also in the VR environment. The problem was, that my ships steered around so jerkily, that even I got nauseous within just a few seconds of watching the demo game! Something needed to be done about that.

I spent some time looking into various filtering algorithms, but did not find a suitable one for my needs. I thought the ideas in a standard PID Controller were applicable, but my problem was that the set point varies, as the ship is targeting a moving object. After a lot of tinkering with the code, I did finally manage to create a system that sort of follows the PID controller principle, with me using only the proportional and derivative terms of the control loop. My ships all have a maximum turning rate that they cannot go over, but I used the PID controller when determing how much to change the current turning rate. This made the ships rotate much smoother, so that I did not immediately get nausea when watching the demo game.

Spacescape for Skyboxes with Nebula Backgrounds

I had already a while ago run into a program called Spacescape by Alex Peterson, when I was looking for some skybox ideas. This is a neat free tool for creating space skyboxes with stars and nebulas. I just hadn't had time to look into this program further. Here below is a picture of one of the sample scenes in the program.

I wanted to finally look into this program a bit more thoroughly, mainly to see how I could use it together with the environments (the planet, moons and a sun) I had created in Cinema 4D. I took one of the sample files (called "Green Nebula"), and played around with it a bit (changed the green color to blue, and changed some star colors as well), and it began to look quite usable for my game. I figured there would probably be two ways to combine Spacescape with Cinema 4D, I could either render my planets and such from Cinema 4D separately and use them as billboard textures in Spacescape, or I could use the Spacescape images as background images in Cinema 4D.

I decided to first experiment with the latter option, as I am still much more familiar with Cinema 4D than with Spacescape. I replaced my background stars in my C4D skybox scene with a Background object, and attached the correct image from the Spacescape export images as the Background texture. I then rendered out the image from Cinema 4D, and compared the result with the Spacescape image. The images were correctly sized and oriented, but the stars in the Cinema 4D output were dimmer. I tried to figure out what caused this. I suspected too heavy antialiasing, and switched that from Best to Geometry, but that did not help. I then added some sharpening, but that did not help either. Finally, I figured out that the smoothing was caused by the texture sampling in Cinema 4D, which by default uses MIP sampling. I switched that to None, so that the background texture gets handled pretty much as-is, and finally got an image where the background stars and nebula were practically identical to the original image from Spacescape, with the planet added to the foreground. Now I just need to get more familiar with Spacescape to be able to add a nice variety of space background skyboxes to my game.

Next Steps

Next, I believe I will need to work on the cruiser damage textures and the collision detection. I would also need to create the female navigator avatar, and create some objects and textures on the rear of the cruiser bridge, and after those I should be able to make the third mission playable. The second mission is actually only waiting for the scoring system, as I do not want to make several missions and then add the scoring system to all of them separately. After that I could have the first five missions running. The sixth mission needs the player to be able to control a pirate fighter, so the next step will be to create the pirate ship cockpit. Then I still need to create the alien mothership, to be able to finally add all the single player missions. Then I can start working on the multiplayer features, and the configuration screen. Still a lot of work to do, in other words!

Thanks again for your interest in my LineWars VR game project!

Sep 15th, 2018 - Cruiser Work

During the past month I have mainly worked on the Cruiser game object. I finished the object mesh, improved and completed the dynamic self-shadowing system, textured the model, and also worked on the PDC (point defense cannon) rotating and firing system. These battlecruisers feature on several of the missions in my game, so along with the Space Station they are the "hero" objects of the game.

Modeling the Cruiser

I wanted to have the cruiser model relatively complex, but because in many missions there will be several cruisers in the scene, it should not be overly complex. Also, as I needed to manually configure all the shadow planes for my dynamic self-shadowing system, I did not want to have to do a huge amount of work. I ended up with a somewhat cigar-shaped design, with a shuttle landing pad in the center, a missile tower on top, some recessed living quarters on the sides, and eight PDC extrusions strategically placed to have maximum coverage. The image below shows only one of the PDC guns, as I decided to actually generate these programmatically while importing the mesh. This way I could exactly determine the normal vectors and UV coordinates, and which vertices are shared between the triangles. It was important to have as few vertices for each PDC as possible, as I plan to have all of them moving and tracking enemy targets.

Shadow System Revamped

As I described in my Apr 21st, 2018 blog post, I had already back in April worked on a self-shadowing system for my Cruiser model. I had manually configured the few shadow areas I had, mainly just to confirm my system worked. I had left many of the needed shadows still unconfigured, as the model itself was not yet finished. Now that my model was much more complex, I thought it was too much work to configure everything manually the way I had done previously. So, I decided to try to do as much of the configuration programmatically as I could.

The first step was to move the shadow configuration from the Awake() routine to my MeshPostProcessor() routine, so that it will be done while the model gets imported. At first, I created code that generates a new C# source code file containing a static array. However, as this file was over two megabytes in size, and had over 16.000 array items, it took forever to parse when the game was loading! In the editor it only took a few seconds, but in the actual Gear VR I waited a couple of minutes for the scene to start, and while it still had not started, I abandoned this idea. I am more used to working with non-managed languages, where such a large array in a source file simply gets compiled to a binary and is as fast as any binary data. Looks like in C# the system parses this source file during runtime, which obviously is not what I wanted. Since that did not work, I decided to simply write a JSON file containing the data, and loading that as a resource when the game starts. This seemed to work much better.

In my shadow data generator, I manually configure these three pieces of information for every shadow plane:

All the rest is programmatically handled, for example determining which polygons should be affected by which of my shadow planes and combining the shadow planes affecting a certain polygon into one of the allowed combinations (X-plane and Y-plane, X-plane and Z-plane, or Y-plane and Z-plane). This system considers the polygon normal, and whether any part of the shadow plane is "visible" to the front side of the polygon. This saved a lot of work compared to the original system, where I had to manually determine the shadow planes for every polygon separately. There are currently 77 separate shadow planes configured for my Cruiser model.

However, while I was adding these shadow configurations, I realized that my shadow system is still lacking some features. For example, I could only have a slope on the X-coordinate of the shadow plane, but there were several occasions where I would need to have other slopes as well. I tried to add some new code into my vertex shader, but I could not figure out a way to add more features without introducing register spilling (meaning I ran out of available GPU registers). After a while fighting with this, I decided to refactor the whole system.

At first, I looked into taking advantage of the new UV coordinate sets introduced by the later Unity versions. Nowadays you can have up to 8 sets of UV coordinates, while I only had 4 in the version of Unity I started coding LineWars VR with. However, adding even more data that I should update dynamically did not sound all that efficient. Instead of adding more data, I experimented with various ways to more efficiently use the existing UV2, UV3, UV4 and Color arrays. One idea I got was that instead of comparing the shadow plane interpolators against minimum and maximum values, I could compare just the absolute interpolator value against a single maximum value, if the plane center is at zero. This change was still not enough for all my needs, but it made the fragment shader nicely somewhat more efficient. Originally the fragment shader shadow calculation looked like the following (it could only handle the maximum limit in the Y-direction, there were no checks for the minimum value):

    fixed sh = (i.shadowPos.x <= i.shadowData.x && i.shadowPos.x >= -i.shadowData.x && i.shadowPos.y <= i.shadowData.y) ||
               (i.shadowPos.z <= i.shadowData.z && i.shadowPos.z >= -i.shadowData.z && i.shadowPos.w <= i.shadowData.w) ? i.uv.z : i.uv.w;
with a performance like this:
  3 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   18      5       2       A
  Shortest Path Cycles:   5       4       2       A
  Longest Path Cycles:    6.5     5       2       A
It had 18 arithmetic operations, and spent 6.5 GPU cycles. The new code checks for two shadow planes using abs(), so it can handle limits in all directions:
    fixed sh = (abs(i.shadowPos.x) <= i.shadowData.x && abs(i.shadowPos.y) <= i.shadowData.y) ||
               (abs(i.shadowPos.z) <= i.shadowData.z && abs(i.shadowPos.w) <= i.shadowData.w) ? i.uv.z : i.uv.w;

The performance of the new code looks like this:
  3 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   14      5       2       A
  Shortest Path Cycles:   3       4       2       L/S
  Longest Path Cycles:    4.5     5       2       L/S
The number of arithmetic instructions dropped from 18 to 14, and the cycles by 2! That was a nice performance boost combined with better features, so it was a win-win situation! The reason why using abs is so much faster seems to be because this way the shader can calculate all four comparisons in one go, as all the comparisons are the same (less-than-or-equal). The compiled code looks like this:
    u_xlatb2 = greaterThanEqual(vs_TEXCOORD4, abs(vs_TEXCOORD3));
    u_xlatb2.x = u_xlatb2.y && u_xlatb2.x;
    u_xlatb2.y = u_xlatb2.w && u_xlatb2.z;
    u_xlatb21 = u_xlatb2.y || u_xlatb2.x;
    u_xlat21 = (u_xlatb21) ? vs_TEXCOORD0.z : vs_TEXCOORD0.w;

Sadly, this new code only works when the shadow planes are symmetrical, which was the case most of the time but not all the time. Thus, I still needed to get more data into the shader. I read some documentation, and noticed that also the vertex shader can use texture lookups. Could I perhaps use a texture in some way? I only had a limited amount of different shadow plane configurations (probably less than 256), so if I could just use one of the Mesh.Color components as an index to a texture, I could read all the rest of the data from the texture!

I began working on a system using a helper texture. I did not find good examples of using a texture to send several float values to a shader, so I decided to see what I could do with just the color values from a texture. These are basically integers between 0 and 255 converted to a float value between 0 and 1. So, the immediate problem was that there were no negative values available, and also the limit of 256 different values was rather low. However, my Cruiser model is 182.5 meters long (so less than 256), and if I used one color value for positive values and another for negative values, I could easily fit the whole range of the ship into 0 to 256 even using half-a-meter steps. The shadow plane extents are obviously always positive, and I could probably fit all the required shadow plane edge slopes into the 0 to 255 range as well. All that was needed was that I had to make sure all my extrusions and recessed areas in my model were situated at 50cm steps. This was not much of a problem.

So, I decided to go with a shadow system like this in my vertex shader, using a 256x4 RGBA texture:

As you may have noticed from that list, I managed to drop the uv3 and uv4 lists completely! I also switched from Mesh.Colors to Mesh.Colors32, as that makes it easier to give integer index values to the shader. Thus, I have much less data to update the mesh with whenever the shadow orientation (sun direction relative to the ship) changes.

In the vertex shader I then have code to read these values from the texture, and calculate the needed shadowPos interpolator values. The shadowData values (shadow plane extents) are retrieved directly from the texture, I just add the slope modifier to these values. I have left out the shadow plane selection from this code to make it clearer, this shows just the "X and Y" plane version:

    float4 ip = (tex2Dlod(_ShadowTex, float4(v.color.r, 0.25, 0, 0)) - tex2Dlod(_ShadowTex, float4(v.color.r, 0, 0, 0))) * 127.5;
    // x plane and y plane
    ip = ip + pos.yzxz - _ObjLightDir.yzxz * (pos.xxyy - v.uv2.xxyy) / _ObjLightDir.xxyy; // Project the vertex onto the shadow plane
    o.shadowPos = ip;
    float4 slope = ip.yxwz * (tex2Dlod(_ShadowTex, float4(v.color.r, 0.5, 0, 0)) * 4*255.0/256.0 - 2);
    o.shadowData = tex2Dlod(_ShadowTex, float4(v.color.r, 0.75, 0, 0)) * 127.5 - slope;
I have to do some arithmetic with 127.5, as the values are mapped from the 0 to 255 range to 0 to 1 range, and I need resolution of 0.5 (as in value * 255 / 2). In the code the ip variable originally contains the values I need to shift the plane with (when the plane center is not at the origin). I add the vertex position projection along the sun ray to this variable, and send it to the fragment shader in the shadowPos interpolator. Then I multiply the projected coordinates with the requested slope, and subtract that from the shadow plane extents, before sending the extents to the fragment shader in the shadowData interpolator. Whether to add or subtract the slope is just a matter of convention, I decided to go with positive slopes making the shadow plane smaller, as that is the more common direction.

This system takes care of all my shadow plane needs, but how did the vertex shader performance change? The original code (which was lacking features) from April had the following performance characteristics:

  8 work registers used, 9 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   58      27      0       A
  Shortest Path Cycles:   30      27      0       A
  Longest Path Cycles:    30      27      0       A
The new code became slightly slower, but on the plus side it has much better features:
  8 work registers used, 9 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   62      27      4       A
  Shortest Path Cycles:   32      27      4       A
  Longest Path Cycles:    32      27      4       A
This is what the shadow texture looks like. As it contains data instead of an image, it does not look like much of anything, but you can see that I currently use 165 different shadow plane configurations (out of the maximum 255). Thus, I could still make my Cruiser slightly more complex, but not a lot more complex. The image is zoomed to 1024x16 so it is more visible. The reason why I have 77 shadow planes in the generator code and 165 shadow plane configurations in the texture is that the texture contains all the distinct pairs of the shadow planes.

Cruiser texturing

Okay, now I had modeled the Cruiser and had fixed the shadow system, but it still looked like a white box. The next step was to texture it. I wondered what the base color of the ship should be. It would make sense if the ship was painted black, to make it not stand out in the dark of space, but that would not be very good for the playability. Since the Cruisers are on the same side as the Cobra fighters, I decided to take the color of the Cobra fighter and just darken it a bit. This looked pretty good, so I decided to go with that color scheme.

I had put aside a quarter of my texture atlas (which is 2048x2048 texels in total) for the Cruiser, as I already use one quarter of it for the Space Station, and one eight for the fighter ships. Almost a quarter is used by the animated textures, so that still leaves around one eight for the upcoming Alien mothership. One quarter is only 1024x1024 texels, and as I wanted to have similar damage state system as with the Space Station and the fighter ships, each of the four damage states can only use a quarter of that area. So, I had at most 256x1024 texels in total to use for the main texturing. With the Space Station I did not use the area very optimally, so I could only have three damage states instead of four as with the fighter ships. I decided to be a bit smarter with how I use the texture area, and tried to fit everything optimally into this texture block.

I also decided to use some pre-baked lighting using the Luminance texture, much like I did the spotlights shining on the fuel tanks with the Space Station. However, I wanted to better handle the damage states, so that when the spot light gets shot out, the area it lights should also get dark. This meant having eight damage states for some areas instead of four, which made the available texture area even smaller. Thus, most of the panels can use only 128 texels by something less than that as their textures.

I ended up needing an area of 1024x1048 texels for the Cruiser, so I went a little bit over the budget. This should not be a problem, as my plan for the Alien mothership is to use some kind of weird fluid animated surface texture on it, with not much features, so it should not need all that much space from my texture atlas. I still haven't worked on the damage states, so that is something I need to continue working with.

The image below is from within the game itself, and it showcases many of the features of this blog post.

  1. The main color is a darker version of the Cobra fighter color, a darker shade of a bluish grey. The main armor panel shapes are created using the normal texture.
  2. The shadows are clearly visible on the landing pad (shadow from the forward missile tower), forward top armor panels (shadow from the antenna), bow side (shadow from the RCS thruster pod), behind the PDC bases, and obviously in the recessed side areas, which are fully in shadow.
  3. The landing pad has its own lighting, so even the area that is in shadow is not completely dark. Similarly, the recessed side areas have lighted windows, which shine some light on the walls even when that side of the ship is in shadow. This is done using a luminance texture.
  4. The bow armor panels and especially the forward side armor panel edges show the tangent-space specular highlights. More about this after the image.
  5. The PDCs are pointing approximately towards the viewer. I describe the PDC system further down in this blog post.

Tangent Space Specular Lighting

In my other objects (the Space Station and the fighter ships) I have used a specular map to determine the areas (mostly windows) in the texture that should have specular highlights from the sunlight. I have calculated the specularity amount in the vertex shader (as that requires the use of the pow function, which is rather slow). However, that meant that any specular highlights can never occur between any vertices. This was fine for windows and other such small flat areas. I also used the same system in the fighter ships, as those are pretty small. However, in order to get nice looking specular highlights for my Cruiser, I had to move the specular highlight calculation to the fragment shader.

In my original vertex shader code, I had calculated the specular color component for each vertex like this:

    worldPos = mul(unity_ObjectToWorld, v.position);
    worldNormal = UnityObjectToWorldDir(v.normal);
    ...
    //------------------------------------
    // Precalculate the specular color (this method only suitable for small and flat polygons!)
    //------------------------------------
    float3 halfVector = normalize(_WorldSpaceLightPos0 + normalize(_WorldSpaceCameraPos - worldPos));	// In world coordinates
    o.specular = _LightColor0 * pow(DotClamped(halfVector, worldNormal), 100);
I performed the calculations in world space, as Unity provides the _WorldSpaceCameraPos uniform automatically. In order to move the calculation into the fragment shader, I would need to provide the input to the pow function to the fragment shader. I did not want to add several new interpolators to send the world-space vertex positions and surface normals to the fragment shader, so could I do the calculations in tangent space, as I already had both the tangent space normal and tangent space light direction in my fragment shader?

To be able to do all the calculations in the fragment shader and in tangent space meant that I needed to provide it with the tangent space camera direction. Unity only provides world space camera direction, and even object space coordinates need to be specifically rotated to tangent space. This is too much work to do in the fragment shader, but as I already had the object-to-tangent-space rotation matrix available in the vertex shader, I decided to provide an object space camera direction as a uniform vector from the C# code to the vertex shader. I can then use the same matrix to rotate the camera direction as I use to rotate the object space light direction. I then replaced the specular item in my fragment shader input with this tangent space camera direction:

    //------------------------------------
    // Calculate the specularity camera direction in tangent space.
    //------------------------------------
    float3 camVect = _ObjCameraPos - v.position;
    o.specular = float4(mul(rotation, normalize(camVect)), 0);
I then moved the original code from the vertex shader to the fragment shader:
    fixed3 halfVector = normalize(i.lightDirection + i.specular.xyz);	// In tangent space coordinates
    col = sunLight + sh * _LightColor0 * pow(DotClamped(halfVector, tangentSpaceNormal), 100);

This worked pretty well, until I realized that when I moved further away from the ship, weird artifacts began to appear in the specular highlights, as in the following image:

I immediately thought that this might be caused by some texture compression artifacts. After some studying the problem, it seemed that actually the mipmap levels cause this problem to appear. It is not even the specular map texture that causes this, but the normal texture! I tested forcing the texture not to use mipmap levels, and that got rid of the problem, but I don't think that is a good solution, as that can cause texture flickering. Perhaps I could somehow interpolate between using the normal texture, and just plain surface normal?

I spent some time thinking about how could I get the plain polygon surface normal into my fragment shader, until I realized what a stupid question that is! I am in tangent space, which by definition means that the polygon surface normal points to (0,0,1)! So, what was left was just to figure out a way to switch between the tangent space normal from the normal map, and the plain polygon surface normal, depending on the distance of the ship.

I already had all the data available, I just adjusted my vertex shader code to also send a value telling the ship distance (I determined on the Gear VR that the distance where the artifacts begin to appear is around 100 meters). Thus, I changed the vertex shader code to look like this (I just manually "normalize" the vector, so that I can use the length-based value as the fourth component of the vector):

    //------------------------------------
    // Calculate the specularity camera direction in tangent space.
    // Also determine the distance from the camera, to determine whether to use
    // normal-texture-based specular highlights (when close) or just surface-normal-based highlights (when far away).
    //------------------------------------
    float3 camVect = _ObjCameraPos - v.position;
    float camDist = length(camVect);
    o.specular = float4(mul(rotation, camVect/camDist), saturate((camDist-70)/50));
To take advantage of this distance value, I added just a single line into my fragment shader:
    tangentSpaceNormal = lerp(tangentSpaceNormal, fixed3(0,0,1), i.specular.w);
After all these changes, I realized that I don't actually need the world space vertex position or normal for anything in my vertex shader! I removed those (and changed the remaining code that used them to use the tangent space values), and got a nice performance boost for my vertex shader! If you remember from above, my vertex shader used to have 62 arithmetic and 27 load/store operations. These both dropped down considerably, mostly because I did not need to use the unity_ObjectToWorld matrix at all anymore! The resulting vertex shader only uses 22.5 arithmetic cycles and 22 load/store cycles per vertex, which is pretty good, considering all the shadow plane handling.
  11 work registers used, 8 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   40      22      4       A
  Shortest Path Cycles:   22.5    22      4       A
  Longest Path Cycles:    22.5    22      4       A
The fragment shader sadly got a bit slower again, but on the other hand not any slower than what it was when I started improving it. However, the original code used 18 arithmetic operations totaling 6.5 GPU cycles, while the new code uses 14 operations to spend the same amount of cycles. This shows how the pow operation really has an effect on the performance.
  5 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   14      5       2       A
  Shortest Path Cycles:   6.5     5       2       A
  Longest Path Cycles:    6.5     5       2       A
One weird thing about those results was that I have thought the shader has only 8 work registers available in the vertex shader and 4 in the fragment shader, as I have usually always gotten spilling active when the work register count would exceed that. This time both of the shaders use more work registers than that, but still don't need spilling. Perhaps I was wrong about those limits, or perhaps this time the shader can use half-resolution work registers smarter.

PDC work

After I got the Cruiser model itself mostly done, I started figuring out how to handle the PDC guns. I wanted to have all the guns individually targeting enemy ships, and shooting streams of bullets, just like a real minigun. I went back and forth between having separate game objects for each gun, or having the guns be a part of the main Cruiser mesh, and dynamically updating the mesh when the guns move. Both of these have their own advantages and disadvantages, and I was not sure which system would be better. In the end I decided to at least experiment with having the guns be a part of the main Cruiser object, so that I can save on Draw Calls and can easily get the shadows to affect the guns as well. Since my shadow system cannot handle shadows from moving parts, I decided to not even attempt to have the guns cause shadows. Instead, I added spot lights to the gun recess back wall, which shines a light towards the PDC base. This should fool the eye, so that the guns not causing shadows is not as noticeable.

The first step was to model the PDC gun. It should have six barrels, which I planned to handle using a texture, so that I used just a six-segment cylinder as the barrel. The rest of the gun is just a simple box, and as I realized each corner of a box needs to have triple vertices in Unity for the three normal directions anyways, I decided to have rounded corners in my box. This creates no additional vertices, so it is a free visual quality improvement. I created a simple model in Cinema 4D, and then added those vertex positions into my MeshPostProcessor code, so that it can append the original Cruiser mesh vertex list with the vertices for all the PDC guns. This way I could easily find the correct vertices on the fly when I needed to orient the gun towards the target.

Each gun turned out to need 44 vertices (in Unity). The box needs 8*3 vertices, the barrel takes 7*2 (6 corners plus a duplicate for the UV seam), and the muzzle the remaining 6 vertices. Next, I generated a list of normals and tangents for these vertices. It is a bit annoying that I need to update also the normals and tangents when I update the vertex positions, but that can't be helped when I want to use a normal texture for some surface details, especially on the gun barrels.

Now that I had the vertices (and normals and tangents) set up, I began figuring out how to move them. In my original LineWars II, I used a system where I only rotated corners of a box, and then calculated the positions of the ship vertices using simple arithmetic, like mirroring or averaging the position. I decided to try something similar with the PDC vertex moving, so that I can avoid as many of the somewhat expensive rotation operations as possible. The first step was to determine where the PDC should point at, with a code like this:

    Quaternion rot = Quaternion.LookRotation(canShoot ? pdcDir : (left ? Vector3.left : Vector3.right), up ? Vector3.up : Vector3.down);
    Quaternion want = Quaternion.RotateTowards(dirs[i], rot, 180 * Time.deltaTime);  // 180 degrees per second rotational speed
Everything here is using local coordinates of the Cruiser. The helper variables I use are: The first row calculates the rot Quaternion, which tells us the wanted rotation for this gun. If we can't shoot at anything, we want to point towards left or right, depending on the side this PDC is located. The second row then calculates where we should point this frame, considering the maximum turning rate of the gun, which I decided to be 180 degrees per second.

Now we have the Quaternion telling us where we want to point at during this frame. I then calculate the new positions for the 12 vertices at the back of the ammo box, and for the 6 vertices at the back of the barrel, using code like this:

    // Fully rotate only those vertices whose positions we can't figure out from the other vertices.
    // Ammo box rear and rear ring
    for (int i = 0; i < 12; i++)
        verts[vi + i] = rot * PDCVerts[i] + pos;
Here the variables are: The rest of the vertices can be derived from these vertices without needing to rotate each of them. I just calculate a rotated unit vector (in Z direction), after which I can simply set up the remaining vertices by adding a suitable magnitude of this unit vector to an already rotated vertex. Thus only 18+1 rotations are needed to handle the 44 vertices.

For the normals and tangents I do something similar, with the difference being that there are only 4 distinct normal directions I need to rotate (up, right, and two barrel normal directions). I already have the Z-aligned unit direction, so these can handle all the normals. The tangents are even simpler, as every direction I need for the tangents has already been rotated for the normals, so it is just a matter of storing the correct directions to the tangent array.

So, what sort of a performance does my PDC vertex rotating system have? I added my usual System.Diagnostics.StopWatch-based system to count how many stopwatch ticks my routine takes. Looking at the results, the most common value was 460 ticks (with a single enemy flying around, so usually just two guns are tracking it). This was not terrible, but I thought there might be ways to speed this up further. I checked where most of the time is spent, and it turned out that the slowest operation is actually the setting of the Mesh vertices, normals and tangents. I first thought that perhaps I could do this only every second frame, but instead it occurred to me, that perhaps I don't need to move the guns every single frame, especially if the target direction has not moved much. I experimented with various angle limits, and came up with an angle limit of 4 degrees. If the target is less than 4 degrees away from where the gun is currently pointing, I skip moving the gun. This angle is still small enough to not be noticeable, yet it provided a nice performance boost. After this change, the most common stopwatch ticks value was 130 ticks, so much better than the original 460 ticks. Obviously whenever any of the guns actually needs to rotate, the code takes around that 460 ticks to handle it, but this does not need to happen every frame any more.

Okay, now the guns rotate and track targets, but I still need to make them shoot. For the bullets I created a pool of GameObjects using a LineRenderer. When a gun wants to shoot, I then get a free GameObject from this pool, set its position, and store the bullet direction into an array. I created a new ShootPDC() routine for this, and then added the following code to transmit the correct bullet start position to the routine:

    ShootPDC(new Ray(data.gameObject.transform.TransformPoint(pos + dirs[i] * new Vector3(0, 0.5f, 2f)),
             data.gameObject.transform.TransformDirection(Quaternion.Lerp(rot, want, Random.value) * Vector3.forward)));

Here I use some random linear interpolation between the actual target direction (rot) and the direction where the gun is currently pointing at (want) to create some variation to the bullet directions. This way all bullets do not follow the exact same route, especially when the gun is currently rotating.

I then added code to move the bullets. I decided to start experimenting with a bullet speed of 300 m/s (around the speed of sound, or 1000 km/h), as I want to have these bullets move much slower than my laser blasts (which move at something like 2000 m/s). I also decided to have a range of 1000 meters for the bullets, so each bullet stays alive a bit over 3 seconds. In my first tests I had 200 simultaneous bullets moving, but it looked like the guns ran out of bullets pretty often when running in the editor (with something like 400 fps frame rate), so I upped this to 400 bullets. I thought this was a good amount to test with, as obviously the bullet moving code should be really fast and efficient.

In my first test I had made a mistake with the material handling, and accidentally created a new material instance for every bullet, which made all the bullets have their own Draw Call! I quickly fixed this, and it was fun to see in the statistics window how it showed things like "Saved by batching: 395" and such. So, it looks like the LineRenderer can batch the lines pretty efficiently. I also used only a single LateUpdate call that moves all the bullets, as having every bullet have their own LateUpdate() routine would certainly be much less efficient.

I again turned to the StopWatch to see how much time moving all 400 bullets takes. It turned out to take around 3400 ticks, when using local coordinates and using transform.Translate() to move the bullets. I next tested without calling Translate, as I had read it is slower than simply adding the coordinates, and sure enough, the time dropped to 3100 ticks. I then thought that perhaps switching to world coordinates (LineRenderer.useWorldSpace = true) would be a smart move, in case that makes it simpler for Unity to calculate all the vertex positions when batching my bullets together. This turned out to increase the time to 3300 ticks, though. I then added code to use arrays to store the positions (instead of setting both start and end position separately), and this change dropped the time to 2500 ticks. I then converted the data for each bullet to a simple class, so that I can just have a single array of these classes to handle in the movement code. This dropped the time to 1750 ticks! This was reasonably good, considering that 400 simultaneous bullets should be pretty rare in the actual gameplay, it happens only in the editor. However, I still had no code to check if the bullets hit anything!

So, to check for the bullet hits, I needed to have some kind of a collision detection for every bullet. This sounded pretty difficult to get working efficiently, so I thought about what is the simplest collision detection I could use. I decided to have each bullet know the target it was aimed at, and only check collisions with this ship, and the viewer ship (as it would be quite distracting to have bullets go through your own ship without collisions). This creates a potential situation where bullets can go through ships or asteroids, but I will need to test this in actual missions to see how distracting it is. With a collision test (just a simple shield hit sphere) against both the target and the viewer ship, the time spent into the bullet movement routine increased to around 2750 ticks. This was in the situation where every bullet flies the full 1000 meters without hitting anything, so all 400 bullets are moving and testing for collisions every frame. This is not horrible, but it would be nice to get this running faster. I can of course do some trickery like not testing for collisions every frame, or precalculating situations where collisions cannot occur (like my ship is behind the turret and moving slower than the bullets), but I'll leave that for later.

There was still one issue with the bullets when the guns shot at moving targets: The bullets never seemed to hit the target! This was because the guns aimed directly at the target, not to where the target would be when the bullets would reach it. I thought this needed fixing, so I added simple code that attempts to aim at where the ship might be when the bullets reach it. There are probably more accurate methods for calculating this, and I might revisit this in the future. This simple code already solved the most of the problem, now most of the bullets hit the ship (as long as the ship moves in a straight line), which is pretty realistic.

    pos = enemies[j].gameObject.transform.position;
    float dist = (pos - data.gameObject.transform.position).magnitude;
    pos += enemies[j].gameObject.transform.rotation * new Vector3(0f, 0f, enemies[j].CurSpeed * dist / PDCSPEED);
    enemyLocalPos[j] = data.gameObject.transform.InverseTransformPoint(pos);
Here the variables are:

After all that code my PDCs tracked and shot at the enemy ship pretty convincingly, the only thing that was missing was some trick to make the barrel look like it is rotating. Since I already have a section of my ship texture atlas set aside for animations, I thought I'd experiment with an animated texture for the gun barrel. I used a YouTube video of someone shooting a minigun as a reference, and tried to emulate the red-hot glowing barrels and the flickering barrel rotation. I had already created a stationary texture, for the barrels, so I just copied that, shrinked it down somewhat (as my animated textures use only 64 texels per frame horizontally, and I want to use as few texels vertically as possible), and shifted the texture every second frame. This already looked surprisingly good, so I went a step further and added six different texture states (as each barrel takes 3 texels, and I emulated some half-steps as well). Sadly my 32-frame animation is not divisible by six, so there is a slight jump once per 32 frames. However, I don't think this matters, as it is supposed to look like a flickering rotation. The end result was pretty good, in my opinion! Especially considering that this barrel rotation takes no CPU time, and very minimal GPU time.

Here on the left is an animated GIF image showing the right side top PDCs of a cruiser tracking and shooting at an enemy fighter. The guns return to their rest position whenever they lose track of the target, and they also start shooting only after they have rotated close enough to the target direction. I might still add a proper muzzle flash, but on the other hand you rarely see the PDC guns this close in the game, so that is probably not needed.

Aug 17th, 2018 - Collision Detection and Other Progress

The main focus of this blog post is my collision detection system. I had been somewhat scared of tackling that, as I assumed I would need to study a lot of difficult algorithms and such. It turned out not to be such a major issue after all, but more about that later on in this blog post. Before I worked on the collision detection, I worked on various other aspects of my game, so I'll start with those.

More Sound Effects

After writing the previous blog post I began adding some sound effects to my game. I first searched for some explosion sound that I could attach to the new explosion animation. I found Boom by Redbulldog98 and Distant explosion by juskiddink on freesound.org, and mixing those created a pretty good ship explosion sound. I had already used Rumbling Rocky sound by Opsaaaaa as the sound effect for when the asteroid crumbles. Next, I added a sound effect for when a laser hits a ship's hull, that I got from FUN-EXPLOSION02 by newagesoup. I just shortened the effect somewhat so it is more of a quick hit. It took me a while to find a suitable effect for a laser hitting the ship shield, but I finally decided to use SRSFX_Electric_Hit by StephenSaldanha. For the laser sounds I use JM_NOIZ_Laser 01 and JM_NOIZ_Laser 02 by Julien Matthey, one for the player's ship and one for the other ships. That took care of the most essential sound effects, but I still need quite a few effects for various other things.

Missile Handling

Next, I decided to add missiles into my game. Since the missiles are very fast and rather small, you mostly only see them as a few pixels on the screen, so I wanted to make them very low-poly objects. It only took me a little while to model the mesh in Cinema 4D, using a cylinder primitive and making a few adjustments to it. It was a simple thing to texture this as well, as I did not need any damage states. If a missile gets hit by a laser, it simply explodes immediately, there are no damage states as such. So, it only took me one day to model, texture, and code all the missile-specific stuff into LineWars VR. The missile mesh contains 82 vertices and 64 triangles in Unity, and looks like the following image. I only used a tiny 40 by 40 texels texture area for all the textures of the missile, so the texture looks quite blurry. That does not matter in the actual game, though.

Upgraded to Unity 2018.2

One morning I got a Unity popup recommending me to upgrade to 2018.2 from my 2018.1 version. These upgrades are somewhat annoying, as the Unity people seem to always introduce some code-breaking changes in new versions. I decided to upgrade anyways, as I am still far from getting my game close to release, and just one-point level upgrade should not cause significant problems.

It took me about a whole day to get my game to compile using the new version of Unity, though. The first major issue was, that I needed to upgrade to a newer Android SDK. However, when I tried to let Unity upgrade the Android SDK, I got this:

CommandInvokationFailure: Unable to update the SDK. Please run the SDK Manager manually to make sure you have the latest set of tools and the required platforms installed.
I then tried to upgrade the SDK manually, but this also ended up in an error:
Warning: An error occurred during installation: Failed to move away or delete existing target file: C:\Projects\AndroidSDK\tools
Move it away manually and try again..
This was a bit silly, as the sdkmanager.bat which performs the upgrade sits in that directory, so of course it cannot be moved away or deleted! I found a Stack Overflow question about this specific issue, and using the recommendations there I managed to get the SDK upgraded.

The next problem was that almost all of my shaders failed to compile, with error messages like the following:

Shader error in 'LineWarsVR/CruiserShader': invalid subscript 'instanceID' 'UnitySetupInstanceID': no matching 1 parameter function at line 60 (on gles3)
This was not much of an issue, as I don't actually use or need instancing in any of my shaders, that was just a left over from some tests I had made earlier. I commented out all UnitySetupInstanceID() calls from my shaders, and got them to compile.

I then tested building the executable for Android, but that complained about missing passwords. It took me a little while to remember that I had tested uploading my game to the Oculus Store Alpha Channel some time before, and had to add a new key store with passwords for that. It looked like Unity does not remember these passwords, so I need to give them again whenever I build a new binary for my phone. Not a big deal.

The final issue, again a result for that Oculus upload, was that I needed to replace INFO with LAUNCHER in the AndroidManifest.xml to get a standalone executable to work. I need to remember to switch this back to INFO when uploading a new version to the Oculus store.

Purchased InControl

As I want to support various gamepads with LineWars VR, I had been thinking about how best to handle this. I had read from the "Endspace" developer blog that they use InControl for this. I thought it should probably work well with my needs as well, and decided to purchase it. It was only $35 in the Unity Asset Store, even though the main page says it costs $40, so I thought that was a good deal. I had some issues making it work properly with my Snakebyte VR:CONTROLLER, but after testing my controller with Endspace, I noticed it behave exactly the same. Looks like the problem is in the controller itself, it just does not seem to always recenter the analog sticks properly when you release them.

Anyways, I added quick code to read the InControl controller positions, and it seemed to work fine. It still needs a lot of fine-tuning and such, and I also need to handle multiple scenes better, add configuration page, and so on. It does look like I can let InControl worry about the different gamepads and just focus on the game myself, which was exactly the reason why I purchased it.

Collision Detection

Okay, now let's get into the main feature of this blog post, my collision detection algorithms. As I have mentioned before, I don't want to use the Unity's built-in physics module with its Collider objects, as these are black boxes and I believe I can program a more performant system for the specific needs of my game. In the original LineWars II I had simple bounding spheres that I used for collision detection, but in LineWars VR I wanted to have more accurate collision. I wanted to use mesh-based collisions, but still keep the performance high. This sounded like a pretty difficult problem, and thus I had assumed it would take mew a few weeks to get this done.

The first step of any collision detection is of course trying to preprocess the situation using some fast code to detect if a collision is at all possible. Since all my objects already have bounding spheres, which correspond to the distance of the furthest vertex from the object center, the first step was to detect if the bounding spheres intersect. This is the simplest collision detection there is. You just need to calculate the distance of the sphere origins, and subtract the sum of the sphere radii from that. This is faster to calculate using the squares of both of those, so you can avoid the square root. This way I can quickly determine which objects cannot collide during this frame.

I added a new collision detection routine call into my MoveShips() routine, which handles the movement of all the ships. I use a single routine to handle all the ship movements instead of having each game object having an Update() routine separately. This is closer to how my old LineWars II was built, and I wanted to use a similar system. Thus, I can also have a single CollisionDetect() routine that handles all the collisions. Into this new CollisionDetect routine I then added two for-loops, so that each ship is tested against all the lower-indexed ships within my ShipData array. If a ship has died, it's ShipType enumeration value gets changed to NotExists, so I am using that to determine which ships are still alive. So, the main structure of my collision detection routine became this:

    // Check each ship against all the lower-indexed ships for collisions
    for (int i = 1; i < ShipCount; i++)
    {
        ShipData dataA = ShipParams[i];
        if (dataA.ShipType == ShipTypes.NotExists)
            continue;
        Vector3 posA = dataA.gameObject.transform.position;
        for (int j = 0; j < i; j++)
        {
            if (dataA.ShipType == ShipTypes.NotExists)  // In case shipA already collided with a previous ship and died...
                break;
            ShipData dataB = ShipParams[j];
            if (dataB.ShipType == ShipTypes.NotExists)
                continue;
            Vector3 posB = dataB.gameObject.transform.position;
            // Check a potential collision between dataA and dataB
            float shieldDist = dataA.ShieldRadius + dataB.ShieldRadius;
            shieldDist = shieldDist * shieldDist;   // Use the squared min distance
            Vector3 posB = dataB.gameObject.transform.position;
            Vector3 vectAB = posB - posA;
            // If the encapsulating shields don't intersect, the objects have not collided.
            if (vectAB.sqrMagnitude > shieldDist)
                continue;
        }
    }

If the shields of the objects ("shield" meaning the bounding sphere for objects that do not have shields) intersect, then it is time to check whether the actual object meshes intersect. This mesh versus mesh intersection is a rather complex operation. I studied the code for "Fast Collision Detection of Moving Convex Polyhedra" by Rich Rabbitz from "Graphics Gems IV" for a while, but thought that it was too complex for my needs. I would have needed to build a new structure for each of my objects to use that algorithm, but I would rather not do that, as I already have a K-d tree (as mentioned in my Apr 21st, 2018 blog post) for each of my objects to speed up the laser hit collision detection. Wait a minute, I already have a special structure for a certain type of collision detection, could I perhaps use that also for the ship versus ship collisions? My laser hit test uses the KDtree of an object to quickly determine which triangles of the mesh are within the partitioned bounding box where the laser ray may hit the object, and then does a ray-triangle intersection test for only these triangles. If I could somehow make the ship collisions behave like a ray-triangle collision, I could use the same system for these.

The collisions that I needed to handle were the collisions between the fighter ships, a collision between a fighter ship and an asteroid, and a collision between a fighter ship and the space station. Collisions between an asteroid and a space station are handled specially in the first mission of my game, and in other missions the asteroids do not move. Then I would also need to check collisions against the cruisers and the alien motherships, but I decided to leave those out for now. So, could I somehow make my fighter ships behave like a laser ray?

I noticed that my ray-triangle intersection algorithm does not actually care about whether the ray goes forwards or backwards. So, I could actually have an object partially penetrating a triangle of another object, and cast a ray from the penetrating part of the object backwards. Since my fighter ships all have pointy noses and always move forwards, I thought the nose would be a perfect starting point for a "collision ray". My fighter ships even have simple shapes, for example the Pirate fighter is basically just an elongated tetrahedron, so I could handle its shape pretty simply just by sending three rays from the nose towards each of the three rear corners. The Cobra ship is not much more complex, I could handle it using four rays, from the nose to each wing tip and to top and bottom. Then I just needed to check whether the ray-triangle hit position along any of these rays is less than the length of the ship away from the nose! Here below is a picture showing the three "collision rays" of a pirate ship. They do not obviously exactly follow the mesh shape, but they are quite accurate enough for collision detection of fast moving ships.

So, I added some static variables to contain the rays of the Pirate and Cobra ships, and then created a routine that tests these rays against the KDtree of the target object. Here below are the parts of the routine that handle the Pirate ship. The Cobra part is similar, it just has four rays instead of three.

    private static Ray[] PIRATERAYS = new Ray[] { new Ray(new Vector3(0f, 0f, 9.14231f), new Vector3(0f, 0.25027f, -0.96818f)),   // nose to vertical stabilizer tip
                                                  new Ray(new Vector3(0f, 0f, 9.14231f), new Vector3(0.21404f, -0.15747f, -0.96405f)),   // nose to left wing tip
                                                  new Ray(new Vector3(0f, 0f, 9.14231f), new Vector3(-0.21404f, -0.15747f, -0.96405f)) };   // nose to right wing tip
    private static float PIRATERAYENDZ = -6.62195f;  // Z coordinate of the rear wall of the ship

    private static bool ShipMeshHit(ShipData ship, ShipData target)
    {
        Vector3 hitPos;
        int hitTri = -1;
        bool hit = false;
        Transform st = ship.gameObject.transform;
        Transform tt = target.gameObject.transform;
        // Translate the ship "collision rays" into target local coordinates, and use KDHit test to determine if they hit.
        Ray localRay = new Ray(tt.InverseTransformPoint(st.TransformPoint(PIRATERAYS[0].origin)), Vector3.zero);
        for (int i = 0; i < PIRATERAYS.Length; i++)
        {
            localRay.direction = tt.InverseTransformDirection(st.TransformDirection(PIRATERAYS[i].direction));
            hitPos = Vector3.zero;
            if (target.KDTree.KDHit(localRay, ref hitPos, ref hitTri))
            {
                // Check where the hit position is along the "collision ray". The "hitPos" is in the target's coordinate system.
                Vector3 localHitPos = st.InverseTransformPoint(tt.TransformPoint(hitPos));
                if (localHitPos.z >= PIRATERAYENDZ)
                {
                    // The target ray hit position is within the ship mesh, so this is a proper collision!
                    Ships.Collision(ship, localHitPos, -1); // Call the ship's collision handler
                    hit = true;
                    break;
                }
            }
        }
        return hit;
    }
The Ships.Collision() routine then tests whether the ship has enough armor health left that it survives this frame of the collision, damages the armor and possibly some equipment, reduces the speed of the ship, and possibly explodes the ship if the accumulated damage exceeds the survivable amount. The above routine also handles similar stuff for the collision target object, I left that part out of the above routine for clarity.

There is still one extra step in my collision handling routines, specifically for the Space Station. It has such a complex shape, that I thought running the full KDTree hit test whenever a fighter ship is within the bounding sphere of the station is unnecessary work. Since I plan to always orient my space stations along the world Z axis, I could do some collision prefiltering before calling the KDTree hit testing. I separated the space station into sections along the Z coordinate, so I could easily determine which part of the station a ship could hit. For example, if the ship's position has a negative Z coordinate less than the station's lowest Z value minus the ship's shield radius, the ship cannot hit the station even though their spheres may intersect. Similarly, since the station main drum has a radius of 55 meters, if the ship is sufficiently far in front of the station that it cannot hit the habitation ring or its poles, and the ship's xy-distance from the station center is more than 55 meters plus the ship's shield radius, it cannot collide with the station even when it is inside the space station's bounding sphere.

I tested the performance of my new code using the "Demo" mission of my game. In this mission six Cobra fighters (plus the player) attack an enemy space station (or "star base" as it is called within my game) which launches several Pirate fighters to defend it. When the mission begins and there are 7 friendly ships, one space station, and 10 asteroids, the collision detection routine runs at about 500 Unity editor stopwatch ticks per frame. When most of the ships are dead, this drops down to around 100 stopwatch ticks, and the worst case seems to be somewhere around 1500 stopwatch ticks. I have learned that everything below about 5000 stopwatch ticks for such a once-per-frame routine is acceptable and does not cause framerate hiccups. So, my routine should be quite performant enough, and if it turns out to be too slow for really massive battles (my goal is to have up to 50 ships in a battle), I can always split the code to only handle half the ships per frame, alternating between which halves are handled. I actually had to do this for my LineWars DS version for Nintendo DS, as that handheld console has quite a weak CPU.

Space Station Collision Damage

After the main collision detection routine was done, I started working on what happens with the Space Station when it gets hit by a moving ship. I had already worked on some laser hit damage states before, and I decided that as a result of a collision, the adjacent panels of the station shall go directly to the most damaged state (which usually is the fire animation). I just need to check which is the closest vertex of the collision, and make all the panels sharing this vertex location go into the most damaged state. There are a lot of panels on the space station, and I need to manually set up all the UV coordinates for each of these damage states, so this is a lot of work. I had gotten wiser when I made the fighter ship damage states, and simply used U offset to determine the damage state, but the space station was my first model with damage states, and I hadn't thought about the amount of work these damage states require when I originally made it. I just need to work on it for a few days every now and then to get it eventually done. It is too boring a chore to do in one go.

Here on the left is an animated GIF image showing two Cobra fighters chasing a Pirate fighter and forcing it to crash into the habitation ring of the Space Station. The Pirate ship's shields flash red during the initial contact with the habitation ring, and the shield hit tries to bounce the ship away from the station. Due to the high speed of the ship, it only takes a couple of frames until the actual ship (or rather one of the "collision rays") also penetrates the station, causing the ship to explode. The collision handling also causes damage to the space station. Only the lower deck of the habitation ring currently gets damaged, as the upper deck still lacks the damage states.

MeshPostProcessor for Space Station

While I worked on the damage states, I noticed that one panel on the Space Station had slightly wrong UV coordinates. I fixed these, but then noticed that the procedural shadow system stopped working properly, there were some extra shadows in front of the habitation ring poles. This reminded me, that I had used such non-continuous UV coordinates to force Unity to duplicate the vertices of these panels, to separate the shadowed parts from parts of the station that have no self-shadowing. However, for the damage states, I needed the UV coordinates to be correct.

When I originally created the Space Station model and started texturing it, I was not aware of Unity's AssetPostprocessor component. Now that I had used it successfully to give all the triangles unique vertices for my fighter ships, I decided to use it also for the Space Station, to create unique vertices where needed. This way I could keep the UV coordinates correct.

I also realized, that my MeshPostProcessor is the correct place to add the Colors array to the Mesh. As I mentioned on my Mar 25th, 2018 blog post, I use vertex colors to let the shader know which part of the station (regarding the various shadow areas) this vertex belongs to. I had used the scene Awake() method to call a routine to calculate this whenever a mission begins. However, it would be much more efficient to pre-calculate this while importing the Mesh itself, as this data never changes dynamically. Thus, I moved all my code to find certain polygons and give their vertices the correct color values into the MeshPostProcessor.

Pre-alpha Gameplay Footage

I meant to record some video about my new collision detection algorithms, but the clip I recorded actually shows quite a bit about the general gameplay of LineWars VR, so I decided to release it as a pre-alpha gameplay footage. Pretty much everything is still unfinished, but it has the most important gameplay elements in place. There are some frame skips in the video, which seem to be caused by the recording itself, as the actual gameplay in Gear VR even on my slow Samsung S6 phone is completely smooth.

That's about it for this blog post, thank you for your interest in my progress with LineWars VR!

July 22nd, 2018 - Mission One Playable and Ship Damage States

After writing the previous blog post, I worked a few days on the Pirate ship texturing and fixing a couple of bugs in my handling of a laser hitting an asteroid. I had noticed that sometimes the laser seems to not hit an asteroid, even though it should. This turned out to be a simple issue of my forgetting to clear one variable in the code. Another issue was that the NPC (Non-Player Character) controlled Cobra 2 ship continued shooting at something even when there were no more enemies. This was again a simple issue, I did not test correctly whether an enemy ship was still alive. After these fixes I thought it might be interesting to attempt to make the first mission actually playable.

Making Mission One "Asteroid Storm" Playable

The things I needed to do to make the first mission playable were the following:

  1. Have the asteroids around the starbase move towards the starbase.
  2. Detect a collision between an asteroid and the starbase. This determines whether the mission (saving the starbase from the approaching asteroids) has failed.
  3. Restart the level if the mission fails.
  4. Go to the next level (with more asteroids) if all the asteroids were destroyed successfully.
  5. If the most difficult level (level 10) was successfully handled, the mission is completed, and the player is returned to the main menu.
The first step was rather easy, I just calculated a vector towards the starbase for every asteroid when the level starts, and then just moved the asteroids along that vector every frame. This is similar to how the asteroids behave in the first mission of the original LineWars II game.

The second part was much more difficult. I did not want to use the Unity built-in physics system with its collision detection, as I am not sure about how performant it is. I believe I can tailor-made a more efficient collision detection system myself. In the most difficult level I have 46 asteroids all moving and about to collide with the starbase, with every asteroid consisting of 528 triangles. So those would need quite complex mesh colliders for accurate collisions, and I would still need to handle collisions between the two ships (player and the NPC helper) and all the asteroids. In addition to that, I would need collision detection between each of the asteroids, in case they hit each other before they hit the starbase. All this felt like something that could easily kill the performance of my game.

I began by studying the collision detection algorithms on the web, using the great Realtime Rendering reference page Object/Object Intersection. After a little bit of thinking about my specific scenario, I came up with an interesting idea to reduce the required computations by quite a bit. Since my asteroids move towards the starbase at a constant velocity, I could actually calculate beforehand the exact frame when the asteroid hits the starbase. I could even orient the asteroid so, that it hits with its farthest out vertex. Since the asteroid rotates at a constant angular velocity, I can have it be in a known orientation when it hits the starbase, and just calculate backwards the required orientation it should have when the level begins.

Thus, the only things I needed were to determine where on the starbase I want each of the asteroids to hit, and in what orientation they should be at that point. As my asteroids are procedurally generated using an FFD deformer, I just added code into the mesh generation routine to remember which vertex was the farthest out from the asteroid center. I stored this into "MaxShield" field of my common ship structure (as the asteroids don't have shields and I did not want to add a new field for just asteroids). I could of course convert my ship structure to a class and use proper class inheritance to handle specific needs of different ship types, but I wanted to keep things like that close to what I did in LineWars II (which was coded fully in Assembler language).

Anyways, the code to get the needed rotation of the asteroid given the vertex index is as follows. It uses the "MeshData" structure of the ship data "asteroid", which is simply a cache for the mesh triangles and vertices, so I don't need to access the actual Mesh object in my C# code.

    // Calculate rotation needed to point the farthest out vertex of the astroid toward the world Z axis
    Vector3 maxVector = asteroid.meshData.Vertices[asteroid.MaxShield];
    Quaternion rot = Quaternion.Inverse(Quaternion.LookRotation(maxVector));

I decided to have six different "hit locations" on the starbase where the asteroids could hit it:

  1. Outer rim of the habitation ring, which has an outer radius of 139 meters and width of 19 meters, at Z coordinate 0.
  2. Front wall of the habitation ring, at 134 plus or minus 5 meters radius, with Z coordinate at 8.5 meters.
  3. Back wall of the habitation ring, at 134 plus or minus 5 meters radius, with Z coordinate at -8.5 meters.
  4. Solar panels, of which there are four separate panels, each 44 meters wide and 84 meters long, with their back-side Z coordinate at -92.5 meters.
  5. Main body of the starbase, which has a radius of 55 meters.
  6. Front wall of the starbase, with the Z coordinate at 58 meters.
Since the starbase is located at world coordinates (0,0,-60), I simply needed to add that -60 to the Z coordinates when calculating the asteroid movement vectors. Even though the starbase rotates, it does not change anything regarding the asteroid hit positions, as the solar panels do not rotate, and other parts that can hit are circular.

Here is an example of the calculations in the first section, outer rim of the habitation ring. I can have the asteroids star randomly around the ring, just making sure they start further out than the habitation ring. The code that handles this is as follows. Here "level" is the current mission level (1..10) and "asteroid.ShieldRadius" is the size of the asteroid, that is, the magnitude of the farthest out vertex vector).

    hitSeconds = Random.Range((float)(level + 21), (float)(level + 40 + 20*level)); // When the asteroid will hit the starbase
    Quaternion dirRot = Quaternion.Euler(0f, 0f, Random.value * 360f);              // Asteroid starting position around the world Z axis of the starbase
    // Starbase outer rim, end position along the rim outer edge (x=139 + radius, y=0; z=-60 +/- 8), rotated around z
    hitPos = dirRot * new Vector3(139f + asteroid.ShieldRadius, 0f, -60f + Random.Range(-8f, 8f));
    startPos = dirRot * new Vector3(Random.Range(200f, maxDist) + asteroid.ShieldRadius, 0f, -60f + Random.Range(-200f, 200f));
    hitRot = Quaternion.LookRotation(-hitPos) * rot;
The code above gives me location of the asteroid hitting the starbase, the asteroid starting position, and the orientation of the asteroid when it hits the starbase. To keep things simple, I decided to use the same orientation for both when the asteroid hits the starbase and when the mission starts, I simply make the asteroid rotate random number of full revolutions on the way to the starbase. This random value is adjusted by the asteroid size, so the smaller asteroids may rotate faster than the larger asteroids.

Okay, now I can handle the asteroids hitting the starbase, but how about asteroids hitting each other? Here I decided to make things easier for myself, instead of letting the asteroids hit each other, I decided to prohibit this completely. That meant I had to generate the starting positions for the asteroids in such a way that they do not hit each other on the way to the starbase. Here I used the Capsule / Capsule Collision detection algorithm, with the capsules corresponding to the routes the asteroids take when they move towards the base. As the asteroids may hit the base at different times, the capsules should only cover the routes where both the asteroids are "alive". I added a loop into the asteroid generation routine to check each new asteroid generated against all the other already generated asteroids, and retry the new asteroid starting position until the code finds a position where the asteroid does not hit any other asteroid on the way. This turned out to work fine, except that the code may take an extra second or two trying to find suitable starting positions for all the asteroids.

Okay, now I could detect the collisions simply by checking if the "hitSeconds" of any asteroid is less than the current time, and then call the "Mission failed!" stuff and restart the scene if so. However, it turned out that the time it took to generate the asteroids for the next level became annoyingly long in the higher levels, and thus I began looking into ways to restart the mission without the need to restart the whole scene. I decided to create all the 46 asteroids when the scene is started (for the first time) and cache the starting positions and orientations of all the asteroids. Then when the level is restarted or cleared, I simply reset the positions and orientations (and whether they are active or not) of all the GameObjects and continue the same scene. This meant that the level change is instantaneous, but the first level takes a little while to start up.

This pretty much made it possible to play the first mission of my game! I am still missing the ship/asteroid and ship/starbase collisions, most of the sound effects, etc., but the basic gameplay is in place, yay! I spent a couple of days adding a version of the asteroid shader which receives shadows from the starbase (for those asteroids that hit the shadow side of the base) and made a very simple deformation algorithm for the starbase to deform as it gets hit. The latter still needs quite a bit of work to look convincing, though.

Fighter Ship Damage States

After I got the first mission practically playable, I decided to work next on the fighter ship damage states. My plan was to have each triangle/polygon of the fighter ships have four damage states:

  1. Not damaged
  2. Laser scorched hull plating
  3. Structural damage. Hit to this level can damage some ship equipment (laser, shield generator, thrusters, etc.) located "underneath" this triangle.
  4. Major structural damage. If a triangle is already at this damage level and it gets hit again, the whole ship will explode.
In order to be able to handle each triangle having a separate damage state, I needed to have every triangle have their own vertices. Since I wanted to have my fighter ships take advantage of Unity Dynamic Batching, I needed to have the vertex count still stay below 300. Luckily my fighter ships are very simple models, both the Cobra and the Pirate ships have only 66 triangles (coincidentally), so even with all the vertices triplicated that only generated 198 vertices, so comfortably below that limit. I extended my MeshPostProcessor routine (which I wrote about in my previous blog post) to handle also the Cobra and Pirate meshes and triplicate all the vertices. Actually, I decide that the navigation lights do not need separate polygon-based damage states, so I decided to combine those vertices. They don't even need separate normals (as they are emitting light rather than receiving it), so I could easily combine them in my MeshPostProcessor. The end result was 174 vertices for the Cobra mesh and 176 vertices for the Pirate ship.

I am using a shared 2048x2048 texture atlas for all my ships, including the starbase and the cruiser, so to hit several damage states for all of these ships into the same texture atlas meant that the texturing of the ships needs to be rather coarse. I decide to go with 256-texel wide damage state slots, so that I can easily just add 1/8 to the UV coordinates of the vertices of the triangle that gets hit, to go to the next damage state. I was able to fit the full textures of both the Cobra and the Pirate ships into a 256 by 550 texture area, so they only take a little bit over one eight of my texture atlas. As part of my texture atlas is reserved for animation, that meant that about 5/6 of the texture atlas can still be used for the starbase, cruiser, and the alien ship. I also need to fit the textures for the missile somewhere, but that should only take a very small part of the texture atlas.

The images above are taken from Cinema 4D, which does not have all the surface normals quite as I wanted them, so I used my MeshPostProcessor to adjust some of the vertex normals (for example normals around the pirate ship bottom blue flash light, and the pirate ship nose vertex). The insignia of the Cobra ship is for the "Princess Royal Guard". I decided to use the Finnish Air Force blue/white roundel as the basis, and just replaced the inner white circle with a white crown. This is sort of like a "Royal Finnish Air Force" insignia, if Finland was a kingdom. :)

I pretty much suck at texturing in general, and damage texturing in particular, so it took me over two weeks trying to generate the damage model textures. I have not even managed to create the damage state textures for the rear of the ships, which are most likely to get hit, as I am still trying to figure out what the ship main thrusters should look like when they are damaged. Less bright light emissions at least, but I have not been able to decide what else is needed. I am not terribly happy with the other panels either, but at least they now have some semblance of damage.

Here below is a short YouTube video demonstrating the damage states of both the Pirate and the Cobra ships.

Explosion Footage

As I mentioned in my first blog post (at the end of this page), I had already back in the end of last year found some nice free explosion footage from Videezy, but the problem with that footage was that it did not have an alpha mask. I had been working on adding an alpha mask myself to that footage on and off for the past several months, but now finally I decided that enough is enough, I will need to purchase some proper explosion footage material. I searched the net, trying to find some ArtBeats reel footage, but it looked like ArtBeats does not sell footage directly to end users. The stock image sites that resell their footage did not clearly state whether the footage includes an alpha channel, and a single video seemed to cost between $35 and $80, which I thought was rather expensive.

After some more googling I found the ActionVFX site, which clearly stated that all files include an alpha channel, and the collection of ten aerial explosions cost only $50 at full HD resolution. I thought that was a pretty good deal, especially since the license allows you to use the footage for practically everything you can think of, so I purchased that collection. I downloaded the collection in ProRes format (not knowing what that format was), and then began to study how I could convert that ProRes-format MOV file into something useful for my game.

It turned out that FFmpeg can read ProRes files, so I decided to use my trusty AviSynth plus VirtualDub software combination to create a set of RGB frames and a set of Alpha frames from one of these explosion clips.

LoadPlugin("C:\Program Files (x86)\AviSynth\plugins\ffms2-2.23.1-msvc\x86\ffms2.dll")
v = FFMpegSource2("Aerial_Explosion_5_0783_2K.mov") # 310 frames, fire done at frame 120, slow dissolve
s = v.SelectEven().AssumeFPS(30)                    # Only take every second frame, assume the speed is still 30 FPS
v = s.Trim(0,63)                                    # Take only the first 64 frames
v = v.Crop(576,80,768,768)                          # Make the explosion fill the frame as fully as possible
v = v.Lanczos4Resize(256,256)                       # We want frames of 256 by 256 pixels
v = v.Fadeout(25)                                   # Fade out to black, during the last 25 frames
# Replace the above line with the following for the Alpha channel, otherwise the result is the RGB data.
#v = Overlay(v.ShowAlpha().Fadeout(30), v.ShowGreen().Fadeout(10), mode="add") # Have RGB data cause bumps to the Alpha fadeout
return v.ConvertToRGB32()
After that I just needed to combine the frames into an 8x8 sprite sheet, for which I used my CreateAnimBMP Unity Editor helper script that I had created some time earlier. This script has the names of the first RGB image and the first Alpha image, together with the output image name and the number of frames and output image size as parameters. Then it reads the input images (both the RGB image and the Alpha image) for each frame and calls the ProcessFrame method for each of these frames. This method stores each frame to the correct location of the output byte array outbytes, which is then written into the output BMP file. The inoffs parameter is the start offset of the RGBA data in the output BMP file, frameSize is the resolution of the single frame (256), and oSize is the resolution of the output file (2048).
    private void ProcessFrame(int frame, byte[] outbytes, byte[] rgbbytes, byte[] alphabytes, int inoffs, int frameSize, int oSize)
    {
        // BMP pixel format = Blue, Green, Red, Alpha
        // Y coordinate goes from bottom towards the top, so swap the y offset
        int xoffs = 4 * (frame % (oSize/frameSize)) * frameSize;
        int yoffs = ((oSize/frameSize) - 1) * 4 * oSize * frameSize - 4 * (frame / (oSize/frameSize)) * (oSize / frameSize) * frameSize * frameSize;
        Debug.Log("frame " + frame.ToString() + ", xoffs=" + xoffs.ToString() + ", yoffs=" + yoffs.ToString());
        for (int y = 0; y < frameSize; y++)
        {
            for (int x = 0; x < frameSize; x++)
            {
                outbytes[54 + yoffs + 4 * oSize * y + xoffs + 4 * x] = rgbbytes[inoffs + 3 * (y * frameSize + x)];
                outbytes[54 + yoffs + 4 * oSize * y + xoffs + 4 * x + 1] = rgbbytes[inoffs + 3 * (y * frameSize + x) + 1];
                outbytes[54 + yoffs + 4 * oSize * y + xoffs + 4 * x + 2] = rgbbytes[inoffs + 3 * (y * frameSize + x) + 2];
                outbytes[54 + yoffs + 4 * oSize * y + xoffs + 4 * x + 3] = alphabytes[inoffs + 3 * (y * frameSize + x) + 1];    // Use green channel of input
            }
        }
    }

This is an animated GIF showing this specific explosion, and I still have nine different variations that I can use for other explosions.

June 19th, 2018 - Progress Report, Voice Acting

Cobra cockpit avatar

After working on the Cruiser bridge pilot avatar, which I wrote about at the end of my previous blog post, I started working on the pilot avatar for the Cobra cockpit. The differences between these avatars are the 18-degree tilt of the Cobra cockpit, and the fact that the Cobra cockpit has the joystick between the pilot's legs, while the Cruiser has the joystick on the right arm rest. It was more difficult than I anticipated to handle these differences.

The first thing I needed to do was to model a separate version of the hand holding the joystick. To optimize the mesh, I had removed all polygons that always face away from the camera from the original mesh, however, the Cobra cockpit has the joystick in a different orientation, so some of these removed polygons were suddenly needed. Luckily, I still had the original object saved away, so I could just copy back the needed polygons. This increased the vertex and polygon count, however, so I spent some time optimizing the mesh, to get down to around the same number of vertices is I had in the Cruiser joystick hand. I got down to 1239 vertices for the Cobra joystick hand, compared to 1233 vertices in the Cruiser joystick hand, so almost exactly the same amount.

Next, I added the C# code that updates the _JoystickProjMat matrix for the throttle and joystick hands, these were identical to the code for the Cruiser avatar. However, when adding the code for the lower arms, I ran into some issues. I realized that the code I used in my previous blog post to calculate the positions of the connected vertices was actually buggy, and just happened to work because all the objects shared the same orientation. Now that I had to add the 18 degrees tilt of the Cobra cockpit, my algorithm stopped working.

After some thinking about this problem, I realized that it would actually be smarter to use the same rotation matrix I use in the shader to do the inverse rotation in the C# code. I just need to invert the matrix. Thus, I created the following code, which does pretty much the same as the code in my previous blog post, but here both the handRot and armRot have an added 18-degree tilt added, and the vertex movement is done using a matrix multiplication.

    //------------
    // Adjust the vertex positions by the current throttle amount
    //------------
    // First calculate how much the whole lower arm should move.
    // It moves as much as the wrist moves because of the throttle rotation,
    // minus how much the arm rotates because of the elbow joint connection.
    Vector3 trans = s_CobraThrottleTrans;
    Quaternion handRot = Quaternion.Euler(myData.CurSpeed * 30f / myData.MaxSpeed - 15f - 18f, 0, 0); // Throttle hand rotation, only around X axis
    Quaternion armRot = Quaternion.Euler((myData.MaxSpeed * 3f / 4f - myData.CurSpeed) * 8f / myData.MaxSpeed - 18f, 5f, 0);   // Arm rotation
    Vector3 wristPos = (handRot * s_LLArmWristPos) /*- s_LLArmWristPos*/ - (armRot * s_LLArmWristPos) /*+ s_LLArmWristPos */;
    Matrix4x4 mat = Matrix4x4.TRS(trans + wristPos, armRot, Vector3.one);
    Matrix4x4 invMat = Matrix4x4.Inverse(mat);
    // Translate to the opposite direction and rotate the wrist connected vertices
    for (int i = 0; i < m_VertexData.Count; i++)
    {
        // First add the handRot rotation, and then remove the effect of the _JoystickProjMat rotation
        m_Verts[m_VertexData[i].idx] = invMat.MultiplyPoint3x4(handRot * m_VertexData[i].vertex + trans);
    }
    m_Mesh.vertices = m_Verts;
    // Lower arm rotation
    m_Mat.SetMatrix("_JoystickProjMat", mat);

I was able to use mostly similar code for all the lower and upper arm movements. The difference is in the armRot calculation, as especially the Cobra right upper arm needs to move in a rather complex way when the joystick moves around all three axes. It was much simpler to make work for the Cruiser bridge, so I decided to keep the joystick on the right arm rest for my third ship (the Pirate ship), which I haven't even started modeling yet. The Cobra cockpit shall be the only one with the joystick between the pilot's legs.

MeshPostProcessor

After adding the pilot avatar vertices and polygons to the CobraCockpitSwitches mesh (I use two separate meshes for the Cobra cockpit, CobraCockpit which contains all the straight panels and the switch legends and illumination, and the CobraCockpitSwitches mesh for all the switches and such. These meshes have different materials, and the mesh with the switches had a more suitable material for the pilot avatar. However, the size of the mesh grew to 11688 triangles, which I thought was too much.

I actually had already a while ago had an idea to code some sort of a postprocessor for my meshes, where all the polygons that always face away from the camera could be removed, and after that also all the unnecessary vertices could be removed. It would be a lot of work to do all of this by hand (as I currently have over 280 separate switches in the cockpit, all of which I combine into a single mesh when exporting the object from Cinema 4D to Unity). Instead of removing these back faces by hand, I decided to look into writing a small snippet of code to do that automatically.

I found a simple example for using the Unity AssetPostprocessor. This seemed to be the place to add the code to remove the hidden polygons and vertices from my CobraCockpitSwitches object. After some coding and debugging I was able to create an extension class that does what I needed. The code I came up with is as follows:

public class MeshPostProcessor : AssetPostprocessor
{
    void OnPreprocessModel()
    {
    }

    void OnPostprocessModel(GameObject g)
    {
        Apply(g.transform);
    }

    void Apply(Transform t)
    {
        if (t.name == "CobraCockpitSwitches")
        {
            Mesh mesh = t.GetComponent<MeshFilter>().sharedMesh;
            if (mesh == null)
            {
                Debug.LogWarningFormat("Failed to get mesh for object {0}!", t.name);
                return;
            }
            int[] tris = mesh.triangles;
            Vector3[] verts = mesh.vertices;
            int tcnt = tris.Length;
            Debug.LogFormat("verts={0}, tris={1}", verts.Length, tcnt / 3);
            Vector3 cam = Quaternion.Euler(new Vector3(-18f, 0f, 0f)) * new Vector3(-0.6f, 0.5f, -0.65f); // Camera position in local coordinates
            int rcnt = 0;
            List<int> newTris = new List<int>();
            for (int i = 0; i < tcnt; i += 3)
            {
                Vector3 v = verts[tris[i]];
                // Calculate n = normal vector of this triangle
                Vector3 n = Vector3.Cross(verts[tris[i + 1]] - v, verts[tris[i + 2]] - v).normalized;
                v = cam - v;
                float m = v.magnitude;
                // Compare the vertex-to-camera vector with the triangle normal
                if (m > 0.5f && Vector3.Dot(v / m, n) < -0.05f)
                    rcnt++;  // Remove this triangle!
                else
                {
                    // This triangle should remain, so add it to the list of new triangles.
                    newTris.Add(tris[i]);
                    newTris.Add(tris[i + 1]);
                    newTris.Add(tris[i + 2]);
                }
            }
            Debug.LogFormat("Removed {0} reverse triangles", rcnt);
            mesh.triangles = newTris.ToArray();
            MeshUtility.Optimize(mesh); // Remove the extra vertices etc.
        }
    }
}

Running this code removes all hidden triangles that are over 0.5f meters = 50cm away from the camera (this limit is there so I don't remove triangles very close to the camera, which may become visible when the player rotates their head). Using the dot product limit of -0.05f above gets rid of 3728 triangles, thus the resulting mesh contains only 7960 triangles and 7581 vertices instead of the original 11688 triangles and 8616 vertices. The vertex reduction is done in the MeshUtility.Optimize(mesh) method, so I only needed to handle the triangles myself. Hardcoding the camera position (including the 18 degrees tilt angle) is not very elegant, as I need to change these values if I ever move the camera in the cockpit to a different location.

Skybox for the first mission

After working on the Cobra cockpit avatar and the mesh postprocessor, I wanted to start working on the actual missions in the game. Until now I had only had a single skybox, which did not actually fit any of the missions of the game, so I decided to start working on the skybox for the first mission. In the first mission the player's task is to clear the asteroids surrounding a star base located "in an obscure Deneb star system". Since Deneb is a blue-white supergiant, I wanted to have my skybox reflect that, and have my light color be bluish-white.

I thought it would look nice if I had a ringed planet in the scene, with the light of the distant bright blue-white star reflecting from the ice particles in the rings, and thus began working on such a scene in Cinema 4D. Within a few hours of experimenting on this, I came up with a rather nice-looking scene. The rings are just a disc primitive, with a multi-step black/white gradient in both the color and alpha channels and a high specularity for the reflections. I added subtle lights in the ring that only affect the planet surface, to give the effect of the sun illumination reflecting from the rings and illuminating the night side of the planet surface.

Above is a picture of the scene in Cinema4D using the same camera as I use for the skybox, and below is the actual rendering. I think this skybox could still use some additional nebulosity in the background, but I haven't yet gotten around to adding that. I don't want my skyboxes to become too busy, as I think space should be mostly black to give proper contrast between light and shadow. Many space games seem to treat space like daylight on Earth, with a lot of ambient light around. I have never liked that style.

New asteroid explosion system using a shader

I then began testing the first mission, shooting the asteroids around the star base. Running the game on my Samsung Galaxy S6 and at 1.4 render scale (meaning the render resolution is 1434x1434 instead of the default 1024x1024) I noticed some framerate drops whenever there were several asteroids concurrently exploding. I had been using the old C# -based asteroid explosion routine I originally created back in March and described in an old blog post. This code does a lot of work per frame in the C# code running on the CPU, and when it had several asteroids to handle, it obviously caused too much strain for the CPU to keep up the frame rate. So, I needed to come up with a faster way to split the asteroids.

I decided to do two major optimizations to my asteroid splitting routine:

  1. Pre-create an exploded asteroid mesh, so that I can simply swap the meshes when an asteroid explodes.
  2. Perform the asteroid fragment expansion in the shader instead of in the C# code.
Since I am creating my asteroids randomly using an FFD deformer over a simple sphere, creating an exploded asteroid mesh meant that I had to use this FFD-processed mesh as input, and then split it into the six different UV sections (as in my original C# splitting routine), and then have some way to move these sections and also crumble them starting from their edges.

After some experimenting I came up with a system where using the mesh color and uv2 arrays (plus the tangent.w value) I was able to have each exploding fragment have a section movement vector, fragment movement vector, and a value that determines when the fragment separates from the section. The resulting C# code got pretty complex, as I needed to find the adjacent triangles and determine the sections they belong to (this was the easy part, as the UV coordinate range determines this), add a new vertex and create three new triangles for each existing asteroid shell triangle, generating new normal, tangents, and the uv2 and color vectors as well. Instead of describing the C# code, it may be easier to understand this new system using the shader as an example. Here is the vertex shader, which does most of the actual work:

    float4 _ShieldSphere;  // Local position and radius of a possible ship shield sphere
    float  _CurrentPos;    // Position (0.0 .. 1.0) of the magnitude to use

    v2f vert (VertexData v)
    {
        v2f o;
        UNITY_SETUP_INSTANCE_ID(v);
        //UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o);
        // Adjust the vertex position by the vertex movement vector
        // First calculate the movement as a whole section, using color.xyz
        // Blast radius is 100 meters from the original surface (so smaller asteroids grow bigger)
        float3 pos = v.position.xyz + (v.color.xyz * 2 - 1) * 100 * _CurrentPos;
        if (_CurrentPos - v.color.a > 0)
        {
            // Add movement of the crumbling part
            pos += float3(v.uv2.x, v.uv2.y, v.tangent.w) * (_CurrentPos - v.color.a);
        }
        o.position = UnityObjectToClipPos(pos);
        o.uv = v.uv;
        //TANGENT_SPACE_ROTATION;
        float3x3 rotation = float3x3( v.tangent.xyz, cross(v.tangent.xyz, v.normal), v.normal );
        o.lightDirection = mul(rotation, _ObjLightDir);
        o.localPos = float4(pos - _ShieldSphere.xyz, _ShieldSphere.w * _ShieldSphere.w);
        return o;
    }

First there are two uniform variables, which are set up from the C# code. _ShieldSphere determines the local position and radius of the energy shield of the closest ship. This is so that the asteroid fragments do not penetrate the ship's shields if the ship that shot this asteroid (quite possibly the player's ship) flies through the cloud of the exploded asteroid fragments. The _CurrentPos uniform variable is simply the phase of the explosion, 0 meaning the explosion is just starting and 1 is the fully exploded asteroid with all fragments as far away as they will go (and also as small as they will go, as the fragments shrink as they fly away from the center).

The actual vertex shader begins by calculating the position of the input vertex. This is based on the original vertex position v.position.xyz which is then adjusted by 100 times the current explosion phase times the segment movement vector v.color.xyz. Since the color vector range is 0..1, I multiply it by 2 and shift it down by 1, to get a range of -1 .. 1 for the movement vector. There are six movement vectors for the six separate UV sections of the asteroid, so the asteroid always splits into six major parts.

Next the v.color.a is checked against the current explosion phase, to determine if it is time for this vertex to crumble away from the main section. If it is, the vertex position is further adjusted by the vector (v.uv2.x,v.uv2.y,v.tangent.w) multiplied by the fraction of time that has passed since this vertex crumbled away from the main section. I could use v.tangent.w for this purpose after I realized that all the tangent vectors in my asteroid had v.tangent.w value of -1, which meant that instead of using the original cross(v.normal, v.tangent.xyz) * v.tangent.w formula in the TANGENT_SPACE_ROTATION calculations, I could simplify it to just cross(v.tangent.xyz, v.normal) giving the exact same result, leaving v.tanget.w free to be used as the third component of the fragment vertex movement vector. Otherwise I would have had to use something like uv3.x for this, spending additional memory and time during the draw calls.

The rest of the code just calculates the normal stuff that a vertex shader does, calculating the screen position of the vertex and the tangent-space light direction of the vertex. Finally, also the shield sphere origin position and distance (squared) is calculated and sent to the fragment shader. The fragment shader is rather simple, the only interesting bit is the check for whether the fragment is within the shield sphere, and then removing (clipping) this fragment out if it is:

    fixed4 frag (v2f i) : SV_Target
    {
        //UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(i);
        // Clip fragments within the closest ship shield radius
        if (dot(i.localPos.xyz, i.localPos.xyz) < i.localPos.w)
        {
            clip(-1.0);
            return 1.0;
        }
        fixed4 tex = tex2D(_MainTex, i.uv);
        fixed3 tangentSpaceNormal = tex.rgb * 2 - 1;
        fixed4 col = tex.a * (DotClamped(i.lightDirection, tangentSpaceNormal) * 3 * _LightColor0);
        return col;
    }
The resulting shaders became reasonably performant, with the vertex shader performance as follows:
  7 work registers used, 8 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   23      19      0       A
  Shortest Path Cycles:   11.5    19      0       L/S
  Longest Path Cycles:    11.5    19      0       L/S
And the fragment shader like this:
  2 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   6       3       1       A
  Shortest Path Cycles:   1       1       0       A, L/S
  Longest Path Cycles:    2.5     3       1       L/S
Testing this new system in Gear VR using my Samsung Galaxy S6 caused no framerate drops even with several asteroids in the explosion phase simultaneously. The exploded asteroids have 5860 vertices and 2016 triangles, so they are rather complex meshes and having several visible at one time can still cause slowdowns, but normally in my scenes there should not be very many exploded asteroids visible at any one time.

Story narration and voice acting

In the beginning of June I then decided to tackle the big issue where I needed help from other people: Story narration and voice acting. In my original LineWars II game I had a story (created by Raymond Bingham, who had done some work for Epic Megagames / Safari Software at the time) which was shown as a text scroller before each mission began. Such text scrollers do not work very well in virtual reality games, so I wanted to have a voice narration instead in LineWars VR. But how to find someone natively speaking English and willing to do the narration?

I had joined the freesound.org site a few years ago, when I was looking for some sounds for one of my other projects. I decided to go back to that site and see if I could find someone who would be willing to help me out. I checked some voice samples from various people, and then found the page by TheScarlettWitch89. She had a pleasant young voice, had only been a member for a little while (which meant she was probably still active on the site), and mentioned being open to requests, so I decided to send her a private message and ask whether she would be interested in giving her voice for the story narrator of my game. She responded quickly and was interested in this project, which was great! I spent a couple of days coming up with the actual story narration (I had to shorten the textual story somewhat, as I believe the players are not interested in hearing longwinded narrations) and sent her the texts. I just recently received the actual narrations from her, which do sound very nice and should fit my game perfectly. My big thanks again to TheScarlettWitch89 for generously helping me with this project!

Okay, now the story narration was handled, but I also had some textual battle chatter (and friendly fire warnings) in my original LineWars II game. These too would be nice to get converted to actual human voices. I again checked freespound.org, and found some nice fighter pilot battle chatter lines by AlienXXX and shadoWisp back from 2015. These had many usable lines, but as I would like to have the call signs like "Cobra One" included in the messages, I decided to post a forum post on the sample request forum of the freesound.org site. I also sent thanks and a query about their interest to do some voice acting specifically for my game to both AlienXXX and shadoWisp. AlienXXX responded and was willing to help me out (thanks again AlienXXX!), but I haven't heard back from shadoWisp. This is not all that surprising, as she seems to have not been active on the site since 2015.

After a few days with no replies to my forum post and encouraged by the responses I had gotten for the two private messages I had sent, I decided to start contacting people directly via private messages. I searched for people that had done something similar before, that had been active recently, and had mentioned being available for requests on their home pages. I have up to nine friendly fighters (and a few cruisers) in my missions, so I would need around ten or so different voices for the battle chatter. I can use the shadoWisp samples for one voice if necessary, but I still needed around ten other voices.

Somewhat to my surprise, most of the people I had sent private messages responded and were willing to donate their voices to my game. At the moment the following people have generously agreed to help me out, some of them have already sent me their samples. There are even some professional voice actors in the mix willing to help me out, which is pretty amazing! Thank you again to all of you!

Some of the people above were even willing to help me out in other ways, AlienXXX (CÚsar Rodrigues) was willing to provide music for my game, and EFlexTheSoundDesigner (Evan Boyerman) was willing to work as the sound designer and has already provided me with some very nice sounding "walkie talkie" radio effects on his battle chatter samples, in addition to good voice acting.

Energy shield shader

Okay, now the status of the voice stuff began to look good, so I wanted to get back to programming the game. The next major feature I wanted to add was the energy shield around the ships. In my original LineWars II the ships flashed white when they got hit, but I wanted to have a proper spherical energy shield around the ships in LineWars VR. I was not exactly sure what kind of an effect I wanted, so I made a Google image search for "spaceship energy shield". One of the first hits was from a blog post called Screen Space Distortion and a Sci-fi Shield Effect by Kyle Halladay. This looked pretty neat, so I read through his blog post, and noticed that he had generously shared the source code for the effect. His effect used some screen space distortion, which I did not think was very relevant for what I was after, but the actual energy shield sphere was very useful.

His energy shield shader could handle up to 16 simultaneous hits, with 4 times 4x4 matrices handling the required input variables. I decided that four simultaneous hits are plenty enough for my uses, so I simplified the shader to only use one 4x4 matrix. His shader also used a plasma texture to animate the hit effects, I decided to go with just a simple shock wave. He had a neat Fresnel effect in his shader, which I did decide to copy for my shader as well.

Here is the actual shader code I use in my version of the shield shader. The vertex shader does nothing particularly interesting, besides sending the maximum intensity of all the four shield hits to the fragment shader in oNormal.w value. This is used to handle the shield edge Fresnel effect.

    float4x4 _Positions;    // 4 separate shield hit positions in local coordinates
    float4   _Intensities;  // 4 separate shield hit intensity values
    float4   _Radius;       // 4 separate shield hit shock wave radiuses
    half4    _ShieldColor;  // Color = health of the shield (blue, yellow, red)

    v2f vert (appdata v)
    {
        v2f o;
        o.vertex = UnityObjectToClipPos(v.vertex);
        o.oPos = v.vertex.xyz;
		// Put the biggest hit intensity into oNormal.w
        o.oNormal = float4(v.normal, max(_Intensities.x,max(_Intensities.y,max(_Intensities.z,_Intensities.w))));
        o.oViewDir = normalize(_ObjectCameraPos - v.vertex.xyz);
        return o;
    }

    // This subroutine calculates the intensity of the fragment using all the hit positions.
    float calcIntensity(float3 oPos)
    {			
        float3 impact0 = (_Positions[0].xyz - oPos);
        float3 impact1 = (_Positions[1].xyz - oPos);
        float3 impact2 = (_Positions[2].xyz - oPos);
        float3 impact3 = (_Positions[3].xyz - oPos);

        float4 sqrLens = float4(dot(impact0,impact0),	// Values between 0*0 and 2*2 = 0..4
                                dot(impact1,impact1), 
                                dot(impact2,impact2), 
                                dot(impact3,impact3));
				 
        float4 cmpRad = sqrLens < _Radius ? cos(5 * (sqrLens - _Radius)) : 0;
        float4 add = cmpRad * _Intensities;
        return add.x + add.y + add.z + add.w;
    }

    fixed4 frag (v2f i) : SV_Target
    {
        float ang = 1 - (abs(dot(i.oViewDir, i.oNormal.xyz))); // Fresnel effect, shield edges show up brighter than the center
        return (ang * i.oNormal.w + calcIntensity(i.oPos)) * _ShieldColor;
    }
The interesting stuff happens in the calcIntensity subroutine. It first separates the four hit positions from the 4x4 uniform matrix _Positions and calculates the impact positions relative to the current fragment. Then it prepares the sqrLens vector, which contains the four squared distances of these impact positions. Next a vector cmpRad is calculated, this in turn contains the fragment intensity relative to the squared distance. I am using a cos trigonometric function to calculate this, so that the blast front (or shock wave) has the largest intensity (as cos(0) = 1, where _Radius == sqrLens), and the shock wave then follows a cosine function when the distance to the hit origin is less than _Radius, and zero if the distance is greater. The multiplier 5 is just a value I experimentally determined to make the shield hit look nice.

Then the cmpRad is multiplied by the _Intensities vector, as each of the four hits have their own intensity decay value. These four intensities are then added up, to get the final fragment intensity value. In the fragment shader first the fresnel effect is calculated (using the shield hemisphere normal vector and a vector towards the camera), and then this Fresnel effect is multiplied with the maximum intensity of the currently active shield hits, and then the summed-up intensity of the hit shock waves is added, and finally the color of the shield is used to make the shield show the ship's shield health (blue = healthy, yellow = half health, red = about to collapse).

Of course, I was interested to see what the Mali Offline Compiler thinks the performance of these shaders is, so I ran them through the compiler and got the following results. Not too bad, considering the shield can handle four separate hits, and the fact that the shields are not visible all the time.

  8 work registers used, 8 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   17      15      0       A
  Shortest Path Cycles:   10.5    15      0       L/S
  Longest Path Cycles:    10.5    15      0       L/S
  4 work registers used, 6 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   15      3       0       A
  Shortest Path Cycles:   7       3       0       A
  Longest Path Cycles:    7       3       0       A

To make the effect work obviously needs some C# code to actually send those uniform values to the shader. This is handled by a couple of routines, first a static ShieldHit routine, which just checks the current _intensity vector values, and chooses the slot which has the lowest intensity (is the oldest hit). Then it sets up the new uniform values like this:

    script.matrix.SetRow(slot, localHitPos);
    script.intensity[slot] = 1f;
    script.radius[slot] = 0f;
    mat.SetMatrix("_Positions", script.matrix);
That is, the new local shield hit position is set to the correct row in the matrix, and the intensity of that slot is set to one, and the radius to zero. The position matrix is sent to the shader in this routine, as it does not change every frame. Then in the Update routine of the actual shield object I handle the radius increase and the intensity decrease, using a frameCount variable, whose initial value is 90 at the moment (so the shield flash lasts 1.5 seconds). First, I check if all the intensities are zero, in which case I can hide the whole shield.
    void Update () {
        if (intensity == Vector4.zero)
            gameObject.SetActive(false);
        else
        {
            intensity = new Vector4(Mathf.Clamp(intensity.x - 1f / frameCount, 0f, 1f),
                                    Mathf.Clamp(intensity.y - 1f / frameCount, 0f, 1f),
                                    Mathf.Clamp(intensity.z - 1f / frameCount, 0f, 1f),
                                    Mathf.Clamp(intensity.w - 1f / frameCount, 0f, 1f));
            radius = new Vector4(radius.x + 1f / frameCount, radius.y + 1f / frameCount, radius.z + 1f / frameCount, radius.w + 1f / frameCount);
            m_Material.SetVector("_Intensities", intensity);
            m_Material.SetVector("_Radius", radius);
        }
    }

Here below is a quick animated GIF showing a single laser blast hitting the rear shield of a ship. The Fresnel effect shows the hemisphere shape of the shield, while the hit itself generates a slowly dissipating shock wave.

That's about as far as I have gotten during the last month or so. I am currently working on the texturing of the Cobra and Pirate ships (the current Cobra texture can be seen in the above GIF. It has some nice features already, but it could use some more work). I am keeping all the Cobra and Pirate ships very low-poly (the absolute top limit is 300 vertices, as I want to take advantage of Unity's dynamic batching with these), so I can't make them very complex shape-wise. I do use normal mapping in the textures, so I can add features using surface normals, though.

Thank you for your interest in LineWars VR!

May 20th, 2018 - Progress Report

Cobra Cockpit shadows switched from Cube Map to procedural shadows

By the end of the last blog post I had figured out a way to combine my cockpit texture shadow maps to a single Cube Map. However, just the next day, when trying to improve the Cobra cockpit shadow maps, I realized that the Cube Map just isn't very suitable for my non-rectangular cockpit shape. I decided to check whether I could use procedural shadow planes instead of a Cube Map to handle the sun shining through cockpit windows. I had already implemented something similar for the cruiser shadows, but this time I would need a rather complex window shape, which was not even aligned with any coordinate axis.

I spent some time thinking about the problem, and did some experiments, and noticed that the Cobra cockpit windows could actually be handled by three planes whose borders would align with the object coordinates, if I would pitch the whole cockpit object 18 degrees up. This would make the instrument panel vertical, and the side window bottom edges nearly horizontal. Since only the instrument panel edges were located where I did not want to move them, I could move the other window borders freely to make the side window top and bottom edges exactly horizontal, and also make the rear edges of the windows vertical. This would make it possible to check for the fragment being in shadow or light with simple checks for intersection point Y or Z coordinate being over or below a given limit. In the image below is the CobraCockpit mesh in Cinema4D, with the windows (and the center bar of the front window) shown as yellow polygons. The Left view shows nicely how I was able to have the side windows aligned to the local coordinate system, even though in the Perspective view they look not to be aligned on any axis.

Next, I just needed a suitable algorithm for a ray-plane intersection. I had used only coordinate-aligned planes before this, so I was not sure how much more difficult handling such an arbitrarily-oriented plane would be. Luckily, it turned out that an algebraic method for Ray-Plane Intersection is pretty simple. I just need to calculate the t term for the plane, after which the ray-plane intersection for a ray starting at point P and going towards vector V is simply P + tV. This term t of the plane is based on the plane normal and the vector V, both of which will stay constant throughout the frame, and the point itself. The algorithm for calculating t is -(P.N + d)/(V.N). The value d is constant for the plane (it is -P.N for any point P on the plane) and can be precalculated. I found a nice Vector Dot Product Calculator on the net, which allowed me to just input the coordinates from my Cinema 4D cockpit object and get the d term for my shadow planes as output.

So, now it was a matter of implementing the code into my Cobra cockpit shader, and into my C# code that previously calculated the needed sun ray parameters for the Cube Map algorithm. I decided to use four shadow planes: Front window, left side window, right side window, and the window strut in the middle of the front window. I originally had the center bar/strut horizontal, but after some tests I realized that I could get nicer shadows if I were to use two slightly slanted shadow planes, depending on which side of the cockpit the sun shines from. So, in my C# code I had the plane normals set up in the Start method:

    m_N1 = new Vector3(0, -0.95f, -0.3125f);              // Front Plane normal
    m_N2 = new Vector3(-0.51307f, -0.83868f, -0.18270f);  // Left Plane normal
    m_N3 = new Vector3(0.51307f, -0.83868f, -0.18270f);   // Right Plane normal
    m_N4m = new Vector3(-0.44702f, -0.87392f, -0.19088f); // Center bar normal
    m_N4p = new Vector3(0.44702f, -0.87392f, -0.19088f);  // Center bar normal

In the Update method I then needed to pass the current light direction (in object coordinate system), along with the denominators for the plane t terms (meaning the result of V.N for each of the shadow planes). I have a uniform float4 variable _ShadowPlaneMultipliers in my shader, and pass these denominators inverted so I can just multiply instead of divide them in the shader. I'm not sure if converting divisions to multiply operations in the shader makes the shader run any faster, but at least it should not be any slower. Thus, this is what the Update part of my C# code does per each frame (the Movement.SunDirection stores the static sun direction of the scene in world coordinates):

    Vector3 lightDir = Vector3.Normalize(transform.InverseTransformDirection(Movement.SunDirection));
    m_Material.SetVector("_ShadowsLightDir", lightDir);
    // Setup the shadow plane inverse multipliers
    float d1 = Vector3.Dot(lightDir, m_N1);
    float d2 = Vector3.Dot(lightDir, m_N2);
    float d3 = Vector3.Dot(lightDir, m_N3);
    Vector4 mult = new Vector4(1f / d1,  1f / d2,  1f / d3, 1f / Vector3.Dot(lightDir, lightDir.x < 0 ? m_N4p : m_N4m));
    m_Material.SetVector("_ShadowPlaneMultipliers", mult);

Now then, what do the shaders look like? If we take the vertex shader first, that is responsible for calculating the ray-plane intersections with all the four planes. The ray is starting from the vertex position in object local coordinates (v.vertex.xyz) and going towards the light direction (also in object local coordinates) _ShadowsLightDir. The calculations need again the plane normals, the constant d terms of the ray-plane intersection equations, and the denominators which we get from the uniform _ShadowPlaneMultipliers sent by the C# code. I could have used uniforms to store the data like the normal vectors, but I noticed using the Mali Offline Compiler that giving the values "inline" within the code is faster. The compiler is smart enough to only use the dimensions of the vectors that I actually need (for example, it does not calculate the y component of the P1 vector at all, because I don't use it in the shadowData1 or shadowData2 output interpolators, and for the P4 it only calculates the x component).

    // ----- Shadow plane calculations -----
    half3 N1 = half3(0,-0.95,-0.3125);              // Front Plane normal
    half3 N2 = half3(-0.51307, -0.83868, -0.18270); // Left Plane normal
    half3 N3 = half3(0.51307, -0.83868, -0.18270);  // Right Plane normal
    half3 N4 = half3(_ShadowsLightDir.x < 0 ? 0.44702 : -0.44702, -0.87392, -0.19088); // Center bar normal
    float t1 = -(dot(v.vertex.xyz, N1) + 0.4302) * _ShadowPlaneMultipliers.x;
    float t2 = -(dot(v.vertex.xyz, N2) + 0.8023) * _ShadowPlaneMultipliers.y;
    float t3 = -(dot(v.vertex.xyz, N3) + 0.8023) * _ShadowPlaneMultipliers.z;
    float t4 = -abs(dot(v.vertex.xyz, N4) + 0.4689) * _ShadowPlaneMultipliers.w; // Handle vertices on the "wrong" side of the plane using abs()
    half3 P1 = v.vertex.xyz + t1 * _ShadowsLightDir;
    half3 P2 = v.vertex.xyz + t2 * _ShadowsLightDir;
    half3 P3 = v.vertex.xyz + t3 * _ShadowsLightDir;
    half3 P4 = v.vertex.xyz + t4 * _ShadowsLightDir;
    o.shadowData1 = half4(t1 < 0 ? 100 : P1.x, t2 < 0 ? 100 : P2.y, t1 < 0 ? 100 : P1.z, t2 < 0 ? 100 : P2.z);
    o.shadowData2 = half3(t4 < 0 ? 100 : P4.x, t3 < 0 ? 100 : P3.y, t3 < 0 ? 100 : P3.z);

If the t term of the equation is negative, it means the sun is shining from the same side of the plane as where the vertex is. This means the vertex will be in shadow, and thus I give a large value of 100 to the output interpolator in this situation. This works fine for polygons whose vertices are always on the same side of the plane. However, the slanted center bar has some polygons with vertices on opposite sides of the plane, so I needed to use the absolute value of the dot product to mirror the vertices to the same side of the plane. If I didn't do that, a polygon may have one vertex with the interpolator value of 100 and another vertex with some negative value, which would cause an invalid shadow strip to appear within the polygon. To make sure there are no cockpit vertices that are located in the actual shadow plane, I adjusted the d terms (0.4302, 0.8023 and 0.4689) slightly from their actual values, to put the shadow plane slightly outside of the window holes in the mesh.

Then, in the fragment shader, it is rather easy to check whether the interpolated ray-plane intersection position is within the window area. As I had rotated my cockpit so that all the windows had their edges axis-aligned, I could just check whether the interpolated intersection location falls within the window coordinates. For example, the center bar is 10 cm wide and located in the middle of the cockpit, so I can simply check if the x coordinate of the corresponding interpolator (P4.x in the vertex shader, sent in shadowData2.x to the fragment shader) value falls within -0.05 and 0.05. If it does, this fragment is in a shadow caused by the center bar, so I can return the shadow color value from the fragment shader. The instrument panel is located at z coordinate 14.581 cm (or 0.14581 meters) and is used as one edge for both the front window and the side windows. The intersection between the side windows and the front window is somewhat irrelevant, as the fragment gets sunlight whether the sun shines from the front window or the side window. Thus, I just used a width of -0.85 to 0.85 meters for the front window, even though this width overlaps the side windows somewhat.

    // Handle light shining in from windows
    if (i.shadowData2.x > -0.05 && i.shadowData2.x < 0.05) // Center bar shadow
        return shadowColor;
    if (i.shadowData1.x > -0.85 && i.shadowData1.x < 0.85 && i.shadowData1.z > -0.51307 && i.shadowData1.z < 0.14581)
        return lightColor; 
    if (i.shadowData1.y > 0.1838 && i.shadowData1.y < 0.62579 && i.shadowData1.w > -1 && i.shadowData1.w < 0.14581)
        return lightColor;
    if (i.shadowdata2.y > 0.1838 && i.shadowdata2.y < 0.62579 && i.shadowData2.z > -1 && i.shadowData2.z < 0.14581)
        return lightColor;
    // We are in shadow
    return shadowColor;

So, the interesting question now is, how did the performance of the shaders change after this change from the (not properly working) Cube Map shadows to this procedural shadow system? Let's use the Mali Offline Compiler and check what it says. The original vertex shader took 10 arithmetic cycles and 22 load/store cycles, so the performance is bound by the load/store operations:

  7 work registers used, 7 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   20      22      0       L/S
  Shortest Path Cycles:   10      22      0       L/S
  Longest Path Cycles:    10      22      0       L/S
After adding all the code to handle the shadow planes, the arithmetic instructions nearly doubled! However, since the shader is bound by the load/store performance (which only increased by two cycles), the actual performance degradation is not very significant.
  8 work registers used, 8 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   35      24      0       A
  Shortest Path Cycles:   17.5    24      0       L/S
  Longest Path Cycles:    17.5    24      0       L/S
The original fragment shader (using the Cube Map shadow system) had this kind of performance:
  3 work registers used, 3 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   23      5       2       A
  Shortest Path Cycles:   3       2       1       A
  Longest Path Cycles:    9       5       2       A
The longest path had 9 arithmetic operations, 5 load/store operations and two texture lookups. After switching to the procedural shadow planes, the performance characteristics changed to the following:
  4 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   23      5       1       A
  Shortest Path Cycles:   3       4       1       L/S
  Longest Path Cycles:    7       5       1       A
The instruction counts actually remained the same, but the arithmetic cycles decreased by two, and I got rid of one of the texture lookups (the Cube Map itself). This was very nice, I was able to fix the shadows to be correct, and make the code run faster at the same time!

Instruments combined to a single mesh

Next, I continued my work on the Cruiser Bridge model. Pretty soon I realized that it would make more sense to have a common code for the cockpit instruments of all the three flyable ship types. Currently the Cobra cockpit instruments were partly embedded into the Cobra cockpit mesh, and partly in a separate object. Both of these had their own code to handle the instrument updating. So, I started moving the all the instruments into the separate object for the Cobra cockpit and created a new instruments object for the Cruiser bridge. These objects would work kind of like skins for the instruments, so each cockpit has their own skin, but the underlying C# code would be the same. After a day of working on this I had the system up and running for both the Cobra and the Cruiser cockpits.

Cruiser Bridge shadows switched from Cube Map to procedural

After I got the Cobra cockpit shadows improved using procedural shadows instead of the Cube Map system, I wanted to make the same change also to the Cruiser bridge object. However, the problem here was that the cruiser bridge actually has 19 separate rectangular windows! The front and ceiling windows are actually nicely oriented along the coordinate axis, so those would be easy to handle using the Cube Map. Only the slanted side windows were a problem with the Cube Map system. At first, I tried to create a sort of a hybrid system, where the front and ceiling windows were handled by the Cube Map, and the side windows procedurally, but it soon became evident that this hybrid system would just enhance the worst aspects of both of those systems. Since the Cube Map could not handle all the windows, I had to switch completely to the procedural system.

I used much the same system as with the Cobra cockpit shadow planes, except that the front and ceiling shadow planes were somewhat simpler, as they are axis-aligned. Thus, the vertex shader for the Cruiser bridge turned out slightly simpler. Here shadowData1 contains the interpolators for the front and ceiling windows, and shadowData2 the interpolators for the left and right windows.

    // Shadow planes
    float2 dist = (v.vertex.yz - float2(1.601, 7.101)) / _ShadowsLightDir.yz;	// Y and Z plane distances
    float4 ip = v.vertex.xzxy - _ShadowsLightDir.xzxy * dist.xxyy;
    o.shadowData1 = float4(dist.x > 0 ? 100 : ip.x, ip.y, dist.y > 0 ? 100 : ip.z, ip.w);
    half3 Nr = half3(-0.86824, 0, -0.49614);
    float tr = -(dot(v.vertex.xyz, Nr) + 6.1778) / dot(_ShadowsLightDir, Nr);
    half3 Pr = v.vertex.xyz + tr * _ShadowsLightDir;
    half3 Nl = half3(0.86824, 0, -0.49614);
    float tl = -(dot(v.vertex.xyz, Nl) + 6.1778) / dot(_ShadowsLightDir, Nl);
    half3 Pl = v.vertex.xyz + tl * _ShadowsLightDir;
    o.shadowData2 = float4(tl < 0 ? 100 : Pl.z, Pl.y, tr < 0 ? 100 : Pr.z, Pr.y);

The fragment shader however became pretty complex, as there are so many separate windows. I tried to order the if clauses in a way that the total number of clauses for the longest path would stay as low as possible. The main if clauses check the overall window areas, and the sub-clauses then exclude the window struts within these areas.

    if (i.shadowData1.x > -2.94 && i.shadowData1.x < 2.94 && i.shadowData1.y > 2.56 && i.shadowData1.y < 6.94)
    {
        // Sun shines through the ceiling windows.
        if (i.shadowData1.y < 3.94 && i.shadowData1.x < 1.44 && i.shadowData1.x > -1.44)
            return sunColor;
        if (i.shadowData1.y < 4.06 || (i.shadowData1.y > 5.44 && i.shadowData1.y < 5.56))
            return shadowColor;
        if ((i.shadowData1.x > 1.44 && i.shadowData1.x < 1.56) || (i.shadowData1.x > -1.56 && i.shadowData1.x < -1.44))
            return shadowColor;
        return sunColor;
    }
    if (i.shadowData1.z > -2.94 && i.shadowData1.z < 2.94 && i.shadowData1.w > -1.44 && i.shadowData1.w < 1.44)
    {
        // Sun shines through the front windows.
        half fX = abs(i.shadowData1.z);
        if (abs(i.shadowData1.w) < 0.06 || ( fX > 1.44 && fX < 1.56))
            return shadowColor;
        return sunColor;
    }
    if (i.shadowData2.x > 2.60171 && i.shadowData2.x < 6.99752 && i.shadowData2.y > 0.06 && i.shadowData2.y < 1.44)
    {
        // Sun shines through the left windows.
        if ((i.shadowData2.x > 5.49752 && i.shadowData2.x < 5.60171) || (i.shadowData2.x > 3.99752 && i.shadowData2.x < 4.10171))
            return shadowColor;
        return sunColor;
    }
    if (i.shadowData2.z > 2.60171 && i.shadowData2.z < 6.99752 && i.shadowData2.w > 0.06 && i.shadowData2.w < 1.44)
    {
        // Sun shines through the right windows.
        if ((i.shadowData2.z > 5.49752 && i.shadowData2.z < 5.60171) || (i.shadowData2.z > 3.99752 && i.shadowData2.z < 4.10171))
            return shadowColor;
        return sunColor;
    }
    return shadowColor;

The resulting performance by the Mali Offline Compiler turned out to be as follows, first the vertex shader:

  8 work registers used, 8 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   32      23      0       A
  Shortest Path Cycles:   17.5    23      0       L/S
  Longest Path Cycles:    17.5    23      0       L/S
And then the fragment shader:
  4 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   52      6       1       A
  Shortest Path Cycles:   1       1       1       A, L/S, T
  Longest Path Cycles:    8.5     6       1       A
The vertex shader performance is pretty similar to that of the Cobra cockpit vertex shader, but the fragment shader is somewhat slower due to all those if clauses. Even this performance is still slightly better than what the performance was using the Cube Map, so I was pretty happy with the end result.

Cobra cockpit Android texture problem

After I had changed both of these shadow systems, and everything worked fine within the Unity editor, I finally tested the result on the actual Gear VR hardware using my Android phone. The Cruiser bridge worked fine, but the Cobra cockpit had a strange issue where some of the cockpit panels were either black or white depending on the orientation, and not using either of the shadowColor or lightColor as they should have! I again had no idea what could cause this, but as I had already encountered something similar before, I at least had some ideas as to how to debug the problem.

I began debugging this problem by changing the order of the interpolators in the structure and got various other weird effects with the textures. With a certain order of the interpolators, the textures were "swimming", with another order I got the black and white problem, but I could not find an order that would be without issues. After some more experimenting I finally noticed, that if I made all the interpolators have the same number of dimensions, the problem vanished. Originally, I had declared my data structure like this:

struct v2f
{
    float4 vertex : SV_POSITION;
    float2 uv : TEXCOORD0;
    float3 normal: TEXCOORD1;
    float4 channel: TEXCOORD2;
    float3 cameraDir: TEXCOORD3;
    float4 shadowData1: TEXCOORD4;
    float3 shadowData2: TEXCOORD5;
}
When I then remapped all the shadowData2 interpolators into the extra dimensions of the other structure members, I got rid of the weird texture problem. All the interpolators now have the same 4 dimensions, and the structure looks like this (with uv.z, uv.w and normal.w containing the old shadowData2 interpolators):
struct v2f
{
    float4 vertex : SV_POSITION;
    float4 uv : TEXCOORD0;
    float4 normal: TEXCOORD1;
    float4 channel: TEXCOORD2;
    float4 cameraDir: TEXCOORD3;
    float4 shadowData1: TEXCOORD4;
}

Pilot avatar and Cruiser bridge modeling and texturing

Now that I had the shadows working properly, it was time to continue modeling the Cruiser bridge, and at the same time the pilot avatar. There was quite a bit of work involved in both the modeling and texturing, so even after working on this for a couple of weeks, it is still far from finished. Again, texturing is the most time-consuming part of the process.

When working for the instrumentation of the Cruiser bridge, I decided that the captain should have MFD display panels at the ends of the arm rests. These will contain the ship status and damage displays on the left panel, and the communications stuff on the right panel. These would correspond to the leftmost and the third display panel in the Cobra cockpit. The Cobra cockpit has the 3D radar display showing on the second display panel but I decided to have a large holo projector instead in the Cruiser bridge. This holo projector will be on the floor level in front of the captain, while the captain's chair is on an elevated level with unrestricted views through the windows.

Below and in front of the captain are the stations for weapons and communications officers (or some such), which should also contain some human characters sitting on the stations. Both of those stations are still only at a placeholder level, no actual work has been done either properly modeling or texturing the stations.

Vertex color as light intensity

While working on the preliminary texturing of the Cruiser bridge model, I realized that in several locations I would like to have some additional lights shining on the surface. The Cruiser bridge (same as the Cobra cockpit) should be rather dark when the sun is not shining through the windows, but there should still be some modest lighting. As I am trying to fit all the textures into a rather limited texture atlas, I could not bake all the lighting into the textures either. How to solve this issue?

I remembered reading from Unity documentation that if a mesh does not have Vertex Color information, Unity will automatically generate an array of full white Vertex colors. I thought that rather than wasting memory with this default array, I could actually use the Vertex Colors for something useful, and store information about how much light each vertex of the object is receiving. Then it was just a matter of writing a C# code that would calculate the received light from whatever light emitters I decided to have in my objects. Here is an example image of a stairwell in the Cruiser bridge, where a light in a wall is illuminating each step.

Pilot avatar arm movements

After some time working on the bridge and the pilot avatar, I decided it was time to tackle the difficult issue, making the pilot arms move. I wanted to have the left throttle hand follow the actual throttle control input, and similarly the right hand to follow the joystick input. I thought about having the pilot's legs move on the pedals for the yaw input (similarly to how airplane controls work), but decided to have yaw control also move the joystick hand, to keep the number of moving objects as low as possible.

I started with the throttle hand, and simply made it rotate around it's pivot point which was inside the arm rest. The rotation worked, but I noticed that the procedural shadows were not correct. I realized that simply moving the object will not work, as I used the same procedural shadows for the throttle hand object as I used for the bridge object. The procedural shadows use hardcoded distances from the object origin to the shadow planes, so the object origin can not move or rotate, or the shadows will be incorrect. But I want to move and rotate the pilot's arms! How can I solve this problem?

I thought about having separate dynamic procedural shadow values in the throttle and joystick arm shaders, but it soon became evident that this would make the shader much more complex and slower. So, the remaining option was to keep the object stationary, and instead move the vertices of the object. I wrote a quick test that used some C# code to rotate the vertices of the throttle hand object. This worked, after I moved the throttle hand to the bridge coordinate system origin, and then also remembered to adjust the bounding box of the object, otherwise the object would not be visible when it should!

However, moving all the vertices in the C# code for every frame did not seem like a good solution, as the joystick object has 1233 vertices in Unity, and even the simpler throttle hand has 362 vertices. I would also need to move the lower and upper arms, so it felt like moving approximately two thousand vertices in C# code for each frame would be an unnecessary burden for the CPU. How about moving the vertices using the GPU?

Since the vertex shader needs to transform the vertices from the object coordinate system to world coordinates and then to screen coordinates anyways, I thought that perhaps adding one more transformation to this operation would be the best way to solve this issue. After all, performing such transformations is what the GPU is meant for. However, this meant I needed to delve into those scary rotation matrices, which I feel I do not fully understand. Luckily, it turned out that Unity has simplified the rotation matrix generation, so that I could generate the required matrix simply using the Matrix4x4.TRS method. This gets a translation vector, rotation quaternion, and a scale vector as parameters, and returns a matrix that can directly be used in the shader. Thus, I just needed to add a float4x4 uniform variable to my vertex shader, and multiply both the object vertex and normal by this matrix inside the vertex shader:

    float4x4    _JoystickProjMat;
    float3 vertex = mul(_JoystickProjMat, v.vertex);  // Translate and rotate the joystick/throttle/arm
    float3 normal = mul(_JoystickProjMat, v.normal);  // Only xyz input, so translation is not applied
This change caused the vertex shader to spend 3 additional arithmetic GPU cycles, compared to the original Cruiser Bridge vertex shader. Since the vertex shader is still load/store -bound, this should not actually affect the performance all that much. The fragment shader needed no changes because of this system.
  8 work registers used, 12 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   35      23      0       A
  Shortest Path Cycles:   19.5    23      0       L/S
  Longest Path Cycles:    19.5    23      0       L/S
What remained was the generation of the matrix, which I did in the Update method of a C# code that is attached to the throttle hand object. I rotate the throttle hand object up to plus or minus 15 degrees around the X axis, based on the throttle range between zero and MaxSpeed:
    Vector3 trans = new Vector3(-0.44f, -0.12f, 2.0f);  // Translate the object to where we want it
    Quaternion rot = Quaternion.Euler(myData.CurSpeed * 30f / myData.MaxSpeed - 15f, 0, 0); // Throttle hand rotation, only around X axis
    m_Mat.SetMatrix("_JoystickProjMat", Matrix4x4.TRS(trans, rot, new Vector3(1, 1, 1)));

After I got the throttle hand working, it was time to attach the lower arm to it. This was something I had been partly looking forward to and partly dreading, as I would need to invent some way to attach some vertices of one object to another. I needed to have the wrist vertices of the lower arm stay stationary relative to the wrist vertices of the throttle hand, even though the lower arm should move separately to the hand. I looked into Unity skinned mesh system, but it felt rather overkill, as I only needed to move a few vertices of the lower arm object. In the end I decided to move these few vertices using the C# code. But this again forced me to look into the required rotation matrices.

Since I use rotation matrices in the vertex shader to rotate the object vertices, and now I needed to keep some vertices from rotating (or rather, have them rotate differently), it seemed like similar rotation matrices would be the solution here as well. However, since I needed to combine two different rotations into one, I thought that trying to build the rotation matrix from these two quaternions might be less efficient than just using those two quaternions directly. I decided to attempt to solve the problem using just quaternions. Here is first the code that handles the left lower arm vertex movement, followed by some explanations regarding the code.

    //------------
    // Adjust the vertex positions by the current throttle amount
    //------------
    Quaternion handRot = Quaternion.Euler(myData.CurSpeed * 30f / myData.MaxSpeed - 15f, 0, 0);  // Throttle hand rotation, only around X axis
    Quaternion armRot = Quaternion.Euler((myData.MaxSpeed * 3f / 4f - myData.CurSpeed) * 8f / myData.MaxSpeed, 0, 0);  // Arm rotation
    Quaternion armInv = Quaternion.Inverse(armRot);
    // First calculate how much the whole lower arm should move.
    // It moves as much as the wrist moves because of the throttle rotation,
    // minus how much the arm rotates because of the elbow joint connection.
    Vector3 llArmWristPos = new Vector3(-0.00325f, 0.19955f, -0.11151f);  // Wrist position of avatar LeftLowerArm
    Vector3 wristPos = (handRot * llArmWristPos) /*- llArmWristPos*/ - (armRot * llArmWristPos) /*+ llArmWristPos */;
    Vector3 trans = new Vector3(-0.44f, -0.12f, 2.0f) + wristPos;
    // Translate to the opposite direction and rotate the wrist connected vertices
    for (int i = 0; i < m_VertexData.Count; i++)
    {
        Vector3 v = m_VertexData[i].vertex;
        m_Verts[m_VertexData[i].idx] = handRot * v      // Rotate the wrist vertices by the throttle hand rotation
                                     + armInv * v - v   // Remove the effect of the _JoystickProjMat rotation
                                     - wristPos;        // Remove the effect of the _JoystickProjMat wrist-relative translation
    }
    m_Mesh.vertices = m_Verts;
    // Lower arm rotation
    m_Mat.SetMatrix("_JoystickProjMat", Matrix4x4.TRS(trans, armRot, new Vector3(1, 1, 1)));
First, I generated the same handRot quaternion as with the throttle hand, but I then also added a small rotation of the lower arm using the armRot quaternion. This makes the arm move slightly more naturally. For the wrist vertices I need to remove this arm rotation, so I generated an inverted rotation armInv from the armRot as well. After that, I calculate how the wrist should move. The "wrist" in this case is part of the lower arm, so its position is based on the rotation of the llArmWristPos around the origin (the lower arm and the throttle hand share the same origin position), plus the inverted rotation of the arm. However, instead of using the armInv quaternion to rotate the arm, I use the original armRot negated. This way I can avoid adding the llArmWristPos to the formula, as it gets both added and subtracted within the formula. This is probably not quite a correct way to do this, but with a rotation only around a single axis I can get away with it. The resulting translation is then the position where we want the lower arm, plus the wrist position.

I then go through the wrist vertices, which I had stored in the Start method into an m_VertexData array containing the vertex index in the stored m_Verts array and the original vertex position. The new vertex position for these wrist vertices is based on only the hand rotation, but since the rotation matrix in the shader rotates and translates also these vertices by the armRot and wristPos values, I need to remove the effect of both of these from the resulting vertex position. Then I just update the vertices of the mesh and send the _joystickProjMat matrix to the shader.

There is similar code for the upper arm, and for the joystick hand and right lower and upper arms. The differences between these codes are the positions, the way the lower arm moves, and the fact that the upper arm has two sets of specially moving vertices, one set at the elbow and one at the shoulder. Luckily the upper arms are the lowest-poly objects, so updating their vertices every frame should not be especially time-consuming operation. Here are the current vertex and polygon counts of the pilot avatar arm objects, as counted by Unity:

Finally, here below is a short YouTube video demonstrating the pilot avatar arm movements, and also showing the procedural shadows on the Cruiser bridge. Note that everything is still very much work in progress. Thanks for your interest in my LineWars VR project!

Apr 21st, 2018 - Progress on Multiple Fronts

For the past month I have been working on various core technologies (and some models) I will need for my finished game. I have mainly focused on making sure the ideas I had in mind are working, and I've been jumping to the next new thing after I have confirmed the idea works. Thus, I have practically not finished anything, I have just started a lot of new development fronts. This also means I don't have a new demo video this time, as none of the new techniques are quite in the presentable form yet.

Space Station Laser Hit Damage Textures

As I mentioned at the end of my previous blog post, I continued my work on the Space Station textures by implementing some damaged textures. My idea is to switch the texture coordinates of a body panel when it gets hit, so after the first hit the panel shows a black scorch mark from the laser, the next hit to the same panel creates more damage, until the whole panel gets blown of and the underlying structure gets exposed. This is why I created my original Space Station model so, that it has many rectangular areas, which I can then replace the texture UV coordinates of. However, the main body of the station is so big that I needed to have each rectangular area consist of four body panels, and thus, I needed to have that texture having all combinations of anything between zero and four of the armor panels damaged. This ended up taking so much of my texture atlas area, that I decided to limit the damage stages to only three: Non-damaged, Once Hit, and Structure Visible. Here below is an example of those three body panels:

The code that handles the laser hitting the station first uses a KDTree implementation to determine the polygon (triangle) that got hit, then finds the adjacent triangle (the fourth corner of the rectangular panel) using the fact that the hit triangle has two shared vertices with another triangle when the texture UV coordinates are continuous between these two triangles, and then uses a C# Dictionary to look up the next UV coordinates given the current UV coordinates of the hit triangle. I precalculate the KDTree of the station, and also a lookup list of shared vertices for each vertex, to make the runtime hit tests as fast as possible. The UV coordinate dictionary also needs to be generated beforehand, and that is a slow and error-prone manual process, which is why I still have not completely finished it for the station model.

Fire and Smoke Animation Textures

With those damage panels I was able to get down to a visible structure behind the armor panels, but when you hit that structure with more laser fire, you would expect something further to happen. I thought that perhaps it would be a good idea to have some fire starting inside the station when you keep hitting the damaged area of the station. I searched the net for some fire animations, and using a picture search for "fire sprite sheet" resulted in many options. I chose one such sheet and modified it for my needs, and got a rather nice looking fire animation running. I decided to use the bottom part of my 2048x2048 texture atlas for the animations, so I added code to my shader to change the UV texture coordinates if they are in this area. This worked so well, that I decided to see what other animations I could do.

First, I decided to replace the blinking lights animation (which I had done in the vertex shader by switching the UV coordinates) with a proper animation, using the same system as the fire animation. However, as I only used a 16-frame animation sequence (16*128 = 2048) I noticed that I was not able to make a suitable fast blink using just 16 frames in a loop. Thus, I changed my fire animation to 32 frames (which meant I had to drop the horizontal resolution from 128 pixels down to 64 pixels), and with that speed I was able to get proper fast blinking speed. I even have room to add different color blinking lights, like red and green running lights for my space ships.

Next, I thought it would be nice to get some steam coming out of some ruptured pipes, and perhaps even some kind of pressure leak animation when the habitation ring of the station is hit. For these I again searched the net for some white smoke animations, and found a couple of promising ones. However, I had a problem of figuring out how to combine the smoke animation with the base texture. For this, I decided to see if my old AviSynth skills could still be useful. After some tinkering with the scripts, I was able to create a ruptured pipe producing white smoke or steam, and a hole that looks sort of like it is leaking atmosphere.

Here below is the Avisynth script I used to generate the leaking atmosphere effect. I am using some white smoke animation footage I found on the net, which I first adjust to half brightness with "Tweak(bright=0.5)" and then convert to RGB32 (which some of the later functions require). I also load a couple of BMP images, a round mask that makes the smoke fade towards the borders of the image, and the base window panel image on top of which the smoke is shown. I then crop a part of the smoke animation and turn it 90 degrees clockwise, and then stack two of these horizontal animations, one going to the opposite direction and one starting at a bit later in the original animation. Then I use the "Dissolve" operation to make the animation loop seamlessly, meaning that the last 8 frames of the 32-frame sequence slowly dissolve into the 8 frames before the start of the sequence (I concatenate two of these 32-frame sequences just to be able to confirm there is no sudden jump when looping back to start). Then I use some "Mask" and "Layer" operations to fade the smoke animation towards the frame edges, resize the result to a temporary size (I do the final resizing when adding the images to my texture atlas), and then just use the "Layer" operation again to position the smoke animation over the background image. Finally, I take the 32 frames of the sequence and convert them to YV12 (for some reason I have since forgotten, perhaps this would not be needed).

LoadPlugin("C:\Program Files (x86)\AviSynth\plugins\ffms2-2.23.1-msvc\x86\ffms2.dll")
v = FFMpegSource2("c:\Projects\LineWarsVR\References\smoke_anim_preview.mp4").Tweak(bright=0.5).ConvertToRGB32()
m = ImageSource("c:\Projects\LineWarsVR\References\Smoke\RoundMask.bmp").ConvertToRGB32()
p = ImageSource("c:\Projects\LineWarsVR\References\Smoke\WindowPanel.bmp").ConvertToRGB32()
v = v.Crop(169, 136, 64, 64).TurnRight()
v = StackHorizontal(v.Turn180(), v.Trim(5,200))
i = 130
a = v.Trim(i,i+31)
b = v.Trim(i-8,i+23)
v = Dissolve(a, b, 8)
v = v.Trim(0,31) + v.Trim(0,31)
v = Mask(v, m)
c = BlankClip(v)
v = Layer(c,v)
v = v.LanczosResize(100,96)
v = Layer(p,v, "lighten", x=-25, y=-38)
v = v.Trim(0,31)
return v.ConvertToYV12()

Space Station Shader Optimizations

As I mentioned at the end of my previous blog post, I was able to get the space station fragment shader down to a reasonable 8.5 GPU cycles, but the vertex shader still used 30 GPU cycles to run. What was worse, it used spilling, which meant that the GPU did not have enough registers to hold all the intermediate values of the needed calculations, it needed to store some intermediate values into memory and then load them back to continue the calculations. I wanted to at least get rid of the spilling and optimize the shader code overall if possible.

The first optimization was removing the separate blinking code, as I could now use the animation system for the blinking. The animation is handled in the vertex shader with a code like this:

	//------------------------------------
	// Handle animations (blinks, fire, etc..)
	//------------------------------------
	o.uv = (v.uv.y < 260.0/2048.0) ? float2(v.uv.x + _AnimOffset, v.uv.y) : v.uv; // TRANSFORM_TEX(v.uv, _NormalTex);
I am using the _AnimOffset uniform variable from the C# script to get the current frame of the animation to play. I also noticed that I can get rid of the TRANSFORM_TEX function, as I use neither tiling nor offsets with my UV coordinates. This change already got rid of a couple of GPU cycles.

I also noticed that the Unity built-in TANGENT_SPACE_ROTATION macro normalizes both the vertex normal vector and the vertex tangent vector before it calculates the binormal (using the cross-product operation). I thought both of these normalizations were unnecessary, as both of those vectors are already normalized in my input object mesh. Thus, I replaced the macro with this code:

	//TANGENT_SPACE_ROTATION;
	float3x3 rotation = float3x3( v.tangent.xyz, cross( v.normal, v.tangent.xyz ) * v.tangent.w, v.normal );

The last optimization at this time was my replacing the object-space light direction calculations that were performed in the shader with a uniform vector that gets calculated in the C# script, as it only changes once per frame. All these changes resulted in the vertex shader now taking only 25.5 GPU cycles, and not having to use spilling any more.

  8 work registers used, 13 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   49      25      0       A
  Shortest Path Cycles:   25.5    25      0       A
  Longest Path Cycles:    25.5    25      0       A

Cruiser 3D Model and Procedural Self-Shadowing

After spending many days with setting up the damage textures for the space station model, I began to get bored with that work, and decided to start working on the Cruiser model and continue the station damage stuff later. There are several missions in the original LineWars II where there are large cruiser ships in the fight in addition to the Cobra and Pirate fighters. The cruiser model was something that I wanted to create from scratch, and not use the very simple model from LineWars II even as a basis.

I had spent some time looking for various space ship designs in sci-fi movies and art, and had decided on something vaguely similar to the Rodger Young ship from Starship Troopers. However, now that I had nice-looking procedural shadows in my complex space station, it would be quite silly if my cruiser was either a very simple convex object (with no self-shadowing needed), or a complex object lacking proper shadows. The problem with my space station procedural shadows are, that they work only in the Z direction of the object, meaning that the station needs to face towards the sun. This is not a problem with the space station, but the cruiser must be able to move freely and have the sun shining from any direction.

I first created a rather low-poly cruiser model with some recessed slots in the sides, and began figuring out how to handle shadows within these slots. I could use much the same algorithm as in the "mail slot" of the space station. However, in the cruiser model I did not want to have anything hard-coded in the shader, as I would need to have several shadowed regions with different light directions. I started experimenting with using the UV2, UV3 and UV4 coordinates of the Unity Mesh object for parameters of the shadow areas. Each vertex can have these additional UV coordinates for which I did not have any other use at the moment.

After some work experimenting, I managed to create an algorithm that worked pretty well for the recessed areas. I used the UV2 input X coordinate as a flag that tells whether the shadow plane is the XZ plane (horizontal, when uv2.x > 0) or the YZ plane (vertical, when uv2.x < 0) or whether the vertex belongs to a triangle that needs no self-shadowing (uv2.x == 0). Then uv2.y tells the distance of the plane from the object coordinate system origin, and uv3 and uv4 give the four corner points of the rectangle that passes light on this plane. The vertex shader part of the algorithm looked like this:

	float dist;
	float2 ip;
	o.shadowData = float4(v.uv3, v.uv4);
	if (v.uv2.x > 0)
	{
	    dist = (pos.y - v.uv2.y) / _ObjLightDir.y;
	    ip = pos.xz - _ObjLightDir.xz * dist;
	    o.shadowPos = float4(ip, 1, 0);
	}
	else if (v.uv2.x < 0)
	{
	    dist = (pos.x - v.uv2.y) / _ObjLightDir.x;
	    ip = pos.zy - _ObjLightDir.zy * dist;
	    o.shadowPos = float4(ip, 1, 0);
	}
	else
	    o.shadowPos = float4(0,0,1,1);
The vertex shader gives two float4 interpolators to the fragment shader, shadowData which is based on the uv3 and uv4 vertex input and contains the corner x,y coordinates (which stay constant throughout the polygon), and shadowPos which is the actual interpolator of the projection of the fragment position on the shadow plane (in the X and Y coordinates) and the shadow/light multipliers (in the Z and W coordinates). Thus, swapping the Z and W I could have a rectangular area either cause a shadow or pass light, while the area of the plane outside of this area behaves the opposite.

The fragment shader part of the algorithm is pretty simple, it just compares the shadowPos interpolator with the shadowData interpolator to determine whether the fragment is in shadow or in light:

	fixed sh = i.shadowPos.x <= i.shadowData.x && i.shadowPos.y <= i.shadowData.y && i.shadowPos.x >= i.shadowData.z && i.shadowPos.y >= i.shadowData.w ? i.shadowPos.z : i.shadowPos.w;

Okay, so this was a good example algorithm for simple rectangle shadows in recessed areas, however, I need to have shadows generated by the control tower and other tower-like structures on the cruiser hull. This seemed lot more complex, and I decided to again start from a hard-coded vertex and fragment shaders and see how far I can get. I created a simple cube with an extruded tower in the middle of one face and began working on the shadow algorithm. Having the tower ceiling cause shadows on the cube face worked well with the existing algorithm, but it was not sufficient, as the tower walls also need to cause shadows. However, I realized that I don't need to have the ceiling cause shadows at all, if I just have two adjacent walls creating shadows. After some more testing I was able to confirm that indeed two planes that are at right angles will be enough for convincing shadows for a rectangular tower, but the planes will need to be different depending on the sun direction.

I then used different if clauses for different sun directions in my algorithm, and had a rotating cube with an extruded tower having nice shadows running in the Unity editor! The next step was to have a tower that is not just a simple cube but has some angled walls. I was able to handle this as well, if I added a slope multiplier to the check whether the fragment is in shadow. With this system I thought I had enough features to handle the cruiser structure self-shadowing. However, everything was still hard-coded, and used many more if-clauses and variables than the available 6 float variables in the UV2, UV3 and UV4 vertex data vectors. In fact, I counted I needed two sets of plane z-distance, x-min, x-max, y-min, y-max, x-slope, y-slope, plus a way to determine which plane orientation to use for either of those sets, so in total 2*8 = 16 variables, while what I had was only 6 float variables, and the Vertex Color, which was just four fixed (0..255) or (0..1.0) values. How could I fit 16 variables into 6 (plus some change) variables?

I then had an idea of using completely different sets of UV2, UV3 and UV4 coordinates depending on the sun direction relative to the object. The object orientation and the sun orientation are known in the C# script and stay constant throughout the frame, so the C# script can provide the shaders with the correct set of these extra UV coordinates. This did not actually help much with the needed variables, but made it possible to have only two shadow planes in the code, if the vertex input can tell the code the orientation of the planes. Moreover, since there are only three possible two-plane orientations, one of the Vertex Color fields would have sufficient resolution to handle this data. So, now I was at 2x7 = 14 variables needed, with 6 floats and 3 fixed variables available.

Next, I decided to limit the model so that all structures will be symmetrical on the X axis (so instead of x-min and x-max, I can just use -x and +x), so I got down to 12 variables needed and 9 available. Then I realized that with the sun direction handling, I would only need to know the shadow plane limit towards the sun, the plane can continue towards infinity to the other direction. So now I was at 10 needed variables (two sets of z-distance, x-offset, y-max, x-slope and y-slope) with 9 variables available. I decided to only have one of the two planes have a slope, so I ended up with needing 8 variables plus the plane selector mapped into 6 floats and 4 fixed values. The slope was the only one that could reasonably fit into the 0..1 range, so I mapped the variables like this:

I still have the Vertex Color blue channel free for some potential future use, perhaps adjusting whether the "y" limit would be a min or max.

The resulting vertex shader code looks like this:

	//------------------------------------
	// Prepare for the shadow calculations
	//------------------------------------
	float2 dist;
	float4 ip;
	// We have three possible plane directions, with two planes active in each direction.
	if (v.color.a == 0)	// x plane and y plane
	{
		dist = (pos.xy - v.uv2.xy) / _ObjLightDir.xy;
		ip = pos.yzxz - _ObjLightDir.yzxz * dist.xxyy;
	}
	else if (v.color.a == 1) // x plane and z plane
	{
		dist = (pos.xz - v.uv2.xy) / _ObjLightDir.xz;
		ip = pos.yzxy - _ObjLightDir.yzxy * dist.xxyy;
	}
	else // y plane and z plane
	{
		dist = (pos.yz - v.uv2.xy) / _ObjLightDir.yz;
		ip = pos.xzxy - _ObjLightDir.xzxy * dist.xxyy;
	}
	o.shadowData = float4(v.uv3.x+(v.color.r*2-1)*(ip.y-v.uv3.y), v.uv3.y, v.uv4.x+(v.color.g*2-1)*(ip.w-v.uv4.y), v.uv4.y);
	o.shadowPos = ip;
The performance of the vertex shader is as follows, pretty close to what the space station vertex shader originally was at, but the cruiser will have much fewer vertices than the space station, so this should not be a problem:
  8 work registers used, 9 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   58      27      0       A
  Shortest Path Cycles:   30      27      0       A
  Longest Path Cycles:    30      27      0       A
The fragment shader in turn is still reasonably simple, as all the data is still within the two interpolators shadowData and shadowPos. I used the Z and W coordinates of the standard UV texture coordinates to send the shadow/light color multipliers from the vertex shader to the fragment shader:
        // Handle shadow
        fixed sh = (i.shadowPos.x <= i.shadowData.x && i.shadowPos.x >= -i.shadowData.x && i.shadowPos.y <= i.shadowData.y) ||
                   (i.shadowPos.z <= i.shadowData.z && i.shadowPos.z >= -i.shadowData.z && i.shadowPos.w <= i.shadowData.w) ? i.uv.z : i.uv.w;

The fragment shader is still pretty efficient even with these shadow calculations:
  3 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   18      5       2       A
  Shortest Path Cycles:   5       4       2       A
  Longest Path Cycles:    6.5     5       2       A

Cruiser Bridge Model

After I got this shadow algorithm working, instead of manually adding all the shadow configuration data (which is again a boring and error-prone manual operation), I switched to working on the cruiser command bridge model. In LineWars II you could control a Cobra, a Pirate ship and/or a Cruiser, depending on the mission, but the simple cockpit image was the same. In LineWars VR I want to have properly different cockpits for all of those, and I have been looking forward to being able to start working on the cruiser command bridge. I knew I wanted to have an elevated captain's chair, and then some pilot/weapons officer chairs below and in front of the captain. I again looked at various images from movies and some sci-fi art for inspiration, and began working on the bridge model. I was able to use my cruiser object for the outside measurements of the outside walls of the bridge, but after that I just began modeling chairs and stuff. Pretty soon I noticed that I will need to have the pilot avatar for scale reference, so I again switched to working on something else after I got the cruiser bridge started.

Pilot Avatar and Joystick Hand Object

I started work on the pilot avatar by using the male pilot from my old Snow Fall animation project, from which I had already taken much of the Cobra cockpit structure. The problem with this model is, that it is very high-poly (over 426000 polygons in the original animation project, not even counting the head which is another 10237 polygons). I took separate parts of the original object (like the thighs, legs, feet, etc.) and reduced their polygon counts as much as possible, still trying to maintain a smooth quality of the objects.

One important part of this pilot avatar is the hand that holds the joystick. In many cockpit-based VR games the pilot avatar hand moves the joystick as the player moves the controller, so I wanted to have a similar feature in my LineWars VR as well. I took the joystick object (14266 points) and the hand object (13076 polygons) of the original Snow Fall animation project and began working on the polygon reduction. I had used the Subdivision Surface modeling deformer in Cinema 4D to create the smooth surfaces, and I could get the polygon counts much lower by using a lower subdivision amount. For the joystick object I was able to get down to 1 subdivision, which generated an object of just 616 points (without any of the buttons). With the hand itself the subdivision count of 1 was not enough, and with the count at 2 I got an object with 3369 points. I set a goal of less than 1000 points total for these two objects combined and set to work.

After a lot of remodeling and removing all those parts that will always be hidden I finally got down to 970 points total. I still need to add the buttons, which will increase the count to around 1000 or a bit over, but perhaps I still have some optimization possibilities with the model. I was still able to keep the quality reasonably high, especially with the hand model, which is the more important one of those two objects.

Cobra Cockpit Switches Cube Map Shadow Problem on Android

After spending several days optimizing the Joystick Hand object point and polygon count, I wanted to do some programming again for a change. I decided to finally look into the weird problem I have been having with the shadow mapping of the Cobra cockpit switches. I have two shaders for the Cobra cockpit, one that handles the large illuminated panels and other big rectangular polygons, and another shader for the small switches and such. The main difference between these shaders is that the first uses only a single color plane of the texture atlas for all (greyscale) color info, while the second shader uses the texture as standard RGB color info. The shadow algorithm on both of these shaders was similar, but for some reason the second shader calculated the shadow locations wrong, but only on Android, both worked fine in the Unity editor!

I first noticed this problem back in February but did not want to bother with solving it at that time. Now that I took the problematic shader as a basis for my Cruiser Bridge shader, I noticed that the problem happened there as well. Debugging the problem was rather difficult, as I could not run the game in the Unity editor, I had to keep uploading it to my Android phone and test it there.

It took quite a lot of debugging and trial and error to finally close in on the problem, but I still don't actually understand what exactly causes this weird behavior. I began by making the Cruiser Bridge shader a duplicate of the working CobraCockpit shader, and then began modifying it to look more and more like the misbehaving shader. I found out that leaving out this code in the fragment shader makes the problem appear:

	col = col.r > 0.5f ? col : col * dot(i.normal, i.cameraDir);
However, that did not make much sense, as that code has absolutely nothing to do with the shadows, it just adjust the brightness of the surface slightly based on the angle between the surface and camera, for a poor man's Global Illumination -type effect. How could leaving some unrelated code out cause any problems? I even checked the resulting compiled code, and the only difference between the two versions were that the used temporary variables were a bit different.

After a lot of further debugging I then finally found the root cause and the root difference between the shaders. In the working vertex shader I had this:

	o.uv = float3(2 * v.uv, 1);
However, in the vertex shader that caused the fragment shader to calculate the shadows wrong I had this:
	o.uv = 2 * v.uv;
My understanding of the shader language is not good enough to see what is so horribly wrong in the second vertex shader code that it will cause havoc with some other code in the fragment shader. In any case, I moved the UV coordinate multiplication from the vertex shader to the fragment shader, and after that the shadows began to work properly! I then realized that it is actually silly to multiply the UV coordinates in the shaders at all, why not just use the correct UV coordinates in the object mesh in the first place? Thus, I changed all the UV coordinates of my CockpitSwitches object and removed the multiplication completely.

Cockpit Texture Reorganization

As I had begun working on the Cruiser Bridge and had decided to use the same texture atlas for all my cockpits, I noticed that the current texture locations were far from optimal. I had put the original Cobra cockpit illuminated panels here and there on the texture atlas, so that there were only small separate areas where all the RGB planes could be used for the other cockpits besides the Cobra cockpit. Thus, I spent one day optimizing the texture and moving the UV coordinates around in the Cobra cockpit mesh. I still have about two thirds of the texture atlas reserved for the Cobra cockpit, so that both the Cruiser bridge and the Pirate cockpit will need to get by with the remaining one third of the texture. But both of those other cockpits will be much more modern looking, so they should not need as much switches and such illuminated panels.

This is what the combined cockpit texture currently looks like (in the Unity editor). All the different color planes (and also the Alpha plane which is not visible) contain the illumination maps of the different switch panels, so as a single RGBA image it looks quite messy. It is kept uncompressed to avoid compression artifacts, which would be very visible on such highly detailed texture maps. This makes it annoyingly large at 21.3 megabytes, but I don't think I can help it if I want to have such detailed instrumentation in the Cobra cockpit.

Combined Cockpit Shadow Cube Map

The most recent thing I have been working on was combining the cockpit shadow cube maps into a single cubemap. I have had the Cobra cockpit shadow cubemap as six separate 512x512 resolution images (basically containing a black and white image, but still using all the RGB color planes), using the Unity legacy CubeMap asset, which had also taken a lot of memory as it seems like the legacy cubemap asset does not compress the textures it uses. Now that I needed another such cubemap for the Cruiser bridge shadows, it occurred to me, that I could easily use just a single cubemap, with each of the three cockpits using their own color plane. Thus, I decided to get down to 256x256 resolution (actually, the original cruiser bridge shadow cubemap I made was only 128x128 resolution), and use the current Unity CubeMap texture type, to be able to get the cubemap compressed. I decided to use the red color component for the Cobra cockpit shadow map, green for the Cruiser, which then leaves blue for the upcoming Pirate ship cockpit shadow map. I wrote a short editor helper script to handle creation of the combined cubemap texture from six separate faces for each of the cockpits, and got a 256x256 cubemap which only takes 192 kilobytes of memory.

The next step is to adjust the shadow cubemaps both for the Cruiser bridge and for the Cobra cockpit to more closely follow the cockpit shape. My cockpits are obviously not exactly cube-shaped, so this creates some issues when using a cubemap for the shadows. I am currently trying to figure out a better algorithm to calculate the cubemap light ray hit position for such non-rectangular cockpits.

That's all for now, thanks again for your interest in my project!

Mar 25th, 2018 - Creating a Space Station

Modeling

Immediately after I finished the previous blog post, I began working on the new space station model for LineWars VR. If you remember the original LineWars II, you may remember it having a space station that was basically just a cube, with a "mail slot" for the fighter ships to fly through. That model only had 20 vertices (or points), so it was very cheap to process on the old DOS PC machines. However, for LineWars VR I wanted to have better and more complex space station. I thought the space station design in the movie "2001: A Space Odyssey" looked good and made sense scientifically, and as it has been copied by many games and movies since, I thought I'd do something similar for LineWars VR as well.

Modeling the new space station took only a day or so, as I could just use a collection of primitime shapes (cylinders, toruses capsules, and some cubes) in Cinema 4D. After I got these looking like a simple space station, I made the object editable, and split it into a single quarter of the station, using two symmetry deformers to then generate the full station. That way I only needed to hand-model one quarter of the station. I also wanted to keep the object as low-poly as possible, as the recommended maximum number of vertices per scene for a Gear VR game is a hundred thousand. As I will have many other objects in addition to this space station in a scene, it should only have a fraction of that amount of vertices. On the other hand, there is never more than one space station in a scene, so it can have more polygons and vertices than the other game objects.

My resulting space station model has 2465 vertices in Cinema 4D. Since all the sharp edges and also all vertices where the texture UV coordinates are not continuous generate extra vertices, the vertex count when the object got imported into Unity went up to 6310. That is pretty high, but still acceptable if I can keep the fighter ships very low-poly.

Texturing

After I got the modeling done, I began texturing the station. The outer rim should obviously have windows, as those are the living quarters. Since I did not have enough polygons to model the windows, I knew I needed this object to use normal mapping for all the small details. In addition to normal mapping, I knew I also needed luminance, as many of the windows should have light in then, even when that side of the station is in shadow. Also, the windows should have specular reflections (same as the solar panels), so that when the sunlight hits them in the correct angle, they should look bright even when there is no light coming from behind the window.

I added all those four 2048x2048 textures (diffuse color, normal mapping, luminance and specular strength) into Cinema 4D, with the plan of using just a 1024x1024 corner area of the texture maps for my station. I plan to have these same textures as a texture atlas for all ships in my game, as there are only four types of ships: Cobra, Pirate, Cruiser and the space station. There will also be alien ships, so if I can fit those into the same texture atlas that would be good, but if needed I can use a different texture for those.

I wanted to be able to shoot at the various parts of the station and have them take damage, so I tried to re-use the same texture panels everywhere I could, to leave room for various damaged panels in the texture atlas. This also meant trying to keep all the panels rectangular, and also not using continuous UV coordinates, so that I can then just change the UV coordinates of a single panel when it gets hit. The solar panels and fuel/water tanks would probably take damage differently. The tanks could simply explode, leaving nothing much behind, and the solar panels could just get torn away when they get hit.

I also planned to have the "mail slot" side of the station always facing the sun, so that I could keep the tanks always in shadow. This meant that I had to have some other way to make the fuel tanks visible, and I decided to add some spot lights shining on them. I modeled these lights in Cinema 4D, and then baked the lighting into a texture, and then copied the relevant parts of the texture into my texture atlas. I had to make some adjustments to the generated texture coordinates to make the texture fit nicely within my texture atlas. I did similar work for the landing pads that are inside the space station.

Finally, as I did not want to load all four different texture maps in the shader, I tried to figure out a way to pack the textures into fewer actual texture maps. With the asteroid I had used the RGB texture planes as the normal directions, and the alpha channel as the grayscale color. This would not work all that well with my station, as I needed to have full RGB color available. It then occurred to me, that perhaps I could get by with just two texture maps? The luminance and specularity were practically on/off toggles, or at most grayscale values, so that left two full RGB and XYZ planes. That totals 8 different panels, which would nicely fit into two RGBA textures. With the ETC2 texture compression the RGB colors of a pixel are compressed into 4 bits and the Alpha channel into another 4 bits, which means that the alpha channel has much less compression artifacts than the RGB channels. Thus, I decided to use the alpha channels of both the textures for the normal vector (as compression artifacts are most noticeable in the normal map). Thus, my resulting texture packing uses the first texture as the RGB diffuse color plus X coordinate of the normal vector, and the second texture as luminance toggle in the Red channel, normal vector Z coordinate in the Green channel, specular strength in the Blue channel, and the normal vector Y coordinate in the Alpha channel.

Shadows

The space station would look pretty bad if it didn't have shadows on the solar panels, when the sun is shining from the front of the station. My plan is to avoid using proper shadow maps in my game, as those would require rendering the scene separately from the view point of the sun, and then using this shadow map to determine which pixels are in shadow, and all of this should be done for every frame. I don't think the mobile devices running Gear VR have the performance to handle this with sufficient quality (meaning large enough shadow maps). So, what are the alternatives?

One thing I could have done would have been to point the station directly towards the sun, and then just bake the shadow information into the texture maps. However, as I wanted to have the solar panels stay stationary while the rest of the station rotates, this would not work. Next, I tried using a static shadow map texture, which would rotate as the main part of the station rotates. Since I use the Dynamic Soft Shadows Based on Local Cubemap method for the cocpit shadows, and that basically just calculates the correct shadow map position from the fragment position in 3D, I thought I could perhaps use something similar but just with a simple texture, when I know the sun always shines from the same direction. I got this working fine, but the problem was the uneven shadow edge around the circular main body of the station. Straight lines looked pretty good, but the circular section had very obvious jagged edges.

I then got the idea of using code instead of a texture map to calculate whether a pixel is in shadow. Since my station only has simple shapes (from the shadow perspective), it has a ring, a central circle, and four poles, I thought that the required formula should not be overly complex. And I was right, I was able to have just a couple of if clauses to check the shadow areas. This resulted in very clean and sharp shadow edge, which was just what I was after.

The Shader

I created a new custom shader to handle all the afore mentioned ideas. I used the asteroid shader as the basis, as it already handled the normal mapping. I had found a slightly more efficient method of handling the tangent space lighting calculations for the normal mapping since my original asteroid blog post, though. Instead of converting the tangent space normal into world space in the fragment shader, it is more efficient to convert the light vector into tangent space in the vertex shader. Unity provides a TANGENT_SPACE_ROTATION macro for this purpose, so the vertex shader calculations can be done simply by the following code, with no need to calculate the binormal vector:

	TANGENT_SPACE_ROTATION;
	o.lightDirection = mul(rotation, mul(unity_WorldToObject, _WorldSpaceLightPos0).xyz);
Then in the fragment shader, this can be handled simply by taking the dot product of the normal vector (taken from the texture) and this light vector:
	fixed4 tex = tex2D(_MainTex, i.uv);
	fixed3 tangentSpaceNormal = tex.rgb * 2 - 1; // Convert the normal vector values from 0..1 to -1..1 range
	fixed4 col = tex.a * (DotClamped(normalize(i.lightDirection), tangentSpaceNormal) * _LightColor0;

The space station vertex shader has four different sections to handle the special requirements of the station model and textures:

  1. The non-rotating solar panels are handled by using unity_WorldToObject matrix for those vertices to get their coordinates in object space, while the rotating vertices already have their coordinates in object space. This same handling needs to be done also to the normals and tangents of those vertices, which adds so many GPU cycles that I am thinking of eventually abandoning this idea of using a single mesh for the whole station.
  2. Next, the blinking polygons (or more accurately their vertices) are handled by checking the vertex color Green value (which I use in the c# script to mark the blinking polygons), and if it is set, and the _SinTime.w variable is > 0.99, I move the vertex UV coordinates to a blinking area of a texture map. This generates a short flash once every two seconds or so.
  3. The next step is to prepare the shadow calculation values. The shadow calculation in the fragment shader needs to know which areas of the space station cause a shadow on the polygon, for example polygons in front of the ring poles are not shadowed by the ring poles. Here again I use the vertex colors (this time the Red channel) to select the correct shadow area. This step also prepares the object space vertex coordinate and the object space light direction (which is not the same as the tangent space light direction) for the fragment shader.

    Since the tangent space surface normal can point towards the sun even when the polygon itself is in shadow, this can create unrealistic lit pixels on otherwise shadowed polygons. To avoid this, I also calculate a shadow multiplier at this stage, like this:

    	saturate(50 * dot(_WorldSpaceLightPos0.xyz, worldNormal))
    

  4. Finally, I calculate the specular color, based on the world coordinates of the camera, vertex and the light. For better quality specular reflection (especially for curved surfaces) this should be calculated per pixel in the fragment shader, but since my specular surfaces are flat, I thought I could use this optimization.

Then in the fragment shader I first read the two texture maps, and get the tangent space surface normal for this fragment (pixel). This looks rather similar to the asteroid fragment shader above, except I have two textures that get combined:

	fixed4 col = tex2D(_ColorTex, i.uv);
	fixed4 texn = tex2D(_NormalTex, i.uv);
	fixed3 tangentSpaceNormal = fixed3(col.a, texn.a, texn.g) * 2 - 1;

The shadow is then calculated, projecting the fragment position onto the plane that generates the shadow (which the vertex shader has given us), and then checking the coordinates of this projected point whether it is inside the radius of a circular section or whether the X and Y coordinates fall within rectangular sections (the poles, for example). These coordinate areas are currently hard-coded into the shader, but as I would like to use the same shader also for the other ships, I may need to figure out a better system for this. In the fragment shader I call my subroutine CheckShadow to handle the shadow calculation, which returns a value between 0 (in shadow) and 1 (not in shadow), with the not-in-shadow taken from the shadow multiplier calculated in the vertex shader.

	// Handle shadow
	fixed sh = CheckShadow(i.objectPosition, i.objectLightDir, i.shadowData);

Then it is just a matter of checking for luminance (which is not affected by the shadow) and specularity (which is affected by the shadow) to get the final color of the pixel. The luminance uses the second texture Blue channel, and the specularity the second texture Red channel multiplied by the specularity value pre-calculated in the vertex shader.

	// Handle luminance
	if (texn.b > 0.5)
	    return i.specular * texn.r * sh + col;
	// Handle specular
	col = sh * (i.specular * texn.r + 
	// Handle bumpiness
	col * DotClamped(normalize(i.lightDirection), tangentSpaceNormal) * _LightColor0);
	return col;

The resulting fragment shader takes only 8.5 GPU cycles worst case, and only 2.5 GPU cycles best case, according to the Mali Offline Compiler. These are pretty good values in my opinion, considering all the stuff the shader needs to handle. The vertex shader however takes 30 GPU cycles, most of which is caused by the rotating/non-rotating part handling, which I could get rid of completely if I had the station in two parts. However, if I split it into two parts, I would have to come up with some different way of handling the rotated shadows on the stationary solar panels, and as even the solar panel part of the station has more than 300 vertices, it could not be batched into the same draw call as the rest of the station. So, I would get rid of one problem and generate two new problems, so I am not yet sure if that change would be worth it. This is the Mali Offline Compiler result for the fragment shader:

  3 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   24      5       2       A
  Shortest Path Cycles:   2.5     2       2       A
  Longest Path Cycles:    8.5     5       2       A

The Result

Here below is a small video of me running the game in the Unity editor (using my Oculus Rift), and recording the editor window. It shows me flying around the space station, so you can see the shadows, luminance and specular handling in action. The specular reflections are visible on the solar panels and on the various windows, while the luminance shows on the windows of the shadow side of the station, and also on the fuel/water tanks.

The next step is to start working on the damaged textures, to handle the effects of ships shooting at the station. This will probably keep me busy for the next couple of weeks, and after that I can hopefully move on the creating the cruiser. I keep learning new tricks every step of the way, so after I have done the cruiser and the fighter ships, I probably have learned a lot of new tricks I can use to improve my space ship cockpit textures and shader. As always, thank you for your interest in my project!

Mar 8th, 2018 - Splitting Asteroids

Game code from LineWars II to Linewars VR

Most of my time during the last month and a half has been spent working on code that handles laser rays hitting an asteroid, but before I started working on that, I ported some of the game logic code from my old Linewars II over to LineWars VR. I started with the Demo scene, where a group of Cobra fighters attack a pirate starbase defended by some Pirate fighters. It took about a week to get the code working. The main issue was changing the original Euler angles for ship directions to use the Quaternions that Unity uses. This needed quite a bit of trial and error to get working, with me trying to understand how the quaternions actually work.

After I got the game code mostly working, I also added a HUD display, which shows the type, distance and speed of a ship that is directly in front of the player's ship. This HUD information is displayed around the crosshairs, just like it was in the original LineWars II.

Asteroid hit work

Then I began working on the main feature of this blog post, the laser hitting an asteroid. I had an idea of doing this in three phases:

  1. Determine where (on which polygon) the laser hits the asteroid, and play a hit animation at that position.
  2. If the asteroid is sufficiently large, generate a crater around this hit position.
  3. Explode the asteroid into fragments, when a big asteroid has been hit several times, or straight after the first hit if the asteroid is very small.

Determining the hit position

Unity does have a physics system that could handle most of this stuff pretty much automatically, but I decided not to use that, as I am not sure about the performance of the system on mobile devices, and the system is pretty much a black box. I like to know exactly what the game code is doing, so I decided to port the collision code from my original LineWars II over, and then start enhancing that with more features.

The first step was to determine whether the laser ray actually hits the asteroid, and if so, where. For a rough collision test I use a simple bounding sphere (as my asteroids are somewhat round in shape). If it looks like the laser ray is close enough to the asteroid center point, I then use a more exact collision detection. I found a good algorithm for a ray-triangle intersection test from the Unity Answers pages. I could use this algorithm pretty much as-is, I just added a test that the triangle and laser do not face the same way (as that would mean the laser hits the back side of the asteroid, which is not what I want). This test removes about half of the triangles from the test, and thus saves some CPU time. I used the System.Diagnostics.Stopwatch to check the number of ticks these tests take (when run in the editor), and the full intersection test for all 528 triangles of the asteroid takes between 1218 and 1291 ticks, while the intersection test leaving out the triangles facing the wrong way takes between 765 and 915 ticks.

Using this ray-triangle intersection test I was able to determine which triangle of the asteroid got hit, and I can even get the exact hit position in world coordinates quite easily. I then used a scene from Star Wars Episode 2 to check the timing of the hit flash and the speed of the explosion fragments, and tried to generate something similar in Cinema 4D using the Explosion FX deformer on my asteroid mesh, together with some flash footage. Below is an animated gif of the hit animation I came up with. This will be played on a quad facing the camera whenever a laser ray hits the asteroid. (Note that the speed of this animated gif is not necessarily the same as what the speed of the animation is inside the game. The animation should last one second, but your browser may run it faster or slower.)

I even added code to my shader rendering the animation, so that the color of the fragments varies depending on how much sunlight falls on the surface of the asteroid that got hit. So, if the laser ray hits a shadow side of the asteroid, you see a flash, but the ejected fragments are almost black. However, hitting a brightly lit side of the asteroid shows bright fragments ejecting from the hit position.

Creating craters

Next, I started working on the code that would dynamically generate craters into the asteroid mesh, around this hit position. I decided to aim for a crater with a radius of 5 meters (or Unity units), which meant that I had to have a way of finding the vertices, triangles and edges that fall within this radious from the hit position.

Since I only knew the one triangle that got hit, and Unity meshes do not have a way of easily finding adjacent triangles, I added a list called V2t (for Vertex-To-Triangles) into my asteroid GameObjects, which I fill when creating the asteroids. This list contains a list of triangles that each vertex in the mesh is a part of. This way I could easily find the adjacent triangles of my hit triangle. However, I soon realized that this was not enough, as my asteroids consist of several texture UV sections, which meant that Unity has duplicated some of the vertices. Thus, I needed to add still another list, keeping track of all the duplicates of each vertex, to be able to locate an adjacent triangle even if it does not share vertices with the current triangle. These two additional lists began to make things rather more complex than I would have liked.

Well, now that i could find the adjacent triangles, the next step was to find the edges of the triangles that get intersected by the crater rim, so that I could then split the triangles along the crater rim. I obviously wanted to have separate triangles for inside and outside the crater rim. For this intersection test I found a good ray-Sphere Intersection Test algorithm, which I could modify to test for intersections along the edges. Thus, my algorithm basically consisted of checking whether each corner vertex (p1, p2 and p3) of a triangle is inside or outside of the crater (with midpoint at p0) like this:

    // Check how many vertices are inside the crater.
    int tst = ((p1 - p0).sqrMagnitude < mhd.craterSqrRadius ? 1 : 0) +
              ((p2 - p0).sqrMagnitude < mhd.craterSqrRadius ? 2 : 0) +
              ((p3 - p0).sqrMagnitude < mhd.craterSqrRadius ? 4 : 0);

This gave me a number between 0 (no vertices are inside the crater) and 7 (all vertices are inside the crater), with the bits of the tst value determining which edges are intersected by the crater. This I could then use in a switch statement to try to handle each of the separate cases. Here below is an image from my quad grid notebook where I had doodled some examples of these different intersections, in an attempt to figure out how to handle them, and to help me to keep track of which vertex is which when implementing the code.

As you can see from the above image, even if no vertices of the triangle are inside the crater, it is still possible that the crater rim intersects one or more of the triangle edges. Thus, I added the code below, using the Ray-Sphere intersection test, to calculate another variable tst2, which keeps track of how many intersections there are on each of the triangle edges.

    // Check for edge intersections.
    // When tst == 0, usual tst2 values are 9 (100 100), 18 (010 010), 27 (110 110), 36 (001 001), 45 (101 101), 54 (011 011) and 63 (111 111).
    t12a = RaySphereIntersect(p1, (p2 - p1), p0, mhd.craterSqrRadius, out t12b);
    t13a = RaySphereIntersect(p1, (p3 - p1), p0, mhd.craterSqrRadius, out t13b);
    t23a = RaySphereIntersect(p2, (p3 - p2), p0, mhd.craterSqrRadius, out t23b);
    int tst2 = (t12a > 0.0f && t12a < 1.0f ? 1 : 0) +
               (t13a > 0.0f && t13a < 1.0f ? 2 : 0) +
               (t23a > 0.0f && t23a < 1.0f ? 4 : 0) +
               (t12b > t12a && t12b < 1.0f ? 8 : 0) +
               (t13b > t13a && t13b < 1.0f ? 16 : 0) +
               (t23b > t23a && t23b < 1.0f ? 32 : 0);

So, now things began to get quite complex, as I had to handle all these different cases, and not just for the triangle that got hit, but for all the adjacent triangles as long as there are triangles that have any edge intersections around the original triangle. I spent a couple of weeks working on this code, and got it to work reasonably well on the original asteroid mesh, but when trying to generate a new crater that overlaps an existing crater, I ran into such severe problems (stack overflow, and other hard to trace occasional bugs in my code), that I eventually decided to abandon this code for now. That was pretty frustrating, as I would really have liked to have craters appear on the asteroids when shooting them.

Exploding the asteroid

Instead of fighting with the crater creation for weeks and weeks, I decided to start working on code that would then eventually split and explode the asteroid. I wanted have a sort of a crumbling effect, so that the asteroid does not simply blast into small polygons, but instead crumbles in a convincing way for a large asteroid. This meant, that I had to divide the changes to happen over several frames, not everything at once. I decided to do this also in three parts:

  1. Since my asteroid has six separate texture UV sections, I decided to split the asteroid initially into six fragments along the UV sections, as those section rims already had duplicated vertices.
  2. During the next step, I build proper asteroid fragments from these six sections. This basically means connecting all the rim vertices to a new fragment-specific vertex at the center of the asteroid.
  3. For every frame after that, I move the six sections away from each other, and start splitting triangles away from the rims.

The first part was pretty easy, as I could just check each vertex, and determine the section it belongs to using it's UV coordinates. I created six separate lists for the vertices of each section, and since the sections were aligned along the local axis of the asteroid, it was easy to determine the direction where the section should move.

During the second frame after the explosion has started, I then generate the new center vertex, and generate new triangles to join all the rim vertices to this new center vertex, for all the six parts. For determining the rim vertices I could use my vertex duplicate lists, since if a vertex has a duplicate, it must be a rim vertex. My algorithm first looks for any duplicated vertex, and then starts traversing the rim (taht is, looking for an adjacent duplicated vertex) until we get back to the original vertex. Here I had to handle one special case, since in a corner triangle all three vertices are on the rim. I had to make sure I follow the correct edge (and do not accidentally cut the corner). I then add new vertex duplicates for each of these rim vertices (to get a sharp angle with different normal directions), and create the new triangles. The normal and tangent directions of the center vertex were a bit problematic, until I decided to just point the normal away from the sun, which has the effect of making the center of the asteroid look black from all directions, which in my opinion looks fine.

During all the following frames (until I determine the explosion has run sufficiently long) I randomly select a rim triangle of the section, and remove it from the main fragment body, generate new vertices for it, and start moving it away from the main fragment body. I also make all these separated small fragments smaller every frame, so that they eventually vanish. All this work is done using the single mesh, so even though it looks like many separate parts, it actually still is just a single GameObject in Unity.

Since the asteroid originally has 528 triangles, and eventually all of these triangles may get separated into a four-triangle fragment, the triangle count can increase up to 528*4 = 2112. Similarly, the original vertex count of 342 can get up to 5280 vertices (as every original triangle becomes a fragment with 10 vertices). Both of these numbers are still within sensible limits though, especially considering that only a few asteroids should be both visible and in the explosion phase at any given time in the game.

Here below is a YouTube video illustration of my asteroid explosion routine in action:

Jan 26th, 2018 - Cobra cockpit work

Cockpit model from my Snow Fall project

For the past month or so I have been mainly working on creating the Cobra cockpit mesh, and texturing it. I started with the main components of my Snow Fall project ship cockpit (which in turn is loosely based on the real Space Shuttle glass cockpit). I think this sort of a retro ship cockpit, without any fancy holographic instruments, suits the feel of my game the best. The first problem I had was with the correct scale of the cockpit. After many tests and trials I ended with an instrument panel that is about 3 metres wide (as the ship is a two-seater) in Cinema 4D, but as that felt a bit too big in Gear VR, I scaled it by 0.9 when importing the mesh to Unity. That size seems to be at least close to correct.

I redid almost all parts of the model, trying to get by with as few vertices as possible. I also decided to use flat shading for the cockpit, based on the excellent Flat and Wireframe Shading tutorial from Catlike Coding (Jasper Flick). That way I don't get duplicated vertices for sharp edges in the mesh when Unity imports it, rather I can disable normals in the mesh completely, and then calculate them as needed in the fragment shader.

Dynamic Soft Shadows Based on Local Cubemap

I had found this interesting blog post on the Arm Mali community about Dynamic Soft Shadows Based on Local Cubemap. This is a trick of getting proper dynamic shadows that emulate the light shining into a room through some windows, using a baked cube map instead of any real time shadow calculations. I thought it might fit my use case pretty well, as I wanted have the sunlight coming through the cockpit windows hitting the instruments and walls of my cockpit. The problem I have, is that my cockpit is not exactly rectangular, and the original algorithm expects a rectangular room for which it calculates the correct shadow position, using a Bounding Box of the room size. I do have some ideas about how to solve this issue, though, but haven't yet had time to fully implement my ideas. I do have the basic system working already, though, and it looks pretty neat in my opinion!

The blog post (and the corresponding Unity sample project) also gives code for calculating dynamic shadows for moving objects, which I think I might need for getting proper shadows from all the switches, the joystick, the pilot's body parts, and such. To be ready for this, I decided to split my cockpit into two meshes, one containing the base cockpit structure, using the flat shading, and another containing all the separate switches and other (possibly even moving) objects which should generate shadows on the various cockpit panels. I decided to use a different shader for this object, with normals, as most of these objects should not be flat shaded. This of course adds one Draw Call, but I don't think having an extra Draw Call for the cockpit is that much of an issue, considering the cockpit is the closest object to your eyes, and thus should be the most detailed.

I have already tested these dynamic shadows as well, but the code has a lot of issues (for nicer results I should up the shadow texture resolution to 2048x2048 pixels, but that would cause a rather significant extra work for the GPU, and even so the shadows sometimes are not at exactly the correct position), so I am not yet sure if I will actually implement this part of the code at all. I think with the issues and slowdown the trouble is perhaps not worth the effort. Besides, even John Carmack has said "Don't try to do accurate dynamic shadows on GearVR. Dynamic shadows are rarely aliasing free and high quality even on AAA PC titles, cutting the resolution by a factor of 16 and using a single sample so it runs reasonably performant on GearVR makes it hopeless."

By the way, there was one issue with the dynamic shadows that I fought with before I managed to solve it, the shadow texture was upside down on my Oculus Rift (which I use for quick tests)! I spent a bit too long googling for this issue, considering it is a known issue, the texture coordinates have the V coordinate reversed in the Direct 3D system, compared to Open GL for which the original algorithm and shaders were coded for. I managed to fix this issue by replacing this code (in the original RoomShadows.shader):

	// ------------ Runtime shadows texture ----------------
	// ApplyMVP transformation from shadow camera to the vertex
	float4 vertexShadows = mul(_ShadowsViewProjMat, output.vertexInWorld);

	output.shadowsVertexInScreenCoords = ComputeScreenPos(vertexShadows);

	return output;
}
with this code (I needed to base my change on ComputeNonStereoScreenPos() instead of the original ComputeScreenPos(), which uses separate coordinates for each eye when in VR, and thus displayed the shadows in one eye only!):
	// ------------ Runtime shadows texture ----------------
	// ApplyMVP transformation from shadow camera to the vertex
	float4 vertexShadows = mul(_ShadowsViewProjMat, o.vertexInWorld);

	o.shadowsVertexInScreenCoords = ComputeNonStereoScreenPosNew(vertexShadows);
	
	return o;
}

inline float4 ComputeNonStereoScreenPosNew (float4 pos) {
	float4 o = pos * 0.5f;
	#if defined(UNITY_HALF_TEXEL_OFFSET)
		o.xy = float2(o.x, o.y /** _ProjectionParams.x*/) + o.w * _ScreenParams.zw;
	#else
		o.xy = float2(o.x, o.y /** _ProjectionParams.x*/) + o.w;
	#endif
	o.zw = pos.zw;
	return o;
}
That is, I commented out the ProjectionParams.x multiply, so the shadow texture is always read the correct way up.

Cockpit texture packing

Same as with my Snow Fall project, I want to have the cockpit of my LineWars VR game as detailed as I can make it (without sacrificing performance, obviously). Even as a kid I built all sorts of plane cockpits (using cardboard boxes) with detailed instruments, so my interest in detailed cockpits must be trying to fulfill some sort childhood dream of sitting in a cockpit of an aeroplane. :-) Anyways, for my Snow Fall project I had purchased a book called The Space Shuttle Operators Manual which has detailed schematics for all the instrument panels of the Space Shuttle. I had scanned these pages and converted them to emissive textures for my Snow Fall project, but in LineWars VR I needed them to have also some other details, so I decided to re-scan the schematics. (By the way, it seems that the same schematics can be found from this PDF from NASA, which even has the new glass cockpit instrumentation, which my original book did not have: Space Shuttle Crew Operations Manual).

After scanning all the schematics of the panels I wanted to have in my Cobra cockpit, I tried to fit them into a single rectangular texture (in Snow Fall all the textures were separate, with various sizes, most over 2048 pixels per side, and there were dozens of these textures!). I noticed I could just about fit them with still readable texts and symbols if I used a 4096x4096 texture. However, a texture of this size would take 48 megabytes uncompressed, and as all the recommendations for Gear VR state that textures should be kept at 2048x2048 or below, I began looking into ways to make the texture atlas smaller.

I decided to go with "gray packing", as most of the information in my textures has to do with the instrument panel switches illumination, and all the panels themselves are pretty much just gray. Thus, I created a C# script for Unity, which reads my 4096x4096 uncompressed BMP texture, and generates a 2048x2048 32-bit image from it, with each 2048x2048 area of the original image in one of the Red, Green, Blue and Alpha channels. Using ETC2 compression, I was able to get practically all the information from the original 48MB BMP file into a 4MB texture! The actual packing routine is pretty simple, it just gets the input bytes array, offset into the BMP file where the actual data starts, and width and height of the original file, and it packs the data into the four pixel planes into outbytes array (with room for the 54-byte BMP header):

    private void Convert(byte[] outbytes, byte[] inbytes, int inoffs, int w, int h)
    {
        // BMP pixel format = Blue, Green, Red, Alpha
        for (int y = 0; y < h; y++)
        {
            for (int x = 0; x < w; x++)
            {
                outbytes[54 + 
                    (4 * (w >> 1) * (y & ((h >> 1) - 1))) + 
                    (4 * (x & ((w >> 1) - 1))) +
                    (y * 2 < h ? 2 : 0) +
                    (x * 2 < w ? 0 : 1)
                    ] = inbytes[inoffs + 3 * (y*w + x)];
            }
        }
    }
That code is obviously not the most efficient way to do this, but since I only run it in the Unity editor whenever the texture BMP changes (which does happen often, now when I am working on the textures), it does not matter whether it takes 100ms or 500ms to run.

Of course this packing of the texture also needed some changes to the vertex and fragment shaders, to look up the correct texture coordinates and select the correct color plane, and also to convert the grayscale texture value to the yellowish instrument panel illumination color. In my CobraCockpitShader.shader code I use a vertex-to-fragment structure that looks something like this:

	struct v2f
	{
		float4 vertex : SV_POSITION;
		float2 uv : TEXCOORD0;
		fixed4 channel: TEXCOORD1;
	};
The other items are pretty much standard, but the channel element is the one that handles the color plane masking. It is set up in the vertex shader like this:
	o.uv = 2 * TRANSFORM_TEX(v.uv, _MainTex);
	o.channel = max(fixed4(1 - floor(o.uv.x) - floor(o.uv.y), floor(o.uv.x) * floor(o.uv.y), floor(o.uv.y) - floor(o.uv.x), floor(o.uv.x) - floor(o.uv.y)), 0);
That is, all the texture coordinates are multiplied by two (so they get the range of 0..2 instead of 0..1, to map from 0..4096 to 0..2048 texels). Since the texture parameters use wrapping, the coordinates that are over 1 simply get mapped back to the range 0..1, but I can use these 0..2 coordinate ranges to determine the correct "quadrant" of the texture. The floor function converts the coordinate to integer, so it can only get a value of 0 or 1, and thus the UV coordinates map to one of the four "quadrants" (with the V coordinate reversed for OpenGL texture orientation): (0,1) = Red, (1,1) = Green, (0,0) = Blue, and (1,0) = Alpha. The channel setting uses some arithmetic to get only one of the four color components set, based on which of the UV coordinates were over 1, without using any conditional operations.

Then, in the fragment shader, I take only the wanted color channel from the texture, and switch to yellowish color if the resulting color is above a threshold, like this:

	// sample the texture
	fixed4 col = tex2D(_MainTex, i.uv) * i.channel;
	// Only one of the channels has data, so sum them all up to avoid conditionals
	fixed tmp = col.r + col.g + col.b + col.a;
	// Clamp the colors so we get yellow illumination with gray base color.
	col = min(tmp, fixed4(1, 0.7f, 0.4f, 1));

Cinema 4D C.O.F.F.E.E. UV Plugin

I find modeling pretty easy, but texturing in Cinema 4D is something I constantly struggle with. Perhaps my workflow especially with this project is not very well suited to the way the UV tools in Cinema 4D work. I have a BMP image containing a texture atlas, and I want to map certain polygons in my model to certain exact UV coordinates in this already existing texture atlas. At first I simply used the Structure view of Cinema 4D to input the coordinates by hand, but that got boring and error-prone pretty quickly. I then decided to look into creating my own plugin to make this job easier.

I managed to create a plugin that finds a point (vertex) that is currently selected in the mesh, and then looks for all the selected polygons sharing this point, and gets the UV coordinates from the UVW tag for this point in the polygon. These coordinates (which are floating point numbers between 0 and 1) are then converted to 0..4096, to match with my texture image, and displayed in a popup window.

Then when I input new coordinates, it sets all the selected polygon's UVW coordinates for this point to the given value (converted back from 0..4096 to 0..1 range). Thus, using this makes it easier for me to map coordinates in the texture atlas to a UV coordinates, and since I can select the polygons that should be affected, I can avoid (or create when necessary) discontinuities in the UV coordinates, which would make Unity duplicate the vertex when importing the mesh. Even though the plugin is a bit buggy and quite rudimentary, it has been a big help in my peculiar texturing workflow.

RenderScale, AntiAliasing and MipMapping

I mostly use my Oculus Rift when working on my project, and only occasionally actually build and run the project on my Gear VR device. I began wondering why my cockpit does not look nearly as nice on Gear VR as it looks on Oculus Rift. The textures were flickering and did not look to be as detailed as on the Rift, even though I used the same textures, and the display resolution should be about the same. Even the skybox showing the background planet had clear aliasing problems, while it was very clean-looking on the Rift.

I first tried to increase the antialiaing (MSAA) level, but that did not seem to have much of an effect. After searching the net for anwsers, I finally found the RenderScale setting, and noticed that using the default 1.0 RenderScale the eye buffer size was only 1024x1024 on the Gear VR, while on Oculus Rift it was 1536x1776 per eye! This obviously caused a big difference in the apparent quality. I experimented with increasing the RenderScale to 1.5, which made the eye texture 1536x1536 on the Gear VR (and something huge, like 2304x2664 on the Rift), and that got rid of the aliasing problem with the skybox, and the textured looked much more detailed, but still there was some texture crawl and star field flickering issues, on both Gear VR and Oculus Rift. On my Galaxy S6, the RenderScale 1.5 also caused an occasional FPS drop, so that would not be a real solution for the texture problems.

I then ran accross the article by John Carmack, where he states that Mip Maps should always be enabled on Gear VR. Well, I did not have them enabled, as I thought the cockpit is so close to the eyes, there is no need to blur any of the cockpit textures. Just to test this, I enabled Mip Mapping, and contrary to my expectations, the textures got a lot calmer and the flickering was almost completely gone. The bad thing was, the texture compression artifacts (caused by my gray packing) got quite visible. At first I thought about doing some clever reordering of the texture atlas that could lessen the artifacts, but in the end I decided to go with an uncompressed texture for the cockpit instrument panels. Sadly, with Mip Mapping, this bloated the original 4MB texture to a whopping 21.3MB! However, I think I can have all my other textures compressed, so perhaps I can get away with one such huge texture in my game.

Cockpit instruments, clock and radar

Occasionally I get bored with working on the textures, and get sidetracked with some other feature that my game needs. One of the things I think every Virtual Reality game should have, is a visible real time clock when you are in VR. I don't know if it is just me, but usually when I am playing a VR game, I only have a certain amount of time I can play it, before I need to do something else. It is pretty annoying trying to check what time it is by peeking out of the VR glasses. Thus, I began experimenting with ways to get a clock display into my cockpit. I had already implemented a simple UI panel into the center MFD (multi-function display), which I certainly could use for a clock, but I wanted to check if there was a way to add instruments without adding any Draw Calls to my project.

The center panel (based on the Space Shuttle center panel) happened to have a slot for a timer, which I had some trouble deciding on how to model or texture. I decided to change this into a digital clock, so I could kill two birds with one stone, so to speak: Have a clock visible, and have the timer area actually do something useful in the center panel. I had an idea of adding the number (and letter) glyphs into my cockpit texture atlas, and then just switching the UV coordinates in my cockpit mesh whenever the clock changes (once per minute). This would neatly avoid any extra draw calls, and I thought that updating the UV coordinates of my base cockpit mesh (which at the moment has 737 vertices and 1036 triangles inside Unity, and 566 points/688 polygons in Cinema 4D) once a minute should not cause much of a slowdown. However, to be able to update just the UV coordinates of certain polygons in the cockpit mesh, I needed a way to find those polygons!

I couldn't use anything like the index of a point or polygon from Cinema 4D to find the clock face polygons, as Unity will rearrange the vertices and triangles when it imports the mesh. I needed to find the correct UV coordinate array indices within Unity, but to do that I needed to have something set up in Cinema 4D to flag the polygons I needed to find. I decided to simply flag the left edge of the clock face polygons with a UV coordinate U value 0.5, as nothing else in my mesh uses that UV coordinate value. Of course I could also have used for example 0 or 1, but as Cinema 4D gives those coordinates to newly created polygons, I did not want to have this cause problems. This is how the polygons were organized in the object inside Cinema 4D (don't mind the Min/Sec headers, the clock will show Hour/Min, I was just lazy to change the texture, as that text is so small it will not be readable in the game):

So, now I only needed to find the corresponding triangles in Unity, find their UV indices (in correct order), store these, and then use these to display a number glyph whenever the current time changes. Sounds simple, but it took a bit of a trial and error to find the simplest algorithm to handle this. In my daytime job I had used C# and Linq quite extensively, so I reverted to my Linq toolbox for these algorithms, as the performance was not critical during this setup phase. Here is the routine I came up with, hopefully sufficiently commented, so that you can see what it does:

    using System.Linq;

    int[,] m_clockUVIndices = new int[4, 4];

    void PrepareClock()
    {
        // Find the clock vertices
        Mesh mesh = GetComponent<MeshFilter>().mesh;
        Vector2[] uvs = mesh.uv;
        Vector3[] verts = mesh.vertices;
        // Find all the vertices flagged with "uv.x == 0.5f"
        List<int> vidxs = new List<int>();
        for (int i = 0; i < verts.Length; i++)
            if (uvs[i].x == 0.5f)
                vidxs.Add(i);
        // Find the polygons that use these vertices, these are the digital clock face polygons.
        List<int> tidxs = new List<int>();
        int[] tris = mesh.triangles;
        for (int i = 0; i < tris.Length; i++)
            if (vidxs.Contains(tris[i]))
                tidxs.Add(i / 3);
        // Now tidxs contains all the triangles (including duplicates) that belong to the digital instrument faces.
        // We need to find the correct order of the triangles, based on the sum of the X and Y
        // coordinates of their vertices.
        tidxs = tidxs.Distinct()
                     .OrderBy(a => verts[tris[a * 3]].x + verts[tris[a * 3 + 1]].x + verts[tris[a * 3 + 2]].x)
                     .ThenBy(a => verts[tris[a * 3]].y + verts[tris[a * 3 + 1]].y + verts[tris[a * 3 + 2]].y).ToList();
        // Next, reorder the vertices of each pair of triangles for our final UV coordinate array.
        for (int i = 0; i < 4; i++)
        {
            List<int> tmp = new List<int>
            {
                tris[tidxs[i*2] * 3],
                tris[tidxs[i*2] * 3 + 1],
                tris[tidxs[i*2] * 3 + 2],
                tris[tidxs[i*2+1] * 3],
                tris[tidxs[i*2+1] * 3 + 1],
                tris[tidxs[i*2+1] * 3 + 2],
            };
            tmp = tmp.Distinct().OrderBy(a => verts[a].x).ThenByDescending(a => verts[a].y).ToList();

            m_clockUVIndices[i, 0] = tmp[0];
            m_clockUVIndices[i, 1] = tmp[1];
            m_clockUVIndices[i, 2] = tmp[2];
            m_clockUVIndices[i, 3] = tmp[3];
        }
    }

Now that I had the UV indices that need changing stored, it was a simple matter to change these whenever the current minute changes. Here below is the code that does that, by checking the current minute against the last updated minute. Don't get confused by the const values having X and Y coordinates, these mean the texture U and V coordinates, I just prefer the X and Y terminology over U and V:

    const float X_START = 2048f / 4096f;	// Start of the number glyph U coordinate
    const float Y_START = 1f - (2418f / 4096f);    // Start of the letter 0 in the texture atlas
    const float X_END = 2060f / 4096f;	// End texture U coordinate of the number glyph
    const float Y_SIZE = -((2435f - 2418f) / 4096f); // Height of the number glyph we want to display
    const float Y_STRIDE = -23f / 4096f;	// How much to travel to find the next number glyph

    int m_currentMinute = -1;

    void UpdateClock()
    {
        DateTime curTime = DateTime.Now;
        if (curTime.Minute != m_currentMinute)
        {
            // Update the clock when the current minute changes.
            m_currentMinute = curTime.Minute;
            Mesh mesh = GetComponent<MeshFilter>().mesh;
            Vector2[] uvs = mesh.uv;
            // Set the lower digit of the minute
            float y = Y_START + Y_STRIDE * (m_currentMinute % 10);
            uvs[m_clockUVIndices[3, 0]] = new Vector2(X_START, y);
            uvs[m_clockUVIndices[3, 1]] = new Vector2(X_START, y + Y_SIZE);
            uvs[m_clockUVIndices[3, 2]] = new Vector2(X_END, y);
            uvs[m_clockUVIndices[3, 3]] = new Vector2(X_END, y + Y_SIZE);
            // Set the higher digit of the minute
            y = Y_START + Y_STRIDE * (m_currentMinute / 10);
            uvs[m_clockUVIndices[2, 0]] = new Vector2(X_START, y);
            uvs[m_clockUVIndices[2, 1]] = new Vector2(X_START, y + Y_SIZE);
            uvs[m_clockUVIndices[2, 2]] = new Vector2(X_END, y);
            uvs[m_clockUVIndices[2, 3]] = new Vector2(X_END, y + Y_SIZE);
            // Set the lower digit of the hour
            y = Y_START + Y_STRIDE * (curTime.Hour % 10);
            uvs[m_clockUVIndices[1, 0]] = new Vector2(X_START, y);
            uvs[m_clockUVIndices[1, 1]] = new Vector2(X_START, y + Y_SIZE);
            uvs[m_clockUVIndices[1, 2]] = new Vector2(X_END, y);
            uvs[m_clockUVIndices[1, 3]] = new Vector2(X_END, y + Y_SIZE);
            // Set the higher digit of the hour (24-hour clock)
            y = Y_START + Y_STRIDE * (curTime.Hour / 10);
            uvs[m_clockUVIndices[0, 0]] = new Vector2(X_START, y);
            uvs[m_clockUVIndices[0, 1]] = new Vector2(X_START, y + Y_SIZE);
            uvs[m_clockUVIndices[0, 2]] = new Vector2(X_END, y);
            uvs[m_clockUVIndices[0, 3]] = new Vector2(X_END, y + Y_SIZE);
            mesh.uv = uvs;
        }
    }

The end result, with the game running on Oculus Rift, and this image captured from the editor, looks like the following (with the clock showing 12:42):

As you can see from the image above (well, besides the badly unfinished texturing of the center console), I also worked on some radar code. The radar display uses much of the same techniques, the legend showing the number of friendly, enemy, and other objects uses the exact same texture UV coordinate changing, and the radar blips actually change the triangle mesh coordinates similarly. That creates a 3D radar view (not a hologram, mind you, just a simple 3D display, which even current day technology is capable of, like in Nintendo 3DS) that shows the direction and distance of the other ships and asteroids. I decided to use logarithmic scale in the radar, so that it is scaled based on the furthest object, and the distances are first square rooted and then scaled based on this furthest distance. This way even objects that are relatively close clearly show their direction relative to your own ship.

Next steps

Well, I will continue working on texturing the cockpit, and occasionally testing some other interesting algorithms, to not get bored with the texturing work (which I really do not much enjoy). I just got an Android-compatible gamepad, and I have already done some basic work on reading the Gear VR Controller, so adding proper input would be something I need to do pretty soon. I have also imported the original Pirate and Station meshes from LineWars II into my project, as placeholders, so I could perhaps start working on the actual game mechanics in the near future.

Lots of work remaining, but at least the project does progress slowly but surely!

Dec 23rd, 2017 - Modeling and Texturing Asteroids

Asteroid references

A week or so ago I began looking into creating asteroids for LineWars VR. In the original Linewars II I had and asteroid mesh that had 12 vertices and 20 polygons. I then scaled this randomly and differently in all three dimensions, to create asteroids of various sizes and shapes. However, for Linewars VR I want to have something a bit more natural looking, so I spent a couple of days looking for ideas and tutorials about asteroid generation. I thought that modeling the asteroid mesh by hand would not create suitable variation, so I mainly looked into procedural asteroid generation. I even found a Unity forum thread about that exact subject, so I was certainly not the first one trying to do that. The Unity forum thread did not seem to have exactly what I was after, though. I also found a tutorial about creating asteroids in Cinema 4D, but those asteroids did not look quite like what I had in mind for LineWars VR.

Procedural textures

Finally I found a thread about procedural asteroid material in Blender, which seemed to have results much like what I was after. So, I decided to first look into creating a suitable texture for my asteroid, and only after that look into the actual shape of the asteroid. The example used a procedural texture with Cells Voronoi noise together with some color gradient. At first I tried to emulate that in Cinema 4D, but did not quite succeed. Finally I realized that the Cinema 4D Voronoi 1 noise actually generated crater-like textures when applied to the Bump channel, with no need for a separate color gradient or other type of post-processing! Thus, I mixed several different scales of Voronoi 1 (for different sized craters), and added some Buya noise for small angular-shaped rocks/boulders. For the diffusion channel (the asteroid surface color) I just used some Blistered Turbulence (for some darker patches on the surface) together with Buya noise (again for some rocks/boulders on the surface).

Procedural asteroid mesh

Okay, that took care of the textures, but my asteroid was still just a round sphere. How do I make it looking more interesting? For my texturing tests I used the default Cinema 4D sphere object with 24 segments. For that amount of segments, the resulting sphere has 266 vertices. For my first tests to non-spherify this object, I simply randomized all these vertex coordinates in Unity when generating the mesh to display. This sort of worked, but it generated a lot of sharp angles, and the asteroid was not very natural-looking. Many of the online tutorials used FFD (Free Form Deformation) tool in the modeling software to generate such deformed objects. I could certainly also use the FFD tool in Cinema 4D for this, but I preferred something that I could use within Unity, so that I could generate asteroids that are different during every run of the game, just like they were in the original LineWars II.

I decided to check if Unity would have an FFD tool, and found a reasonably simple-looking FreeFormDeformation.cs C# code for Unity by Jerdak (J. Carson). I modified that code so, that instead of creating the control points as GameObjects for the Unity editor, I created the control points in code with some random adjustments, and then used these control points for deforming the original sphere mesh while instantiating a new asteroid in Unity. After some trial and error with the suitable random value ranges I was able to generate quite convincing asteroid forms, at least in my opinion. This is my current random adjustment, which still keeps the asteroids mostly convex, so I don't need to worry about self-shadowing (as I want to have dynamic lighting, but plan to avoid real-time shadows, for performance reasons):

    Vector3 CreateControlPoint(Vector3 p0, int i, int j, int k)
    {
        Vector3 p = p0 + (i / (float)L * S) + (j / (float)M * T) + (k / (float)N * U);
        return new Vector3(p.x * (0.5f + 4 * Random.value), p.y * (0.5f + 4 * Random.value), p.z * (0.5f + 4 * Random.value));
    }

Exporting procedural textures from Cinema 4D to Unity

Now I had a nicely textured sphere in Cinema 4D, and a nice loking asteroid mesh in Unity, but I still needed to somehow apply the procedural texture generated in Cinema 4D to the mesh deformed in Unity. I first looked into some YouTube tutorial videos, and then began experimenting. Using the Bake Object command in Cinema 4D I was able to convert the sphere object to a six-sided polygon object with proper UV coordinates, together with baked textures.

To generate a normal texture for Unity from the bump channel in Cinema 4D I had to use the Bake Texture command, which gives me full control over which material channels to export, how the normals should be exported (using the Tangent method, as in the screen shots below), and so on.

When I imported this mesh into Unity, applied my Free Form Deformation to it (which meant I had to call the Unity RecalculateNormals() method afterwards), and applied the texture to the mesh, there were visible seams where the six separate regions met. After some googling I found a blog post that explained the problem, together with code for a better method to recalculate normals in Unity. I implemented this algorithm, and got a seamless asteroid! Here below is an animated GIF captured from the Unity game viewport (and speeded up somewhat).

Asteroid shader

After I got the asteroid working witht the Standard Shader of Unity, I wanted to experiment coding my own shader for it. I had several reasons for creating a custom shader for my asteroid object:

  1. I wanted to learn shader programming, and this seemed like a good first object for experimenting with that.
  2. I had an idea of combining both the diffuse texture and the normal texture into a single texture image, as my diffuse color is just shades of gray. I can pack the 24bpp normal map with the 8bpp color map to a single 32bpp RGBA texture. This should save some memory.
  3. I wanted to follow the "GPU Processing Budget Approach to Game Development" blog post in the ARM Community. I needed to have easy access to the shader source code, and be able to make changes to the shader, for this to be possible.
  4. I am not sure how efficient the Standard Shader is, as it seems to have a lot of options. I might be able to optimize my shader better using the performance results from the Mali Offline Compiler for example, as I know the exact use case of my shader.
I followed the excellent tutorials by Jasper Flick from CatlikeCoding, especially the First Light and Bumpiness tutorials, when coding my own shader. I got the shader to work without too much trouble, and was able to check the performance values from the MOC:
C:\Projects\LineWarsVR\Help>malisc -c Mali-T760 Asteroid.vert
  4 work registers used, 15 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   19      10      0       A
  Shortest Path Cycles:   9.5     10      0       L/S
  Longest Path Cycles:    9.5     10      0       L/S

C:\Projects\LineWarsVR\Help>malisc -c Mali-T760 Asteroid.frag
  2 work registers used, 1 uniform registers used, spilling not used.

                          A       L/S     T       Bound
  Instructions Emitted:   9       4       1       A
  Shortest Path Cycles:   4       4       1       A, L/S
  Longest Path Cycles:    4       4       1       A, L/S
So, the vertex shader (which needs to calculate a BiNormal vector for the vertex, based on the existing Normal and Tangent vectors) takes 10 GPU cycles per vertex to execute (so for 266 vertices in the asteroid, this takes at most 2660 GPU cycles per asteroid, probably less if the back-facing polygons have been culled in an earlier step of the rendering pipeline), and the fragment shader (which needs to calculate the TangentSpace normal vector from the normal map and the Normal, Tangent and BiNormal vectors provided by the vertex shader) takes only 4 GPU cycles per fragment (pixel). As my Galaxy S6 (which is close to the low end of the Gear VR -compatible devices) has a GPU processing budget of 28 GPU cycles per pixel, my asteroid is well within this budget.

Nov 28th, 2017 - The Beginning

Okay, so I decided to start working on a new game project, after quite a long while. Many times since I coded and released my LineWars II game, I have been thinking about getting back to coding a new space game. However, I hadn't dared to start working on such, as it seems that all games nowadays are built by a large team of developers, artists, and other professionals. I thought that a single person making a game during their free time would probably not have a chance of succeeding in competition against such big game projects. However, I recently ran across End Space for Gear VR, which idea-wise is a rather similar space shooter as what Linewars II was. Reading the developer's blog, I found out that it was actually created by a single person. As there are not all that many cockpit-type space shooter games for Gear VR, and this End Space seems to be rather popular, I thought that perhaps there would also be interest for a Virtual Reality port of my old LineWars II game!

As the Gear VR runs on mobile devices, it means that the graphics and other features need to be quite optimized and rather minimalistic to keep the frame rate at the required 60 fps. This nicely limits the complexity of the game, and also gives some challenges, so this would be a good fit to my talents. I am no graphics designer, but I do like to optimize code, so hopefully I can get some reasonably good looking graphics running fast. No need for a team of artists, when you can not take advantage of graphics complexity. :-)

Music

I heard about End Space at the end of November 2017, and after making the decission to at least look into porting LineWars II to Gear VR, I started looking at what sort of assets I already had or could easily create for this project. Pretty much the first thing I looked into was music. Linewars II used four pieces originally composed for Amiga 500 by u4ia (Jim Young). He gave me permission to use those pieces of music in LineWars II, and I converted the original MOD files to a sort of hybrid MIDI/MOD format, in order to play the same music on Creative SoundBlaster, Gravis UltraSound or Roland MT-32, which were the main audio devices at that time. By far the best music quality could be achieved from playing the music via a Roland MT-32 sound module. However, the only way to play that hybrid MIDI/MOD song format was within LineWars II itself, and I was not sure if I could somehow record the music from the game, now 24 years later!

After some experiments and a lot of googling, I managed to run the original Linewars II in DOSBox, together with the Munt Roland MT-32 software emulator and Audacity, and record the music into WAV files with a fully digital audio path, so the music actually sounded better than it had ever sounded on my real Roland LAPC-1 (an internal PC audio card version of the MT-32 sound module)! So, the music was sorted, what else might I already have that I could use in this project?

Missń Force Luuraa

That is the title of a Finnish Star Wars fan film from 2002. The film never got finished or released, but I created some 3D animation clips for the movie, as I was just learning to use Cinema 4D at that time (as that was the only 3D modeling package I could understand after experimenting with the demo versions of many such packages). Now as I was going through my old backup discs of various old projects, I found the scene files for all these animation clips. Looking at those brought back memories, but they also contained some interesting scenes, for example this tropical planet. This would fit nicely into Linewars VR, I would think, as pretty much all the missions happen near a planet.

Snow Fall

Back in 2002 I started working on a 3D animation fan film myself, the setting of my fan film "Snow Fall" being the events of a chapter of the same name in the late Iain M. Banks' novel "Against a Dark Background". This project has also been on hold for quite a while now, but I do every now and then seem to get back to working on it. The last time I worked on it was in 2014, when I added a few scenes to the movie. It is still very far from finished, and it seems the 3D animation technology progresses much faster than I can keep up with my movie, so it does not seem like it will ever get finshed.

In any case, I spent a lot of time working on a detailed ship cockpit for this animation project. I believe I can use at least some of the objects and/or textures of the cockpit in my LineWars VR project. This cockpit mesh has around a million polygons, and uses a lot of different materials, most of which use reflections (and everything uses shadows), so I will need to optimize it quite a bit to make it suitable for real-time game engine rendering. Here below is a test render of the cockpit from June 28th, 2003.

Learning Unity

As I haven't worked with Unity before, there are a lot of things to learn before I can become properly productive with it. I am familiar with C#, though, so at least I shoudl have no trouble with the programming language. I have been reading chapters from the Unity manual every evening (as a bedtime reading :), and thus have been slowly familiarizing myself with the software.

Explosion animation

One of the first things I experimented with in Unity was a texture animation shader, which I plan to use for the explosions. I found an example implementation by Mattatz from Github, and used that for my tests. I also found free explosion footage from Videezy, which I used as the test material. This explosion did not have an alpha mask, but it seems that none of those explosion animations that have an alpha masks are free, so I think I will just add an alpha mask to this free animation myself.