DevLog Update – The City, Optimization, and Trees

Started working on a level which I’m calling “The City” for now. Got a few architectural elements that I’m pretty happy with, heavily inspired by brutalism. You can see some trees that I’ve added. I’m quite happy with the results so far, but I think they still need plenty of tweaking. I’m not sure if it would be worth it to write a tree generation script, or just make 10 or so different ones by hand and place them randomly around the scene.

Also started optimizing the geometry in the scene. I’m using a combination of merging groups of objects into one to reduce draw calls, setting certain objects to occluders, as well as not drawing polygon faces that I know for sure are hidden.

Finally, I just got a new updated version of ProBuilder, which I’m super excited to try out.

Relativity_Game_Willy_Chyr_screenshot_2013-11-30_001

Relativity_Game_Willy_Chyr_screenshot_2013-11-30_002

DevLog Update – Replacing Color Change Beams, finetuning puzzles

Spent today and yesterday fine-tuning various puzzles that use color-change beams and dispenser cubes. Mostly this involved replacing the old version of the beams with the new ones, and adjusting solutions that were either too obvious or too tedious to execute.

Tomorrow I will start designing different architectural elements and space to populate the game’s world.

Anyway, got some new screenshots to share:

Relativity_Game_Screenshot-2013-11-25_21-07-26

Relativity_Game_Screenshot-2013-11-25_21-08-37

Relativity_Game_Screenshot-2013-11-26_00-28-33

Relativity_Game_Screenshot-2013-11-26_12-51-31

Cube Dispensers and Color Change Beams

Today was not as productive as I hoped it would be.

I spent the first part of the day fixing up “Cube Dispensers” – these machines that release new cubes (while simultaneously destroying old ones) when you press a button. Because there are six different gravity fields in the game, I ended up having to do everything six times. At this point, I’ve abstracted the code enough so that level elements are pretty modular and can switch pretty gravities easily, but it’s still not super optimal…

RelativityGame_Cube_dispensers

Another thing I worked on is “Color Change Beams” – these are beams of light that allow you to change the color of a cube, and therefore the specific gravity field it belongs to. I wanted to create a sweet transition effect where the new material would slowly fade in over the old material. Alas, writing the shader proved a little too complicated, and I settled for a simple instant-change color effect. I will come back to it in a few days.

Relativity_Game_Screenshot_color_change_beam_01

Relativity_Game_Screenshot_color_change_beam_01

Unity Shaders – Depth and Normal Textures (Part 3)

This is a continuation of a series of posts on shaders: Part 1, Part 2

In the previous two parts, I talked about using depth texture in Unity. Here, I will discuss using depth+normal textures through DepthTextureMode.DepthNormals, which is basically depth and view space normals packed into one.

Below is the effect we will create. What you’re seeing is the scene being rendered with the viewspace normals as colors, and then the depth value as colors.

DepthNormals

Depth+Normal Texture

If you remember from Part 1, we can tell the camera in Unity to generate a depth texture using the Camera.depthTextureMode variable. According to the docs, there are actually two modes you can set this variable to:

  • DepthTextureMode.Depth:depth texture.
  • DepthTextureMode.DepthNormals: depth and view space normals packed into one texture.

We are already familiar with DepthTextureMode.Depth, so the question is: how exactly do we get the values of depth and normals from DepthTextureMode.DepthNormals? 

It turns out, you need to use the function DecodeDepthNormal. This function is defined in the UnityCG.cginc include file, which, by the way, can be found on windows using this path: <program_files>/Unity/Editor/Data/CGIncludes/

Below is the definition:

inline void DecodeDepthNormal( float4 enc, out float depth, out float3 normal )
{
   depth = DecodeFloatRG (enc.zw);
   normal = DecodeViewNormalStereo (enc);
}

So what is going on here? We can see that the function takes 3 inputs: float4 enc, out float depth, out float3 normal. Basically, what it does is it takes information from enc, runs the functions DecodeFloatRG and DecodeViewNormalStereo on depth and normal respectively, and outputs those values. 

This is what it will look like in our code:

DecodeDepthNormal(tex2D(_CameraDepthNormalsTexture, i.scrPos.xy), depthValue, normalValues);

depthValue is a float which will contain the depth value of the scene, and normalValues is a float3 that will contain view space normals. As for the first variable, tex2D(_CameraDepthNormalsTexture, i.scrPos.xy), what’s going on? Well, basically, _CameraDepthNormalsTexture‘s variable type is Sampler2D. What DecodeDepthNormal requires is a float4. So what we do is apply tex2d, a function which performs a texture look up in a given sampler.

The first input that tex2d takes is the sampler, in our case _CameraDepthNormalsTextureand the second input is the coordinates to perform the look up, which in our case is the screen position, or i.scrPos. However, i.scrPos is float4, and in the input needs to be float2, so we take only the xy coordinates.

The Shader

Here’s the code for the shader. Let’s call it “DepthNormals.shader”.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
Shader "Custom/DepthNormals" {
Properties {
   _MainTex ("", 2D) = "white" {}
   _HighlightDirection ("Highlight Direction", Vector) = (1, 0,0)
}

SubShader {
Tags { "RenderType"="Opaque" }

Pass{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"

sampler2D _CameraDepthNormalsTexture;
float _StartingTime;
float _showNormalColors = 1; //when this is 1, show normal values as colors. when 0, show depth values as colors.

struct v2f {
   float4 pos : SV_POSITION;
   float4 scrPos: TEXCOORD1;
};

//Our Vertex Shader
v2f vert (appdata_base v){
   v2f o;
   o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
   o.scrPos=ComputeScreenPos(o.pos);
   o.scrPos.y = 1 - o.scrPos.y;
   return o;
}

sampler2D _MainTex;
float4 _HighlightDirection;

//Our Fragment Shader
half4 frag (v2f i) : COLOR{

float3 normalValues;
float depthValue;
//extract depth value and normal values

DecodeDepthNormal(tex2D(_CameraDepthNormalsTexture, i.scrPos.xy), depthValue, normalValues);
if (_showNormalColors == 1){
   float4 normalColor = float4(normalValues, 1);
   return normalColor;
} else {
   float4 depth = float4(depthValue);
   return depth;
}
}
ENDCG
}
}
FallBack "Diffuse"
}

Remeber that the normal values are from the view space, so when you move the camera, the normals, and thus the colors, change.

DepthNormalsCamera

The script to attach to the camera

Let’s call it “DepthNormals.cs” just to keep things consistent. What the script does is, everytime the user hits the keyboard button “E”, it switches the shader between showing the depth values and the normal values.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
using UnityEngine;
using System.Collections;

public class DepthNormals : MonoBehaviour {

public Material mat;
bool showNormalColors = true;

void Start () {
   camera.depthTextureMode = DepthTextureMode.DepthNormals;
}

// Update is called once per frame
void Update () {
   if (Input.GetKeyDown (KeyCode.E)){
      showNormalColors = !showNormalColors;
   }

   if (showNormalColors){
      mat.SetFloat("_showNormalColors", 1.0f);
   } else {
      mat.SetFloat("_showNormalColors", 0.0f);
   }
}

// Called by the camera to apply the image effect
void OnRenderImage (RenderTexture source, RenderTexture destination){
   //mat is the material containing your shader
   Graphics.Blit(source,destination,mat);
}
}

Conclusion

Now you know how to get the depth and normal values from the Depth+normal texture. Please remember that this is not meant to be a definitive guide on how to work with shaders. It is simply a summary of my experience working with the depth texture and vertex/fragment shaders in Unity during the past few days. Hopefully you found some of the information useful for you all development projects.