Categories
Uncategorized

Aurora Borealis: A Breakdown

A short step-by-step overview of how to create a convincing real-time aurora borealis effect in Unity.

Chances are you’ve seen Miskatonic Studio’s great aurora borealis effect:

As soon as I saw it, I immediately felt inspired to create a similar effect in Unity. At the time, they had not released the source code yet, so I had to figure out myself how they went about it exactly, but that was part of the fun. I started hacking away and pretty quickly got to something I liked the look of.

Overview

As soon as they mentioned “ray marching” it all clicked together for me. The effect is basically a 2D noise map “extruded” downwards volumetrically using ray marching. I know how to animate nice noise patterns and I’ve done raymarching before, so I figured that I should be able to crack this one! I will try to make this breakdown accessible, but I will assume that you are familiar with the basics of writing shaders. With that out of the way, let’s get right to it.

Let’s Make Some Noise

“How do you make a wispy aurora borealis-like pattern” you might be wondering. Avid Photoshop users might look at Miskatonic Studios’ tweet and be reminded of “Difference Clouds”. It’s a filter that generates a Perlin noise map like Render > Clouds does, but it does it based on the difference with the colour values that are already there. If you proceed to repeat the effect several times (hotkey ALT+CTRL+F) you will end up with an elaborate marble-like texture with ‘veins’.

So that tells us something: there’s a way to combine Perlin noise maps to get this kind of marble-like effect. Let’s figure out how to do it in Unity. First, if you don’t have one already, create a master Perlin noise texture. The most useful texture you will ever generate. Create four unique layers of Perlin noise using Render > Clouds and store them into the R, G, B and A channels using the Channels window. You might want to use the Image > Adjustments > Levels to make sure they are more or less normalized. We want the Perlin noise to go all the way from pure black to pure white instead of, say, a brightness of 30% to 70%. Let’s make use of the full range so we have to do less filtering in Unity. You’ll end up with something like this:

A master Perlin texture with a different Perlin noise map in every channel

Now in your shader, try sampling one of the channels based on the world-space XZ co-ordinates as the UV’s, with some offset multiplied by time. You’ll end up with a basic scrolling Perlin noise like this:

Now do it again but with unique values for the UV scale, offset and scroll speed. Bonus points if you make it scroll in opposite directions. I always find that that creates a more dynamic animation as a result. Don’t forget to sample a different channel too for maximum variety! You’ll end up with something like this, very similar but unique:

Now comes the tricky part. How do we combine these? You can multiply them together to get a nice animated Perlin noise map. Perfect for something like clouds, but that’s not what we’re after. What’s the secret to the Difference Clouds effect? It’s obvious in hindsight: you take the difference of the two Perlin noise maps, so you subtract them from each other, and take the absolute value of that so it’s positive on both sides of the “center”. You will start to see this veiny marbly effect we’re after:

Notice how there are now curly black “veins” forming and disappearing? If we play with the contrast and invert the image we end up with the mask we’re after:

Note that I cranked up the contrast so hard and the resolution is not super huge, so the end result is kind of ‘grainy’. Instead of smooth thick lines like in Miskatonic Studios’ version we get something more wispy and with holes in it. I like that, personally. It reminds me of the streaks and gaps you see a lot in real aurora borealis:

How to See the Northern Lights in the Summer | Condé Nast Traveler
Source: cntraveler.com

Let’s March Some Rays

Now, this next part about raymarching might be a bit intimidating for newcomers. I’ll be going through it in a hurry because there are far better resources out there for learning how to do raymarching at all. The TL;DR version of it is this: you place a basic shape that represents the volume, like a cube. A normal vertex/fragment shader is tasked to produce one colour value for every pixel on the surface of that shader. To have that pixel instead represent every “pixel” behind it in the volume, we need to “march” further into the volume and calculate a colour value for that position too, and then combine them all together. In the end this amounts to little more than:

A) Have a C# script pass along some material properties so it can reconstruct where a pixel’s ray is in world-space / where the bounds of the box are

B) Have the surface shader do a little for-loop where it marches forward from the world-space position of the pixel on the front of the box until it reaches the world-space position of the pixel on the back of the box.

In my case we calculate the depth where the back of the box would be with maths, but you could also do some wizardry and render the object with front-face culling to get that depth. That is useful for if you want bounding objects with more complicated shapes. For our purposes, an axis-aligned box is perfectly fine.

In my case the bounds mesh is a big axis-aligned cube in the sky.

At this time of the year?

With a basic ray-marching structure set up (refer to the source code below if you don’t have one yet) we can get to the fun part: making our noise pattern volumetric. For every world-space position you sample, use the XZ coordinates to sample the noise function we created earlier. This way we get “pipes” where the noise pattern is extruded downwards volumetrically.

It’s starting to resemble something, although it’s not very pleasing to look at yet.

That’s a start! It doesn’t look great yet however. The problem here is that we’re using a solid colour. Ideally we want some change in colour based on the Y-position, but more importantly we want some change in the opacity of the effect so we can have some smooth, gradual fall-off. Luckily I know just the thing! I made a Curves & Gradients Texture utility for Unity so you can quickly and intuitively create such a texture from a gradient.

The gradient tool in action.

A sharp start and a long tail seems to be the perfect fall-off for an aurora borealis. With that applied, we end up with something like this:

That looks very aurora borealis-like. Now as a finishing touch consider adding a little bit of a colour fall-off as well to add interest. Lastly: consider using perlin noise to add some variation to the length and y-position of the streaks. With all of that combined you will end up with something like this:

May I see it?

And there you have it: a relatively straightforward implementation of a volumetric aurora borealis effect in real-time in Unity. I hope you found this breakdown useful. The source code for this project is available for free on GitHub. If you make anything cool with it, don’t forget to @ me on Twitter at @Roy_Theunissen. Happy trails!

Categories
Shaders Unity

Curves & Gradients To Texture in Unity’s Built-in Render Pipeline

Do you want to be able to tweak a curve or a gradient in real-time and pass it along to a shader? Maybe re-use the same one across different effects? Look no further!

Many times when working on a complicated effect you want to have a colour fall-off, or you need some kind of specific value fall-off and your simple mathematical function / lerping between colours is not cutting it.

The Traditional Solution

What do you do? Maybe start a new Photoshop document. Convert the background layer to a normal layer. Add a layer effect with a gradient overlay. Start moving around some keys. Hope it makes sense. CTRL+S. Back to Unity, wait for it to import. Hmm, not quite right, back to Photoshop. That workflow sucks. The extra waiting time between iterations slows the process down and makes it hard to see how small changes affect the overall results. Especially if you’re talking about a curve of float values. Having to express that as colours is not at all intuitive. The ideal workflow would be that you just specify an animation curve or a gradient and have that automatically generate a texture, and pass that on to your shaders.

The New Solution

That’s what I made! Declare an AnimationCurveTexture field or a GradientTexture field and you are good to go. You can edit it “locally” for this specific instance to get real-time feedback while tweaking, or you can create a re-usable AnimationCurveAsset/GradientAsset. If you want you can even export it to a texture asset and use that. It is easy to switch back-and-forth between different kinds of sources. The point is that there is now an abstraction layer in-between. You can just define the values that you want, and there is now a utility that is responsible for generating or loading the corresponding texture, giving you full control over how the corresponding data is formatted, but allowing you to focus on the impact of minute changes in values instead.

In Conclusion

This is ideal for creating an intuitive shader workflow that encourages frequent iteration, especially in the Built-In Render Pipeline / code-based shaders. The extra control it gives you also allows you to create new workflows for real-time creation of complex project-specific texture requirements that would be complicated to create in Photoshop (think 2D gradients with lots of colours on both axes). Whatever your project needs, it is now in your hands to extend this tool and facilitate a productive workflow for it.

Curves & Gradients To Texture is now available for free on OpenUPM and on GitHub.

If you make any improvements that you’d like to share with the rest of us, feel free to open a Pull Request.

Categories
Unity

Introducing: The Asset Palette

Optimize your workflows with shortcuts to frequently-used assets.

Imagine this: you’re making a handful of castle-themed levels, and the majority of the work entails placing drawbridges, chandeliers, doors and keys. It’s a very small group of assets, but some of them might be gameplay, some of them might be decorative, frankly the specific files are scattered across different subfolders of the project. For the duration of your work on these levels though, these assets belong together. You want to be able to have them all in one place and quickly drag them into the scene. This is where the Asset Palette comes in.

The Solution

This kind of scenario actually happens a lot in game development. It’s not just level design: as a programmer you might be working on a feature requiring you to constantly tweak a handful of Scriptable Objects. I believe the solution is to identify frequently-used asset-based workflows, create a new group for that workflow and add all the relevant assets to it. From then on you can use it as a panel of shortcuts to greatly speed up your day-to-day work. The Asset Palette does just that.

Tutorial on how the Asset Palette works

Extensibility

The Asset Palette is set up using [SerializeReference] and PropertyDrawers, meaning that you can add any kind of custom PaletteEntry to a collection. Do you have a feature that requires you to constantly execute one of a few macro functions? Instead of cluttering the global toolbar menu with your hyper-specific macros you could add buttons for them to your own personal palette.


In Conclusion

Every game is different. The workflows required to make your game in an efficient manner are unique to your game. Consider using this free, plug-and-play tool to create the kind of tailored workflow optimization that your game needs.

Asset Palette is available for free on OpenUPM and on GitHub.

If you make any improvements that you’d like to share with the rest of us, feel free to open a Pull Request.

Categories
Unity

Introducing: The Scene View Picker

Assigning references via the scene in Unity

Are you an organized worker? Have you ever organized your scene hierarchy so hard that everything makes a lot of sense, but you can’t easily find that one object? That little bastard. It’s right there in the scene view, I’m looking at it.

Has this ever happened to you?

The Solution

Introducing: the Scene View Picker. It’s a PropertyDrawer for UnityEngine.Object that adds a handy button next to all references in your script that allows you to assign a reference from the scene view.

It’s particularly useful for UI but it works for references of all types

Extra Features

Sometimes the object you’re looking for is surrounded by other objects that would be equally valid. In that case you can Middle-Click to select nearby objects from a dropdown.

One of its more obscure features but it’s proven very useful in some of our projects.

In Conclusion

It’s free, very convenient and best of all: it’s plug-and-play. All your custom scripts (save perhaps for the ones with dodgy editor scripts that don’t use serialized properties) will now have this handy little button.

It’s on the Open Unity Package Manager (OpenUPM) and also available on GitHub.

I highly recommend it, and if you have any ideas on how to improve it feel free to open a Pull Request.

Categories
Shaders Unity

GPU Spline Deformation in Unity

How to deform any mesh along a spline by leveraging textures

Textures are just data grids and they don’t need to describe colours nor correspond to existing geometry in any way. With a little creativity they can mean anything.

Many 3D objects and effects could be described as a simple shape repeating across a spline. Most modelling packages offer this kind of functionality, yet it’s not quite as intuitive to do something like this in a game engine. Most implementations fall back to the brute-force approach of generating a new mesh via code. I’m here to offer an interesting alternative.

Inspiration

There is an uptick of new effects and optimizations that stems from a simple idea: textures don’t have to describe colours along polygons. Textures are just data grids and they don’t need to describe colours nor correspond to existing geometry in any way. With a little creativity they can mean anything.

Inspired by that idea I set out to try a new workflow: render deformation along a spline to a texture, then apply that deformation to any mesh using a shader. Let’s walk through how it works and what you can do with it.

A novel approach to deforming any mesh along a spline

Deformation Matrices

Transformation matrices are used in 3D rendering to translate co-ordinates from one space to another such as local space to world space. Without going into too much detail on the maths (there’s plenty of good articles for that), it’s worth knowing how transformation matrices are stored. In shader functions, most matrices are stored as a float4x4, which is essentially a two-dimensional array of floats, like so:

Image for post

The idea started when I wondered if there was an easy way to store a matrix in a texture. There is, actually. It can be written like this:

Image for post

Given that each pixel in a texture can hold RGBA values, each pixel can hold one row of a matrix. A matrix usually has four rows, so a whole matrix can be stored in as little as four pixels. So if we just make a script that samples the position, rotation and scale along a spline then we can construct a matrix and store that in a texture using the layout described above, like this:

Image for post
A 64 KB 1024×8 RGBA16 texture generated by a script. The X-axis represents progress along the spline.

The Shader

Image for post

Given a base mesh that is aligned with the Z axis and has plenty vertices we can very easily transform one ‘slice’ of the mesh along the spline. The process is as follows:

  1. The normalized z-coordinate of the vertex determines what ‘slice’ it belongs to, or how far along the spline it should end up. Let’s call this P.
  2. Sample four vertical points in the deformation texture. The X co-ordinate is P and the Y co-ordinate of each sample is evenly spread.
  3. Combine all four samples into a float4x4 matrix.
  4. Move the vertex back to Z co-ordinate 0, the starting point, and multiply it by the matrix.
Image for post
It’s that easy!

Artifacts

There’s a number of artifacts you’ll see if you’re building something like this from scratch and they’re worth knowing about.

  1. Using the alpha channel to store non-alpha data surprisingly does not work well out-of-the-box. Most image formats will change the RGB values when the A value is 0, corrupting the data. I’ve found the EXR file format to be the least intrusive when it comes to this. Any other format, especially PSD, will interfere with the data.
  2. Interpolation may work against you. Especially for the rows in the texture, which are distinct values with no relation to one another, interpolation will generate incorrect values. A quick solution to interpolation problems is just to increase the resolution. Despite theoretically only needing four vertical pixels I’ve opted to use eight for this particular reason.
  3. It’s important to think about clamping. For looping splines the X-axis can be repeated, but the Y-axis remains distinct and should be clamped to avoid interpolation artifacts.
Image for post
Early version of the displacement algorithm

Strengths and Weaknesses

There’s certain things this technique does well and certain things it doesn’t. What it does well:

  • Simple generic workflow, any deformation is easily baked and applied.
  • Updated in real-time for immediate feedback while working.
  • Can be animated to achieve creative new effects.
  • Pre-rendered texture assets make this approach extremely performant.

However, there’s also certain weaknesses to this approach:

  • As it stands, rendering new textures is done via a script and is not very efficient. Certainly good enough for editor usage, but not optimal for splines that change at runtime.
  • No new geometry is created, so if you want to deform a mesh along an arbitrarily long spline you might have to instantiate more meshes.
  • The bounds of the mesh are not automatically updated and have to be updated via a script to prevent incorrect culling.
Image for post
Looping splines and animated offsets can create looping meshes.

Closing Thoughts

It’s an interesting technique and hopefully serves to help people let go of their preconceptions about textures and make new optimizations, and maybe even create surprising new real-time effects that were not possible before.

The repository for this article can be found here. Good luck, and have fun.

Image for post
Categories
Uncategorized

Debug.Log(“Hello World!”);

Hello. When I make something cool or have acquired some piece of ancient wisdom I’ll share it here.

As my great-great-grandfather Ronald Dietrich Theunissen said: “why bother with microblogging when you can do macroblogging”