My primary responsibilities throughout the 15 weeks of production were centred largely around systems programming in Unreal C++ and Blueprints for the game's Light Detection, as well as considerable work in Technical Art within shaders for the Toxin System, and on the game's lighting.
For the first 3 weeks of production (as well as most of the 4 week pre-production phase), I took on the job of creating the Light Detection Manager used in the game, which not only determines whether the player is currently standing safely within any light source's influence, but also calculates the total amount of light (or rather Illuminance) that is falling on the player from all contributing lights around them, using physically-based and fairly accurate calculations. As this system is heavily mathematical, I opted to create it in pure Unreal C++, which allowed me to not only gain experience with Unreal's unique implementation of C++ (as it was my first time using the engine), but also allowed me to write most of the maths in a much more succinct way in comparison to blueprints.
The system can calculate the lighting contribution from directional lights, point lights, spot lights, and rect lights, and will trigger damage to the player over time if their Illuminance level drops below a set threshold value. Writing the Light Detection Manager was a challenging experience, as each type of light needs it's own unique detection logic, and much of the actual maths behind Unreal's spot and rect lights (especially the rect lights), is not particularly documented, so it required me to combine a considerable amount of research to come up with general solutions that worked for all lighting shapes. My process for rect light detection was a unique solution that I was able to come up with independantly (which I plan on documenting fully in a later blog post and perhaps a youtube tutorial), but it required me to find and construct the four bounding planes of a rect light's fustrum shape, and perform 4 simple point-to-plane distance calculations to determine if the player was "above" all four bounding planes.
After I had gotten the Light Detection to an acceptable and playable Alpha state, I moved on to creating the Toxin System which would be used to visually represent the photosensitive toxin within the game, and express to player's what parts of the level were deadly and what parts were safe. As the toxin is, as mentioned, photosensitive, and we knew early on that player's would have dynamic control of positioning and rotating certain lights to solve puzzles, I knew during pre-production that I'd have to be able to come up with some kind of dynamic visual implementation that was able to react to light sources shining on it, and be "carved away" and nullified by said lights dynamically. Before I started anything, I spent about two weeks straight just researching different ideas and approaches that had popped into my mind, the most promising of which was to sample and use the light attenuation buffer data to create a light mask, that would then mask out the toxin's volume material in areas where light was shining. This would've been an effective and relatively easy implementation, but such lighting data is unfortunately inaccessible as of Unreal 4.27, and also required the engine to use forward rendering, which would have performed considerably worse in our game as we knew we were going to need a lot of active, non-static lights.
The first approach I successfully implemented was to use Unreal's old particle system (as I hadn't yet been introduced to Niagara) to spawn a settable number of particles, within a settable volume. Each particle would use a simple volume material for it's visuals, and the toxin would interact with light sources at runtime via collision. Using trigger volumes, I wrote logic for both point and spot lights that semi-accurately mapped trigger volumes (either conular or spherical volumes) to a light source's area of influence. These trigger volumes would then collide with the toxin system's particles on a custom Light Volume collision layer, and those particles would then be killed on collision. As this was my first attempt at a dynamic, light interactive fog, it was in all honesty quite rudimentary and primitive, but importantly it was enough to get the idea accross during our initial playtesting sessions (weeks 4 and 5 of production).
Once I had implemented the first attempt at the Toxin System, and I knew that it was in a workable state for playtesting throughout the rest of Alpha, I moved on to concepting and constructing a different and better approach to the Toxin System that I had become aware of. Because I was unsure if I would be able to get this Toxin 2.0 working dynamically with lights or not, my focus was initially on completing the Toxin 1.0 implementation to an acceptable state so that in the event I couldn't implement the new system, we had a sufficient back-up.
I should mention that I sourced most of the method for the visuals, as well as the initial set up for collision capturing, primarily from this amazing tutorial series on YouTube from Digital Mind so I give a huge amount of credit to him for the system I was able to make for Planet Nine, I simply had to adapt his implementation to work dynamically with the lights in our game.
The Toxin 2.0 system was implemented as follows: a Niagara particle system was created that spawns a single, persistent particle that uses a volume material to visualise a simple volumetric fog. This fog then uses two scene captures, one to capture the height of the floor the fog was mapped to (allowing the toxin to have height falloff based on the terrain beneath it), and one to capture the height of any actors tagged to be elligible for toxin collision. Using render targets, the height map of the floor is stored first, and then compared with the height map of the collision actors to create a collision mask, any areas with overlapping actors would be stored as black (0, 0, 0 , 0), and anywhere else would be stored as white (1, 1, 1, 1). This collision mask (at the corresponding UV samples) is then simply multiplied with the final extinction value that is calculated at each point in the volumetric fog, effectively nullifying the toxin in areas with colliding actors, and leaving the rest of the toxin unchanged.
To integrate this so that I could have lights mask the toxin material dynamically, I wrote some type-specific logic that generates and shapes a procedural mesh to the area of influence of a light, depending of the type of light. For point lights this is obviously just be a sphere the size of the light's attenuation radius, and for spot lights I used a conical mesh that mapped to the base height of the cone (see the GIF below). Then, whenever needed, the light's procedural interaction mesh is sliced along its plane of intersection with the floor. This means that the image taken by the toxin's scene capture from below will capture the effective area of intersection the light has with the fog along the floor.
This implementation of light-interactive volumetric fog is not only much prettier and cleaner than my original implementation, but is also immensely more performant, as it is only drawing one quad per instance of fog in the game, whereas the original system would be drawing (and overdrawing) around 500 quads per toxin instance.
Towards the end of alpha, as my work on the Light Detection and Toxin System started to taper off, I took on extra roles of general scene work implementing gameplay sequences, basic set dressing, as well as taking on the sole responsibility of lighting within the main level. For lighting (since I had never done any lighting work before), I decided to set up and schedule six structured Lighting Passes, in which I would go through the level in its current state and adjust all lights where needed based on Clarity, Efficiency, Visibility and Visual Appeal. I aligned each lighting pass with the completion of significant milestone builds or sprints, as I knew these would be times in which the level would undergo the most amount of changes to the layout, geo and set. With help from the Artists in the team, I was able to create a colour palette that exemplified the cold, heartlessness of the Nova Corperation in the areas of deserted factory and sterile facility, which was broken up by pockets of humanity where signs of the worker's struggle was clear, expressed by the warmer, more calming lighting colours in these areas.
If you're interested in reading more specifically about my week by week workflow, you can find my devlogs here on my website.