Beta MAX
{ Made in Unity }
{ Made in Unity }
Use four-dimensional thinking to control time and navigate through the world of Beta MAX. This atmospheric first-person puzzle platformer takes you on a journey with Max, who has been thrust into a retro-futuristic 80's inspired world full of increasingly complex puzzles.
The Player Must
Pause and rewind time
Manoeuvre objects onto floor switches and remotely activate wall switches
Utilise time chambers and velocity accelerators
Dodge lasers and avoid crushers
Travel on moving platforms and traverse fragmenting bridges
On Each Level
Find the hidden collectible
Overcome all obstacles to complete without errors
Move with speed and precision to get under par-time
The core mechanic of the game is to pause and rewind time. My original approach for implementing this was to have a single system that would handle recording and rewinding of all the game objects in a scene. This soon became a horrible mess as the number of recordable game objects grew. I changed tactic by creating reusable generic state and frame recording classes. This allowed the data being recorded to be independent from the recording implementation so each component would handle the recording and rewinding itself.
Everything is synced to the VCR time (time shown at the bottom left when playing). The VCR time goes up by adding the frame delta, and down with negative delta multiplied by rewind speed. Dynamic game objects like the box or the player have their time, position, velocity, etc. stored in each fixed time step, but only if the values actually change (to be efficient with memory).
Things that follow a path and loop like moving platforms or crushers that simply go back and forth don't require the above mentioned frame and state recording classes. Instead they have their position calculated each fixed frame from the VCR time and duration of the loop. Then game objects that follow a path but can be switched on & off do record their state changes and calculate their positions from the last state change.
Another core aspect of the game was the switches. A component I called the Bread Crumb is what shows the connection between a switch and its target, and if the switch is recordable then so must be the Bread Crumb. A Bread Crumb that connects a Floor Switch to something (e.g. a Light Bridge) animates from the start to the end of its path.
To aid with the level design, I created the Path Tool. This is a Unity Editor Tool that creates a list of points (positions & normals) as I click along surfaces. It has features such as easily deleting a point with CTRL+click, independently move existing points and reverse all points. From this list of points I then procedurally generate the Bread Crumb mesh, with width and height (above surface) options.
It's necessary for me to generate the mesh myself as the vertex color data in it contains information required in the shader for animating the path. The red channel is set to the length of the path up until that point, and the green channel is later calculated to the unit value. That is, the vertex length divided by the whole path length. This allows me to use a Step node in Shader Graph to get a mask used for the emission output.
Now the path can be illuminated when modifying the Edge shader parameter from between 0 and 1.
A cool feature of the game that helped the player to see where they have been is what I called the Lux Rug. Eh, naming things is hard, don't judge! đŸ˜› This is a floor that lights up when the player walks on it, then fades out when the player leaves a square. Naturally this is also rewindable.
Like the Bread Crumb, this rug is procedurally generated to aid in developing levels. And like the Tile King utility that's part of the Nexcide Tools project on the Unity page, the generated tiles can be flipped, mirrored, rotated and colour adjusted to break up repeating patterns.
The Lux Rug is built up of 4 components:
Canvas (For rendering)
Collider (So the player doesn't fall through the floor)
Trigger (For calculating which tile the player is in, if any)
Audio Source (A sound plays from the tile when illuminated)
For the shader to know how the floor should be lit, a texture is created with the same dimensions as the rug. If a rug is created with 8x6 tiles, then that'll be the size of the Canvas texture. This texture is then set on the shader and the colours of the texture are updated accordingly. When the mesh is generated the UV3 vertex data contains the coordinate of the tile for sampling.
Animations for the game were created using a homebrew mocap (motion capture) solution. I used the Meta Quest Pro headset that includes hand tracking and created an editor utility to record movements into an AnimationClip. The animations would then be cleaned up and manually processed. All of the cutscenes and most of the in-game animations were created using this method.
Portals are cool, so I implemented them in the game. They're purely for visuals, therefore non-traversable. The game is complicated enough without traversable portals! They are used as a window into another world. They can be seen at the start and end of the game, and in some secret areas at various points throughout the game.
The concept of portal rendering is quite simple. You have a 2nd camera that updates its position based on the main camera. The 2nd camera position is calculated in several steps:
Transform position to local space of the source portal
Transform position back to world space from local space of the destination portal
Repeat with the rotation