2 Client project demo’s for Michel Sainisch, ex-Ubisoft producer, for a Tactical RPG prototype with elements from Front Mission (1995) and XCOM: Enemy Unknown (2012).
”In a distant feature on a faraway planet, lead giant humanoid machines equipped with an arsenal of destruction. Customize pilots and machines to your liking, and fight on procedurally generated maps with environmental hazards, weather effects and more in an extensive single player campaign, or online and offline PVP!”
-Designed, built and led development for 5 procedural level generator prototypes.
-Lead design & production for a team over 20 people
-Designed 12 interchangable level pieces
-Animation & VFX implementation
-Design & Lead development for a procedural level generator
-Build procedural generator prototypes
-2D procedural level design
-Theme & concept design for levels
-Making system design templates
-Define design pillars
-Uphold vision for design
-Task distribution for the design team
-Managing client interactions
-Core gameplay design
-Class concept design
-Write technical design documents
-Animation system creation and implementation
-Particle system creation and implementation
-World interaction & encounter implementation
-Ragdoll physics implementation
-Task management for 20+ people
-Managing client expectations
Procedural generator principles
1. Researching procedural development & concept creation
To understand the process of procedural level design better, I researched various different generator types to see which would suit our needs the best. Afterwards prototyping several of them during our concepting phase, which I reviewed on basis of quantifiable level design parameters distilled from my level design specific research. Consequently we settled on a voranoid / plot and parcel hybrid, taking a pre set up tile-able voranoid pattern and point sampling a combination to create a new level.
[Picture] This presentation slide was used to pitch the procedural generation system to the rest of the team. The first generator was based on an infinitely tiling voronoid pattern of which we could turn chunks on / off to change the size, shape and playstyle of the level. On top of this, most cover in each chunk could be turned on or off based on a set algorhythm and handplaced by the level designer to create a balance between designer influence and procedural generation.
2. Proof of concept
To find what the best way for generating the level, we made several prototypes based on different ways of procedural generation. Consequently finding early problems with the way of generating and having early prototypes to test with.
[GIF left] This GIF shows a prototype that the main procedural programmer made to prove the voronoid generation concept. It shows the indirect generation of a path through the voronoid shape and the change in level shape that comes after.
[GIF right] This GIF shows the different iterations of the plot and parcel prototype that I made to test whether created self created paths would result in a bigger expressive space faster than a predetermined map.3. Building the first generator
When building the first generator based on the voronoid prototype we quickly ran into a couple of problems: The artists couldn’t work out a theming when every wall or level edge could potentially be the or not, furthermore the lack of height difference also reduced visual impact/readability and the level design felt very ‘random’ and didn’t promote different styles of gameplay. Lastly the concept was shifted from a single player game to multiplayer which the generator was not catered towards, consequently we had to go back to the drawing board.
[Picture] This screenshot was taken just before we ditched this prototype. On the left it shows some of the map generations we could already make with the current setup and that we had 4 out of 6 stages designed at this point.
4. Redesigning the generator
Before going back to the drawing board I first did some more researched into existing procedural level generators and opted for a much simpler system that our last one while reusing a lot of the logic we had already built: Using a persistent map that’s divided into premade chunks which we switch out. Consequently catering towards ‘balanced’ multiplayer gameplay while the artists were given more control in the environment. Furthermore we set some strict visual guidelines to ensure readability and shifted the theming to a more industrial, blocky environment.
[Picture left] This slide shows some of the research I did into X-COM’s plot and parcel system: watching some of their talks, reading post mortems and distilling it into some take away’s for our own procedural system.
[Picture right] This image shows the basics of our new procedural system, being a lot more intuative and keeping it to very basic randomization made the system a lot more handable from a design perspective.
5. Level design
One of the trade-offs of the new procedural system was more manual labor from me as a level designer, having to make 12 unique gameplay setups which had randomized cover and semi random area connections. During this process I focused on making each area stand out more in it’s type of gameplay and playing more with the permanent pieces of cover.
[Picture] This overview showed the type of map that we fed into the procedural level design generator and the color codes that we used to describe the different tile types. Furthermore, it shows some unique chunk iterations and some a the different possible map generations that the procedural machine could make. Lastly we played more with height in this controlled environment, allowing the artists to make a more interesting gameplay location.
6. iterations and microversions
After making the initial few subversions for the chunks, I made some more distinct microversions to increase the expressive space of the generator. Furthermore iterating on some of the existing designs by increasing the space between pieces of cover and balancing the level generation algorithm.
[Picture] This screen capture is one of the possible generations of the level on delivery.
Lead design tasks
During my time as a lead designer I had several responsibilities including:
Technical design documents
To explain to the programming team how to build our core mechanics we decided to set up multiple technical design documents. Consequently, I made a template for the system designers based off a collection of their existing documents, defining variables, core systems, exceptions, calculations etc.
[Picture] This technical design document excerpt explains some of the core combat systems seen in the SYMBios prototype. Here I explain the difference between the hit chance modifiers and define all the variables used in the calculations.System / class design
Because of the tactical nature of the brief we spent a lot of time on system design and paper prototyping, trying to find a USP while ensuring balanced strategic play at the same time. Therefore we analysed multiple existing tactical RPG’s and based most of our systems around the rock paper scissor theory. As a lead I had to oversee this process and ensure the consistency in designs.
[Picture] This slide presented my process designing a pitch for the first Ragnarok prototype, starting with guerrilla based combat we came up with a trinity based on movement, attack range and crowd control.
As a design lead during the second iteration of Ragnarok I had advocate the design vision, ensuring that the core design was in line with client expectations and coherent throughout development. Therefore I researched into several different game design theories like: Quantic foundry’s gamer motivation profile & cognitive threshold map, to support the theories behind our target demographic and core designs. Consequently presenting findings to the design team and setting up guidelines for the future designs.
[Picture] This breakdown shows part of the breakdown I made for Ragnarok, listing out what our potential demographic would be looking for in a tactical RPG and breaking down which parts of the game are made for that purpose.
Game motivation model overview (Yee, 2015)
Cognitive threshold in strategy games (Yee, 2016)
For both the Ragnarok projects, I was tasked with the project planning, setting up project milestones, sprint goals and deliverables for different showcases. Consequently, we set up several different systems to keep ourselves to those milestones and I attended weekly meetings to discuss our progress with the client.
[Picture] For the weekly project planning I sat down with the designers at the start of every sprint and set their deliverables for the week. These had to be updated through Jira and kept up to date. This example shows one of our backlogs
Throughout the project we held weekly meetings with our client, updating him on the latest progress of the project and our plans for future milestones. During this process I learned about the importance of managing client expectations and how clients influence the design process
[Picture] This image shows one of the presentation slides I made for the client to explain our core class system during one of the weekly updates. I would talk over these and show the differences and considerations made for each class.
In both iterations of Ragnarok I did work on the animation systems: setting up the animation blueprint, connecting the Animgraph, linking up animations and creating the events that play the animations. Through this process I learned how Unreal’s animation system works and how to connect animations.
[GIF] This animation loop is a simple example of some work i did on the first Ragnarok: Our animator didn’t have the time to make a windup animation between idle and running, so I used Unreal’s animation blueprint to lerp in between the animations creating the illusion of the robot changing it’s running speed based on his walking distance / second.
For both Ragnarok demo’s I did work on the particle systems, primarily focusing on the shooting VFX. Therefore I was responsible for the creation of the particle systems and the implementation on the animations. This experience taught me about using the Cascade particle editor in Unreal, the socket system and animation notifies.
[GIF] This showcase shows the different weapons for the SYMBios demo: to create the effects I first created a particle system for each weapon, linked them up to the animations and implemented the sound FX.
For the first Ragnarok demo I implemented most of the camera VFX to create a more cinematic feeling to the attack animations. This consisted of applying camera shakes, time dilation and changing around camera angles while playing with depth of field.
[GIF] For the sword slash animation I changed first changed the camera angles and added a black bar transition. Afterwards I added some particle FX to the sword slash and time the time dilation and camera shake to add weight to the attack
Originally used as a concept of environmental hazards, the gas bag ended up only being implemented as dynamic background element. Therefore I quickly made the animation for the gas bag by code: changing it’s size while rotating and moving him back and forth. Furthermore I made a simple pathing system so the artists could quickly implement them in the level.
[GIF left] The pathing system that I made for the artists, it works by putting nodes in the level that automatically connect which the gas bag will automatically follow.
[GIF right] The gas bag scripted animation up close; to animate it I made it rotate, move and change size using timelines. This meant that the gas bag would increase size at the same moment as it flies up while using randomized starting values to mix up the pattern.
To make the combat animations of the SYMBios demo more dynamic we wanted to implement a version of physic body animations, based on the impact damage and type of projectile the mech is hit with. Consequently me and the rigger / animator set up physicsbody’s on the mechs that respond to collision, however we weren’t in time to implement the full version in the final demo due to time constraints.
[GIF left] This GIF shows the collision boxes that the rigger / animator set up for the mech. Afterwards I had to figure out how to interpolate them with the animation.
[GIF right] Here I tested how the mechs respond to ‘forces’ being applied to the using the actual game units, proving that they work on the in-game models as well.
Role: Procedural level designer
Dev. time: May 2017 – Jun. 2018
Time frame: 7 weeks – 10 months
Team size: 20+
Project status: Both demos delivered