After planning the idea out roughly, I spent too long messing around and testing mechanics that I wasn’t even sure I would be using in the project. This certainly came back to punish me. I also spent too long trying to figure out the combat system as I would look at the scripts and the animators and would run away to do something else rather than slog through it trying to understand and make it work.
In the future I need to plan more thoroughly. At the very minimum, I need to list everything that I will need broadly at the start, such as sound. That way I can have a crude to-do list throughout the entire project that can be modified with more details later. Also, at least write tidbits in the blogs as I go instead of just keeping versions of everything I’ve done to then comb through later trying to expain once I’ve forgotten most of the details. And most importantly, do more work earlier in the project. Even if it’s 20 minutes a day thats still better than nothing.
I wanted to have a save system for the game when I started and I got a very basic one working using a binary formatter. I wanted more complex save data though and that obviously required more complex setup, which I didn’t end up doing, leaving save data partially developed. A binary formatter can be used to save simple data such as int, string, float variables.
Magic
Another combat mechanic I wanted to implement was magic. I didn’t figure out how I wanted it to work exactly, but I did create some custom particle effects that would be triggered when casting.
Procedural Room Generation
The random generation of the game was meant to include the spawning of random rooms, but I didn’t spend the time to figure out checking where the door ways were in a room prefab. The base random generation is already in place due to the random enemy generation, but random rooms requires a lot more complexity than I could be burdened with at the time.
Potions
Another combat mechanic I was originally wanting was potions and grenades. Simple enough to implement, and they would use the same generation as the weapon spawning, but I didn’t have the time to complete this mechanic.
I wanted to have an AI for the enemies to make combat that much more interesting on top of the sword mechanics. I followed the below playlist create an AI but quickly found out the videos leave out a lot of scripts and lines in shown scripts tat make the rest of the code useless without.
I got very frustrated with this and spent countless hours trying to find any clues in every video, even the ones I wasn’t planning on using in the first place. I spent way too long on this and gave up many times before coming back a few days later to try again. I eventually started making progress and filled in some blanks but theyy wwerentenough to get everything working properly. So, time for the not-so-good-looking option.
The videos show capsule colliders on most of the bones, but I simplified it so there are only two box colliders, left and right, to detect when you hit the enemy. The detection triggers an animation depending on where they got hit. The video version would blend the animations based on how close to the centre of the character the collision occured, whereas my version doesn’t blend at all.
Through a lot of trial and error, I figured out a way to make sure an instantiated prefab could detect a collision on itself and not other copies, and activate animations in it’s version of an animator without triggering those of another copy. I ended up using four scripts for this; one on each collider, and two on the parent gameobject. One script on the parent will look for, detect, and chase the player. The other script on the parent is used for storing variables to be accessed by the scripts on the collider gameobjects. Each collider script will detect when their respective collider has been hit, will play their respective hit animations, and will disable both colliders until they are ready to be hit again. The videos below show player detection, hit reaction, and enemy behaviours.
The white circle and yellow lines are done in an editor script in the editor folder so they won’t be used in a built version. This is useful for when you need to see something in the scene view but not in the built game, such as a detection radius.
Before I actually got this to work, the player detection and enemy behaviour was quite different. One version is the player would always be detected and looked at but would only be chased when within a certain range. The other is the player would only be detected within a certain range, but there was no way to hide whilst in that range and you would always be chased. Below are videos to show these.
As you can see with the first video, for some reason the enemy would lean over as the player gets closer, leading to some interesting results.
For me to want to repeatedly play a game I like to have different experiences each time. You can have the same basic story but with different encounters/mechanics, or the same encounters/mechanics with different stories. Customisation in games really helps with this as you can change your play style and enjoy a game that way too. One way to change the experience every time is to have procedural/random generation – terrain, enemies, items, or just about anything you can think of. Using a computer doesn’t make generation random exactly – everything is determined by calculations – but it’s pretty close and gives the feel of randomness, which is more important. I used the video below to create a script for generating random coordinates within an area of my designation and for prefabs to spawn at those coordinates.
You can see it working in the video below. The white cubes are to show the corners of the spawn area. The script is setup so each element in the Enemy Types list will correspond to the same element number in the Enemy Spawn Counts list and will use the number in the spawn counts list to spawn the prefab in the types list that many times. The spawning coordinates are all random between the parameters set, and even the rotation can be changed.
I made a modified version of this so you can choose the points you want an object to spawn in, and it will choose an object randomly from the list given to it. I used this to spawn in weapons for the player to choose from to vary their experience more.
I really dislike the current popular method of melee combat/interactions; most objects don’t feel like they have any weight and they can move as easily and as quickly as the controller can. They also pass through most objects in an environment instead of receiving force based on object size and weight. This would be solved with force feedback if it were widely available and safe, as that much force to stop a sword could easily break an arm if done wrong.
Evan Fletcher devises a way to create a more realistic feeling of feedback from an object with the current limitations of VR. He doesn’t show any of the set up he did except for the settings on a configurable joint. The rest he vaguely mentions as bullet points for what needs to be done, leaving me to figure out what he means and recreate to the best of my ability. Weeks later after ignoring this for some time trying to find an alternative, I finally sat down and had a think. First I tried to interpret the referencing of his Unity scene setup, before then creating a C# script to crudely run through the bullet points presented by Mr. Fletcher. These are what I came up with:
The wrist prefab has the configurable joint on it and is instantiated by the script when you pick up a weapon. The wrist object becomes a parent to the weapon object and a child to your xr rig hand/controller. When you drop the weapon the wrist prefab is destroyed, but the weapon stays.
This worked, somewhat surprisingly, but I know there’s things I haven’t thought of when deciphering Mr. Fletcher’s notes.