Personal reflection - Jan Lishak 294322

Mazerama


The initial idea behind Mazerama was to make a maze with a ball that could be controlled by moving the platform. However after implementing this simple approach we found out that the update rate of the platform wasn’t fast enough for a smooth gameplay. This posed us a challenge which is probably common for many other mobile AR project - the lack of any controllers or good input source from the player. After some thought we decided to use the gyroscope of the device, and I think that was the right choice. With the gyroscope the game is very intuitive to play. You move the device and it moves the ball. Even thought the code for the game is only few lines of code it was probably one of the most time-consuming lines I have written in my life. Since we have used the gyroscope on the device we could no longer run the code on the computer and use external camera. It had to be compiled and run on the device, which took way to long for just experimenting.


My experience working with Vuforia is mixed. On one hand the setup was very easy and the integration with unity is quite good. However AR Foundation seems to give better performance by utilizing the ARKit and ARCore apis better. We had few problems that Vuforia was unable to locate the image of the maze when the lightning was not ideal and even when it found the marker there where times where the tracking was lost which was sometimes interrupting the gameplay. 


My main contribution to this project was the gyro script, testing and setting up the project to to work with IOS and Mac OS version of Unity. I also helped with setting up Github LFS to make our collaboration easier.



Trash Recycling 


We wanted to make something that was both fun to play, but also educational. While brainstorm, I thought about and old ps3 game called Trash Panic and proposed the idea of putting trash to different bins. From there we arrived to the final idea behind the game.


Since this education application was so much more complex than the previous one it would be very hard to make the app without the help of the unity AR Simulator. It took more than an hour to set it up with all the preview unity packages, but it definitely saved us much more hours that we would have wasted if we didn’t have it waiting for the app to compile every time we made a little change. I remember I had to make some changes to make it work both on the simulator and the device such as accoutering for mouse button click instead of touch input, but it was still definitely work the extra setup time. For placing the bins we made a script that raycasted from the camera to the ground. At first it would place a semitransparent indicator of the item and after clicking it would place the real object. It used an array of prefabs and indicator prefabs. All the script that were required by the object were placed on itself, so for example the spawner would have the spawn logic already on it so the idea of placing items was generalized and would work with any prefabs. 


The keystone of this whole AR application was plane detection. At first we had problem with detecting only the ground, but after few searches we were able to find the best settings and set it to only detect horizontal planes which solved most issues. For the bins, we had to add colliders so that the trash could bounce off the sides and one collider at the bottom to recognize which trash was put inside the trash. We also had to to use a raycast to figure which object was clicked on the screen. When the trash was carried we used LERP function to smoothly simulate the object being held. 


My contributions the this project was writing of the scripts, setting up the simulator, testing and deploying to IOS.



Knife throwing


We were all excited when we started this project because it was the first VR project. One of the main challenges was the make hands with realistic physics. We had to make a lot of changes to make the throw feel natural. The hands are rigid bodies and follow the controllers location. We also had to figure out how to make the knife stick to the target. We achieved this by adding a little collider at the tip of the knife. This way we were able to make it stick the the plane only when it was hit with the tip of the blade. 


On this project project we had to meet in the VR room which was fun. I mainly contributed with testing since I had written a lot of the code in the last project. 


Wizard world


I really wanted to make a harry potter theme game and the whole team was a board with that. We were able to find a whole map of hogwarts. It took some time to fix all the shaders and building but I think it looks really good. We reused a lot of knowledge from the previous game and (and also some scripts). One of the hardest part was to make the rig work when the player was flying. At first the hands would fly out of the body which wasn’t really realistic experience. The minigame where the player uses the the wand to slay demetors was also quite challenging. 


Main contributions: coding of the rig, fixing the map, adding locomotion, fixing audio issues, adding hold positions.. 


but honestly this was collaboration at its finest. I think we all put our best effort together and I am really glad for the team we put together. It was a really nice journey of exploration of different AR and VR concept. I think I learned a lot and also had fun in the process. 

Comments

Popular posts from this blog

Week 45 - Working on our final assignment

Week 44 - Starting to work on our final assignment

Personal reflection - Mircea-Ionut Dobre - 293117