
VRZoo. project 04
Platform. VR
Engine. Unity
Animal model. Unity Asset Store
Development period. March 2021 to July 2021
Major assignment of AR/VR design in undergraduate course
I have watched plenty of classic animal documentaries produced by BBC, such as Planet Earth and The Blue Planet, through which I felt an upsurge of emotion about the profound survival rules of nature. By seizing the opportunity of VR scene construction, I planned to make a wildlife park, where players need to build a private animal world from scratch. However, if players can only watch the activities of animals without interacting with them in the scene, it will be a little boring. This reminded me of the playing mode of Planet Zoo, where the animals will also suffer from physiological problems such as hunger and bad health, aiming to restore the authenticity of the animal world to the greatest extent. Players need to take good care of all kinds of animals. The more animals exist in the virtual world, the more gold coins they can get, with which they can buy more precious beasts. If players can successfully accumulate enough gold coins within the specified time, they shall be deemed as clearing the game. So, in order to win, players need to weigh various factors, deciding whether to buy feed or animals or accumulate gold coins.



VR's advantageous application: VR can provide an unparalleled sense of immersion. Compared with common models in the PC scene, the comprehensive visual enjoyment offered by the VR scene is beyond imagination. The sense of immersion is essential to a VR zoo, because the sense of oppression brought by the height of giant animals such as giraffes and elephants can not be realized on other platforms.
VR movement control: By VR technology, the user's movement in the real world can be fed back to the VR scene. The range of movement depends on the range of the locator set at the beginning. In addition, ray teleportation or trackpad sliding can be used. When using trackpad, the speed is not easy to control. Users will feel dizzy and may directly collide with animals, or even model clipping can appear. These were not what we wanted, so we used rays to control movement
VR grabbing control: VR grabbing is divided into Basic Grab and Sticky Grab. The difference lies in the interactive mode. Basic Grab: Users need to long press the trigger key for continuous grabbing; Sticky Grab: Users only need to click the trigger key at the time of grabbing and dropping. Since the player may grab for a long time when adjusting the animal's position, the Sticky Grab was adopted as the interactive mode. The Sticky Grab script was added to each object to be controlled in the scene, and we ensured that there was sub object Vive Collide under the VR camera, so that grabbing can be realized.



Animal AI: Predation: Predators and prey were distinguished and graded. All animals were divided into six levels of 0-5 based on the concept of: big fish eating small fish, that is, high-level carnivores preying on low-level herbivores. When carnivores join the scene, their hunger value is full, which will be consumed with the passage of time. When the value decreases to a certain extent, they will visit the existing animal list, traverse and look for prey targets with lower prey level. If the prey targets exist, they will search for and track them, and if not, they will continue starving. Take the tiger preying on the porcupine as an example. After learning that the tiger locates the porcupine and tracks it, the porcupine will escape. During the chasing, if the porcupine moves slower than the tiger, it will eventually be caught up; Meanwhile, the consumption of the tiger's hunger value will double during running. If the tiger cannot catch up with its prey, it will starve to death quickly. If the porcupine is caught up, it will be given a chance of counter killing based on the difference between their predation levels. The equation is relatively simple: survivePer = 10%/Math.Pow((preyLevel-this.preyLevel),10). The tenth power is added to ensure that there will be no such situation as rabbit defeating the tiger. If the tiger's luck is not that bad, the porcupine will make a good meal and restore 100 points of the tiger's hunger value.
Movement: The target point of irregular movement of animals in the scene can be realized by Random function, and the movement can be controlled by Rigidbody. We used Vector3.Distance to determine the distance between two points, and generated the next target point after arrival. Generally speaking, as long as the x and z rotation under the RigidBody is locked, the animal can move correctly. However, during the test, due to the slope and other factors in the scene, the animals will lose balance in the area with too large pitch angle. Simply using rigid body and box collider cannot ensure that there will be no problem. To address this problem, we used AI: Navigation, a built-in function of Unity. After setting the ground in the scene as Navigation Static, we selected the walkable and not walkable sections, set the maximum allowable pitch angle, and click "baking". Next, we attached NavMeshAgent to each character that is to move in the scene, or added the [RequireComponent] Field on the script to mount automatically. After adjusting the moving speed, angular speed and other parameters, it can be used. The advantage of the above procedure is that many troubles about moving speed and steering smoothing difference are saved and the simple calling method can do the job.