Met the initial goal
In general，I think we met our expectations on the time schedule.
Technically, we achieved the basic functions of feeding, cleaning poop, portal and dancing. Also, the rendering of the virtual scene behind our portal looks good. But in fact, the whole development process was not smooth sailing. There are some functions that I once thought were very basic (such as image recognition), and the original scripts in the AR foundation could not meet the requirements of our design. Therefore, I need to supplement and improve the original code.
At the same time, from this project, I learned some basic knowledge about AR development, such as how to set the relative position of the virtual scene in the real world and how to set the relative position of the virtual camera and the mobile phone camera. And in this development process, I had to look at the sample files and scripts that come with the SDK (such as manomotion) to analyze the possible functions without finding a suitable tutorial.
The most challenging parts
For me, the most challenging part of this AR development was the pet feeding using the cards. Because unexpected bugs often appear during the development process. For example, the camera recognized multiple results from one card, the model generated after the card leaves the camera lens will still stay in place, the camera cannot recognize multiple cards at the same time, and use Generated model feeding (collision) the pet, the pet will suddenly fly away or float in the air.
At first, I was completely clueless as to how to solve these problems. (Because there is no tutorial or source showing the reason for these errors). But after a step-by-step investigation, I found that in addition to the parts that need to be solved by code, many bugs are due to some very simple reasons. For example, the reason why pets fly around, which has troubled me for a long time, is because I use the function of AR Anchor. After turning it off, this error will disappear.