top of page

Evaluating the process of production (Augmented Reality Prototype)

  • allazarr
  • Jan 20, 2021
  • 3 min read

Updated: Mar 19, 2021


Augmented Reality is an exciting technology that has been embraced by most companies already. The concept itself is suited to create a bond between the digital and the real world. Currently, the bridge between these worlds is represented by our daily companion, our smartphone. Using the component of our phone, the developers can teleport elements from the digital world to ours in just a couple of lines of code.


Technically speaking, my prototype has 2 parts, the creation, and the coding. The creation of the elements has been made using Blender. Starting from simple shapes I successfully created 2 pendulums. Modeling spheres and cylinders allowed me to create the balls and the strings. between the balls, at the center of the scene, it is placed rotating circle. I've cut a small gap in the upper part of the circle in order to leave space for the strings of the balls. As I specified in the Concept and practical consideration of Augmented Reality blog post, the concept of this creation is to be satisfying to the viewer. In order to achieve this effect, I needed to be careful with my angles and with the position of the elements. Each cut needs to be at exactly 90 degrees. The balls needed to be the same diameter to fit the hole in the spinning circle. To achieve more

realism, the whole structure was attached to a static leg which will sit still on the table. This leg will allow me later into the process to use it as a trigger for the surface that will be projected on. The whole mesh has been texturized with realistic materials to match the real world. For the leg and circle, I used wood, and for the balls, strings, and the circle pad, I used polished stone. The texture used gives a much solid and heavy feel to it.



The programming was done using Unreal Engine 4. Inside Unreal I have first experimented with the Augmented Reality preset. I've tried implementing my meshes to match their presets. Using Android Studio, the projection in Augmented Reality using an Android OS

system was directly linked with Unreal. However, the versions for both Unreal and Android Studio were incompatible at that moment and I could not use the given preset. Using YouTube tutorials from both Dev Enabled and Unreal Engine Live Session I've restarted the project from scratch. I've started by setting up the AR Core from Google to be compatible with the version of Unreal I was using. Once I had the software linked I was ready to set up the camera and the Event Play code. The camera needs to be a bit behind the object to simulate the real smartphone camera. The camera detection code is quite simple as is a detection tool linked with the core module of Google AR. Linking both the source and the geometry parameters gives the player the possibility of using real geometry around them such as the floor or the desk in my case. Once the detection tool is working the AR application is already detecting the floor. The next step was defining the projection tool. Inserting my meshes into the program, I could add a trigger on each of them which allows the player to tap the screen wherever a flat surface is detected. Determinating which of the element in my scene is the

base was crucial for the tracking of the objects.

Applying a physics parameter onto the base of the mesh will allow me to simulate the tracking on that specific object.

The audio core of the AR prototype allows the viewer to immerse itself with different environmental soundscapes to boost the relaxation and the enhancement of the experience. The coding for audio has been made from a play selector linked with 3 determined audio tracks, an additional event play node to trigger the sound once the object is placed, and a speaker module that will play through the default speaker selected (Speaker of earphones).



This AR application is currently working on any device using Android 7 or above. Being an early stages program it is not suited for stand-alone usage. The application needs a constant feed from Unreal Engine as the code is not baked into a final product. In order to properly deliver the project, a mobile application needs to be designed which can carry all the core metadata of the AR core and the projected mesh.





References:


Tutorials:


Documentation:


Comments


​© Alexandru Lazar

  • LinkedIn
  • Flickr Clean
bottom of page