Eutopia Project

Project Overview

Eutopia is a project that aims to create a digital environment that human can perceive in their daily activities. Unlike virtual reality that isolates human’s perception into the virtual world, with augmented reality (AR) human can still doing daily activities in the real world while enjoying the AR enhanced the environment. An ideal example of the successful product is depicted the PlayStation 3 video game, Heavy Rain. A fictional character of the title, Norman Jayden can work in his old small office while perceiving that he is working in a forest.


Illustration of Norman Jayden in his AR enhanced environment


Technical Information

Currently, this has only covered the AR content creation so far. The AR content is made using Vuforia SDK on Unity Daydream preview 5.4.2f2

Vuforia is a marker-based SDK. The main idea to create the digital environment is by making a mark to detect 5 sides of a room, namely: front, back, down, right, and left. Those direction names do not necessarily show where the user is facing. Because in practice, the user may not always focus on the markers, Extended tracking feature is used to create a stable experience. It is a feature in Vuforia SDK where the computer will remember where the markers are and keep projecting the imagery even without the markers on focus.

The user is expected to use the AR content by being in the room and wearing a head-mounted device as the AR machine. The AR machine is simulated by using an Android phone inside a Virtual Reality box. Therefore, the AR content is ported into an Android phone using Google Cardboard API.



At first, this project was initiated for an Entrepreneur course project under the same name. The project is about establishing a restaurant where people may enjoy dinning in custom environments with AR technology. The entrepreneur project has been officially discontinued and so does the development of the AR technology. However, modification and improvement of this project are encouraged.  This is an open sourced project that can be found in this git project:


Problems, Evaluations, and Future Development

To fully manipulate human’s perception of the environment, some vision-related factors need to be considered, for example, the colour and shadows of the objects in accordance with the lightning and distance; interactivity of the human and the objects. Those are still very limited in this project because the AR content is made with minimum recognition of the environment.

The extended tracking feature is not yet perfect. Some objects appear shaky and the AR content keeps resetting all the objects over a certain time. This may be fixed by customising the Unity scripts.

Sometimes the AR machine is unable to detect some image targets, especially when the target is not exposed by light directly or not close enough to the machine. Probably it is better to redesign the image target or even the whole room marking system.

Creating human interactivity with the AR produced objects may be achieved using external motion detector devices such as Leap Motion or motion detecting camera – like the ones used in some modern gaming console. Another alternative may be by porting the AR content into some head-mounted devices bundled with motion sensors such as Microsoft HoloLens or Oculus Rift.

About the author: Admin