Search
Close this search box.

EyeQuestion and Augmented Reality

The last few years progression has been made regarding the use of Virtual Reality in the field of consumer and sensory research. Within our own company, this has been evident in us by presenting an early version of EyeQuestion VR during Pangborn in 2015, us visiting the VR symposium last year and of course the launch of EyeQuestion VR. Research on this subject has grown, with the University of Utrecht and Singapore both having practical applications for VR in the form of VR Dining and the Vocktail, which we have discussed in our blog.

The viablility to use this technology in sensory research has been given a boost as the University of Zurich has done a study by comparing a virtual food buffet with a fake food buffet in order to determine the validity of this new method. A study with thirty-four participants did a comparative study in which the same task was performed in both the real and virtual world. The real world tasks were done using a FFB (Fake Food Buffet) whilst the virtual tasks were done using a VFB (Virtual Food Buffet). Instructions were given to the participants to serve themselves a meal similar to what they would normally have during lunchtime. Results were very promising as the results from the VFB were highly correlated with the amounts that were served from the FFB. Their paper, Innovations in consumer research : The virtual food buffet (Ung,Menozzi,Hartmann & Siegrist, 2017) thus concluded that the VFB is a useful research method.


Both AR (Augmented Reality) and VR (Virtual Reality) are earning a lot of media attention and are promising tremedous growth. But what exactly is the difference between VR and AR?

Virtual Reality (VR) is an artificial, computer-generated simulation or recreation of a real life environment or situation. User immersion is made possible by stimulating their vision and hearing, making them feel like the simulated reality is experienced firsthand. This is done by using VR goggles as this can remove both sound and vision from the real world.


Augmented Reality (AR) enchances existing reality by placing computer generated objects in the real world. Users are allowed to interact with these objects through apps as it is often used in combination with mobile devices due to the on board camera being used as a ideal viewpoint for the user to blend digital components in the real world.

Internally, we have also been expanding EyeQuestion VR to include AR (Augmented Reality). We hope to have a full XR (Cross Reality) available with the EyeQuestion Suite in the future. XR combines both digital and biological realities in the form of a technology-mediated experience. Both hardware (for example the use of VR goggles) and software (for example EyeQuestion) is needed to enable content creation for VR, AR and MR (Mixed Reality). By using these tools, users can generate new forms of reality by placing digital objects into the physical world and vica versa.


There are different techniques to generate these new forms of reality, one of them is by using a marker where digital content is placed on. A great way of using a markerbased AR in Sensory was demonstrated by the University of Tokyo in Japan back in 2010, which you can read more on in our VR Dining blog. With the new techniques available and this example in our mind we started to think what would be a good example project where we could use Augemented Reality.


One idea that we had is to see if it is possible to use AR to let panelists choose the perfect portion of a certain dish whilst being in their own natural environment.
We were able to do this using EyeQuestion and our own EQ marker where a 3D module is plotted that can be changed in size untill the panelist has  reached their ideal portion size.

Our initial projects with virtual reality were a success as we are currently improving EyeQuestion VR and expanding our capabilities into AR. If you are interested in doing a project with our software suite, please do not hesitate to contact us at [email protected].