EyeQuestion and Virtual Reality

When we started our company some 16 years ago and were in the process of naming our software, we tried to create a name that would immediately cover the market that we were trying to make our name in ; Sensory and Research. The research part was easy as that would be covered by the word ‘question’. Being inspired by Apple in the early 2000’s, the first name that popped up was ‘iQuestion’. We liked the concept, but did not like how it was a bit too similar to one of the major software companies. To better link our new company name within the field we were going to work for hopefully a long time, we decided on using ‘Eye’, since this would cover the second part of our goal in using the concept of sensory in our company name.

The decision to use this specific visual organ in our name came back to haunt us during the last 16 years as questions were asked over and over again if our software suite was created in order to make use of Eye Tracking or if consumers could actually give answers using their eyes. Up until now we always had to disappoint people in saying that this was not the case. But, with our new Virtual Reality module, users can finally answer questions with their eyes by using EyeQuestion VR.

 

 

 

Some time ago we visited Utrecht University where a symposium on the use of VR in sensory science was held. You can read all about that in our blog. We were already working on VR, as this is something we had presented in an earlier stage of our development cycle last year, but now we have further improved our VR module by making it easy to use, create and overall very accessible by integrating it in our existing EyeQuestion interface.

 

 

 

 

 

 

 

 

 

 

 

EyeQuestion VR will not only be integrating the panelist’s visual organ, but also their auditory by adding voice recognition and text-to-speech integration. Both additions will make their debuts in EyeQuestion VR where we are able to create a truly immersive environment which creates a natural means of interaction. By removing the user interface we believe that we can directly connect to the panelists senses and bypass any possible bias.

A demo of EyeQuestion VR will be available at our booth during the Pangborn conference and we will keep you updated on our website and newsletter with our latest software developments. Please come and visit both EyeQuestion as well as QI Statistics booths at Hall B, booths 9 & 10 during the 12th Pangborn Sensory Science Symposium in Providence, Rhode Island.