With a name like EyeQuestion, it might seem we also provide tools to perform eye tracking studies. We might have foreseen (pardon the pun) the importance of Eye tracking with this name and therefore we want to investigate potentially adding this to our software as it can provide valuable insights. Earlier this year, we performed a pilot study to test the OpenEye Eye tracking glasses.
What is eye tracking?
Eye tracking is a way to measure gaze and eye movement in response to visual stimuli. A huge amount of the information processed by the brain comes from visual stimuli. While not all of it comes from deliberate focus, it is still a highly important factor when considering information processing (Boernstein, 1955).
Due to the unique structure of the eye, at any given time, only a small fraction of the image viewed can be in focus. Eye tracking technology can find and record this focus. This way, a clear estimation of cognitive processing and attention can be made. The movement of the eye is mostly subconscious and, in most cases, difficult to control. This makes it free from most biases (Zamani, Abas & Amin, 2016).
Eye tracking can be used to create a heatmap showing a visual representation, it can identify the most noticeable object, the order of noticing objects, and by measuring the amount of time an object is looked at preference can be assessed. Furthermore, interest can also be measured by the pupil reaction and cognitive processing by the blinking rate. With these outcomes you can for example examine new label designs and new product packaging or how well products are noticed between other brands.
We performed a pilot study in which we wanted to test the OpenEye glasses. These glasses (see Image 1) have a 3D printed base to which a camera is attached that is recording the right eye. The positioning of the pupil is detected using Artificial intelligence (AI). Another camera is used to record the view of the person wearing the glasses.
The videos can be uploaded to the cloud, from which the post-processing can start. During post- processing of the videos, you can identify the points of interest by labelling the videos. These labels are saved to a dataset which can be trained using AI and thereafter applied to any other video. For example, in Image 2 several points of interests are labelled.
The data can be exported into an Excel file that shows the number of seconds that a specific point was in view and in focus. This data can also be viewed in graphs (see Image 3.)
During the pilot test, 16 panellists were asked to go shopping in a supermarket and find 4 products that were mentioned on a shopping list. The following products were included on the shopping list (in randomised order):
- A pandan rice of the best quality
- The healthiest breakfast cereal
- A product from a 100% vegan shelf
- An eco friendly laundry detergent
The pilot test was a great opportunity to test the first version of the OpenEye glasses. We learned a lot and also identified what still needs to be improved. In the next period we will continue with improving and testing the Eye tracking glasses. Furthermore, we are also looking into the possibility to do Eye tracking through a webcam.
Boernstein, W. S. (1955). Classification of the human senses. The Yale journal of biology and medicine, 28(3-4), 208.
Zamani, H., Abas, A., & Amin, M. K. M. (2016). Eye tracking application on emotion analysis for marketing strategy. Journal of Telecommunication, Electronic and Computer Engineering (JTEC), 8(11),87-91.