Sluit dit zoekvak.

Video vs. Written Reviews: What’s Better?

In a world where the opinions and reviews can make or break a product, understanding customer feedback is absolutely crucial. With the rise of online shopping and the use of social media, video reviews have become increasingly popular, providing visual and auditory feedback about products.

However, the big question remains: how effective are video reviews compared to the traditional written ones? In a study we did in December 2023, our aim was to find out if video comments and written comments tell us different things.

Our members of the BlindGetest panel were invited to conduct a Home Use Test. Participants were randomly divided into two groups: the Video Group (VG) and the Written Group (WR). 

The Video Group (VG) got the task of recording videos, using the Video Upload feature of EyeQuestion at various stages of product evaluation (to learn about this feature, check out this article). While the Written Group (WR) documented their thoughts through written remarks. Participants were provided with the EQ APP for filling out questionnaires. (Do you want to know more about this feature? Check out this article)

Every participant got a cardboard box with two products inside. Through the EQAPP, participants got invited to complete two questionnaires, each to be done on different days. The EQAPP notified participants when to do each questionnaire. The questionnaire guided participants to evaluate the packaging of the product first, and the taste after. Participants were also asked to rate how much they liked the product and answer follow-up questions about how easy the task was.

Overview of Data Analysis

The acquired videos were downloaded, and transcripts were obtained for each video to perform word counts. Video transcription was facilitated using the AI EyeQuestion Assistance Tool, enabling automatic uploading of the videos and generation of transcription files for each. Subsequently, the data from both video transcriptions were merged with the corresponding remarks typed, and word counts were conducted. To compare the word count groups derived from the two different sources, the Welch two-sample t-test was conducted.


Participants  who conveyed feedback through video resulted in a  greater number of words in their assessments of both packaging and taste aspects, in comparison to who exclusively typed their comments. Furthermore, an examination of participant perceptions regarding the overall task indicated that individuals in the written feedback group reported a greater degree of comfort and perceived lower levels of stress, in contrast to who opted for video recording as the mode of feedback.

These findings highlight important considerations for future research on how people provide feedback on products. The linguistic nuances observed in video feedback may offer richer insights into individuals’ thoughts and emotions regarding products, prompting a need for further investigation. Additionally, it’s crucial to explore the levels of comfort and stress individuals experience when giving feedback, especially with the array of feedback methods available.

It is noteworthy that factors such as panellists’ experience with technology and age may contribute to the observed differences, as older people may not be as accustomed to video recording and could find it extremely challenging. Deeper research into the factors driving these differences could result in the most effective approaches for collecting and interpreting consumer feedback across different scenarios.