The original post covered specific eye tracking and facial recognition solutions that have sprung up in recent years. Here we’ll see how eye tracking technology can be useful for boosting conversion rates.
One of the primary things that sets iMotions apart from other human behaviour research solutions is its ability to integrate across various different technology platforms and different data collection methodologies.
In our previous post, one of the main points that was made is that while the ability to track where an individual’s eyes are moving is very valuable. It is not enough to provide the full picture per se.
Even when combined with facial recognition technologies that code for emotional response. Eye tracking is not a perfect predictor of sales or of other behavioural outcomes of interest.
One of the most attractive features is that the technology includes a software platform. It allows for simultaneous recording of a variety of different biometric sensors.
What Does it Cover?
The iMotions platform covers eye tracking, facial expression analysis, EEG, ECG, EMG, and galvanic skin response (GSR) as primary data collection sources. It also allows for cross-platform integration with over 50 additional sensors.
Additionally, iMotions allows for both mobile and remote use of several of these different biometric sensors. In the image below, you’ll see a research participant equipped with multiple different biosensors performing a live, in-store test.
While the use of EEG and eye tracking are specifically mentioned, the test could just as easily include skin conductance or other biosensors.
Uses
With both mobile and remote tracking, as well as the ability to easily integrate across a variety of different biometric data sensors. IMotions provides researchers the opportunity to perform human behaviour research across a variety of different platforms for a variety of different purposes.
On the website, iMotions lists applications including human behaviour research, neuromarketing, psychology, human-computer interaction, medicine and health, virtual reality, neuroscience, and engineering. Also, iMotions has headquarters in both Copenhagen, Denmark, and Boston MA, United States. The technology has been utilized in research laboratories across the world.
The usefulness of iMotions extends primarily to its ability to provide information that goes above and beyond eye movements.
One of the main themes at Conversion-Uplift is the idea that the consumer is not always aware of what they are feeling. IMotions simplifies the process of collecting data not only on where an individual is looking and when. But also provides data from sources like facial coding, EEG and skin conductance that gives an idea of the individual’s level of arousal and their emotional state.
Conclusion
In conclusion, iMotions is one of only a handful of solutions that has the ability to integrate across various different biometric sensors and give a more holistic picture of consumer decision making.
In the ever-expanding toolbox of the digital marketer, iMotions can be a useful tool for answering a variety of different in-house human behaviour research questions. Including conversion rate optimization and others.
For additional information on the iMotions eye tracking capabilities, check out the infographic below. Look for information on the specifics of the human eye, different specs for contemporary eye tracking technologies. As well as some tips for performing effective eye tracking research.
Eye-Tracking that Reads Emotions! Feelings drive much of our behaviour and so emotional engagement with digital content can be a strong indicator of its effectiveness at getting our attention. This is especially critical for mobile gaming apps as unless users are engaged they may lose interest and delete the app from their phone. Imagine then the power of emotion-enabled gaming apps that recognise emotional responses in real-time and where these same emotions can modify games dynamically to create a more engaging and exciting user experience.
Source: Affectiva.com
This is now becoming a reality with eye-tracking solutions that use webcams and device cameras to monitor emotional engagement. These solutions use vision algorithms to identify key landmarks on the face, such as the tip of the nose or the corners of the eyes and the mouth. Machine learning algorithms then analyse pixels in those regions to classify facial expressions based upon pixel colour, texture and gradient.
Combinations
Combinations of facial expressions are then mapped to emotions and data is processed in real-time through SDKs. This data can usually be accessed via APIs or visualised in a dashboard. Indeed, eye-tracking solution Affectiva’s SDK integrates into user interfaces to allow developers to emotion-enable their apps and digital experiences so they adapt to people’s emotions in real-time.
Source: Affectiva.com
This creates new opportunities to optimize digital content in a more sensitive and engaging way. Apart from game design, the technology is also especially valuable for usability and market testing, video and ad evaluation, and website design optimization in general.
Interestingly though research by Nielsen indicates that facial coding used in isolation is a poor predictor of sales. EEG research that tracks brain waves is a more powerful indicator. This is why it’s important to use emotional engagement measures along side eye gaze and other research methods. It gives you a better understanding of the impact of visual content.
An Advantage
One advantage of eye-tracking though is that the eye doesn’t lie. We can’t control what we initially look at when we see an image. Indeed, research by eye-tracking company Sticky.ad has discovered a gap between what people say they were looking at when they first glance at an image and where their gaze is actually focused. Self reporting is an unreliable method of identifying what people look at partly because our conscious brain filters out a lot of the information that our visual cortex processes.
Psychologists have also discovered that we don’t have full access to what motivates a lot of our behaviour. Our attention is largely determined by our subconscious brain as it scans for solutions to implicit or psychological goals.
Source: Sticky.ad
At the same time usability testing and remote Voice of Customer solutions can add to our understanding of how people interact with a user interface. It provides insights into the issues they encounter. Each method of research provides a different perspective on how people respond to visual content.
Is There A Correlation Between Eye Gaze and Mouse Movements?
Some user experience solutions claim there is a high correlation between mouse movement and eye gaze, but the evidence does not support this. The highest published correlation is 69%. But research by Google (see page 29) indicates it is more in the region of 42% and according to an anecdotal study it is only 32%. However, Simple usability claim it may be even lower, if not non-existent, when visitors begin to scroll. This is because people tend to scroll and scan content with their mouse either being static (because they use the scroll-wheel) or their mouse follows the right-hand scroll bar. You can almost guarantee that users won’t be looking at where their mouse is when they scroll down a page.
Many eye tracking studies have shown that users tend to scan web pages in an “F” shaped pattern. This is partly because in the West at least we read from left to right. As a result we mainly focus on the top left side of the screen and scan across horizontally. Our eyes then scan down the page, again mainly down the left to find what we are looking for. Of course this process can be disrupted by content that grabs our attention or design elements that separate content. Web pages for instance that appear to end at the bottom of the screen won’t encourage further exploration and so engagement often falls dramatically around the fold area.
Google Golden Triangle & Beyond:
In 2005 an eye tracking study by Google and marketing services companies Enquiro and Didit.com, discovered that users tend to view Google search results in more of a triangular shape. Thus a majority of a user’s eye gaze is focused in a triangle at the top of the search results page. This became known as the “golden triangle”.
Source: Mediative
However, Google conducted a further eye tracking study in 2015 which was published by Mediative. This indicated that more visitors are moving outside of the ‘Golden Triangle’ and that the rise of the mobile user is also changing browsing behaviour.
Source: Mediative
The latest Google study shows:
Mobile users scan vertically more than horizontally.
Users are viewing more search results within a single session. But are spending less time browsing each one (on average 1.17 seconds compared to 2 seconds in 2005).
Results that are lower down the SERP (particularly positions 2 to 4) are receiving more clicks than they did a few years ago.
What this tells us it that we can’t rely on generic eye tracking studies to inform us where visitor gaze is concentrated. We need to undertake our own studies where we can. Otherwise we will making decisions based upon assumptions that may well not be valid.
Observer Effect:
The Hawthorne effect refers to the fact that people often change their behaviour when they are aware they are being observed. This is why TV viewing research panels ignore what new participants watch in the first few weeks after joining the study. Viewing habits have been shown to change immediately after someone becomes involved in such research. It is therefore important to allow users to acclimatize to the study and if possible use equipment that is not too evasive.
Another way of reducing potential bias is not to disclose which website or page you are interested in and let the user start the study on another website. You can then guide the user to the relevant website through a search result or link and allow them to browse the page before moving onto another.
A cloud-based eye tracking and emotional engagement research solution that uses a standard webcam or device camera. Developed using the world’s largest emotion database of more than 4 million faces across 75 countries, Affectiva has established norms to benchmark ad effectiveness by geography and product category. Unilever for instance uses Affectiva to measure emotion analytics for over 600 products in 70 countries. The solution also integrates with voice of customer tools and works with any panel provider.
The Affdex SDK allows developers to create interactive and emotion-aware apps and digital experiences that adapt to the emotional response of a user. The SDK monitors and reports on emotional responses from facial expressions by analysing the user’s face via the device camera, a video or even a single image in real-time.
Pricing: Quotations are available on request.
2. EyeSee:
An online eye-tracking and facial coding platform that uses the respondent’s own web cam to deliver fast, cost-effective and global eye tracking insights. Just before and during the eye tracking and facial coding research they monitor environmental factors and behaviour (e.g. head pose and head movement. If necessary it will recalibrate to ensure the accuracy of the research.
The platform is able to display a range of stimuli from static images, video and live websites. It can also make static images scrollable and zoom in by clicking on certain areas. For live websites the solution can activate or deactivate links and create overlays.
The facial coding functionality uses an algorithm to process webcam images which automatically identifies 7 emotions: Happiness, Surprise, Confusion, Disgust, Fear, Sadness and Neutral. The output from the algorithm is double checked by humans to validate the accuracy of the outputs.
Pricing: No prices on the website. Please contact EyeSee for a quote.
This is the World’s first cloud-based self-serve biometric eye-tracking and facial coding solution. Sticky utilises the user’s webcam and as a result the solution brings the cost of eye tracking and emotional engagement research within the reach of most medium to large companies. Further, results can be obtained within hours rather than weeks that it might take for a traditional eye tracking study.
The webcam eye-tracking software uses multiple tracking algorithms to identify the face, key features, eyes, iris and movement in 3D. This generates highly accurate eye tracking data and can be used to optimise all forms of digital content including images, ads, websites, videos and emails. Sticky’s Eye Portal allows users to identify areas of interest (AOI) produces data on three key metrics.
The Key Metrics
The Seen Metric measures the percentage of the audience that saw the AOI.
Stickiness informs how long on average was spent on it.
Salience shows the average time it took a viewer to first see the AOI.
The solution is used for a range of purposes by Stick’s clients including website design optimization, ad placement valuation, article layout readability and more. Sticky also integrates with online surveys and it includes a feature to include survey questions within the portal.
Pricing: The cost is dependent upon the plan you sign up to:
Each credit represents an image or video used in an experiment. Facial coding requires an additional credit per 30 seconds of video. Using the Sticky panel is extra, starting from USD $5 per participant. Using your own panel is currently free.
The Eyegaze System uses the pupil-centre/corneal-reflection method to track where the user is looking on the screen. An infrared-sensitive video camera, mounted beneath the monitor takes 60 pictures per second of the user’s eye.
A LED light mounted in the centre of the camera’s lens illuminates the eye. The LED reflects a small amount of light off the surface of the use’s cornea and pupil which reflects off the retina. This causes the pupil to appear white and enhances the camera’s image of the pupil. It makes it easier for the image processing functions to locate the centre of the pupil. The computer calculates where the user is looking on the screen based upon the relative position of the pupil centre and corneal reflection within the video image of the eye. This provides an average accuracy of within a quarter inch or better.
Before operating the application the system must go through a calibration process to learn several physiological properties of the user’s eye in order to accurately determine where they are gazing on the screen. This calibration process takes about 15 seconds. It does not need to be repeated if the user leaves the testing environment and returns later.
This desktop eye-tracking solution uses a camera mounted beneath the screen to calculate where the user is looking. It is ideal for UX and developers to conduct user testing to understand where on a page attention is focussed and how people navigate content. The tool offers eye gaze recording, Thinkaloud voice and webcam recording, heat maps, dynamic areas of interest, mouse clicks, pupil diameter data and where users stop scrolling.
A non-evasive, portable eye-tracking system that sits just below the computer screen. It calibrates almost instantly and runs on Windows OS. It looks like an Xbox Kinect and can be used wherever you need to conduct user testing.
Provide a comprehensive package of biometric and eye-tracking research hardware (including remote and mobile kit), software solutions and training. They use remote eye-tracking hardware and sell eye tracking glasses, biometric sensors and PC specs.
Algorithm based tools that predict user focus and attention:
Eyequant uses algorithms and machine learning to predict how visitors will see and perceive website pages. The software generates instant heatmaps covering attention, hot spots, engaged visitor analysis, perception maps, regions of interest, visual clarity and new visitor analysis. You can either enter a URL or upload an image for a new design.
Analysis is based upon eye tracking research, user studies and online experiments to understand how people perceive web designs. Statistical analysis is employed to identify correlations between specific design characteristics and user behaviours.
Design characteristics that are identified as influencing user behaviour are then used as variables in predictive modelling. Machine learning is employed to determine the optimal weight and combination of these characteristics. Predictions are regularly benchmarked against eye tracking and other user studies to measure the accuracy of the modelling.
AttentionWizard from Sitetuners has partnered with Feng GUI to provide an evaluation of the visual effectiveness of landing pages by predicting what a person would most likely to look at.
Usability research is essential for checking whether a site or app is intuitive and easy to navigate to create a great customer experience. It helps inform our decisions about the choice architecture. Remote usability research solutions or face-to-face user interviews identify the main usability problems. Do these methods of research reflect real behaviour?
How many usability research proposals acknowledge that the process of undertaking usability research can influence the behaviour we observe? We may have taken users out their natural environment and set them objectives that lead them to behave in a certain way.
Behavioural scientists have found that many of our decisions are made automatically by our unconscious brain. The context and our underlying psychological goals heavily influence the choices we make. We also behave differently when we are aware that we are being observed.
Asking respondents direct questions is especially problematic as people over-think issues. They switch to their slow, rational brain when encountering a mentally demanding task. Unfortunately most of the time when we are browsing a website we rely on our fast, intuitive, unconscious brain to make decisions without really engaging our conscious thought process. The implication here is that we cannot even access the rationale behind much of our behaviour when interacting with a website.
Daniel Kahneman, Thinking, fast and slow
“People don’t have reliable insight into their mental processes, so there is no point asking them what they want”.
Avoid taking people away from their natural environment if at all possible. Certainly don’t use focus groups as this is about far away of a normal browsing behaviour as you can get. How often do you search the web with a group of people you have never met and discuss your likes and dislikes of the site?
This is why remote user testing methods have an advantage over some face-to-face methods. Participants can be in their normal environment, with their normal distractions and so their behaviour is less likely to be influenced by the testing process. Don’t get me wrong, there will still be some bias as a result of the testing method. But it may be substantially less than techniques which take the user out of their normal browsing environment.
Observe and listen rather than ask:
You will get more meaningful insights from simply observing and listening to your users during a usability test as past behaviour is a more reliable indicator of future behaviour. Try to avoid verbal interventions as much as possible. People don’t like to admit when they do something wrong and you are likely to influence how they then behave in any future tasks. If you do want some verbal feedback, just ask your testers to say what they are doing as they go through the task.
But always keep in the back of your mind that usability testing is about informing your judgement, and not to prove or disprove someone’s opinions. It is also an iterative process that should begin early on in the development of a design.
Most of our daily choices are made by our fast, intuitive brain which means we don’t have time to rationalise why we are making those decisions. New implicit research techniques such as functional MRI, EEG, biometrics, eye tracking, facial decoding and implicit reaction time studies (IRTs) are allowing marketers to access the sub-conscious part of the brain to better understand how we respond to communications and designs.
Eye tracking research helps identify which specific elements of a page or message attract our attention, but also the communication hierarchy of messages. Heatmaps allows us to display this data to reveal the proportion of visitors who noticed each of the key elements on a page. Plus the frequency and duration of gaze on each element.
Click and mouse movement heatmaps from visual analytics solutions such as Hotjar and Decibel Insights can provide similar insights for existing pages. For true eye tracking research though solutions from Affectiva and Sticky allow for you to evaluate both new and existing web page designs.
In the final analysis the only way you will know if a change identified through usability research improved agreed success metrics is to conduct an online experiment in the form a A/B test. It is only when visitors are acting on their own impulses and with their own money that you will see how they behave.
Prioritise the insights you get from usability testing to decide which are worthy of A/B testing. A/B testing will give you the evidence to show exactly how much difference your usability testing has had on your conversion success metrics.