Papoutsaki, Laskey, Huang Advance Eye Tracking Through Webcam, Browser-Based Democratization
- Posted by Jesse Polhemus
- on May 26, 2016

WebGazer.js has already received significant attention on Reddit and Hacker News. Click either of the previous links to share the story and take part in the discussion.
In an age where technological advancements have made extremely high levels of responsiveness, interactivity, and customization almost omnipresent, eye tracking (using a device to measure eye position and movement) has arisen as a powerful tool. It can benefit areas of study that range from user experience design and cognitive science to assistance for the impaired and optimization of search engine results, but until recently, it’s been hampered by the cost of highly specialized equipment, integration issues, and other problems.
Today, researchers from Brown University’s Department of Computer Science (Brown CS) and their collaborators have addressed these challenges with a new eye tracking library. Alexandra Papoutsaki (Brown CS PhD candidate), James Laskey (Brown CS undergraduate student), Jeff Huang (Brown CS Assistant Professor), Patsorn Sangkloy (Georgia Institute of Technology), Nediyana Daskalova (Brown CS PhD Candidate), and James Hays (Georgia Institute of Technology, former Brown CS Associate Professor) have just released WebGazer.js, which democratizes eye tracking by using common webcams to infer the eye-gaze locations of web visitors on a page in real time.
Their web site, http://webgazer.cs.brown.edu, features a video, demos, source code, and documentation, and their research ("WebGazer: Scalable Eye Tracking Using User Interactions") is due to be published at the International Joint Conference on Artificial Intelligence (IJCAI) in July.
WebGazer.js offers several innovations, including self-calibration from a user’s clicks and cursor movements, easy integration with any website (the API requires only a few lines of JavaScript), browser-side implementation with no need for video data to be sent to a server, real-time gaze prediction on most major browsers, swappable components for eye detection, and multiple gaze prediction models. As a security measure, WebGazer.js runs only if the user consents in giving access to their webcam.
“With our few lines of code,” says Jeff, “a website can figure out where a visitor is looking. Imagine that the New York Times, with your permission, could learn what articles you read on a page, how long you read them, and in what order. This enables easier usability studies, new video games, help for people with motor control problems, and many other improvements in how all of us use computers and smart devices worldwide. There’s no extra software to install, it works with your webcam, and by watching how you interact with a site, it trains a machine learning predictive model that gets better the more you use it. Compare this to a $28,000 infrared eye tracker! With only small sacrifices in accuracy, this is scalable, faster, cheaper eye tracking. It’s democratic, and we’d like to see it used by anyone who has a webcam.”
For more information, please click the link that follows to contact Brown CS Communication Outreach Specialist Jesse C. Polhemus.