top of page

Research Projects

Interest detection from eye gaze

Information extracted from our eye gaze can indicate if we are interested in particular objects we look at. In this project, we modeled users’ interest of particular photos while they browsed a photo collection.

GazeNote

Head-mounted displays does not provide an easy way to point things out in an image. We instrumented Google Glass with an eye tracker and used the gaze position to add annotations to particular parts of an captured image.

How can we support search activities over extended time periods? We have explored various visualization methods for keeping track of already seen search results, such as the query preview widget in Querium and SearchPanel.

To better support workplace communication, we designed myUnity, a system that provides presence information in a workplace. myUnity used multiple presence sensors to detect various presence states, such as “in office,” “in a call”, “at computer at home” etc.

Assisted visual search

Visual search is an error prone task, in particularly, in cluttered images such as satellite images or medical images. This project explored if eye tracking data can be used to direct user’s attention towards un-inspected areas of an image. 

The focus of this project was to user interfaces to support search where multiple users collaborate around a search need. We designed and evaluated a system for doing video search where two searchers took different roles in order to cover as much of the search space as possible.

When focused on communicating ideas and meeting new people, users are often confronted by conference rooms with a bewildering tangle of cables, remote controls and switches. In the DICE project, we designed an easy to use conference room.

Please reload

bottom of page