EXCITE

EXCITE is a tool that can ease the burden of video and tracking data analysis for multi-surface environments. EXCITE leverages proxemic information—such as people’s and devices’ distance and orientation, captured with the Proximity Toolkit—simultaneously with video data and event data from the tools, and allows video analysts to generate queries on the data as a whole. These queries are annotated on a timeline, allowing the analyst to scrub through synchronised video capture feeds to validate and further annotate study sessions. Using our tool allows the analyst to compare the incidence of various events with one another, and see these in captured video of behaviour from study sessions.

Abstract

A central issue in designing collaborative multi-surface environments is evaluating the interaction techniques, tools, and applications that we design. We often analyze data from studies using inductive video analysis, but the volume of data makes this a time-consuming process. We designed EXCITE, which gives analysts the ability to analyze studies by quickly querying aspects of people’s interactions with applications and devices around them using a de-clarative programmatic syntax. These queries provide simple, immediate visual access to matching incidents in the interaction stream, video data, and motion-capture data. The query language filters the volume of data that needs to be re-viewed based on criteria such as application events, and proxemics events, such as distance or orientation between people and devices. This general approach allows analysts to provisionally develop theories about the use of multi-surface environments, and to evaluate them rapidly through video-based evidence.

Publication

Marquardt, N., Schardong, F., Tang, A. (2015)
EXCITE: EXploring Collaborative Interaction in Tracked Environments.
In INTERACT 2015. Springer.

Downloads