EXCITE

Toolkits.EXCITE History

Hide minor edits - Show changes to output

Changed line 13 from:
Marquardt, N., Schardong, F., Tang, A.\\
to:
Marquardt, N., Schardong, F., Tang, A. (2015)\\
Changed line 15 from:
''In INTERACT 2015.''
to:
''In INTERACT 2015. Springer.''
Changed line 19 from:
* '''[[Attach:EXCITE_V1.0.zip | EXCITE (Version 1.0)]]''' (will be available at INTERACT in September 2015)
to:
* '''[[Attach:EXCITE_V1.0.zip | EXCITE (Version 1.0)]]''' (June 2, 2015)
Added line 20:
* Guthub repository: [[https://github.com/fredericoschardong/excite]]
Changed line 19 from:
* '''[[Attach:EXCITE_V1.0.zip | EXCITE (Version 1.0)]]''' (2015-06-02)
to:
* '''[[Attach:EXCITE_V1.0.zip | EXCITE (Version 1.0)]]''' (will be available at INTERACT in September 2015)
Changed line 11 from:
!!ProjectorKit Publication
to:
!!Publication
Added lines 1-20:
%define=box padding-left=1em padding-right=1em margin='3px 3px 0'%
%define=greenbox box bgcolor=#e6f3e5 border='1px solid #8fd586'%

EXCITE is a tool that can ease the burden of video and tracking data analysis for multi-surface environments. EXCITE leverages proxemic information—such as people’s and devices’ distance and orientation, captured with the Proximity Toolkit—simultaneously with video data and event data from the tools, and allows video analysts to generate queries on the data as a whole. These queries are annotated on a timeline, allowing the analyst to scrub through synchronised video capture feeds to validate and further annotate study sessions. Using our tool allows the analyst to compare the incidence of various events with one another, and see these in captured video of behaviour from study sessions.

Attach:excite.png

!!Abstract
A central issue in designing collaborative multi-surface environments is evaluating the interaction techniques, tools, and applications that we design. We often analyze data from studies using inductive video analysis, but the volume of data makes this a time-consuming process. We designed EXCITE, which gives analysts the ability to analyze studies by quickly querying aspects of people’s interactions with applications and devices around them using a de-clarative programmatic syntax. These queries provide simple, immediate visual access to matching incidents in the interaction stream, video data, and motion-capture data. The query language filters the volume of data that needs to be re-viewed based on criteria such as application events, and proxemics events, such as distance or orientation between people and devices. This general approach allows analysts to provisionally develop theories about the use of multi-surface environments, and to evaluate them rapidly through video-based evidence.

!!ProjectorKit Publication

Marquardt, N., Schardong, F., Tang, A.\\
'''EXCITE: EXploring Collaborative Interaction in Tracked Environments.'''\\
''In INTERACT 2015.''

>>greenbox<<
!!! Downloads
* '''[[Attach:EXCITE_V1.0.zip | EXCITE (Version 1.0)]]''' (2015-06-02)
>><<