EyeWear 2016

First Workshop on Eye Wear Computing collocated with ISWC/UbiComp.
12/13 Sept. Heidelberg.
Paper Deadline June 6

Call for Participation

Intelligent glasses, head-mounted displays, egocentric vision devices, and similar “smart eyewear” have recently emerged as interesting research platform for a range of research fields including, ubiquitous computing, computer vision, and social sciences. As most of the human senses are situated on the head, we believe that these types of devices have significant potential as a research and product platform for a wide range of wearable assistive systems in human computer interaction. While early prototypes were too bulky to be worn on a regular basis in daily life, new devices, such as Google Glass and J!NS Meme, look more and more like normal glasses, are light-weight, and allow for long-term use enabling new interaction paradigms.

The proposed workshop will bring together researchers from a wide range of computing disciplines, such as mobile and ubiquitous computing, eye tracking, optics, computer vision, human vision and perception, privacy and security, usability, as well as systems research.

Important Dates

Workshop submission deadline: Jne 06 2016
Feedback to authors: June 14 2016

 Camera ready version: June 25 2016


Workshop candidates are requested to send a position paper (4 pages in the ACM SIGCHI non-archival Extended Abstracts template (landscape format)) to the organizers about their research. All submissions should be sent as PDF to eyewear2016@geist.pro with "EyeWear 2016 Submission" as email subject.


14:00 - 14:10 Welcome and Intro
14:10 - 14:40 Position Paper Session 1
Reading Interventions - Tracking Reading State and Designing Interventions Shoya Ishimaru, Tilman Dingler, Kai Kunze, Koichi Kise, Andreas Dengel
Smart Glasses with Peripheral Vision Display Takuro Nakuo, Kai Kunze, Takuro Nakuo
"What’s My Line? Glass Versus Paper for Cold Reading in Duologues" Jamie A. Ward, Paul Lukowicz
14:50 - 15:20 Position Paper Session 2
Smart Experimental Platform for Collecting Various Sensing Data from Various Things Yugo Nakamura, Yutaka Arakawa, Keiichi Yasumoto Yutaka Arakawa
Exploring a Multi-Sensor Picking Process in the Future Warehouse Alexander Diete, Timo Sztyler, Lydia Weiland, Heiner Stuckenschmidt
Estimation of English skill with a mobile eye tracker Olivier Augereau, Hiroki Fujiyoshi, Koichi Kise
15:30 - 16:00 Collaboration Discussion and Brainstroming
16:00 - 16:15 Warp Up


Andreas Bulling, Max Planck Institute for Informatics
Kai Kunze, Keio Media Design
Ozan Cakmakci, Google Inc.
James M. Rehg, Georgia Institute of Technology