PERIODE 3, 2013/2014, timeslot A, 7.5 ECTS [CS site | OSIRIS]
INFOMMMI: Multimodal Interaction

Lecturers: Wolfgang Hürst (WWW, Email), Peter Werkhoven (WWW, Email)

News | Description | Schedule | Lectures part 1 & part 2 | Practicals part 1 & part 2 

LECTURES (PART 2) given by Wolfgang Hürst

This part covers mostly mobile interaction as one particular example of multimodal interaction. It is about applying the theoretical background of multimodal perception and multisensory input to concrete state-of-the-art examples, such as mobile gaming on devices like smartphones and tablets. It is accompanied by a practical part in which students have to implement some multimodal interaction system in a related project - as specified under Practicals (part 2).

We will build on the content from the first part insofar as we want to look into a concrete example of multimodal interaction: mobile devices such as smartphones and tablets. Mobile devices offer various multimodal sensors, for example microphones (for audio and speech input), cameras (for visual input), accelerometers (to get the orientation of a device), compass (to get the location of a device), touchscreen (for tactile input). How can these be used for multimodal interaction? Which modality should be used in what context? What new possibilities do these sensors offer to create better interaction experiences (think of Siri for the iPhone or Kinect-style gesture interaction) and to build new applications (think of augmented reality on mobiles)? We will address these questions by considering basics about multisensory human perception (which are covered in part 1 of the course), current state-of-the-art technology (sensors integrated in state-of-the-art mobiles), and concrete application cases. In particular, we will be looking into 3D, augmented, and virtual reality on mobiles.


PAPERS TO READ (obligatory)

  • Seungyon Lee and Shumin Zhai.
    The performance of touch screen soft buttons,
    Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09).
    (PDF)

  • Michael Rohs and Georg Essl.
    Sensing-Based Interaction for Information Navigation on Handheld Displays,
    Advances in Human-Computer Interaction, Volume 2008, Article ID 450385, 11 pages.
    (PDF)

  • Jakob Nielsen's Alertbox.
    Kinect Gestural UI: 1st Impressions,
    Link

  • D. Norman and J. Nielsen.
    Gestural Interfaces: A Step Backwards In Usability,
    Link