-
May 15, 2018
-
11:00 AM
-
APB 1004 (Ratssaal)
-
English
Summary
Data and Information are everywhere and Humans need to interact with them to make sens of them and to create new meaningful contents. With our desktop computers, we interact with dozens of windows (and tabs) containing various contents; our mobile devices are powerful and are connected to "everything"; other type of devices, such as smartwatches, tabletops, wall displays, head-mounted virtual reality display, provide promising technology for human information processing; finally, in some situation, several users should work and interact together. However, current interaction techniques are not adapted to this new technological world.
I will present several research projects whose goal is to provide more power to the users in the above contexts. However, I will mainly focus my talk on: (i) multi-users interaction on wall displays focusing on shared interaction and interaction for control rooms; and (ii) big command vocabularies by using gestures on a touchpad, on a smartwatch, or on the body in a VR context.
Vita
Olivier Chapuis is a CNRS research scientist at the Computer Science Laboratory of University Paris-Sud (LRI), where he is the deputy director. He received a Ph.D. in Mathematics in 1994 from University Paris Diderot, and spent 8 years at University Lyon I, working on model theory, group and field theory and algebraic complexity. He moved to Human-Computer Interaction and joined the LRI in 2004. He worked on window management, desktop interaction techniques, pointing, and multi-scale navigation. His current researches, conducted in the
ILDA research team (Inria-LRI), focus on multi-users interaction and interaction techniques in various contexts (mainly wall displays, but also mobile devices, tabletops, Virtual Reality ...).