Dr. William Delamare
After getting my engineer degree at ENSIMAG, I did my PhD in the EHCI team of the Grenoble Informatics Laboratory. During my PhD, I focused on human-computer interaction with augmented physical objects. I particularly enjoy how new technologies can enable users to get new abilities and hence break free of physical constraints.
I am interested in distant interaction, i.e. how can new technologies enable humans to get new abilities? More specifically, I enjoy exploring solutions for interacting with augmented physical objects. This augmentation allows me to tackle new problems not present in the physical world before the spread of pervasive technologies.
It is the opposite direction of what has been explored in the early stages of Virtual and Augmented Reality: instead of reducing the gap between the digital and physical worlds by mimicking the physical world, I would like to explore how we can get rid of physical constraints in the physical world thanks to new technologies. For instance, 100 years ago, one had to move near a physical object (reducing the distance) in order to move it (applying a force). Now, we can provide interaction techniques to let users move this object without reducing the distance (e.g., by looking at the object) and without applying the force themselves (e.g., by performing a small finger gesture).
William Delamare, Teng Han, and Pourang Irani. 2017. Designing a gaze gesture guiding system. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '17). ACM, New York, NY, USA, Article 26, 13 pages.