Guest talk: Daniel Buschek, LMU Munich "ProbUI – Building Probabilistic User Interfaces"
ProbUI – Building Probabilistic User Interfaces
Hosts: Antti Oulasvirta, Jaakko Lehtinen, Perttu Hämäläinen
ProbUI is a GUI concept and implemented framework that helps mobile touch GUI developers to handle uncertain input and implement feedback and GUI adaptations. It replaces static target representations (bounding boxes) with probabilistic gesture models ("bounding behaviours"). As a key conceptual insight, ProbUI seeks to merge the ease of use of declarative gesture definitions with the benefits of probabilistic reasoning. To this end, it automatically maps the developers' gesture declarations to simple probabilistic models, utilising GUI properties to inform its assumptions. These models are then employed to infer user intention "live" during interaction. This talk gives an overview of ProbUI, details its core idea of mapping declarations to probabilistic models, and concludes with examples of novel adaptive GUI widgets as well as a broader reflection on adaptive and probabilistic user interfaces.
Daniel Buschek is a post doctoral researcher at the University of Munich (LMU). He is interested in innovating interaction at the intersection of Human-Computer-Interaction and Machine Learning / AI. This includes in particular leveraging user- and context-specific human behaviour patterns to create secure, personalised, and expressive interactions with understandable intelligent systems. He received his PHD from the University of Munich in 2018. His studies included research projects at Siemens AG Munich, the University of Glasgow, and the Helsinki Institute for Information Technology. His work has been published at venues such as CHI, MobileHCI, IUI, TOCHI and IMWUT/Ubicomp, including three Honourable Mention Awards.