- FB Informatik und Informationswissenschaft
|(2010): Interactive design of multimodal user interfaces : reducing technical and visual complexity Journal on Multimodal User Interfaces ; 3 (2010), 3. - S. 197-213. - ISSN 1783-7677|
In contrast to the pioneers of multimodal interaction, e.g. Richard Bolt in the late seventies, today’s researchers can benefit from various existing hardware devices and software toolkits. Although these development tools are available, using them is still a great challenge, particularly in terms of their usability and their appropriateness to the actual design and research process. We present a three-part approach to supporting interaction designers and researchers in designing, developing, and evaluating novel interaction modalities including multimodal interfaces. First, we present a software architecture that enables the unification of a great variety of very heterogeneous device drivers and special-purpose toolkits in a common interaction library named “Squidy”. Second, we introduce a visual design environment that minimizes the threshold for its usage (ease-of-use) but scales well with increasing complexity (ceiling) by combining the concepts of semantic zooming with visual dataflow programming. Third, we not only support the interactive design and rapid prototyping of multimodal interfaces but also provide advanced development and debugging techniques to improve technical and conceptual solutions. In addition, we offer a test platform for controlled comparative evaluation studies as well as standard logging and analysis techniques for informing the subsequent design iteration. Squidy therefore supports the entire development lifecycle of multimodal interaction design, in both industry and research.
|(2009): Tactile feedback enhanced hand gesture interaction at large, high-resolution displays Journal of Visual Languages and Computing ; 20 (2009), 5. - S. 341-351|
Human beings perceive their surroundings based on sensory information from diverse channels, However, for human-computer interaction we mostly restnct the user on visual perception. In this paper, we contribute to the investigation of tactile feedback as an additional perception modality, Therefore, we will first discuss existing user studies and provide a classification scheme for tactile feedback techniques, We will then present and discuss a comparative evaluation study based on the ISO 9241-9 IErgonomic requirements for office work with visual display terminals (VDTs) - Part 9: requirements for non-keyboard input devices, 20001, The 20 participants performed horizontal and vertical one-directional tapping tasks with hand gesture input with and without tactile feedback in front of a large, high-resolution display. In contrast to previous research. we cannot confirm a benefit of tactile feedback on user performance. Our results show no significant effect in terms of throughput (effective index of performance (IPe)) and even a significant higher error rate for horizontal target alignment when using tactile feedback. Based on these results. we suggest that tactile feedback can interfere with other senses in a negative way. resulting in the obselved higher error rate for horizontal targets. Therefore, more systematic research is needed to clarify the innuencing factors on the usefulness of tactile feedback. Besides these results, we found a significant difference in favor of the horizontal target alignment compared with the vertical one in terms of the effective index of performance (lPe), confirming the work by Dennerlein et al.
|(2009): Adaptive Pointing Design and Evaluation of a Precision Enhancing Technique for Absolute Pointing Devices Human-Computer Interaction – INTERACT 2009 / Gross, Tom; Gulliksen, Jan; Kotzé, Paula; Oestreicher, Lars; Palanque, Philippe; Prates, Raquel Oliveira; Winckler, Marco (Hrsg.). - Berlin : Springer, 2009. - (Lecture Notes in Computer Science ; 5726). - S. 658-671. - ISBN 978-3-642-03654-5|
Adaptive Pointing Design and Evaluation of a Precision Enhancing Technique for Absolute Pointing Devices
We present Adaptive Pointing, a novel approach to addressing the common problem of accuracy when using absolute pointing devices for distant interaction. First, we discuss extensively some related work concerning the problem-domain of pointing accuracy when using absolute or relative pointing devices. As a result, we introduce a novel classification scheme to more clearly discriminate between different approaches. Second, the Adaptive Pointing technique is presented and described in detail. The intention behind this approach is to improve pointing performance for absolute input devices by implicitly adapting the Control-Display gain to the current user‟s needs without violating users‟ mental model of absolute-device operation. Third, we present an experiment comparing Adaptive Pointing with pure absolute pointing using a laser-pointer as an example of an absolute device. The results show that Adaptive Pointing results in a significant improvement compared with absolute pointing in terms of movement time (19%), error rate (63%), and user satisfaction.
|(2009): Understanding and Designing Surface Computing with ZOIL and Squidy CHI 2009 Workshop : Multitouch and Surface Computing, April 2009, Boston, USA|
In this paper we provide a threefold contribution to the Surface Computing (SC) community. Firstly, we will discuss frameworks such as Reality-based Interaction which provide a deeper theoretical understanding of SC. Secondly, we will introduce our ZOIL user interface paradigm for SC on mobile, tabletop or wall-sized devices. Thirdly, we will describe our two software tools Squidy and ZOIL UI Framework which have supported and facilitated our iterative design of SC prototypes.
|(2009): Adaptive pointing : implicit gain adaptation for absolute pointing devices Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA '09. - New York, New York, USA : ACM Press, 2009. - S. 4171-4176. - ISBN 978-1-60558-247-4|
We present Adaptive Pointing, a novel approach to addressing the common problem of accuracy when using absolute pointing devices for distant interaction. The intention behind this approach is to improve pointing performance for absolute input devices by implicitly adapting the Control-Display gain to the current user's needs without violating users' mental model of absolute-device operation. First evaluation results show that Adaptive Pointing leads to a significant improvement compared with absolute pointing in terms of movement time (19%), error rate (63%), and user satisfaction.
|(2009): Visual Design of Multimodal Interaction : bridging the Gap between Interaction Designers and Developers Proceedings of the Workshop on the Challenges of Engineering Multimodal Interaction : Methods, Tools, Evaluation 2009|
Visual Design of Multimodal Interaction : bridging the Gap between Interaction Designers and Developers
In contrast to the pioneers of multimodal interaction e.g. Richard Bolt in the late seventies, today s researchers can benefit of a wide variety of existing interaction techniques, devices and frameworks. Although these tools are available, the usage of them is still a great challenge particularly in terms of usability. A major issue results from the trade-off between the functionality of the system and the simplicity of use. We introduce a novel visual user interface concept which is especially designed to ease the design and development of post-WIMP user interfaces including multimodal interaction. It provides an integrated design environment for our interaction library Squidy based on high-level visual data flow programming combined with zoomable user interface concepts. The user interface offers a simple visual language and a collection of ready-to-use devices, filters and interaction techniques. We specifically address the trade-off between functionality and simplicity by utilizing the concept of semantic zooming which enables dynamic access to more advanced functionality on demand. Thus, developers as well as interaction designers are able to adjust the complexity of the Squidy user interface to their current need and knowledge.
|(2009): Text Input on Multitouch Tabletop Displays EuroVis'09 : Eurographics/IEEE Symposium on Visualization, Poster Session|
|(2009): Temporal-Spatial Visualization of Interaction Data for Visual Debugging EuroVis'09 : Eurographics/IEEE Symposium on Visualization, Poster Session|
|(2009): Squidy : a Zoomable Design Environment for Natural User Interfaces Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA '09. - New York, New York, USA : ACM Press, 2009. - S. 4561-4566. - ISBN 978-1-60558-247-4|
We introduce the interaction library Squidy, which eases the design of natural user interfaces by unifying relevant frameworks and toolkits in a common library. Squidy provides a central design environment based on high-level visual data flow programming combined with zoomable user interface concepts. The user interface offers a Simple visual language and a collection of ready-to-use devices, filters and interaction techniques. The concept of semantic zooming enables nevertheless access to more advanced functionality on demand. Thus, users are able to adjust the complexity of the user interface to their current need and knowledge.
|(2008): Visual and Physical Interaction Design for Information Services Invited Talk at the International Conference on Human-Computer Interaction and Information Services, Prague, 2008|
With growing quantity and complexity of available information, novel user interface concepts are needed which support users natural seeking behavior and provide natural ways of interaction. We discuss general user requirements and present design principles which we identified as crucial for the successful design of a user-centered visual information-seeking system. We also illustrate novel input modalities and display techniques such as multi-touch tables, gesture tracking systems and physical tokens which lead towards a more reality-based and thus natural interaction.
|(2008): Natural Interaction with Hand Gestures and Tactile Feedback for large, high-res Displays MITH'08 : Workshop on Multimodal Interaction Through Haptic Feedback, held in conjunction with AVI'08 : International Working Conference on Advanced Visual Interfaces|
Human beings perceive their surroundings based on sensory information from diverse channels. However, for humancomputer interaction we mostly restrict the user on visual perception. To investigate the effect of additional tactile feedback on pointing and selection tasks we conducted a comparative evaluation study based on ISO 9241-9. The 20 participants performed horizontal and vertical one-directional tapping tasks with hand gesture input with and without tactile feedback on a large, high resolution display. In contrast to previous research we cannot confirm a benefit of tactile feedback on user performance. Our results show no significant effect in terms of effective index of performance and even a significant higher error rate for horizontal target alignment when using tactile feedback. Furthermore we found a significant difference in favor of the horizontal target alignment compared to the vertical one in terms of the effective index of performance.
|(2008): Laserpointer-Interaction between Art and Science Proceedings of the 13th international conference on Intelligent user interfaces. - New York, NY : ACM, 2008. - S. 423-424. - ISBN 978-1-59593-987-6|
We employ Laserpointer-Interaction as an intuitive, direct and flexible interaction concept particularly for large, highresolution displays which enforce physical navigation. We therefore demonstrate general applicability and suitability by applying Laserpointer-Interaction to a wide range of domains from scientific applications and formal experiments to artistic installations for the broad public.
|(2007): Laserpointer-Interaktion für große, hochauflösende Displays Mensch & Computer 2007 : 7. Konferenz für interaktive und kooperative Medien ; Interaktion im Plural / Gross, Tom (Hrsg.). - München : Oldenbourg, 2007. - S. 69-78. - ISBN 978-3-486-58496-7|
Aufgrund des limitierten Sehvermögens des Menschen muss sich ein Anwender vor großen, hochauflösenden Displays frei bewegen können, um kleinste Details pixelgenau wahr zu nehmen oder einen Überblick über die gesamte Darstellungsfläche von mehreren Quadratmetern zu erhalten. Im Gegensatz zu konventionellen Eingabegeräten wie Maus und Tastatur beschränken Laserpointer den Anwender nicht in seiner Bewegungsfreiheit, sondern ermöglichen unabhängig von der Position zum Display eine sehr intuitive und direkte Art der Interaktion. In diesem Beitrag wird eine Interaktionsbibliothek vorgestellt, welche im Hinblick auf Präzision und verzögerungsfreier Steuerung erstmals auch den Einsatz von Laserpointer-Tracking bei großen, hochauflösenden Displays ermöglicht. In einem Vergleichsexperiment wurde die Interaktionsbibliothek in Kombination mit einem Infrarotlaserpointer gegenüber einer klassischen Maus als Standardeingabegerät bei unterschiedlichen Distanzen evaluiert. Der signifikante Performancevorteil der Maus von 12,5% scheint in Anbetracht der gewonnenen Bewegungsfreiheit und der unmittelbaren Interaktionsweise mit dem Laserpointer eher gering ins Gewicht zu fallen. Im Vergleich zu bisherigen Systemen konnte der Rückstand des Laserpointers um über 50% reduziert werden, was größtenteils auf die geringe Bewegungslatenz, das präzise Tracking und die wirksame Kompensation des Zitterns der Hand zurückzuführen ist. Ferner wurde bei der Studie ein signifikanter Distanzeffekt beim Laserpointer hinsichtlich Performance und Fehlerrate festgestellt.
|(2007): Position-Independent Interaction for Large High-Resolution Displays Proceedings of the IADIS International Conference on Interfaces and Human Computer Interaction (2007). - IADIS Press, 2007. - S. 117-125. - ISBN 978-972-8924-39-3|
Since large, high-resolution displays (LHRD) are capable of visualizing a large amount of very detailed information, users have to move around in front of these displays to gain either in-depth knowledge or an overview. However, conventional input devices such as mouse and keyboard restrict users mobility by requiring a stable surface on which to operate. We present a flexible and intuitive interaction technique based on an infrared laserpointer, a technique that allows identical use from any point and distance. In particular, our laserpointer interaction satisfies the demands of LHRD in the areas of mobility, accuracy, and interaction speed. The solution presented is technically designed as a generic interaction library whose flexibility and general applicability was verified by using it on two very different systems a planar 221" Powerwall and a curved 360° panoramic display. Furthermore, a comparative evaluation study with 16 participants was conducted on the Powerwall to compare the performances of a conventional mouse and our laserpointer by means of a unidirectional tapping test at varying distances (ISO 9241-9). The statistically significant performance advantage of the mouse (13%) appears marginal considering the intuitive and direct mode of interaction in using the laserpointer and the flexibility gained by its use, both of which are fundamental requirements for LHRDs. In comparison to previous systems and evaluations, we were able to reduce the laserpointer s performance lag by over 50%. This result is achieved mainly by our precise tracking method, the minimized delay, and the effective jitter compensation.