inteHRDis - Interaktionstechniken für hochauflösende Displays
The purpose of the research project "Interaction Techniques for High Resolution Displays" funded by Förderprogramm Informationstechnik Baden-Württemberg (BW-FIT) as project part 5 of "information at your fingertips - interactive visualization for gigapixel displays" is the analysis of existing and conception of new interaction and visualization techniques for high resolution displays in consideration of human capabilities and restrictions. In particular Zoomable User Interfaces approaches with Semantic Zooming are investigated regarding their applicability on HRDs in a consistent and user oriented concept. Also the comparison of recently proposed input mechanisms based on tracking systems with unconventional input devices for HRDs like PDAs, UMPCs (Ultra Mobile PCs) and game controllers are in the scope of the project. Thereby PDAs are considered as very promising, since they can offer complex and parallel user interaction with additional visual feedback and orientation possibilities due to their built-in display.
- AG Reiterer (Human-Computer Interaction)
(2010): Squidy : a zoomable design environment for natural user interfaces |
Die heutige Interaktion zwischen Mensch und Computer unterscheidet sich nicht signifikant von der Interaktion, die in den 80er Jahren durch die frühen "Personal Computer" zur Verfügung gestellt wurden. Daher sind Tastatur und Maus noch immer vorherrschend in den derzeit auf dem Verbrauchermarkt verfügbaren Standard Computer Produkten. Dennoch zeigt die beliebte und von Nintendo produzierte Spielekonsole Wii, dass Ganzkörper-Eingabetechniken (z.B. Gesten) möglich sind und auf der ganzen Welt von eingefleischten Spielern wie auch gelegentlichen Spielern angenommen wurden. Dieses Beispiel verdeutlicht die Machbarkeit von neuen und post-WIMP Interaktionstechniken und zeichnet sich durch die Eigenschaften "einfach zu bedienen" (easy to use), "leicht zu merken" (easy to remember) und nicht zuletzt "Spaß an der Bedienung" (enjoyable) aus. Origin (projects) |
|
(2010): Interactive design of multimodal user interfaces : reducing technical and visual complexity Journal on Multimodal User Interfaces ; 3 (2010), 3. - S. 197-213. - ISSN 1783-7677 |
In contrast to the pioneers of multimodal interaction, e.g. Richard Bolt in the late seventies, today’s researchers can benefit from various existing hardware devices and software toolkits. Although these development tools are available, using them is still a great challenge, particularly in terms of their usability and their appropriateness to the actual design and research process. We present a three-part approach to supporting interaction designers and researchers in designing, developing, and evaluating novel interaction modalities including multimodal interfaces. First, we present a software architecture that enables the unification of a great variety of very heterogeneous device drivers and special-purpose toolkits in a common interaction library named “Squidy”. Second, we introduce a visual design environment that minimizes the threshold for its usage (ease-of-use) but scales well with increasing complexity (ceiling) by combining the concepts of semantic zooming with visual dataflow programming. Third, we not only support the interactive design and rapid prototyping of multimodal interfaces but also provide advanced development and debugging techniques to improve technical and conceptual solutions. In addition, we offer a test platform for controlled comparative evaluation studies as well as standard logging and analysis techniques for informing the subsequent design iteration. Squidy therefore supports the entire development lifecycle of multimodal interaction design, in both industry and research. Origin (projects) |
|
(2010): Visualisierung auf Großbildschirmen : Herausforderung eines neuen Ausgabegeräts Informatik-Spektrum ; 33 (2010), 6. - S. 551-558. - ISSN 0170-6012 |
Entwicklungen in der Displaytechnologie haben in den vergangenen Jahren eine Vielzahl hochauflösender Bildschirme hervorgebracht. Der Forschungsverbund "Information at your finger tips – interaktive Visualisierung für Gigapixel Displays" hat sich mit den Herausforderungen beschäftigt, die diese Technologie für viele Bereiche der Informatik in sich birgt. Hierbei wurden sowohl neue Grafiksysteme untersucht als auch Interaktionsmethoden und Darstellungsformen sowie deren Anwendung in Visualisierung und Kunst. Origin (projects) |
|
(2009): Understanding and Designing Surface Computing with ZOIL and Squidy CHI 2009 Workshop : Multitouch and Surface Computing, April 2009, Boston, USA |
In this paper we provide a threefold contribution to the Surface Computing (SC) community. Firstly, we will discuss frameworks such as Reality-based Interaction which provide a deeper theoretical understanding of SC. Secondly, we will introduce our ZOIL user interface paradigm for SC on mobile, tabletop or wall-sized devices. Thirdly, we will describe our two software tools Squidy and ZOIL UI Framework which have supported and facilitated our iterative design of SC prototypes. Origin (projects) |
|
(2009): Gaze-Assisted Pointing for Wall-Sized Displays Human-Computer Interaction – INTERACT 2009 / Gross, Tom; Gulliksen, Jan; Kotzé, Paula; Oestreicher, Lars; Palanque, Philippe; Prates, Raquel Oliveira; Winckler, Marco (Hrsg.). - Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. - (Lecture Notes in Computer Science ; 5727). - S. 9-12. - ISBN 978-3-642-03657-6 |
Previous studies have argued for the use of gaze-assisted pointing techniques (MAGIC) in improving human-computer interaction. Here, we present experimental findings that were drawn from human performance of two tasks on a wall-sized display. Our results show that a crude adoption of MAGIC across a range of complex tasks does not increase pointing performance. More importantly, a detailed analysis of user behavior revealed several issues that were previously ignored (such as, interference of corrective saccades, increased decision time due to variability of precision, errors due to eye-hand asynchrony, and interference with search behavior) which should influence the development of gaze-assisted technology. Origin (projects) |
|
(2009): Adaptive Pointing Design and Evaluation of a Precision Enhancing Technique for Absolute Pointing Devices Human-Computer Interaction – INTERACT 2009 / Gross, Tom; Gulliksen, Jan; Kotzé, Paula; Oestreicher, Lars; Palanque, Philippe; Prates, Raquel Oliveira; Winckler, Marco (Hrsg.). - Berlin : Springer, 2009. - (Lecture Notes in Computer Science ; 5726). - S. 658-671. - ISBN 978-3-642-03654-5 |
We present Adaptive Pointing, a novel approach to addressing the common problem of accuracy when using absolute pointing devices for distant interaction. First, we discuss extensively some related work concerning the problem-domain of pointing accuracy when using absolute or relative pointing devices. As a result, we introduce a novel classification scheme to more clearly discriminate between different approaches. Second, the Adaptive Pointing technique is presented and described in detail. The intention behind this approach is to improve pointing performance for absolute input devices by implicitly adapting the Control-Display gain to the current user‟s needs without violating users‟ mental model of absolute-device operation. Third, we present an experiment comparing Adaptive Pointing with pure absolute pointing using a laser-pointer as an example of an absolute device. The results show that Adaptive Pointing results in a significant improvement compared with absolute pointing in terms of movement time (19%), error rate (63%), and user satisfaction. Origin (projects) |
|
(2009): Visual Design of Multimodal Interaction : bridging the Gap between Interaction Designers and Developers Proceedings of the Workshop on the Challenges of Engineering Multimodal Interaction : Methods, Tools, Evaluation 2009 |
In contrast to the pioneers of multimodal interaction e.g. Richard Bolt in the late seventies, today s researchers can benefit of a wide variety of existing interaction techniques, devices and frameworks. Although these tools are available, the usage of them is still a great challenge particularly in terms of usability. A major issue results from the trade-off between the functionality of the system and the simplicity of use. We introduce a novel visual user interface concept which is especially designed to ease the design and development of post-WIMP user interfaces including multimodal interaction. It provides an integrated design environment for our interaction library Squidy based on high-level visual data flow programming combined with zoomable user interface concepts. The user interface offers a simple visual language and a collection of ready-to-use devices, filters and interaction techniques. We specifically address the trade-off between functionality and simplicity by utilizing the concept of semantic zooming which enables dynamic access to more advanced functionality on demand. Thus, developers as well as interaction designers are able to adjust the complexity of the Squidy user interface to their current need and knowledge. Origin (projects) |
|
(2009): Tactile feedback enhanced hand gesture interaction at large, high-resolution displays Journal of Visual Languages and Computing ; 20 (2009), 5. - S. 341-351 |
Human beings perceive their surroundings based on sensory information from diverse channels, However, for human-computer interaction we mostly restnct the user on visual perception. In this paper, we contribute to the investigation of tactile feedback as an additional perception modality, Therefore, we will first discuss existing user studies and provide a classification scheme for tactile feedback techniques, We will then present and discuss a comparative evaluation study based on the ISO 9241-9 IErgonomic requirements for office work with visual display terminals (VDTs) - Part 9: requirements for non-keyboard input devices, 20001, The 20 participants performed horizontal and vertical one-directional tapping tasks with hand gesture input with and without tactile feedback in front of a large, high-resolution display. In contrast to previous research. we cannot confirm a benefit of tactile feedback on user performance. Our results show no significant effect in terms of throughput (effective index of performance (IPe)) and even a significant higher error rate for horizontal target alignment when using tactile feedback. Based on these results. we suggest that tactile feedback can interfere with other senses in a negative way. resulting in the obselved higher error rate for horizontal targets. Therefore, more systematic research is needed to clarify the innuencing factors on the usefulness of tactile feedback. Besides these results, we found a significant difference in favor of the horizontal target alignment compared with the vertical one in terms of the effective index of performance (lPe), confirming the work by Dennerlein et al. |
|
(2009): Temporal-Spatial Visualization of Interaction Data for Visual Debugging EuroVis'09 : Eurographics/IEEE Symposium on Visualization, Poster Session |
dc.title: dc.contributor.author: Rädle, Roman; König, Werner A.; Reiterer, Harald |
|
(2009): Gaze-augmented manual interaction Proceedings of the 27th international conference : extended abstracts on Human factors in computing systems - CHI EA '09. - New York, New York, USA : ACM Press, 2009. - S. 3121-3124. - ISBN 978-1-60558-247-4 |
This project will demonstrate a new approach to employing users gaze in the context of human-computer interaction. This new approach uses gaze passively in order to improve the speed and precision of manually controlled pointing techniques. Designing such gazeaugmented manual techniques requires an understanding of the principles that govern the coordination of hand and eye. This coordination is influenced by situational parameters (task complexity, input device used, etc.), which this project will explore in controlled experiments. Origin (projects) |
|
(2009): Enhancing input device evaluation : longitudinal approaches Proceedings of the 27th international conference : extended abstracts on Human factors in computing systems - CHI EA '09. - New York, New York, USA : ACM Press, 2009. - S. 4351-4356. - ISBN 978-1-60558-247-4 |
In this paper we present our experiences with longitudinal study designs for input device evaluation. In this domain, analyzing learning is currently the main reason for applying longitudinal designs. We will shortly discuss related research questions and outline two case studies in which we used different approaches to address this issue. Finally, we will point out future research tasks in the context of longitudinal evaluation methods. Origin (projects) |
|
(2009): Adaptive pointing : implicit gain adaptation for absolute pointing devices Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA '09. - New York, New York, USA : ACM Press, 2009. - S. 4171-4176. - ISBN 978-1-60558-247-4 |
We present Adaptive Pointing, a novel approach to addressing the common problem of accuracy when using absolute pointing devices for distant interaction. The intention behind this approach is to improve pointing performance for absolute input devices by implicitly adapting the Control-Display gain to the current user's needs without violating users' mental model of absolute-device operation. First evaluation results show that Adaptive Pointing leads to a significant improvement compared with absolute pointing in terms of movement time (19%), error rate (63%), and user satisfaction. Origin (projects) |
|
(2009): Text Input on Multitouch Tabletop Displays EuroVis'09 : Eurographics/IEEE Symposium on Visualization, Poster Session |
dc.title: dc.contributor.author: Schmidt, Toni; König, Werner A.; Reiterer, Harald |
|
(2008): Laserpointer-Interaction between Art and Science Proceedings of the 13th international conference on Intelligent user interfaces. - New York, NY : ACM, 2008. - S. 423-424. - ISBN 978-1-59593-987-6 |
We employ Laserpointer-Interaction as an intuitive, direct and flexible interaction concept particularly for large, highresolution displays which enforce physical navigation. We therefore demonstrate general applicability and suitability by applying Laserpointer-Interaction to a wide range of domains from scientific applications and formal experiments to artistic installations for the broad public. Origin (projects) |
|
(2008): Visual and Physical Interaction Design for Information Services Invited Talk at the International Conference on Human-Computer Interaction and Information Services, Prague, 2008 |
With growing quantity and complexity of available information, novel user interface concepts are needed which support users natural seeking behavior and provide natural ways of interaction. We discuss general user requirements and present design principles which we identified as crucial for the successful design of a user-centered visual information-seeking system. We also illustrate novel input modalities and display techniques such as multi-touch tables, gesture tracking systems and physical tokens which lead towards a more reality-based and thus natural interaction. Origin (projects) |
|
(2008): ZOIL : A Cross-Platform User Interface Paradigm for Personal Information Management Personal Information Management 2008 (PIM 2008), CHI 2008 Workshop, April 5 - 6, 2008, Florence, Italy |
In this paper we introduce the novel user interface paradigm ZOIL (Zoomable Object-Oriented Information Landscape). ZOIL is aimed at unifying all types of local and remote information items with their connected functionality and with their mutual relations in a single visual workspace as a replacement of today s desktop metaphor. This workspace can serve as an integrated work environment for traditional personal information management (PIM), but can also be used for PIM tasks in a wider sense. By formulating ZOIL s fundamental design principles we describe the interaction style, visualization techniques and interface physics of a ZOIL user interface. Furthermore we discuss ZOIL s ability to provide nomadic PIM environments for mobile and stationary use. Origin (projects) |
|
(2008): Natural Interaction with Hand Gestures and Tactile Feedback for large, high-res Displays MITH'08 : Workshop on Multimodal Interaction Through Haptic Feedback, held in conjunction with AVI'08 : International Working Conference on Advanced Visual Interfaces |
Human beings perceive their surroundings based on sensory information from diverse channels. However, for humancomputer interaction we mostly restrict the user on visual perception. To investigate the effect of additional tactile feedback on pointing and selection tasks we conducted a comparative evaluation study based on ISO 9241-9. The 20 participants performed horizontal and vertical one-directional tapping tasks with hand gesture input with and without tactile feedback on a large, high resolution display. In contrast to previous research we cannot confirm a benefit of tactile feedback on user performance. Our results show no significant effect in terms of effective index of performance and even a significant higher error rate for horizontal target alignment when using tactile feedback. Furthermore we found a significant difference in favor of the horizontal target alignment compared to the vertical one in terms of the effective index of performance. Origin (projects) |
|
(2008): Finteraction : Finger Interaction with Mobile Phone |
Touch interaction with mobile phones enable users to have a more natural interaction with the device, since touch is a natural way of direct accessing an ob ject of interest. But one disadvantage is occlusion; i.e., the user loses a high percentage of information presented on the small screen of the mobile phone during the interaction using his big finger . On the other hand, interaction enhanced with accelerometers can react according to the device movements, e.g., while the user is tilting the mobile phone, the UI will be rotated. A drawback is that the user needs to move the whole device while interacting and consequently loses his eye contact on the phone s screen (screen-absence problem). Finteraction (Finger Interaction) is a new interaction concept that solves the occlusion and screen-absence problems. User interact with a large public display using his mobile phone. Moving the index finger in front of the camera at the backside of the mobile phone, the user can interact with the large public display even on the move. Origin (projects) |
|
(2007): Position-Independent Interaction for Large High-Resolution Displays Proceedings of the IADIS International Conference on Interfaces and Human Computer Interaction (2007). - IADIS Press, 2007. - S. 117-125. - ISBN 978-972-8924-39-3 |
Since large, high-resolution displays (LHRD) are capable of visualizing a large amount of very detailed information, users have to move around in front of these displays to gain either in-depth knowledge or an overview. However, conventional input devices such as mouse and keyboard restrict users mobility by requiring a stable surface on which to operate. We present a flexible and intuitive interaction technique based on an infrared laserpointer, a technique that allows identical use from any point and distance. In particular, our laserpointer interaction satisfies the demands of LHRD in the areas of mobility, accuracy, and interaction speed. The solution presented is technically designed as a generic interaction library whose flexibility and general applicability was verified by using it on two very different systems a planar 221" Powerwall and a curved 360° panoramic display. Furthermore, a comparative evaluation study with 16 participants was conducted on the Powerwall to compare the performances of a conventional mouse and our laserpointer by means of a unidirectional tapping test at varying distances (ISO 9241-9). The statistically significant performance advantage of the mouse (13%) appears marginal considering the intuitive and direct mode of interaction in using the laserpointer and the flexibility gained by its use, both of which are fundamental requirements for LHRDs. In comparison to previous systems and evaluations, we were able to reduce the laserpointer s performance lag by over 50%. This result is achieved mainly by our precise tracking method, the minimized delay, and the effective jitter compensation. Origin (projects) |
|
(2007): Zoomable User Interfaces : Intuitive Navigation in komplexen Informationsräumen |
Obwohl heutige Anwender mit einer stetig ansteigenden Quantität, Dimensionalität und Heterogenität der zugänglichen Informationen konfrontiert werden, beherrschen immer noch die Konzepte aus den 60/70-er Jahren die derzeitigen Benutzeroberflächen. Der Autor stellt ein alternatives Gesamtkonzept in Form eines neuen Zoomable User Interface Paradigmas namens ZOIL vor, welches hinsichtlich einer benutzergerechten Aufbereitung von komplexen Informationsräumen das Konzept von Zoomable User Interfaces mit den direkt-manipulativen Konzepten von objektorientierten Benutzeroberflächen und der räumlichen Datenorganisation mithilfe einer beliebig skalierbaren Informationslandschaft und semantischem Zooming vereint. Das ZOIL Paradigma spezifiziert hierbei keine konkrete Benutzeroberfläche, sondern soll als eine in sich konsistente Kombination aus Visualisierungs- und Interaktionstechniken verstanden werden, welche als flexibles Grundkonzept für vielfältigste Anwendungsdomänen dienen kann. Die konzeptionelle und technische Machbarkeit wird anhand einer prototypischen Umsetzung für die Anwendungsdomäne Dokumentenmanagement überprüft. Zur Hinführung an die Thematik werden verwandte Konzepte, physiologische und psychologische Aspekte der menschlichen Wahrnehmung, grundlegende Navigationskonzepte sowie potentielle Eingabegeräte betrachtet. Origin (projects) |
|
(2007): Laserpointer-Interaktion für große, hochauflösende Displays Mensch & Computer 2007 : 7. Konferenz für interaktive und kooperative Medien ; Interaktion im Plural / Gross, Tom (Hrsg.). - München : Oldenbourg, 2007. - S. 69-78. - ISBN 978-3-486-58496-7 |
Aufgrund des limitierten Sehvermögens des Menschen muss sich ein Anwender vor großen, hochauflösenden Displays frei bewegen können, um kleinste Details pixelgenau wahr zu nehmen oder einen Überblick über die gesamte Darstellungsfläche von mehreren Quadratmetern zu erhalten. Im Gegensatz zu konventionellen Eingabegeräten wie Maus und Tastatur beschränken Laserpointer den Anwender nicht in seiner Bewegungsfreiheit, sondern ermöglichen unabhängig von der Position zum Display eine sehr intuitive und direkte Art der Interaktion. In diesem Beitrag wird eine Interaktionsbibliothek vorgestellt, welche im Hinblick auf Präzision und verzögerungsfreier Steuerung erstmals auch den Einsatz von Laserpointer-Tracking bei großen, hochauflösenden Displays ermöglicht. In einem Vergleichsexperiment wurde die Interaktionsbibliothek in Kombination mit einem Infrarotlaserpointer gegenüber einer klassischen Maus als Standardeingabegerät bei unterschiedlichen Distanzen evaluiert. Der signifikante Performancevorteil der Maus von 12,5% scheint in Anbetracht der gewonnenen Bewegungsfreiheit und der unmittelbaren Interaktionsweise mit dem Laserpointer eher gering ins Gewicht zu fallen. Im Vergleich zu bisherigen Systemen konnte der Rückstand des Laserpointers um über 50% reduziert werden, was größtenteils auf die geringe Bewegungslatenz, das präzise Tracking und die wirksame Kompensation des Zitterns der Hand zurückzuführen ist. Ferner wurde bei der Studie ein signifikanter Distanzeffekt beim Laserpointer hinsichtlich Performance und Fehlerrate festgestellt. Origin (projects) |
Period: | 01.06.2006 – 31.12.2011 |
Link: | Project homepage |