From Pegs to Pixels: A Comparative Analysis of the Nine Hole Peg Test and a Digital Copy Drawing Test for Fine Motor Control AssessmentUser interaction with digital systems requires Fine Motor Control (FMC), especially if the interfaces are complex or require high fidelity and fine-grained interactions. Despite its importance, Fine Motor Control is often overlooked in interactive system design, partly because of its complex assessment. Measuring changes in fine motor abilities due to prolonged use or fatigue currently requires repeated manual testing. This paper analyzes the concept of using the digital mobile devices' input behavior to assess the user's Fine Motor Control. For this, we show that Fine Motor Control can be assessed for touch and stylus-based interaction with a digital mobile system. We conducted a user study, where participants performed a Nine Hole Peg Test and a predefined Copy Drawing Test before and after exercises that affect fine motor skills. Based on this data, we investigated how metrics such as pressure, velocity, and entropy for touch and stylus input can be used to predict Fine Motor Control.2025DSDominik Schön et al.Motor Impairment Assistive Input TechnologiesPrototyping & User TestingMobileHCI
ThermalPen: Investigating the Influence of Thermal Haptic Feedback for Creativity in 3D SketchingThis paper presents ThermalPen, a novel device for 3D sketching that utilizes thermal feedback to allow users to feel the materiality of their sketches. The pen lets users draw using six colours and three textures mapped to different temperatures. Our goal is to investigate the influence of thermal feedback on user creativity for 3D sketching. In a user study with 24 participants, we asked them to draw with and without thermal feedback. Our results show that thermal feedback improved user creativity for specific tasks. Qualitative results also indicate an effect on the user experience. Our work contributes to understanding how thermal feedback can increase user satisfaction with 3D sketching and provide insights and directions for future work.2024PHPhilipp Pascal Hoffmann et al.Vibrotactile Feedback & Skin StimulationShape-Changing Interfaces & Soft Robotic MaterialsC&C
Smooth as Steel Wool: Effects of Visual Stimuli on the Haptic Perception of Roughness in Virtual RealityHaptic Feedback is essential for lifelike Virtual Reality (VR) experiences. To provide a wide range of matching sensations of being touched or stroked, current approaches typically need large numbers of different physical textures. However, even advanced devices can only accommodate a limited number of textures to remain wearable. Therefore, a better understanding is necessary of how expectations elicited by different visualizations affect haptic perception, to achieve a balance between physical constraints and great variety of matching physical textures. In this work, we conducted an experiment (N=31) assessing how the perception of roughness is affected within VR. We designed a prototype for arm stroking and compared the effects of different visualizations on the perception of physical textures with distinct roughnesses. Additionally, we used the visualizations' real-world materials, no-haptics and vibrotactile feedback as baselines. As one result, we found that two levels of roughness can be sufficient to convey a realistic illusion.2022SGSebastian Günther et al.Technical University of DarmstadtVibrotactile Feedback & Skin StimulationImmersion & Presence ResearchCHI
BikeAR: Understanding Cyclists' Crossing Decision-Making at Uncontrolled Intersections using Augmented RealityCycling has become increasingly popular as a means of transportation. However, cyclists remain a highly vulnerable group of road users. According to accident reports, one of the most dangerous situations for cyclists are uncontrolled intersections, where cars approach from both directions. To address this issue and assist cyclists in crossing decision-making at uncontrolled intersections, we designed two visualizations that: (1) highlight occluded cars through an X-ray vision and (2) depict the remaining time the intersection is safe to cross via a Countdown. To investigate the efficiency of these visualizations, we proposed an Augmented Reality simulation as a novel evaluation method, in which the above visualizations are represented as AR, and conducted a controlled experiment with 24 participants indoors. We found that the X-ray ensures a fast selection of shorter gaps between cars, while the Countdown facilitates a feeling of safety and provides a better intersection overview.2022AMAndrii Matviienko et al.Technical University of DarmstadtExternal HMI (eHMI) — Communication with Pedestrians & CyclistsAR Navigation & Context AwarenessCHI
SkyPort: Investigating 3D Teleportation Methods in Virtual EnvironmentsTeleportation has become the de facto standard of locomotion in Virtual Reality (VR) environments. However, teleportation with parabolic and linear target aiming methods is restricted to horizontal 2D planes and it is unknown how they transfer to the 3D space. In this paper, we propose six 3D teleportation methods in virtual environments based on the combination of two existing aiming methods (linear and parabolic) and three types of transitioning to a target (instant, interpolated and continuous). To investigate the performance of the proposed teleportation methods, we conducted a controlled lab experiment (N = 24) with a mid-air coin collection task to assess accuracy, efficiency and VR sickness. We discovered that the linear aiming method leads to faster and more accurate target selection. Moreover, a combination of linear aiming and instant transitioning leads to the highest efficiency and accuracy without increasing VR sickness.2022AMAndrii Matviienko et al.Technical University of DarmstadtSocial & Collaborative VRImmersion & Presence ResearchCHI
CameraReady: Assessing the Influence of Display Types and Visualizations on Posture GuidanceComputer-supported posture guidance is used in sports, dance training, expression of art with movements, and learning gestures for interaction. At present, the influence of display types and visualizations have not been investigated in the literature. These factors are important as they directly impact perception and cognitive load, and hence influence the performance of participants. In this paper, we conducted a controlled experiment with 20 participants to compare the use of five display types with different screen sizes: smartphones, tablets, desktop monitors, TVs, and large displays. On each device, we compared three common visualizations for posture guidance: skeletons, silhouettes, and 3d body models. To conduct our assessment, we developed a mobile and cross-platform system that only requires a single camera. Our results show that compared to a smartphone display, larger displays show a lower error (12%). Regarding the choice of visualization, participants rated 3D body models as significantly more usable in comparison to a skeleton visualization.2021HEHesham Elsayed et al.Human Pose & Activity RecognitionDance & Body Movement ComputingDIS
Itsy-Bits: Fabrication and Recognition of 3D-Printed Tangibles with Small Footprints on Capacitive TouchscreensTangibles on capacitive touchscreens are a promising approach to overcome the limited expressiveness of touch input. While research has suggested many approaches to detect tangibles, the corresponding tangibles are either costly or have a considerable minimal size. This makes them bulky and unattractive for many applications. At the same time, they obscure valuable display space for interaction. To address these shortcomings, we contribute Itsy-Bits: a fabrication pipeline for 3D printing and recognition of tangibles on capacitive touchscreens with a footprint as small as a fingertip. Each Itsy-Bit consists of an enclosing 3D object and a unique conductive 2D shape on its bottom. Using only raw data of commodity capacitive touchscreens, Itsy-Bits reliably identifies and locates a variety of shapes in different sizes and estimates their orientation. Through example applications and a technical evaluation, we demonstrate the feasibility and applicability of Itsy-Bits for tangibles with small footprints.2021MSMartin Schmitz et al.Technical University of DarmstadtCircuit Making & Hardware PrototypingCustomizable & Personalized ObjectsCHI
3D-Auth: Two-Factor Authentication with Personalized 3D-Printed ItemsTwo-factor authentication is a widely recommended security mechanism and already offered for different services. However, known methods and physical realizations exhibit considerable usability and customization issues. In this paper, we propose 3D-Auth, a new concept of two-factor authentication. 3D-Auth is based on customizable 3D-printed items that combine two authentication factors in one object. The object bottom contains a uniform grid of conductive dots that are connected to a unique embedded structure inside the item. Based on the interaction with the item, different dots turn into touch-points and form an authentication pattern. This pattern can be recognized by a capacitive touchscreen. Based on an expert design study, we present an interaction space with six categories of possible authentication interactions. In a user study, we demonstrate the feasibility of 3D-Auth items and show that the items are easy to use and the interactions are easy to remember.2020KMKarola Marky et al.Technische Universität Darmstadt & Keio UniversityPasswords & AuthenticationCustomizable & Personalized ObjectsCHI
Podoportation: Foot-Based Locomotion in Virtual RealityVirtual Reality (VR) allows for infinitely large environments. However, the physical traversable space is always limited by real-world boundaries. This discrepancy between physical and virtual dimensions renders traditional locomotion methods used in real world unfeasible. To alleviate these limitations, research proposed various artificial locomotion concepts such as teleportation, treadmills, and redirected walking. However, these concepts occupy the user's hands, require complex hardware or large physical spaces. In this paper, we contribute nine VR locomotion concepts for foot-based locomotion, relying on the 3D position of the user's feet and the pressure applied to the sole as input modalities. We evaluate our concepts and compare them to state-of-the-art point & teleport technique in a controlled experiment with 20 participants. The results confirm the viability of our approaches for foot-based and engaging locomotion. Further, based on the findings, we contribute a wireless hardware prototype implementation.2020JWJulius von Willich et al.Technische Universität DarmstadtFull-Body Interaction & Embodied InputFoot & Wrist InteractionCHI
Improving the Usability and UX of the Swiss Internet Voting InterfaceUp to 20% of residential votes and up to 70% of absentee votes in Switzerland are cast online. The Swiss system aims to provide individual verifiability by different verification codes. The voters have to carry out verification on their own, making the usability and UX of the interface of great importance. To improve the usability, we first performed an evaluation with 12 human-computer interaction experts to uncover usability weaknesses of the Swiss Internet voting interface. Based on the experts' findings, related work, and an exploratory user study with 36 participants, we propose a redesign that we evaluated in a user study with 49 participants. Our study confirmed that the redesign indeed improves the detection of incorrect votes by 33% and increases the trust and understanding of the voters. Our studies furthermore contribute important lessons for designing verifiable e-voting systems in general.2020KMKarola Marky et al.Technische Universität Darmstadt & Keio UniversityPrivacy by Design & User ControlParticipatory DesignCHI
Walk The Line: Leveraging Lateral Shifts of the Walking Path as an Input Modality for Head-Mounted DisplaysRecent technological advances have made head-mounted displays (HMDs) smaller and untethered, fostering the vision of ubiquitous interaction in a digitally augmented physical world. Consequently, a major part of the interaction with such devices will happen on the go, calling for interaction techniques that allow users to interact while walking. In this paper, we explore lateral shifts of the walking path as a hands-free input modality. The available input options are visualized as lanes on the ground parallel to the user's walking path. Users can select options by shifting the walking path sideways to the respective lane. We contribute the results of a controlled experiment with 18 participants, confirming the viability of our approach for fast, accurate, and joyful interactions. Further, based on the findings of the controlled experiment, we present three example applications.2020FMFlorian Müller et al.Technische Universität DarmstadtFull-Body Interaction & Embodied InputEye Tracking & Gaze InteractionCHI
Assessing the Accuracy of Point & Teleport Locomotion with Orientation Indication for Virtual Reality using Curved TrajectoriesRoom-scale Virtual Reality (VR) systems have arrived in users' homes where tracked environments are set up in limited physical spaces. As most Virtual Environments (VEs) are larger than the tracked physical space, locomotion techniques are used to navigate in VEs. Currently, in recent VR games, point & teleport is the most popular locomotion technique. However, it only allows users to select the position of the teleportation and not the orientation that the user is facing after the teleport. This results in users having to manually correct their orientation after teleporting and possibly getting entangled by the cable of the headset. In this paper, we introduce and evaluate three different point & teleport techniques that enable users to specify the target orientation while teleporting. The results show that, although the three teleportation techniques with orientation indication increase the average teleportation time, they lead to a decreased need for correcting the orientation after teleportation.2019MFMarkus Funk et al.Technische Universität DarmstadtHead-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)Eye Tracking & Gaze InteractionImmersion & Presence ResearchCHI
Mind the Tap: Assessing Foot-Taps for Interacting with Head-Mounted DisplaysFrom voice commands and air taps to touch gestures on frames: Various techniques for interacting with head-mounted displays (HMDs) have been proposed. While these techniques have both benefits and drawbacks dependent on the current situation of the user, research on interacting with HMDs has not concluded yet. In this paper, we add to the body of research on interacting with HMDs by exploring foot-tapping as an input modality. Through two controlled experiments with a total of 36 participants, we first explore direct interaction with interfaces that are displayed on the floor and require the user to look down to interact. Secondly, we investigate indirect interaction with interfaces that, although operated by the user's feet, are always visible as they are floating in front of the user. Based on the results of the two experiments, we provide design recommendations for direct and indirect foot-based user interfaces.2019FMFlorian Müller et al.Technische Universität DarmstadtFoot & Wrist InteractionCHI
Off-Line Sensing: Memorizing Interactions in Passive 3D-Printed ObjectsEmbedding sensors into objects allow them to recognize various interactions. However, sensing usually requires active electronics that are often costly, need time to be assembled, and constantly draw power. Thus, we propose off-line sensing: passive 3D-printed sensors that detect one-time interactions, such as accelerating or flipping, but neither require active electronics nor power at the time of the interaction. They memorize a pre-defined interaction via an embedded structure filled with a conductive medium (e.g., a liquid). Whether a sensor was exposed to the interaction can be read-out via a capacitive touchscreen. Sensors are printed in a single pass on a consumer-level 3D printer. Through a series of experiments, we show the feasibility of off-line sensing.2018MSMartin Schmitz et al.Technische Universität DarmstadtDesktop 3D Printing & Personal FabricationCircuit Making & Hardware PrototypingCHI
SmartObjects: Sixth Workshop on Interacting with Smart ObjectsThe emergence of smart objects has the potential to radically change the way we interact with technology. Through embedded means for input and output, such objects allow for more natural and immediate interaction. The SmartObjects workshop will focus on how such embedded intelligence in objects situated in the user's physical environment can be used to provide more efficient and enjoyable interactions. We discuss the design from the technology and the user experience perspective.2018FMFlorian Müller et al.TU DarmstadtContext-Aware ComputingUbiquitous ComputingCHI