From Pegs to Pixels: A Comparative Analysis of the Nine Hole Peg Test and a Digital Copy Drawing Test for Fine Motor Control AssessmentUser interaction with digital systems requires Fine Motor Control (FMC), especially if the interfaces are complex or require high fidelity and fine-grained interactions. Despite its importance, Fine Motor Control is often overlooked in interactive system design, partly because of its complex assessment. Measuring changes in fine motor abilities due to prolonged use or fatigue currently requires repeated manual testing. This paper analyzes the concept of using the digital mobile devices' input behavior to assess the user's Fine Motor Control. For this, we show that Fine Motor Control can be assessed for touch and stylus-based interaction with a digital mobile system. We conducted a user study, where participants performed a Nine Hole Peg Test and a predefined Copy Drawing Test before and after exercises that affect fine motor skills. Based on this data, we investigated how metrics such as pressure, velocity, and entropy for touch and stylus input can be used to predict Fine Motor Control.2025DSDominik Schön et al.Motor Impairment Assistive Input TechnologiesPrototyping & User TestingMobileHCI
A Qualitative Investigation of User Transitions and Frictions in Cross-Reality ApplicationsResearch in Augmented Reality (AR) and Virtual Reality (VR) has mostly viewed them in isolation. Yet, when used together in practical settings, AR and VR each offer unique strengths, necessitating multiple transitions to harness their advantages. This paper investigates potential challenges in Cross-Reality (CR) transitions to inform future application design. We implemented a CR system featuring a 3D modeling task that requires users to switch between PC, AR, and VR. Using a talk-aloud study (n=12) and thematic analysis, we revealed that frictions primarily arose when transitions conflicted with users' Spatial Mental Model (SMM). Furthermore, we found five transition archetypes employed to enhance productivity once an SMM was established. Our findings uncover that transitions have to focus on establishing and upholding the SMM of users across realities, by communicating differences between them.2025JWJulius von Willich et al.TU Darmstadt, Telecooperation LabMixed Reality WorkspacesImmersion & Presence ResearchCHI
ThermalPen: Investigating the Influence of Thermal Haptic Feedback for Creativity in 3D SketchingThis paper presents ThermalPen, a novel device for 3D sketching that utilizes thermal feedback to allow users to feel the materiality of their sketches. The pen lets users draw using six colours and three textures mapped to different temperatures. Our goal is to investigate the influence of thermal feedback on user creativity for 3D sketching. In a user study with 24 participants, we asked them to draw with and without thermal feedback. Our results show that thermal feedback improved user creativity for specific tasks. Qualitative results also indicate an effect on the user experience. Our work contributes to understanding how thermal feedback can increase user satisfaction with 3D sketching and provide insights and directions for future work.2024PHPhilipp Pascal Hoffmann et al.Vibrotactile Feedback & Skin StimulationShape-Changing Interfaces & Soft Robotic MaterialsC&C
Decide Yourself or Delegate - User Preferences Regarding the Autonomy of Personal Privacy Assistants in Private IoT-Equipped EnvironmentsPersonalized privacy assistants (PPAs) communicate privacy-related decisions of their users to Internet of Things (IoT) devices. There are different ways to implement PPAs by varying the degree of autonomy or decision model. This paper investigates user perceptions of PPA autonomy models and privacy profiles - archetypes of individual privacy needs - as a basis for PPA decisions in private environments (e.g., a friend's home). We first explore how privacy profiles can be assigned to users and propose an assignment method. Next, we investigate user perceptions in 18 usage scenarios with varying contexts, data types and number of decisions in a study with 1126 participants. We found considerable differences between the profiles in settings with few decisions. If the number of decisions gets high (> 1/h), participants exclusively preferred fully autonomous PPAs. Finally, we discuss implications and recommendations for designing scalable PPAs that serve as privacy interfaces for future IoT devices.2024KMKarola Marky et al.Ruhr-University BochumPrivacy by Design & User ControlPrivacy Perception & Decision-MakingIoT Device PrivacyCHI
Tailor Twist: Assessing Rotational Mid-Air Interactions for Augmented RealityMid-air gestures, widely used in today's Augmented Reality (AR) applications, are prone to the “gorilla arm” effect, leading to discomfort with prolonged interactions. While prior work has proposed metrics to quantify this effect and means to improve comfort and ergonomics, these works usually only consider simplistic, one-dimensional AR interactions, like reaching for a point or pushing a button. However, interacting with AR environments also involves far more complex tasks, such as rotational knobs, potentially impacting ergonomics. This paper advances the understanding of the ergonomics of rotational mid-air interactions in AR. For this, we contribute the results of a controlled experiment exposing the participants to a rotational task in the interaction space defined by their arms' reach. Based on the results, we discuss how novel future mid-air gesture modalities benefit from our findings concerning ergonomic-aware rotational interaction.2023DSDominik Schön et al.Technical University of DarmstadtFull-Body Interaction & Embodied InputAR Navigation & Context AwarenessCHI
Smooth as Steel Wool: Effects of Visual Stimuli on the Haptic Perception of Roughness in Virtual RealityHaptic Feedback is essential for lifelike Virtual Reality (VR) experiences. To provide a wide range of matching sensations of being touched or stroked, current approaches typically need large numbers of different physical textures. However, even advanced devices can only accommodate a limited number of textures to remain wearable. Therefore, a better understanding is necessary of how expectations elicited by different visualizations affect haptic perception, to achieve a balance between physical constraints and great variety of matching physical textures. In this work, we conducted an experiment (N=31) assessing how the perception of roughness is affected within VR. We designed a prototype for arm stroking and compared the effects of different visualizations on the perception of physical textures with distinct roughnesses. Additionally, we used the visualizations' real-world materials, no-haptics and vibrotactile feedback as baselines. As one result, we found that two levels of roughness can be sufficient to convey a realistic illusion.2022SGSebastian Günther et al.Technical University of DarmstadtVibrotactile Feedback & Skin StimulationImmersion & Presence ResearchCHI
BikeAR: Understanding Cyclists' Crossing Decision-Making at Uncontrolled Intersections using Augmented RealityCycling has become increasingly popular as a means of transportation. However, cyclists remain a highly vulnerable group of road users. According to accident reports, one of the most dangerous situations for cyclists are uncontrolled intersections, where cars approach from both directions. To address this issue and assist cyclists in crossing decision-making at uncontrolled intersections, we designed two visualizations that: (1) highlight occluded cars through an X-ray vision and (2) depict the remaining time the intersection is safe to cross via a Countdown. To investigate the efficiency of these visualizations, we proposed an Augmented Reality simulation as a novel evaluation method, in which the above visualizations are represented as AR, and conducted a controlled experiment with 24 participants indoors. We found that the X-ray ensures a fast selection of shorter gaps between cars, while the Countdown facilitates a feeling of safety and provides a better intersection overview.2022AMAndrii Matviienko et al.Technical University of DarmstadtExternal HMI (eHMI) — Communication with Pedestrians & CyclistsAR Navigation & Context AwarenessCHI
SkyPort: Investigating 3D Teleportation Methods in Virtual EnvironmentsTeleportation has become the de facto standard of locomotion in Virtual Reality (VR) environments. However, teleportation with parabolic and linear target aiming methods is restricted to horizontal 2D planes and it is unknown how they transfer to the 3D space. In this paper, we propose six 3D teleportation methods in virtual environments based on the combination of two existing aiming methods (linear and parabolic) and three types of transitioning to a target (instant, interpolated and continuous). To investigate the performance of the proposed teleportation methods, we conducted a controlled lab experiment (N = 24) with a mid-air coin collection task to assess accuracy, efficiency and VR sickness. We discovered that the linear aiming method leads to faster and more accurate target selection. Moreover, a combination of linear aiming and instant transitioning leads to the highest efficiency and accuracy without increasing VR sickness.2022AMAndrii Matviienko et al.Technical University of DarmstadtSocial & Collaborative VRImmersion & Presence ResearchCHI
Reducing Virtual Reality Sickness for Cyclists in VR Bicycle SimulatorsVirtual Reality (VR) bicycle simulations aim to recreate the feeling of riding a bicycle and are commonly used in many application areas. However, current solutions still create mismatches between the visuals and physical movement, which causes VR sickness and diminishes the cycling experience. To reduce VR sickness in bicycle simulators, we conducted two controlled lab experiments addressing two main causes of VR sickness: (1) steering methods and (2) cycling trajectory. In the first experiment (N = 18) we compared handlebar, HMD, and upper-body steering methods. In the second experiment (N = 24) we explored three types of movement in VR (1D, 2D, and 3D trajectories) and three countermeasures (airflow, vibration, and dynamic Field-of-View) to reduce VR sickness. We found that handlebar steering leads to the lowest VR sickness without decreasing cycling performance and airflow suggests to be the most promising method to reduce VR sickness for all three types of trajectories.2022AMAndrii Matviienko et al.Technical University of DarmstadtMotion Sickness & Passenger ExperienceMicromobility (E-bike, E-scooter) InteractionImmersion & Presence ResearchCHI
CameraReady: Assessing the Influence of Display Types and Visualizations on Posture GuidanceComputer-supported posture guidance is used in sports, dance training, expression of art with movements, and learning gestures for interaction. At present, the influence of display types and visualizations have not been investigated in the literature. These factors are important as they directly impact perception and cognitive load, and hence influence the performance of participants. In this paper, we conducted a controlled experiment with 20 participants to compare the use of five display types with different screen sizes: smartphones, tablets, desktop monitors, TVs, and large displays. On each device, we compared three common visualizations for posture guidance: skeletons, silhouettes, and 3d body models. To conduct our assessment, we developed a mobile and cross-platform system that only requires a single camera. Our results show that compared to a smartphone display, larger displays show a lower error (12%). Regarding the choice of visualization, participants rated 3D body models as significantly more usable in comparison to a skeleton visualization.2021HEHesham Elsayed et al.Human Pose & Activity RecognitionDance & Body Movement ComputingDIS
Let’s Frets! Assisting Guitar Students during Practice via Capacitive SensingLearning a musical instrument requires regular exercise. However, students are often on their own during their practice sessions due to the limited time with their teachers, which increases the likelihood of mislearning playing techniques. To address this issue, we present Let's Frets - a modular guitar learning system that provides visual indicators and capturing of finger positions on a 3D-printed capacitive guitar fretboard. We based the design of Let's Frets on requirements collected through in-depth interviews with professional guitarists and teachers. In a user study (N=24), we evaluated the feedback modules of Let's Frets against fretboard charts. Our results show that visual indicators require the least time to realize new finger positions while a combination of visual indicators and position capturing yielded the highest playing accuracy. We conclude how Let's Frets enables independent practice sessions that can be translated to other musical instruments.2021KMKarola Marky et al.Technische Universität DarmstadtElectrical Muscle Stimulation (EMS)Haptic WearablesProgramming Education & Computational ThinkingCHI
Oh, Snap! A Fabrication Pipeline to Magnetically Connect Conventional and 3D-Printed Electronics3D printing has revolutionized rapid prototyping by speeding up the creation of custom-shaped objects. With the rise of multi-material 3Dprinters, these custom-shaped objects can now be made interactive in a single pass through passive conductive structures. However, connecting conventional electronics to these conductive structures often still requires time-consuming manual assembly involving many wires, soldering or gluing. To alleviate these shortcomings, we propose Oh, Snap!: a fabrication pipeline and interfacing concept to magnetically connect a 3D-printed object equipped with passive sensing structures to conventional sensing electronics. To this end, Oh, Snap! utilizes ferromagnetic and conductive 3D-printed structures, printable in a single pass on standard printers. We further present a proof-of-concept capacitive sensing board that enables easy and robust magnetic assembly to quickly create interactive 3D-printed objects. We evaluate Oh, Snap! by assessing the robustness and quality of the connection and demonstrate its broad applicability by a series of example applications.2021MSMartin Schmitz et al.Technical University of DarmstadtDesktop 3D Printing & Personal FabricationCircuit Making & Hardware PrototypingCHI
Itsy-Bits: Fabrication and Recognition of 3D-Printed Tangibles with Small Footprints on Capacitive TouchscreensTangibles on capacitive touchscreens are a promising approach to overcome the limited expressiveness of touch input. While research has suggested many approaches to detect tangibles, the corresponding tangibles are either costly or have a considerable minimal size. This makes them bulky and unattractive for many applications. At the same time, they obscure valuable display space for interaction. To address these shortcomings, we contribute Itsy-Bits: a fabrication pipeline for 3D printing and recognition of tangibles on capacitive touchscreens with a footprint as small as a fingertip. Each Itsy-Bit consists of an enclosing 3D object and a unique conductive 2D shape on its bottom. Using only raw data of commodity capacitive touchscreens, Itsy-Bits reliably identifies and locates a variety of shapes in different sizes and estimates their orientation. Through example applications and a technical evaluation, we demonstrate the feasibility and applicability of Itsy-Bits for tangibles with small footprints.2021MSMartin Schmitz et al.Technical University of DarmstadtCircuit Making & Hardware PrototypingCustomizable & Personalized ObjectsCHI
3D-Auth: Two-Factor Authentication with Personalized 3D-Printed ItemsTwo-factor authentication is a widely recommended security mechanism and already offered for different services. However, known methods and physical realizations exhibit considerable usability and customization issues. In this paper, we propose 3D-Auth, a new concept of two-factor authentication. 3D-Auth is based on customizable 3D-printed items that combine two authentication factors in one object. The object bottom contains a uniform grid of conductive dots that are connected to a unique embedded structure inside the item. Based on the interaction with the item, different dots turn into touch-points and form an authentication pattern. This pattern can be recognized by a capacitive touchscreen. Based on an expert design study, we present an interaction space with six categories of possible authentication interactions. In a user study, we demonstrate the feasibility of 3D-Auth items and show that the items are easy to use and the interactions are easy to remember.2020KMKarola Marky et al.Technische Universität Darmstadt & Keio UniversityPasswords & AuthenticationCustomizable & Personalized ObjectsCHI
Podoportation: Foot-Based Locomotion in Virtual RealityVirtual Reality (VR) allows for infinitely large environments. However, the physical traversable space is always limited by real-world boundaries. This discrepancy between physical and virtual dimensions renders traditional locomotion methods used in real world unfeasible. To alleviate these limitations, research proposed various artificial locomotion concepts such as teleportation, treadmills, and redirected walking. However, these concepts occupy the user's hands, require complex hardware or large physical spaces. In this paper, we contribute nine VR locomotion concepts for foot-based locomotion, relying on the 3D position of the user's feet and the pressure applied to the sole as input modalities. We evaluate our concepts and compare them to state-of-the-art point & teleport technique in a controlled experiment with 20 participants. The results confirm the viability of our approaches for foot-based and engaging locomotion. Further, based on the findings, we contribute a wireless hardware prototype implementation.2020JWJulius von Willich et al.Technische Universität DarmstadtFull-Body Interaction & Embodied InputFoot & Wrist InteractionCHI
Improving the Usability and UX of the Swiss Internet Voting InterfaceUp to 20% of residential votes and up to 70% of absentee votes in Switzerland are cast online. The Swiss system aims to provide individual verifiability by different verification codes. The voters have to carry out verification on their own, making the usability and UX of the interface of great importance. To improve the usability, we first performed an evaluation with 12 human-computer interaction experts to uncover usability weaknesses of the Swiss Internet voting interface. Based on the experts' findings, related work, and an exploratory user study with 36 participants, we propose a redesign that we evaluated in a user study with 49 participants. Our study confirmed that the redesign indeed improves the detection of incorrect votes by 33% and increases the trust and understanding of the voters. Our studies furthermore contribute important lessons for designing verifiable e-voting systems in general.2020KMKarola Marky et al.Technische Universität Darmstadt & Keio UniversityPrivacy by Design & User ControlParticipatory DesignCHI
Walk The Line: Leveraging Lateral Shifts of the Walking Path as an Input Modality for Head-Mounted DisplaysRecent technological advances have made head-mounted displays (HMDs) smaller and untethered, fostering the vision of ubiquitous interaction in a digitally augmented physical world. Consequently, a major part of the interaction with such devices will happen on the go, calling for interaction techniques that allow users to interact while walking. In this paper, we explore lateral shifts of the walking path as a hands-free input modality. The available input options are visualized as lanes on the ground parallel to the user's walking path. Users can select options by shifting the walking path sideways to the respective lane. We contribute the results of a controlled experiment with 18 participants, confirming the viability of our approach for fast, accurate, and joyful interactions. Further, based on the findings of the controlled experiment, we present three example applications.2020FMFlorian Müller et al.Technische Universität DarmstadtFull-Body Interaction & Embodied InputEye Tracking & Gaze InteractionCHI
./trilaterate: A Fabrication Pipeline to Design and 3D Print Hover-, Touch-, and Force-Sensitive ObjectsHover, touch, and force are promising input modalities that get increasingly integrated into screens and everyday objects. However, these interactions are often limited to flat surfaces and the integration of suitable sensors is time-consuming and costly. To alleviate these limitations, we contribute Trilaterate: A fabrication pipeline to 3D print custom objects that detect the 3D position of a finger hovering, touching, or forcing them by combining multiple capacitance measurements via capacitive trilateration. Trilaterate places and routes actively-shielded sensors inside the object and operates on consumer-level 3D printers. We present technical evaluations and example applications that validate and demonstrate the wide applicability of Trilaterate.2019MSMartin Schmitz et al.Technische Universität DarmstadtShape-Changing Interfaces & Soft Robotic MaterialsCircuit Making & Hardware PrototypingCHI
Assessing the Accuracy of Point & Teleport Locomotion with Orientation Indication for Virtual Reality using Curved TrajectoriesRoom-scale Virtual Reality (VR) systems have arrived in users' homes where tracked environments are set up in limited physical spaces. As most Virtual Environments (VEs) are larger than the tracked physical space, locomotion techniques are used to navigate in VEs. Currently, in recent VR games, point & teleport is the most popular locomotion technique. However, it only allows users to select the position of the teleportation and not the orientation that the user is facing after the teleport. This results in users having to manually correct their orientation after teleporting and possibly getting entangled by the cable of the headset. In this paper, we introduce and evaluate three different point & teleport techniques that enable users to specify the target orientation while teleporting. The results show that, although the three teleportation techniques with orientation indication increase the average teleportation time, they lead to a decreased need for correcting the orientation after teleportation.2019MFMarkus Funk et al.Technische Universität DarmstadtHead-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)Eye Tracking & Gaze InteractionImmersion & Presence ResearchCHI
Mind the Tap: Assessing Foot-Taps for Interacting with Head-Mounted DisplaysFrom voice commands and air taps to touch gestures on frames: Various techniques for interacting with head-mounted displays (HMDs) have been proposed. While these techniques have both benefits and drawbacks dependent on the current situation of the user, research on interacting with HMDs has not concluded yet. In this paper, we add to the body of research on interacting with HMDs by exploring foot-tapping as an input modality. Through two controlled experiments with a total of 36 participants, we first explore direct interaction with interfaces that are displayed on the floor and require the user to look down to interact. Secondly, we investigate indirect interaction with interfaces that, although operated by the user's feet, are always visible as they are floating in front of the user. Based on the results of the two experiments, we provide design recommendations for direct and indirect foot-based user interfaces.2019FMFlorian Müller et al.Technische Universität DarmstadtFoot & Wrist InteractionCHI