IrOnTex: Using Ironable 3D Printed Objects to Fabricate and Prototype Customizable Interactive TextilesYu等人提出IrOnTex,利用可熨烫3D打印对象制作定制交互式纺织品,实现快速原型设计。2024JYJiakun Yu et al.Desktop 3D Printing & Personal FabricationTextile Art & Craft DigitizationUbiComp
Classroom Dandelions: Visualising Participants' Position, Trajectories and Body Orientation Augments Teachers' SensemakingDespite the digital revolution, physical space remains the site for teaching and learning embodied knowledge and skills. Both teachers and students must develop spatial competencies to effectively use classroom spaces, enabling fluid verbal and non-verbal interaction. While video permits rich activity capture, it provides no support for quickly seeing activity patterns that can assist learning. In contrast, position tracking systems permit the automated modelling of spatial behaviour, opening new possibilities for feedback. This paper introduces the design rationale for "Dandelion Diagrams" that integrate participant location, trajectory and body orientation over a variable period. Applied in two authentic teaching contexts (a science laboratory, and a nursing simulation) we show how heatmaps showing only teacher/student location led to misinterpretations that were resolved by overlaying Dandelion Diagrams. Teachers also identified a variety of ways they could aid professional development. We conclude Dandelion Diagrams assisted sensemaking, but discuss the ethical risks of over-interpretation.2022GFGloria Fernandez-Nieto et al.University of Technology SydneyVisualization Perception & CognitionCollaborative Learning & Peer TeachingUser Research Methods (Interviews, Surveys, Observation)CHI
HapBead: On-Skin Microfluidic Haptic Interface using Tunable BeadOn-skin haptic interfaces using soft elastomers which are thin and flexible have significantly improved in recent years. Many are focused on vibrotactile feedback that requires complicated parameter tuning. Another approach is based on mechanical forces created via piezoelectric devices and other methods for non-vibratory haptic sensations like stretching, twisting. These are often bulky with electronic components and associated drivers are complicated with limited control of timing and precision. This paper proposes HapBead, a new on-skin haptic interface that is capable of rendering vibration like tactile feedback using microfluidics. HapBead leverages a microfluidic channel to precisely and agilely oscillate a small bead via liquid flow, which then generates various motion patterns in channel that creates highly tunable haptic sensations on skin. We developed a proof-of-concept design to implement thin, flexible and easily affordable HapBead platform, and verified its haptic rendering capabilities via attaching it to users' fingertips. A study was carried out and confirmed that participants could accurately tell six different haptic patterns rendered by HapBead. HapBead enables new wearable display applications with multiple integrated functionalities such as on-skin haptic doodles, visuo-haptic displays and haptic illusions.2020THTeng Han et al.Institute of Software, Chinese Academy of Sciences & University of Chinese Academy of SciencesVibrotactile Feedback & Skin StimulationHaptic WearablesBiosensors & Physiological MonitoringCHI
G-ID: Identifying 3D Prints Using Slicing ParametersWe present G-ID, a method that utilizes the subtle patterns left by the 3D printing process to distinguish and identify objects that otherwise look similar to the human eye. The key idea is to mark different instances of a 3D model by varying slicing parameters that do not change the model geometry but can be detected as machine-readable differences in the print. As a result, G-ID does not add anything to the object but exploits the patterns appearing as a by-product of slicing, an essential step of the 3D printing pipeline.<br>We introduce the G-ID slicing and labeling interface that varies the settings for each instance, and the G-ID mobile app, which uses image processing techniques to retrieve the parameters and their associated labels from a photo of the 3D printed object. Finally, we evaluate our method's accuracy under different lighting conditions, when objects were printed with different filaments and printers, and with pictures taken from various positions and angles.2020MDMustafa Doga Dogan et al.Massachusetts Institute of TechnologyCircuit Making & Hardware PrototypingCHI
Drift-Correction Techniques for Scale-Adaptive VR NavigationScale adaptive techniques for VR navigation enable users to navigate spaces larger than the real space available, while allowing precise interaction when required. However, due to these techniques gradually scaling displacements as the user moves, they introduce a Drift effect. That is, a user returning to the same point in VR will not return to the same point in the real space. This mismatch between the real/virtual spaces can grow over time, and turn the techniques unusable (i.e., users cannot reach their target locations). In this paper, we characterise and analyse the effects of Drift, highlighting its potential detrimental effects. We then propose two techniques to correct Drift effects and use a data driven approach (using navigation data from real users with a specific scale adaptive technique) to tune them, compare their performance and chose an optimum correction technique and configuration. Our user study, applying our technique in a different environment and with two different scale adaptive navigation techniques, shows that our correction technique can significantly reduce Drift effects and extend the life-span of the navigation techniques (i.e., time that they can be used before Drift draws targets unreachable), while not hindering users’ experience.2019RMRoberto A Montano-Murillo et al.AR Navigation & Context AwarenessImmersion & Presence ResearchUIST
VARI-SOUND: A Varifocal Lens for SoundCenturies of development in optics have given us passive devices (i.e. lenses, mirrors and filters) to enrich audience immersivity with light effects, but there is nothing similar for sound. Beam-forming in concert halls and outdoor gigs still requires a large number of speakers, while headphones are still the state-of-the-art for personalized audio immersivity in VR. In this work, we show how 3D printed acoustic meta-surfaces, assembled into the equivalent of optical systems, may offer a different solution. We demonstrate how to build them and how to use simple design tools, like the thin-lens equation, also for sound. We present some key acoustic devices, like a "collimator", to transform a standard computer speaker into an acoustic "spotlight"; and a "magnifying glass", to create sound sources coming from distinct locations than the speaker itself. Finally, we demonstrate an acoustic varifocal lens, discussing applications equivalent to auto-focus cameras and VR headsets and the limitations of the technology.2019GMGianluca Memoli et al.University of SussexShape-Changing Interfaces & Soft Robotic MaterialsMusic Composition & Sound Design ToolsCHI
LeviProps: Animating Levitated Optimized Fabric Structures using Holographic Acoustic Tweezers LeviProps are tangible structures used to create interactive mid-air experiences. They are composed of an acoustically-transparent lightweight piece of fabric and attached beads that act as levitated anchors. This combination enables real-time 6 Degrees-of-Freedom control of levitated structures which are larger and more diverse than those possible with previous acoustic approaches. LeviProps can be used as free-form interactive elements and also as projection surfaces. We developed an authoring tool to support the creation of LeviProps. Our tool employs the outline of the prop and the user constraints to compute the optimum locations for the anchors (i.e. maximizing trapping forces), increasing prop stability and maximum size. The tool produces a final LeviProp design which can be fabricated following a simple procedure. This paper evaluates our approach and showcases example applications such as interactive storytelling, games and mid-air displays.2019RGRafael Morales González et al.Mid-Air Haptics (Ultrasonic)Shape-Changing Interfaces & Soft Robotic MaterialsDigital Art Installations & Interactive PerformanceUIST
PickCells: A Physically Reconfigurable Cell-composed TouchscreenTouchscreens are the predominant medium for interactions with digital services; however, their current fixed form factor narrows the scope for rich physical interactions by limiting interaction possibilities to a single, planar surface. In this paper we introduce the concept of PickCells, a fully re-configurable device concept composed of cells, that breaks the mould of rigid screens and explores a modular system that affords rich sets of tangible interactions and novel across-device relationships. Through a series of co-design activities -- involving HCI experts and potential end-users of such systems -- we synthesised a design space aimed at inspiring future research, giving researchers and designers a framework in which to explore modular screen interactions. The design space we propose unifies existing works on modular touch surfaces under a general framework and broadens horizons by opening up unexplored spaces providing new interaction possibilities. In this paper, we present the PickCells concept, a design space of modular touch surfaces, and propose a toolkit for quick scenario prototyping.2019AGAlix Goguey et al.Swansea UniversityShape-Changing Interfaces & Soft Robotic MaterialsCustomizable & Personalized ObjectsCHI
Sampling Strategy for Ultrasonic Mid-Air HapticsMid-air tactile stimulation using ultrasonics has been used in a variety of human computer interfaces in the form of prototypes as well as products. When generating these tactile patterns with mid-air tactile ultrasonic displays, the common approach has been to sample the patterns using the hardware update rate capabilities to their full extent. In the current study we show that the hardware update rate can impact perception, but unexpectedly we find that higher update rates do not improve pattern perception. In a first user study, we highlight the effect of update rate on the perceived strength of a pattern, especially for patterns rendered at slow rate of less than 10 Hz. In a second user study, we identify the evolution of the optimal update rate according to variations in pattern size. Our main results show that update rate should be designated as additional parameter for tactile patterns. We also discuss how the relationships we defined in the current study can be implemented into designer tools so that designers remain oblivious to this additional complexity.2019WFWilliam Frier et al.University of SussexMid-Air Haptics (Ultrasonic)CHI
Collaborating Around Digital Tabletops: Children's Physical Strategies From India, The UK And FinlandWe present a study of children collaborating around interactive tabletops in three different countries: India, the United Kingdom and Finland. Our data highlights the key distinctive physical strategies used by children when performing collaborative tasks during this study. Children in India employ dynamic positioning with frequent physical contact and simultaneous object movement. Children in the UK tend to prefer static positioning with minimal physical contact and simultaneous object movement. Children in Finland use a mixture of dynamic and static positioning with minimal physical contact and object movement. Our findings indicate the importance of understanding collaboration strategies and behaviours when designing and deploying interactive tabletops in heterogeneous educational environments. We conclude with a discussion on how designers of tabletops for schools can provide opportunities for children in different countries to define and shape their own collaboration strategies for small group learning that take into account their different classroom practices.2018IJIzdihar Jamil et al.University of BristolK-12 Digital Education ToolsCollaborative Learning & Peer TeachingCHI
Beyond the Libet Clock: Modality Variants for Agency MeasurementsThe Sense of Agency (SoA) refers to our capability to control our own actions and influence the world around us. Recent research in HCI has been investigating SoA to provide users an instinctive sense of “I did that” as opposed to “the system did that”. However, current agency measurements are limited. The Intentional Binding (IB) paradigm provides an implicit measure of the SoA, however, it is constrained by requiring high visual attention to a “Libet clock” on-screen. In this paper, we extended the timing stimuli through auditory and tactile cues. Our results demonstrate that audio timing through voice commands and haptic timing through tactile cues on the hand, are an effective alternative measure of the SoA using the IB paradigm. They both address current limitations of the traditional method such as visual attention overload and lack of engagement. We discuss how our results can be applied to measure SoA in tasks involving different interactive scenarios such as in Mixed/Virtual Reality.2018PMPatricia I. Cornelio Martinez et al.University of SussexIn-Vehicle Haptic, Audio & Multimodal FeedbackHaptic WearablesBrain-Computer Interface (BCI) & NeurofeedbackCHI
Point-and-Shake: Selecting from Levitating Object DisplaysAcoustic levitation enables a radical new type of human-computer interface composed of small levitating objects. For the first time, we investigate the selection of such objects, an important part of interaction with a levitating object display. We present Point-and-Shake, a mid-air pointing interaction for selecting levitating objects, with feedback given through object movement. We describe the implementation of this technique and present two user studies that evaluate it. The first study found that users could accurately (96%) and quickly (4.1s) select objects by pointing at them. The second study found that users were able to accurately (95%) and quickly (3s) select occluded objects. These results show that Point-and-Shake is an effective way of initiating interaction with levitating object displays.2018EFEuan Freeman et al.University of GlasgowMid-Air Haptics (Ultrasonic)CHI
Mid-Air Haptics for Control InterfacesControl interfaces and interactions based on touch-less gesture tracking devices have become a prevalent research topic in both industry and academia. Touch-less devices offer a unique interaction immediateness that makes them ideal for applications where direct contact with a physical controller is not desirable. On the other hand, these controllers inherently lack active or passive haptic feedback to inform users about the results of their interaction. Mid-air haptic interfaces, such as those using focused ultrasound waves, can close the feedback loop and provide new tools for the design of touch-less, un-instrumented control interactions. The goal of this workshop is to bring together the growing mid-air haptic research community to identify and discuss future challenges in control interfaces and their application in AR/VR, automotive, music, robotics and teleoperation.2018MGMarcello Giordano et al.UltrahapticsMid-Air Haptics (Ultrasonic)CHI
SoundBender: Dynamic Acoustic Control Behind ObstaclesUltrasound manipulation is growing in popularity in the HCI community with applications in haptics, on-body interaction, and levitation-based displays. Most of these applications share two key limitations: a) the complexity of the sound fields that can be produced is limited by the physical size of the transducers, and b) no obstacles can be present between the transducers and the control point. We present SoundBender, a hybrid system that overcomes these limitations by combining the versatility of phased arrays of Transducers (PATs) with the precision of acoustic metamaterials. In this paper, we explain our approach to design and implement such hybrid modulators (i.e. to create complex sound fields) and methods to manipulate the field dynamically (i.e. stretch, steer). We demonstrate our concept using self-bending beams enabling both levitation and tactile feedback around an obstacle and present example applications enabled by SoundBender.2018MNMohd Adili Norasikin et al.In-Vehicle Haptic, Audio & Multimodal FeedbackMid-Air Haptics (Ultrasonic)Vibrotactile Feedback & Skin StimulationUIST
Tangible Drops: A Visio-Tactile Display Using Actuated Liquid-Metal DropletsWe present Tangible Drops, a visio-tactile display that for the first time provides physical visualization and tactile feedback using a planar liquid interface. It presents digital information interactively by tracing dynamic patterns on horizontal flat surfaces using liquid metal drops on a programmable electrode array. It provides tactile feedback with directional information in the 2D vector plane using linear locomotion and/or vibration of the liquid metal drops. We demonstrate move, oscillate, merge, split and dispense-from-reservoir functions of the liquid metal drops by consuming low power (450 mW per electrode) and low voltage (8--15 V). We report on results of our empirical study with 12 participants on tactile feedback using 8 mm diameter drops, which indicate that Tangible Drops can convey tactile sensations such as changing speed, varying direction and controlled oscillation with no visual feedback. We present the design space and demonstrate the applications of Tangible Drops, and conclude by suggesting potential future applications for the technique.2018DSDeepak Ranjan Sahoo et al.Swansea UniversityVibrotactile Feedback & Skin StimulationData PhysicalizationShape-Changing Materials & 4D PrintingCHI