SoilSense: Appropriating Soil-based Microbial Fuel Cells to Create Tangible InterfacesSoil-based Microbial Fuel Cells (SMFCs) offer a sustainable method for powering low-energy computing devices by harnessing electricity from microbial activity in soil. In this paper, we introduce SoilSense, a novel approach that repurposes SMFCs as tangible interfaces, transforming soil into an interactive, computationally responsive medium, instead of energy sources. We explore the voltage variations that occur when pressure is applied to the cathode and systematically characterize this mechanism across different electrode configurations and soil moisture levels. To demonstrate the feasibility of SMFC-based interfaces, we present a series of modular and proof-of-concept prototypes that support diverse interaction modalities. We further illustrate how SoilSense enables interactions through example applications and provide implications and envision for future studies to employ soil as an ecologically compatible material in interactive system design.2025TMShuto Takashita et al.Shape-Changing Materials & 4D PrintingEcological Design & Green ComputingEnergy Conservation Behavior & InterfacesUIST
eTactileKit: A Toolkit for Design Exploration and Rapid Prototyping of Electro-Tactile InterfacesElectro-tactile interfaces are becoming increasingly popular due to their unique advantages, such as delivering fast and localised tactile response, thin and flexible form factors, and the potential to create novel tactile experiences. However, insights from a formative study with typical designers highlighted the lack of resources, limited access to information and complexity of software and hardware tools. This establishes a high barrier to entry and limits the ability to rapidly prototype and experiment with electro-tactile interfaces. To address these challenges, we propose eTactileKit, a scalable and accessible toolkit providing end-to-end support for designing and prototyping electro-tactile interfaces. eTactileKit comprises a hardware platform and a software framework for designing, simulating and exploring electro-tactile stimuli. We evaluated the impact and usability of eTactileKit through a three-week long take-home study, which demonstrated increased accessibility, ease of use, and the toolkit's positive impact on design workflow. Additionally, we implemented a set of use cases to demonstrate the toolkit's practicality and effectiveness across various applications.2025PPPraneeth Bimsara Perera et al.Electrical Muscle Stimulation (EMS)Prototyping & User TestingUIST
Weight-Induced Consumed Endurance (WICE): A Model to Quantify Shoulder Fatigue with Weighted ObjectsFatigue is a major challenge in mid-air interactions, often resulting in a sensation of heaviness––particularly when users carry weighted objects on their arms. Existing models for characterising shoulder fatigue were primarily developed for bare-hand scenarios, limiting their applicability in situations involving encumbrance. In this paper, we introduce Weight-Induced Consumed Endurance (WICE), a novel model that accurately estimates shoulder fatigue when additional weight is attached at various locations on the arm. WICE enhances the calculation of instantaneous shoulder torque by incorporating information about the attached weight, integrates individual arm mass for more personalised fatigue estimation, and uses a Bayesian framework to simulate the distribution of shoulder fatigue. Our evaluation shows that WICE strongly correlates with both experimentally measured endurance time and subjective Borg CR10 ratings, demonstrating its reliability as an objective fatigue metric in both encumbered and no-weight conditions. We further demonstrate how WICE can be applied to examine the effects of controller and haptic devices on user fatigue. WICE provides a foundation for developing fatigue-aware systems that can sense and adapt encumbrance, allowing for more tailored ergonomic MR interactions.2025TLTinghui Li et al.Force Feedback & Pseudo-Haptic WeightFull-Body Interaction & Embodied InputBiosensors & Physiological MonitoringUIST
Estimating the Effects of Encumbrance and Walking on Mixed Reality InteractionThis paper investigates the effects of two situational impairments---encumbrance (i.e., carrying a heavy object) and walking---on interaction performance in canonical mixed reality tasks. We built Bayesian regression models of movement time, pointing offset, error rate, and throughput for target acquisition task, and throughput, UER, and CER for text entry task to estimate these effects. Our results indicate that 1.0 kg encumbrance increases selection movement time by 28%, decreases text entry throughput by 17%, and increase UER by 50%, but does not affect pointing offset. Walking led to a 63% increase in ray-cast movement time and a 51% reduction in text entry throughput. It also increased selection pointing offset by 16%, ray-cast pointing offset by 17%, and error rate by 8.4%. The interaction effect on 1.0 kg encumbrance and walking resulted in a 112% increase in ray-cast movement time. Our findings enhance the understanding of the effects of encumbrance and walking on mixed reality interaction, and contribute towards accumulating knowledge of situational impairments research in mixed reality.2025TLTinghui Li et al.University of Sydney, School of Computer ScienceFull-Body Interaction & Embodied InputMixed Reality WorkspacesCHI
Juggling Extra Limbs: Identifying Control Strategies for Supernumerary Multi-Arms in Virtual RealityUsing supernumerary multi-limbs for complex tasks is a growing research focus in Virtual Reality (VR) and robotics. Understanding how users integrate extra limbs with their own to achieve shared goals is crucial for developing efficient supernumeraries. This paper presents an exploratory user study (N=14) investigating strategies for controlling virtual supernumerary limbs with varying autonomy levels in VR object manipulation tasks. Using a Wizard-of-Oz approach to simulate semi-autonomous limbs, we collected both qualitative and quantitative data. Results show participants adapted control strategies based on task complexity and system autonomy, affecting task delegation, coordination, and body ownership. Based on these findings, we propose guidelines—commands, demonstration, delegation, and labeling instructions—to improve multi-limb interaction design by adapting autonomy to user needs and fostering better context-aware experiences.2025HZHongyu Zhou et al.The University of Sydney, School of Computer ScienceShape-Changing Interfaces & Soft Robotic MaterialsFull-Body Interaction & Embodied InputCHI
Integrating Force Sensing with Electro-Tactile Feedback in 3D Printed Haptic InterfacesTactile feedback mechanisms enhance the user experience of modern wearables by stimulating the sense of touch and enabling intuitive interactions. Electro-tactile stimulation-based tactile interfaces stand out due to their compact form factor and ability to deliver localized tactile sensations. Integrating force sensing with electro-tactile stimulation creates more responsive bidirectional systems that are beneficial in applications requiring precise control and feedback. However, current research often relies on separate sensors for force sensing, increasing system complexity and raising challenges in system scalability. We propose a novel approach that utilizes 3D-printed modified surfaces as the electro-tactile electrode interface to sense applied force and deliver feedback simultaneously without additional sensors. This method simplifies the system, maintains flexibility, and leverages the rapid prototyping capabilities of 3D printing. The functionality of this approach is validated through a user study (N=10), and two practical applications are proposed, both incorporating simultaneous sensing and tactile feedback.2024PPPraneeth Bimsara Perera et al.Force Feedback & Pseudo-Haptic WeightElectrical Muscle Stimulation (EMS)Desktop 3D Printing & Personal FabricationUbiComp
Efficient and Robust Heart Rate Estimation Approach for Noisy Wearable PPG Sensors Using Ideal Representation LearningPhotoplethysmography (PPG) is a non-invasive wearable sensing method used in millions of devices for heart rate monitoring. However, PPG signals are highly susceptible to a variety of noise sources, including motion artifacts, sensor noise, and biological factors, especially in real-world wearable settings. These make designing generalizable models to accurately interpret cardiac activities challenging. This paper proposes a focus shift from learning with noisy signals to utilizing the characteristics of a mathematically modelled PPG waveform in an adversarial setting to increase the signal-to-noise ratio. The results show the proposed approach is robust against noisy data. We evaluated the model in a user study (N=22), where it was tested against unseen PPG data collected from a new sensor and users under three different activity levels. Results showed the generalisability of the approach compared to the state-of-the-art and it maintains consistent performance improvements across diverse user activities. We successfully implemented our model on a commonly used (android) mobile device, confirming its ability to provide fast inferences in a resource-constrained setting.2024ANAmashi Niwarthana et al.Sleep & Stress MonitoringBiosensors & Physiological MonitoringUbiComp
Appropriate Incongruity Driven Human-AI Collaborative Tool to Assist Novices in Humorous Content GenerationCreating humorous content has been shown to improve an individual's emotional well-being by decreasing stress, overcoming anxiety, and enhancing interpersonal relationships. However, it is common knowledge that a good sense of humor is not common. In this paper, we propose a natural language processing (NLP) driven collaborative tool based on appropriate incongruity theory to assist novices in writing humorous content. We use cartoon-caption writing as the use case since it is a popular method where people engage in creating humorous content. The paper describes the design of our co-authoring tool and findings from a two-part user study where (1) 20 participants used our tool to co-author cartoon captions and (2) 66 participants evaluated those captions. Our findings show that the tool helped participants to identify incongruous visual elements in the cartoon, support ideation, and expand the narrative, resulting in co-authored captions frequently rated funnier than those written without the tool. This approach can be appropriated to other humor generation applications including creative writing, creating memes, sketch comedy, and advertising.2024HKHasindu Kariyawasam et al.Generative AI (Text, Image, Music, Video)AI-Assisted Creative WritingIUI
Double-Sided Tactile Interactions for Grasping in Virtual RealityFor grasping, tactile stimuli to multiple fingertips are crucial for realistic shape rendering and precise manipulation. Pinching is particularly important in virtual reality since it is frequently used to grasp virtual objects. However, the interaction space of tactile feedback around pinching is underexplored due to a lack of means to provide co-located but different stimulation to finger pads. We propose a double-sided electrotactile device with a thin and flexible form factor to fit within pinched fingerpads, comprising two overlapping 3 × 3 electrode arrays. Using this new tactile interface, we define a new concept of double-sided tactile interactions with three feedback modes: (1) single-sided stimulation, (2) simultaneous double-sided stimulation, and (3) spatiotemporal double-sided stimulation. Through two user studies, we (1) demonstrate that participants can accurately discriminate between single-sided and double-sided stimulation and find a qualitative difference in tactile sensation; and (2) confirm the occurrence of apparent tactile motion between fingers and present optimal parameters for continuous or discrete movements. Based on these findings, we demonstrate five VR applications to exemplify how double-sided tactile interactions can produce spatiotemporal movement of a virtual object between fingers and enrich touch feedback for UI operation.2023AJArata Jingu et al.Mid-Air Haptics (Ultrasonic)Vibrotactile Feedback & Skin StimulationImmersion & Presence ResearchUIST
Design, Mould, Grow!: A Fabrication Pipeline for Growing 3D Designs Using Myco-MaterialsThere is a growing interest in sustainable fabrication approaches, including the exploration of material conservation and utilisation of waste materials. Particularly, recent work has applied organic myco-materials, made from fungi, to develop tangible, interactive devices. However, a systematic approach for 3D fabrication using myco-materials is under-explored. In this paper, we present a parametric design tool and a fabrication pipeline to grow 3D designs using the mycelia of edible fungi species, such as Reishi or Oyster mushrooms. The proposed tool is designed based on empirical results from a series of technical evaluations of the geometric and material qualities of 3D-grown myco-objects. Furthermore, the paper introduces an easy-to-replicate fabrication process that can recycle different organic waste material combinations such as sawdust and coffee grounds to grow mycelia. Through a series of demonstration applications, we identify the challenges and opportunities for working with myco-materials in the HCI context.2023PGPhillip Gough et al.The University of Sydney, The University of SydneyShape-Changing Materials & 4D PrintingEcological Design & Green ComputingCHI
So Predictable! Continuous 3D Hand Trajectory Prediction in Virtual RealityWe contribute a novel user- and activity-independent kinematics-based regressive model for continuously predicting ballistic hand movements in virtual reality (VR). Compared to prior work on end-point prediction, continuous hand trajectory prediction in VR enables an early estimation of future events such as collisions between the user’s hand and virtual objects such as UI widgets. We developed and validated our prediction model through a user study with 20 participants. The study collected hand motion data with a 3D pointing task and a gaming task with three popular VR games. Results show that our model can achieve a low Root Mean Square Error (RMSE) of 0.80 cm, 0.85 cm and 3.15 cm from future hand positions ahead of 100 ms, 200 ms and 300 ms respectively across all the users and activities. In pointing tasks, our predictive model achieves an average angular error of 4.0° and 1.5° from the true landing position when 50% and 70% of the way through the movement. A follow-up study showed that the model can be applied to new users and new activities without further training.2021NGNisal Menuka Gamage et al.Hand Gesture RecognitionFull-Body Interaction & Embodied InputImmersion & Presence ResearchUIST
Multi-Touch Kit: A Do-It-Yourself Technique for Capacitive Multi-Touch Sensing Using a Commodity MicrocontrollerMutual capacitance-based multi-touch sensing is now a ubiquitous and high-fidelity input technology. However, due to the complexity of electrical and signal processing requirements, it remains very challenging to create interface prototypes with custom-designed multi-touch input surfaces. In this paper, we introduce Multi-Touch Kit, a technique enabling electronics novices to rapidly prototype customized capacitive multi-touch sensors. In contrast to existing techniques, it works with a commodity microcontroller and open-source software, and does not require any specialized hardware. Evaluation results show that our approach enables multi-touch sensors that have a high spatial and temporal resolution, and can accurately detect multiple simultaneous touches. A set of application examples demonstrate the versatile uses of our approach for sensors of different scales, curvature, and materials.2019NPNarjes Pourjafarian et al.Circuit Making & Hardware PrototypingMakerspace CultureComputational Methods in HCIUIST
Tactlets: Adding Tactile Feedback to 3D Objects Using Custom Printed ControlsRapid prototyping of tactile output on 3D objects promises to enable a more widespread use of the tactile channel for ubiquitous, tangible and wearable computing. Existing prototyping approaches, however, have limited tactile output capabilities, require advanced skills for design and fabrication, or are incompatible with curved object geometries. In this paper, we present a novel digital fabrication approach for printing custom, high-resolution controls for electro-tactile output with integrated touch sensing on interactive objects. It supports curved geometries of everyday objects. We contribute a design tool for modeling, testing and refining tactile input and output at a high level of abstraction, based on parameterized tactile controls. We further contribute an inventory of 10 parametric Tactlet controls that integrate sensing of user input with real-time tactile feedback. We present two approaches for printing Tactlets on 3D objects, using conductive inkjet printing or FDM 3D-printing. Empirical results from a psychophysical study and findings from two practical application cases confirm the functionality and practical feasibility of the Tactlets approach.2019DGDaniel Groeger et al.Vibrotactile Feedback & Skin StimulationShape-Changing Interfaces & Soft Robotic MaterialsCircuit Making & Hardware PrototypingUIST