Automatic Tuning of Haptic Motion Effects to Evoke Specific Feelings in Multisensory ContentAutomating the authoring of haptic motion effects, while enabling designers to carefully consider user feelings to provide high-quality user experiences, is crucial for effective multisensory content. We present a motion effect-tuning method that elicits desired perceptual or affective attributes from users watching a video. To this end, we test three modulation methods: (1) Altering the extent of low-frequency motion fluctuations, (2) Changing the motion amplitude in a high-frequency band, and (3) Sampling and interpolating significant motion peaks. Our tuning method transforms an input draft waveform using the modulation techniques to obtain an output motion effect that elicits the goal adjective scores. This method requires two regression models accounting for the effects of motion modulation and audiovisual stimuli, respectively, and we obtain them by conducting perceptual experiments. Lastly, we confirm the method's effectiveness through another user study and explore potential users' feedback and suggestions for future applications through open-ended survey questions.2025JLJiwan Lee et al.Pohang University of Science and Technology (POSTECH), Computer Science and EngineeringVibrotactile Feedback & Skin StimulationMusic Composition & Sound Design ToolsVideo Production & EditingCHI
Real-time Semantic Full-Body Haptic Feedback Converted from Sound for Virtual Reality GameplayWe present a multisensory virtual reality (VR) system that enables users to experience concurrent visual, auditory, and haptic feedback, featuring semantic classification of events from sound, sound-to-haptic conversion, and full-body haptic effects. This concept is applied to enhance the user experience of virtual reality (VR) gameplay. The system utilizes a Long-Short-Term Memory (LSTM) model to classify game sounds and detect key events such as gunfire, explosions, and hits. These events are translated into full-body haptic patterns through a haptic suit, providing users with realistic and immersive haptic experiences. The system operates with low latency, ensuring the seamless synchrony between sound and haptic feedback. Evaluations through user studies demonstrate significant improvements in user experience compared to traditional sound-to-haptic methods, emphasizing the importance of accurate sound classification and well-designed haptic effects.2025GYGyeore Yun et al.Kyungpook National University, Computer Science & Engineering / College of IT Engineering /Full-Body Interaction & Embodied InputGamification DesignCHI
SkinHaptics: Exploring Skin Softness Perception and Virtual Body Embodiment Techniques to Enhance Self-Haptic InteractionsProviding haptic feedback for soft, deformable objects is challenging, requiring complex mechanical hardware combined with modeling and rendering software. As an alternative, we advance the concept of self-haptics, where the user's own body delivers physical feedback, to convey dynamically varying softness in VR. Skin can exhibit different levels of contact softness by altering the biomechanical state of the body. We propose SkinHaptics, a device-free approach that changes the states of musculoskeletal structures and virtual hand-object representations. In this study, we conduct three experiments to demonstrate SkinHaptics. Using the same scale, we measure skin softness across various hand poses and contact points and evaluate the just noticeable difference in skin softness. We investigate the effect of hand-object representations on self-haptic interactions. Our findings indicate that the visual representations have a significant influence on the embodiment of a self-haptic hand, and the degree of the hand embodiment strongly affects the haptic experience.2025JLJungeun Lee et al.Pohang University of Science and Technology (POSTECH), Convergence IT Engineering / Interaction LaboratoryHaptic WearablesEye Tracking & Gaze InteractionCHI
Augmenting Perceived Length of Handheld Controllers: Effects of Object Handle PropertiesIn the realm of virtual reality (VR), shape-changing controllers have emerged as a means to enhance visuo-haptic congruence during user interactions. The major emphasis has been placed on manipulating the inertia tensor of a shape-changing controller to control the perceived shape. This paper delves deeper by exploring how the material properties of the controller's handle, distinct from the inertial information, affect the perceived shape, focusing on the perceived length. We conducted three perceptual experiments to examine the effects of the handle's softness, thermal conductivity, and texture, respectively. Results demonstrated that a softer handle increases the perceived length, whereas a handle with higher thermal conductivity reduces it. Texture, in the form of varying bumps, also alters the length perception. These results provide more comprehensive knowledge of the intricate relationship between perceived length and controller handle properties, expanding the design alternatives for shape-changing controllers for immersive VR experiences.2024CPChaeyong Park et al.Pohang University of Science and Technology (POSTECH)Force Feedback & Pseudo-Haptic WeightShape-Changing Interfaces & Soft Robotic MaterialsCHI
Generating Real-Time, Selective, and Multimodal Haptic Effects from Sound for Gaming Experience EnhancementWe propose an algorithm that generates a vibration, an impact, or a vibration+impact haptic effect by processing a sound signal in real time. Our algorithm is selective in that it matches the most appropriate type of haptic effects to the sound using a machinelearning classifier (random forest) that is built on expert-labeled datasets. Our algorithm is tailored to enhance user experiences for video game play, and we present two examples for the RPG (roleplaying game) and FPS (first-person shooter) genres. We demonstrate the effectiveness of our algorithm by a user study in comparison to other state-of-the-art (SOTA) methods for the same crossmodal conversion. Our system elicits better multisensory user experiences than the SOTA algorithms for both game genres.2023GYGyeore Yun et al.POSTECHVibrotactile Feedback & Skin StimulationGame UX & Player BehaviorCHI
Generating Haptic Motion Effects for Multiple Articulated Bodies for Improved 4D Experiences: A Camera Space ApproachMotion effects are indispensable for improving 4D experiences in highly interactive applications, such as amusement parks, 4D theaters, and virtual reality games. Their recent emergence calls for effective algorithms generating motion effects synchronized with audiovisual content. This paper presents an automatic algorithm for synthesizing the object-based motion effects that express the movements of multiple articulated bodies inclusively {\hb when the objects' motion trajectories are available in the 3D camera space.} By taking the visual velocities and sizes of all object parts, our method computes a \textit{motion proxy} that represents the objects' movements by one point and converts the motion proxy to a motion command through a motion cueing algorithm. The motion proxy is determined by linearly combining the velocities, and its best combination was selected from several candidates by user studies. The results of user studies indicate that our algorithm can produce compelling object-based motion effects that enhance the multisensory experience.2023SHSangyoon Han et al.Pohang University of Science and Technology (POSTECH)Shape-Changing Interfaces & Soft Robotic MaterialsShape-Changing Materials & 4D PrintingCHI
Visuo-haptic Crossmodal Shape Perception Model for Shape-Changing Handheld Controllers Bridged by Inertial TensorWe present a visuo-haptic crossmodal model of shape perception designed for shape-changing handheld controllers. The model uses the inertia tensor of an object to bridge the two senses. The model was constructed from the results of three perceptual experiments. In the first two experiments, we validate that the primary moment and product of inertia (MOI and POI) in the inertia tensor have critical effects on the haptic perception of object length and asymmetry. Then, we estimate a haptic-to-visual shape matching model using MOI and POI as two link variables from the results of the third experiment for crossmodal magnitude production. Finally, we validate in a summative user study that the inverse of the shape matching model is effective for pairing a perceptually-congruent haptic object from a virtual object-the functionality we need for shape-changing handheld interfaces to afford perceptually-fulfilling sensory experiences in virtual reality.2023CPChaeyong Park et al.Pohang University of Science and TechnologyHaptic WearablesShape-Changing Interfaces & Soft Robotic MaterialsCHI
Vibration-Augmented Buttons: Information Transmission Capacity and Application to Interaction DesignOne can embed a vibration actuator to a physical button and augment the physical button's original kinesthetic response with a programmable vibration generated by the actuator. Such vibration-augmented buttons inherit the advantages of both physical and virtual buttons. This paper reports the information transmission capacity of vibration-augmented buttons. It was obtained by conducting a series of absolute identification experiments while increasing the number of augmented buttons. The information transmission capacity found was 2.6 bits, and vibration-augmented and physical buttons showed similar abilities in rendering easily recognizable haptic responses. In addition, we showcase a VR text entry application that utilizes vibration-augmented buttons. Our method provides several error messages to the user during text entry using a VR controller that includes an augmented button. We validate that the variable haptic feedback improves task performance, cognitive workload, and user experience for a transcription task.2022CPChaeyong Park et al.Pohang University of Science and TechnologyIn-Vehicle Haptic, Audio & Multimodal FeedbackVibrotactile Feedback & Skin StimulationCHI
Identifying Contact Fingers on Touch Sensitive Surfaces by Ring-Based Vibratory CommunicationAs computing paradigms shift toward mobile and ubiquitous interaction, there is an increasing demand for wearable interfaces supporting multifaceted input in smart living environments. In this regard, we introduce a system that identifies contact fingers using vibration as a modality of communication. We investigate the vibration characteristics of the communication channels involved and simulate the transmission of vibration sequences. In the simulation, we test and refine modulation and demodulation methods to design vibratory communication protocols that are robust to environmental noises and can detect multiple simultaneous contact fingers. As a result, we encode an on-off keying sequence with a unique carrier frequency to each finger and demodulate the sequences by applying cross-correlation. We verify the communication protocols in two environments, laboratory and cafe, where the resulting highest accuracy was 93 % and 90.5 %, respectively. Our system achieves over 91 % accuracy in identifying seven contact states from three fingers while wearing only two actuator rings with the aid of a touch screen. Our findings shed light on diversifying touch interactions on rigid surfaces by means of vibratory communication.2021SOSeungjae Oh et al.Vibrotactile Feedback & Skin StimulationHaptic WearablesUIST
Improving Viewing Experiences of First-Person Shooter Gameplays with Automatically-Generated Motion EffectsIn recent times, millions of people enjoy watching video gameplays at an eSports stadium or home. We seek a method that improves gameplay spectator or viewer experiences by presenting multisensory stimuli. Using a motion chair, we provide the motion effects automatically generated from the audiovisual stream to the viewers watching a first-person shooter (FPS) gameplay. The motion effects express the game character’s movement and gunfire action. We describe algorithms for the computation of such motion effects developed using computer vision techniques and deep learning. By a user study, we demonstrate that our method of providing motion effects significantly improves the viewing experiences of FPS gameplay. The contributions of this paper are with the motion synthesis algorithms integrated for FPS games and the empirical evidence for the benefits of experiencing multisensory gameplays.2021GYGyeore Yun et al.POSTECHForce Feedback & Pseudo-Haptic WeightGame UX & Player BehaviorLive Streaming & Spectating ExperienceCHI
Augmenting Physical Buttons with Vibrotactile Feedback for Programmable FeelsPhysical buttons provide clear haptic feedback when pressed and released, but their responses are unvarying. Physical buttons can be powered by force actuators to produce unlimited click sensations, but the cost is substantial. An alternative can be augmenting physical buttons with simple and inexpensive vibration actuators. When pushed, an augmented button generates a vibration overlayed on the button’s original kinesthetic response, under the general framework of haptic augmented reality. We explore the design space of augmented buttons while changing vibration frequency, amplitude, duration, and envelope. We then visualize the perceptual structure of augmented buttons by estimating a perceptual space for 7 physical buttons and 40 augmented buttons. Their sensations are also assessed against adjectives, and results are mapped into the perceptual space to identify meaningful perceptual dimensions. Our results contribute to understanding the benefits and limitations of programmable vibration-augmented physical buttons with emphasis on their feels.2020CPChaeyong Park et al.Vibrotactile Feedback & Skin StimulationForce Feedback & Pseudo-Haptic WeightUIST
Body-Penetrating Tactile Phantom SensationsIn tactile interaction, a phantom sensation refers to an illusion felt on the skin between two distant points at which vibrations are applied. It can improve the perceptual spatial resolution of tactile stimulation with a few tactors. All phantom sensations reported in the literature act on the skin or out of the body, but no such reports exist for those eliciting sensations penetrating the body. This paper addresses tactile phantom sensations in which two vibration actuators on the dorsal and palmar sides of the hand present an illusion of vibration passing through the hand. We also demonstrate similar tactile illusions for the torso. For optimal design, we conducted user studies while varying vibration frequency, envelope function, stimulus duration, and penetrating direction. Based on the results, we present design guidelines on penetrating phantom sensations for its use in immersive virtual reality applications.2020JKJinsoo Kim et al.Pohang University of Science and TechnologyVibrotactile Feedback & Skin StimulationFull-Body Interaction & Embodied InputCHI
VibEye: Vibration-Mediated Object Recognition for Tangible Interactive ApplicationsWe present VibEye: a vibration-mediated recognition system of objects for tangible interaction. A user holds an object between two fingers wearing VibEye. VibEye triggers a vibration from one finger, and the vibration that has propagated through the object is sensed at the other finger. This vibration includes information about the object's identity, and we represent it using a spectrogram. Collecting the spectrograms of many objects, we formulate the object recognition problem to a classical classification problem among the images. This simple method, when tested with 20 users, shows 92.5% accuracy for 16 objects of the same shape with various materials. This material-based classifier is also extended to the recognition of everyday objects. Lastly, we demonstrate several tangible applications where VibEye provides the needed functionality while enhancing user experiences. VibEye is particularly effective for recognizing objects made of different materials, which is difficult to distinguish by other means such as light and sound.2019SOSeungjae Oh et al.Pohang University of Science and Technology (POSTECH)Vibrotactile Feedback & Skin StimulationHaptic WearablesCHI
Substituting Motion Effects with Vibrotactile Effects for 4D ExperiencesIn this paper, we present two methods to substitute motion effects using vibrotactile effects in order to improve the 4D experiences of viewers. This work was motivated by the needs of more affordable 4D systems for individual users. Our sensory substitution algorithms convert motion commands to vibrotactile commands to a grid display that uses multiple actuators. While one method is based on the fundamental principle of vestibular feedback, the other method makes use of intuitive visually-based mapping from motion to vibrotactile stimulation. We carried out a user study and could confirm the effectiveness of our substitution methods in improving 4D experiences. To our knowledge, this is the first study that investigated the feasibility of replacing motion effects using much simpler and less expensive vibrotactile effects.2018JSJongman Seo et al.POSTECHVibrotactile Feedback & Skin StimulationForce Feedback & Pseudo-Haptic WeightCHI
Tactile Information Transmission by 2D Stationary Phantom SensationsA phantom sensation refers to an illusory tactile sensation perceived midway between multiple distant stimulations on the skin. Phantom sensations have been used intensively in tactile interfaces owing to their simplicity and effectiveness. Despite that, the perceptual performance of phantom sensations is not completely understood, especially for 2D cases. This work is concerned with 2D stationary phantom sensations and their fundamental value as a means for information display. In User Study 1, we quantified the information transmission capacity using an absolute identification task of 2D phantom sensations. In User Study 2, we probed the distributions of the actual perceived positions of 2D phantom sensations. The investigations included both types of phantom sensations—within and out of the body. Our results provide general guidelines as to leveraging 2D phantom sensations in the design of spatial tactile display.2018GPGunhyuk Park et al.Pohang University of Science and Technology, Max-Planck-InstituteVibrotactile Feedback & Skin StimulationVisualization Perception & CognitionCHI