GenTune: Toward Traceable Prompts to Improve Controllability of Image Refinement in Environment DesignEnvironment designers in the entertainment industry create imaginative 2D and 3D scenes for games, films, and television, requiring both fine-grained control of specific details and consistent global coherence. Designers have increasingly integrated generative AI into their workflows, often relying on large language models (LLMs) to expand user prompts for text-to-image generation, then iteratively refining those prompts and applying inpainting. However, our formative study with 10 designers surfaced two key challenges: (1) the lengthy LLM-generated prompts make it difficult to understand and isolate the keywords that must be revised for specific visual elements; and (2) while inpainting supports localized edits, it can struggle with global consistency and correctness. Based on these insights, we present GenTune, an approach that enhances human–AI collaboration by clarifying how AI-generated prompts map to image content. Our GenTune system lets designers select any element in a generated image, trace it back to the corresponding prompt labels, and revise those labels to guide precise yet globally consistent image refinement. In a summative study with 20 designers, GenTune significantly improved prompt-image comprehension, refinement quality and efficiency, and overall satisfaction (all p < .01) compared to current practice. A follow-up field study with two studios further demonstrated its effectiveness in real-world settings.2025WWWen-Fan Wang et al.Generative AI (Text, Image, Music, Video)Human-LLM CollaborationPrototyping & User TestingUIST
HeadTurner: Enhancing Viewing Range and Comfort of using Virtual and Mixed-Reality Headsets while Lying Down via Assisted Shoulder and Head ActuationVirtual and mixed reality headsets, such as the Apple Vision Pro and Meta Quest, began supporting use in reclined postures in 2024, accommodating users who prefer or require this position. However, the surfaces on which users rest restrict shoulder and head rotation, reducing viewing range and comfort. A formative study (n=16) comparing usage while standing vs. lying down showed that head rotation range decreased from 261º to 130º horizontally and from 172º to 94.9º vertically. To improve viewing range and comfort, we present HeadTurner, a novel approach that assists user-initiated head rotations by actuating the resting surface to yield in pitch and yaw axes. In a user study (n=16), HeadTurner significantly expanded the field of view and improved comfort compared to a fixed surface. Although VR sickness was slightly reduced with HeadTurner, the difference was not statistically significant. Overall, HeadTurner was preferred by 75% of participants. Although our proof-of-concept device was prototyped as a bed, the approach can be extended to more compact and affordable device form factors, such as motorized reclining chairs, offering the potential for comfortable use of VR and MR headsets over extended periods, and was shown to inspire users with interested applications in back-rested scenarios.2025EWEn-Huei Wu et al.National Taiwan University, HCI LabMixed Reality WorkspacesImmersion & Presence ResearchCHI
MR.Drum: Designing Mixed Reality Interfaces to Support Structured Learning Micro-Progression in DrummingLearning drumming is challenging because multiple rhythms must be performed independently and simultaneously using both hands and feet. We conducted two formative studies to understand: 1) professional drumming instructors' teaching methods, and 2) drummers' current self-learning practices and pain points. All instructors deconstructed complex rhythms and limb movements and then used structured progression to teach drumming, which has not been explored by HCI research to date. Based on these findings, we developed a novel micro-progression learning framework for novice drummers that divides and structures comprehension progression (drum sequence and rhythm) and limb coordination progression into 16 stages. We also designed MR-Drum, a mixed-reality system that provides a first-person view of virtual limbs to demonstrate rhythm, limb, and drum surface dynamics, with adjustable tempo and automatic error detection. A summative user study vs. instructional videos showed that MR-Drum significantly improved error rate and timing accuracy, was significantly preferred for comprehension, skill development, and user experience, and was preferred overall by all participants.2025CWChe Wei Wang et al.National Taiwan UniversityFull-Body Interaction & Embodied InputBrain-Computer Interface (BCI) & NeurofeedbackMixed Reality WorkspacesCHI
AIdeation: Designing a Human-AI Collaborative Ideation System for Concept DesignersConcept designers in the entertainment industry create highly detailed, often imaginary environments for movies, games, and TV shows. Their early ideation phase requires intensive research, brainstorming, visual exploration, and combination of various design elements to form cohesive designs. However, existing AI tools focus on image generation from user specifications, lacking support for the unique needs and complexity of concept designers' workflows. Through a formative study with 12 professional designers, we captured their workflows and identified key requirements for AI-assisted ideation tools. Leveraging these insights, we developed AIdeation to support early ideation by brainstorming design concepts with flexible searching and recombination of reference images. A user study with 16 professional designers showed that AIdeation significantly enhanced creativity, ideation efficiency, and satisfaction (all \textit{p}<.01) compared to current tools and workflows. A field study with 4 studios for 1 week provided insights into AIdeation's benefits and limitations in real-world projects. After the completion of the field study, two studios, covering films, television, and games, have continued to use AIdeation in their commercial projects to date, further validating AIdeation's improvement in ideation quality and efficiency.2025WWWen-Fan Wang et al.National Taiwan University, Computer Science and Information EngineeringHuman-LLM CollaborationAI-Assisted Creative WritingCHI
SpinShot: Optimizing Both Physical and Perceived Force Feedback of Flywheel-Based, Directional Impact Handheld DevicesReal-world impact, such as hitting a tennis ball and a baseball, generates instantaneous, directional impact forces. However, current ungrounded force feedback technologies, such as air jets and propellers, can only generate directional impulses that are 10x-10,000x weaker. We present SpinShot, a flywheel-based device with a solenoid-actuated stopper capable of generating directional impulse of 22Nm in 1ms, which is more than 10x stronger than prior ungrounded directional technologies. Furthermore, we present a novel force design that reverses the flywheel immediately after the initial impact, to significantly increase the perceived magnitude. We conducted a series of two formative, perceptual studies (n=16, 18), followed by a summative user experience study (n=16) that compared SpinShot vs. moving mass (solenoid) and vs. air jets in a VR baseball hitting game. Results showed that SpinShot significantly improved realism, immersion, magnitude (p < .01) compared to both baselines, but significantly reduced comfort vs. air jets primarily due to the 2.9x device weight. Overall, SpinShot was preferred by 63-75% of the participants.2024CFChia-An Fan et al.Force Feedback & Pseudo-Haptic WeightImmersion & Presence ResearchUIST
Exploring Augmented Reality Interface Designs for Virtual Meetings in Real-world Walking ContextsResearch has shown that walking during meetings improve creativity, memory, attention, health, and happiness. While mobile technologies have freed users from having to be stationary during virtual meetings, mobile phones pose several usability challenges, such as reduced productivity, due to small screens and safety concerns. In this paper, we present the first exploration of augmented reality (AR) interface design for virtual meetings in real-world walking conditions. We conducted design sessions in-situ with 16 user interface and AR designers using a 2x2 experimental design: meeting while walking in two levels of traffic conditions and with two types of meeting formats. Results show that the designed AR windows averaged 14.5 times in viewing size compared to the most popular smartphone screens. Also, Traffic Level significantly affected the size, opacity, and placement of windows, as well as the preference of anchoring modes, while Meeting Format significantly affected size and opacity. Furthermore, clustering analysis identified two groups of designs that can serve as initial reference designs for further customization and research.2024CCChiao-Ju Chang et al.Head-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)AR Navigation & Context AwarenessDIS
VeeR: Exploring the Feasibility of Deliberately Designing VR Motion that Diverges from Mundane, Everyday Physical Motion to Create More Entertaining VR ExperiencesThis paper explores the feasibility of deliberately designing VR motion that diverges from users’ physical movements to turn mundane, everyday transportation motion (e.g., metros, trains, and cars) into more entertaining VR motion experiences, in contrast to prior car-based VR approaches that synchronize VR motion to physical car movement exactly. To gain insight into users’ preferences for veering rate and veering direction for turning (left/right) and pitching (up/down) during the three phases of acceleration (accelerating, cruising, and decelerating), we conducted a formative, perceptual study (n=24) followed by a VR experience evaluation (n=18), all conducted on metro trains moving in a mundane, straight-line motion. Results showed that participants preferred relatively high veering rates, and preferred pitching upward during acceleration and downward during deceleration. Furthermore, while veering decreased comfort as expected, it significantly enhanced immersion (p<.01) and entertainment (p<.001) and the overall experience, with comfort being considered, was preferred by 89% of participants.2024PLPin Chun Lu et al.National Taiwan UniversityMotion Sickness & Passenger ExperienceImmersion & Presence ResearchCHI
RoomDreaming: Generative-AI Approach to Facilitating Iterative, Preliminary Interior Design ExplorationInterior design aims to create aesthetically pleasing and functional environments within an architectural space. For a simple room, the preliminary design exploration currently takes multiple meetings and days of work for interior designers to incorporate homeowners' personal preferences through layout, furnishings, form, colors, and materials. We present RoomDreaming, a generative AI-based approach designed to facilitate preliminary interior design exploration. It empowers owners and designers to rapidly and efficiently iterate through a broad range of AI-generated, photo-realistic design alternatives, each uniquely tailored to fit actual space layouts and individual design preferences. We conducted a series of formative and summative studies with a total of 18 homeowners and 20 interior designers to help design, improve, and evaluate RoomDreaming. Owners reported that RoomDreaming effectively increased the breadth and depth of design exploration with higher efficiency and satisfaction. Designers reported that one hour of collaborative designing with RoomDreaming yielded results comparable to several days of traditional owner-designer meetings, plus days to weeks worth of designer work to develop and refine designs.2024SWShun-Yu Wang et al.National Taiwan UniversityGenerative AI (Text, Image, Music, Video)Customizable & Personalized ObjectsCHI
Paired-EMS: Enhancing Electrical Muscle Stimulation (EMS)-based Force Feedback Experience by Stimulating Both Muscles in Antagonistic PairsElectrical Muscle Stimulation (EMS) has emerged as a key wearable haptic feedback technology capable of simulating a wide range of force feedback, such as the impact force of boxing punches, the weight of virtual objects, and the reaction force from pushing on a wall. To simulate these external forces, EMS stimulates the muscles that oppose (i.e. antagonistic to) the actual muscles that users activate, causing involuntary muscle contraction and haptic sensations that differ from real-world experiences. In this work, we propose Paired-EMS which simultaneously stimulates both the muscles that users activate and that prior EMS stimulates (i.e. antagonistic muscle pairs) to enhance the external force feedback experience. We first conducted a small formative study (n=8) to help design the stimulation intensity of muscle pairs, then conducted a user experience study to evaluate Paired-EMS vs. prior EMS approaches for both isometric and isotonic user actions. Study results (n=32) showed that Paired-EMS significantly improved realism, harmony, and entertainment (p<.05) with similar comfort (p>.36), and was overall preferred by 78% of participants (p<.01).2024CCChia-Yu Cheng et al.National Taiwan UniversityForce Feedback & Pseudo-Haptic WeightElectrical Muscle Stimulation (EMS)CHI
AirCharge: Amplifying Ungrounded Impact Force by Accumulating Air Propulsion MomentumImpact events, which generate directional forces with extremely short impulse durations and large force magnitudes, are prevalent in both virtual reality (VR) games and real-world experiences. However, despite recent advancement in ungrounded force feedback technologies, such as air jet propulsion and propellers, these technologies remain 5-100x weaker and 10-500x slower compared to real-world impact events. For instance, they can only achieve 4𝑁 with a minimal duration of 50-500𝑚𝑠 compared to the 20-400𝑁 forces generated within 1-5𝑚𝑠 for baseball, ping-pong, drumming, and tennis. To overcome these limitations, we present AirCharge, a novel haptic device that accumulates air propulsion momentum to generate instantaneous, directional impact forces. By mounting compressed air jets on rotating swingarms, AirCharge can amplify impact force magnitude by more than 10x while matching real-world impulse duration of 3𝑚𝑠. To support high-frequency impacts, we explored and evaluated a series of device designs, culminating in a novel reciprocating dual-swingarm design that leverages a reversing bevel gearbox to eliminate gyro effects and to achieve impact feedback of up to 10𝐻𝑧. User experience evaluation (n = 16) showed that AirCharge significantly enhanced realism and is preferred by participants compared to air jets without the charging mechanism.2023PCPo-Yu Chen et al.Force Feedback & Pseudo-Haptic WeightUIST
DrivingVibe: Enhancing VR Driving Experience using Inertia-based Vibrotactile Feedback around the HeadWe present DrivingVibe, which explores vibrotactile feedback designs around the head to enhance VR driving motion experiences. We propose two approaches that use a 360-degree vibrotactile headband: 1) mirroring and 2) 3D inertia-based.The mirroring approach extends the vibrotactile patterns of handheld controllers to actuate the entire headband uniformly. The 3D inertia-based approach uses the acceleration telemetry data that driving games/simulators export to motion platforms to generate directional vibration patterns, including: i) centrifugal forces, ii) horizontal acceleration/deceleration, and iii) vertical motion due to rough terrain. The two approaches are complementary as the mirroring approach supports all driving games because it does not require telemetry data, while the 3D inertia-based approach provides higher feedback fidelity for games that provide such data. We conducted a 24-person user experience evaluation in both passive passenger mode and active driving mode. Study results showed that both DrivingVibe designs significantly improved realism, immersion, and enjoyment (p<.01) with large effect sizes for the VR driving experiences. For overall preference, 88% (21/24) of participants preferred DrivingVibe, with a 2:1 preference for 3D inertia-based vs. mirroring designs (14 vs. 7 participants). For immersion and enjoyment, 96% (23/24) of participants preferred DrivingVibe, with nearly a 3:1 preference (17 vs. 6 participants) for the 3D inertia-based design.2023NYNeng-Hao Yu et al.Head-Up Display (HUD) & Advanced Driver Assistance Systems (ADAS)In-Vehicle Haptic, Audio & Multimodal FeedbackMobileHCI
TurnAhead: Designing 3-DoF Rotational Haptic Cues to Improve First-person Viewing (FPV) ExperiencesFirst-Person View (FPV) drone is a recently developed category of drones designed for precision flying and for capturing exhilarating experiences that could not be captured before, such as navigating through tight indoor spaces and flying extremely close to subjects of interest. FPV viewing experiences, while exhilarating, typically have frequent rotations that can lead to visually induced discomfort. We present TurnAhead, which uses 3-DoF rotational haptic cues that correspond to camera rotations to improve the comfort, immersion, and enjoyment of FPV experiences. It uses headset-mounted air jets to provide ungrounded rotational forces and is the first device to support rotation around all 3 axes: yaw, pitch, and roll. We conducted a series of perception and formative studies to explore the design space of timing and intensity of haptic cues, followed by user experience evaluation, for a combined total of 44 participants (n=12, 8, 6, 18). Results showed that TurnAhead significantly improved overall comfort, immersion, and enjoyment, and was preferred by 89% of participants.2023BKBo-Cheng Ke et al.National Taiwan UniversityForce Feedback & Pseudo-Haptic WeightDrone Interaction & ControlCHI
RealityLens: A User Interface for Blending Customized Physical World View into Virtual RealityResearch has enabled virtual reality (VR) users to interact with the physical world by blending the physical world view into the virtual environment. However, current solutions are designed for specific use cases and hence are not capable of covering users' varying needs for accessing information about the physical world. This work presents RealityLens, a user interface that allows users to peep into the physical world in VR with the reality lenses they deployed for their needs. For this purpose, we first conducted a preliminary study with experienced VR users to identify users' needs for interacting with the physical world, which led to a set of features for customizing the scale, placement, and activation method of a reality lens. We evaluated the design in a user study (n=12) and collected the feedback of participants engaged in two VR applications while encountering a range of interventions from the physical world. The results show that users' VR presence tends to be better preserved when interacting with the physical world with the support of the RealityLens interface.2022CWPeng-Jui Wang et al.Mixed Reality WorkspacesUIST
AirRacket: Perceptual Design of Ungrounded, Directional Force Feedback to Improve Virtual Racket Sports ExperiencesWe present AirRacket, perceptual modeling and design of ungrounded, directional force feedback for virtual racket sports. Using compressed air propulsion jets to provide directional impact forces, we iteratively designed for three popular sports that span a wide range of force magnitudes: ping-pong, badminton, and tennis. To address the limited force magnitude of ungrounded force feedback technologies, we conducted a perception study which discovered the novel illusion that users perceive larger impact force magnitudes with longer impact duration, by an average factor of 2.57x. Through a series of formative, perceptual, and user experience studies with a combined total of 72 unique participants, we explored several perceptual designs using force magnitude scaling and duration scaling methods to expand the dynamic range of perceived force magnitude. Our user experience evaluation showed that perceptual designs can significantly improve realism and preference vs. physics-based designs for ungrounded force feedback systems.2022CTChing-Yi Tsai et al.National Taiwan University, National Taiwan UniversityForce Feedback & Pseudo-Haptic WeightCHI
HeadWind: Enhancing Teleportation Experience in VR by Simulating Air Drag during Rapid MotionTeleportation, which instantly moves users from their current location to the target location, has become the most popular locomotion technique in VR games. It enables fast navigation with reduced VR sickness but results in significantly reduced immersion. We present HeadWind, a novel approach to improve the experience of teleportation by simulating the haptic sensation of air drag when rapidly moving through the air in real life. Specifically, HeadWind modulates bursts of compressed air to the face and uses multiple nozzles to provide directional cues. To design the wearable device and to model airflow speed and duration for teleportation, we conducted three formative studies and a design session. User experience evaluation with 24 participants showed that HeadWind significantly improved realism, immersion, and enjoyment of teleportation in VR (p<.01) with large effect sizes (r>0.5), and was preferred by 96% of participants.2022CTChun-Miao Tseng et al.National Taiwan UniversityMid-Air Haptics (Ultrasonic)Full-Body Interaction & Embodied InputCHI
MotionRing: Creating Illusory Tactile Motion around the Head using 360° Vibrotactile HeadbandsWe present MotionRing, a vibrotactile headband that creates illusory tactile motion around the head by controlling the timing of a 1-D, 360° sparse array of vibration motors. Its unique ring shape enables symmetric and asymmetric haptic motion experiences, such as when users pass through a medium and when an object passes nearby in any direction. We first conducted a perception study to understand how factors such as vibration motor timing, spacing, duration, intensity, and head region affect the perception of apparent tactile motion. Results showed that illusory tactile motion around the head can be achieved with 12 and 16 vibration motors with angular speed between 0.5-4.9 revolutions per second. We developed a symmetric and an asymmetric tactile motion pattern to enhance the experience of teleportation in VR and dodging footballs, respectively. We conducted a user study to compare the experience of MotionRing vs. static vibration patterns and visual-only feedback. Results showed that illusory tactile motion significantly improved users' perception of directionality and enjoyment of motion events, and was most preferred by users.2021SCShao-Yu Chu et al.Vibrotactile Feedback & Skin StimulationSocial & Collaborative VRImmersion & Presence ResearchUIST
JetController: High-speed Ungrounded 3-DoF Force Feedback Controllers using Air Propulsion JetsJetController is a novel haptic technology capable of supporting high-speed and persistent 3-DoF ungrounded force feedback. It uses high-speed pneumatic solenoid valves to modulate compressed air to achieve 20-50Hz of full impulses at 4.0-1.0N, and combines multiple air propulsion jets to generate 3-DoF force feedback. Compared to propeller-based approaches, JetController supports 10-30 times faster impulse frequency, and its handheld device is significantly lighter and more compact. JetController supports a wide range of haptic events in games and VR experiences, from firing automatic weapons in games like Halo (15Hz) to slicing fruits in Fruit Ninja (up to 45Hz). To evaluate JetController, we integrated our prototype with two popular VR games, Half-life: Alyx and Beat Saber, to support a variety of 3D interactions. Study results showed that JetController significantly improved realism, enjoyment, and overall experience compared to commercial vibrating controllers, and was preferred by most participants.2021YWYu-Wei Wang et al.National Taiwan UniversityForce Feedback & Pseudo-Haptic Weight360° Video & Panoramic ContentCHI
HapticSeer: A Multi-channel, Black-box, Platform-agnostic Approach to Detecting Video Game Events for Real-time Haptic FeedbackHaptic feedback significantly enhances virtual experiences. However, supporting haptics currently requires modifying the codebase, making it impractical to add haptics to popular, high-quality experiences such as best selling games, which are typically closed-source. We present HapticSeer, a multi-channel, black-box, platform-agnostic approach to detecting game events for real-time haptic feedback. The approach is based on two key insights: 1) all games have 3 types of data streams: video, audio, and controller I/O, that can be analyzed in real-time to detect game events, and 2) a small number of user interface design patterns are reused across most games, so that event detectors can be reused effectively. We developed an open-source HapticSeer framework and implemented several real-time event detectors for commercial PC and VR games. We validated system correctness and real-time performance, and discuss feedback from several haptics developers that used the HapticSeer framework to integrate research and commercial haptic devices.2021YLYu-Hsin Lin et al.National Taiwan UniversityVibrotactile Feedback & Skin StimulationVoice User Interface (VUI) DesignGame UX & Player BehaviorCHI
ElastiLinks: Force Feedback between VR Controllers with Dynamic Points of Application of ForceForce feedback is commonly used to enhance realism in virtual reality (VR). However, current works mainly focus on providing different force types or patterns, but do not investigate how a proper point of application of force (PAF), which means where the resultant force is applied to, affects users’ experience. For example, users perceive resistive force without torque when pulling a virtual bow, but with torque when pulling a virtual slingshot. Therefore, we propose a set of handheld controllers, ElastiLinks, to provide force feedback between controllers with dynamic PAFs. A rotatable track on each controller provides a dynamic PAF, and two common types of force feedback, resistive force and impact, are produced by two links, respectively. We performed a force perception study to ascertain users’ resistive and impact force level distinguishability between controllers. Based on the results, we conducted another perception study to understand users’ distinguishability of PAF offset and rotation differences. Finally, we performed a VR experience study to prove that force feedback with dynamic PAFs enhances VR experience.2020TWTzu-Yun Wei et al.Force Feedback & Pseudo-Haptic WeightFull-Body Interaction & Embodied InputImmersion & Presence ResearchUIST
WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR using Unobtrusive Head-mounted Vibrotactile FeedbackVirtual Reality (VR) sickness is common with symptoms such as headaches, nausea, and disorientation, and is a major barrier to using VR. We propose WalkingVibe, which applies unobtrusive vibrotactile feedback for VR walking experiences, and also reduces VR sickness and discomfort while improving realism. Feedback is delivered through two small vibration motors behind the ears at a frequency that strikes a balance in inducing vestibular response while minimizing annoyance. We conducted a 240-person study to explore how visual, audio, and various tactile feedback designs affect the locomotion experience of users walking passively in VR while seated statically in reality. Results showed timing and location for tactile feedback have significant effects on VR sickness and realism. With WalkingVibe, 2-sided step-synchronized design significantly reduces VR sickness and discomfort while significantly improving realism. Furthermore, its unobtrusiveness and ease of integration make WalkingVibe a practical approach for improving VR experiences with new and existing VR headsets.2020YPYi-Hao Peng et al.National Taiwan UniversityVibrotactile Feedback & Skin StimulationImmersion & Presence ResearchCHI