Los puntos clave no están disponibles para este artículo en este momento.
The sense of touch plays a crucial role in interactive behavior within virtual spaces, particularly when visual attention is absent. Although haptic feedback has been widely used to compensate for the lack of visual cues, the use of tactile information as a predictive feedforward cue to guide hand movements remains unexplored and lacks theoretical understanding. This study introduces a fingertip aero-haptic rendering method to investigate its effectiveness in directing hand movements during eyes-free spatial interactions. The wearable device incorporates a multichannel micro-airflow chamber to deliver adjustable tactile effects on the fingertips. The first study verified that tactile directional feedforward cues significantly improve user capabilities in eyes-free target acquisition and that users rely heavily on haptic indications rather than spatial memory to control their hands. A subsequent study examined the impact of enriched tactile feedforward cues on assisting users in determining precise target positions during eyes-free interactions, and assessed the required learning efforts. The haptic feedforward effect holds great practical promise in eyeless design for virtual reality. We aim to integrate cognitive models and tactile feedforward cues in the future, and apply richer tactile feedforward information to alleviate users' perceptual deficiencies.
Building similarity graph...
Analyzing shared references across papers
Loading...
Xiaofei Ren
Jian He
Teng Han
Virtual Reality & Intelligent Hardware
Chinese Academy of Sciences
Beijing University of Technology
Institute of Software
Building similarity graph...
Analyzing shared references across papers
Loading...
Ren et al. (Mon,) studied this question.
www.synapsesocial.com/papers/68e70eedb6db64358768802e — DOI: https://doi.org/10.1016/j.vrih.2023.12.001
Synapse has enriched 3 closely related papers on similar clinical questions. Consider them for comparative context: