391 results on '"haptic rendering"'
Search Results
2. Kernel Principal Components Analysis of Lateral Force Data for Quantitative Evaluation of 3D Features Rendering Fidelity
- Author
-
Xiaoying Sun, Guohong Liu, and Zhu Shuangyun
- Subjects
Computer science ,business.industry ,media_common.quotation_subject ,Significant difference ,Fidelity ,Pattern recognition ,Haptic rendering ,Computer Science Applications ,Rendering (computer graphics) ,Spatial similarity ,Kernel (image processing) ,Control and Systems Engineering ,Principal component analysis ,Artificial intelligence ,Electrical and Electronic Engineering ,business ,media_common - Abstract
Haptic rendering technology is increasingly oriented toward representing realistic interaction with the physical world. A parallel challenge is how to achieve objective and quantitative evaluation of haptic rendering fidelity. This article addresses this issue, and involves kernel principal components analysis (KPCA) to determine a quantitative indicator for assessing rendering fidelity of 3D features. The KPCA approach explores eigen-subspace properties of two lateral force sets for respective real and virtual 3D objects, and obtains the quantitative evaluation indicator (spatial similarity) by computing the inner product of two kernel principal features. Finally, the proposed KPCA approach is applied to assess electrostatic tactile rendering ability of 3D sinusoidal humps on touchscreens, and it is also compared with the linear principal components analysis (PCA) method and perceptual ratings. Significant analysis results show that the spatial similarity with the KPCA approach is significantly higher than that with the linear PCA method, and that there is no significant difference in the averaged similarity with respect to the KPCA approach and perceptual ratings.
- Published
- 2021
3. Crowd Navigation in VR: exploring haptic rendering of collisions
- Author
-
Florian Berton, Marco Aggravi, Fabien Grzeskowiak, Alexandre Bonneau, Anne-Hélène Olivier, Ludovic Hoyet, Julien Pettré, Alberto Jovane, Claudio Pacchierotti, Sensor-based and interactive robotics (RAINBOW), Inria Rennes – Bretagne Atlantique, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-SIGNAUX ET IMAGES NUMÉRIQUES, ROBOTIQUE (IRISA-D5), Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT), Analysis-Synthesis Approach for Virtual Human Simulation (MIMETIC), Université de Rennes 2 (UR2)-Inria Rennes – Bretagne Atlantique, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-MEDIA ET INTERACTIONS (IRISA-D6), Université de Rennes 2 (UR2), This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 779942 Crowdbot and No 856879 PRESENT, as well as from the ANR OPMoPS project (ANR16-SEBM-0004)., ANR-16-SEBM-0004,OPMoPS,Mouvements organisés de piétons dans les espaces publics : Préparation et gestion des parades urbaines et des manifestations à fort potentiel de conflit(2016), European Project: 779942,H2020,CROWDBOT(2018), European Project: 856879,PRESENT(2019), Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National de Recherche en Informatique et en Automatique (Inria)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-CentraleSupélec-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-Université de Rennes (UNIV-RENNES)-Inria Rennes – Bretagne Atlantique, Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), and Université de Rennes (UNIV-RENNES)
- Subjects
Computer science ,02 engineering and technology ,Virtual reality ,Haptic rendering ,Rendering (computer graphics) ,Feedback ,Crowds ,Human–computer interaction ,Human interaction ,0202 electrical engineering, electronic engineering, information engineering ,Computer Graphics ,Humans ,Computer Simulation ,Haptic technology ,ComputingMethodologies_COMPUTERGRAPHICS ,Virtual Reality ,020207 software engineering ,Computer Graphics and Computer-Aided Design ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] ,Haptic Technology ,Signal Processing ,Crowd ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,Crowd simulation ,Software - Abstract
International audience; Virtual reality (VR) is a valuable experimental tool for studying human movement, including the analysis of interactions during locomotion tasks for developing crowd simulation algorithms. However, these studies are generally limited to distant interactions in crowds, due to the difficulty of rendering realistic sensations of collisions in VR. In this work, we explore the use of wearable haptics to render contacts during virtual crowd navigation. We focus on the behavioural changes occurring with or without haptic rendering during a navigation task in a dense crowd, as well as on potential after-effects introduced by the use haptic rendering. Our objective is to provide recommendations for designing VR setup to study crowd navigation behaviour. To this end, we designed an experiment (N=23) where participants navigated in a crowded virtual train station without, then with, and then again without haptic feedback of their collisions with virtual characters. Results show that providing haptic feedback improved the overall realism of the interaction, as participants more actively avoided collisions. We also noticed a significant after-effect in the users' behaviour when haptic rendering was once again disabled in the third part of the experiment. Nonetheless, haptic feedback did not have any significant impact on the users' sense of presence and embodiment.
- Published
- 2022
4. Real-Time Teleoperation of Magnetic Force-Driven Microrobots With 3D Haptic Force Feedback for Micro-Navigation and Micro-Transportation
- Author
-
Jaeyeon Lee, Min Jun Kim, Chung Hyuk Park, and Xiao Zhang
- Subjects
0209 industrial biotechnology ,Control and Optimization ,Computer science ,Mechanical Engineering ,Biomedical Engineering ,02 engineering and technology ,021001 nanoscience & nanotechnology ,Haptic rendering ,Haptic force feedback ,Computer Science Applications ,Human-Computer Interaction ,020901 industrial engineering & automation ,Artificial Intelligence ,Control and Systems Engineering ,Teleoperation ,Obstacle avoidance ,Computer Vision and Pattern Recognition ,0210 nano-technology ,Simulation ,Haptic technology - Abstract
Untethered mobile microrobots controlled by an external magnetic gradient field can be employed as advanced biomedical applications inside the human body such as cell therapy, micromanipulation, and noninvasive surgery. Haptic technology and telecommunication, on the other hand, can extend the potentials of untethered microrobot applications. In those applications, users can communicate with the robot operating system remotely to manipulate microrobots with haptic feedback. Haptic sensations artificially constructed by the wirelessly communicated information can assist human operators to experience forces while controlling the microrobots. The proposed system is composed of a haptic device and a magnetic tweezer system, both of which are integrated through a teleoperation technique based on network communication. Users can control the microrobots remotely and feel the haptic interactions with the remote environment in real-time. The 3D haptic environment is reconstructed dynamically by a model-free haptic rendering algorithm using a 2D planar image input of the microscope. The interaction between microrobots and environmental objects is haptically rendered as 3D objects to achieve spatial haptic operation with obstacle avoidance. Moreover, path generation and path guidance forces provide virtual interaction for human users to manipulate the microrobot by following the near-optimal path in path-following tasks. The potential applications of the presented system are medical remote treatment in different sites, remote drug delivery by avoiding physically penetrating through the skin, remotely-controlled cell manipulations, and biopsy without a biopsy needle.
- Published
- 2021
5. AI-based Automatic Spine CT Image Segmentation and Haptic Rendering for Spinal Needle Insertion Simulator
- Author
-
Gun Choi, Ikjong Park, Wan Kyun Chung, and Keehoon Kim
- Subjects
Medical robotics ,Computer science ,business.industry ,Computer vision ,Needle insertion ,Image segmentation ,Artificial intelligence ,Haptic rendering ,business ,Spine ct - Published
- 2020
6. Development of a haptic communication system for fashion image experience in a virtual environment
- Author
-
Sang-Youn Kim, Dong-Soo Choi, Jisoo Ha, and Jongsun Kim
- Subjects
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,computer.software_genre ,Haptic rendering ,GeneralLiterature_MISCELLANEOUS ,Image (mathematics) ,Key factors ,Elasticity (cloud computing) ,Human–computer interaction ,Virtual machine ,Tactile sense ,Haptic communication ,computer ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
The goal of this study was to develop a haptic communication system that can convey the tactile sensation of fashion materials in a virtual environment. In addition, the effectiveness and how realistically the virtual fabric image of this system delivers the tactile sensation of actual fabric was verified. First, a literature review was conducted through which the tactile attributes of fashion materials were defined that would be implemented in the haptic communication system. Then, a questionnaire for evaluating the tactile attributes of fashion materials was developed. Next, a haptic communication system was designed to convey fashion image experiences in a virtual environment, from which a haptic rendering model was suggested. The effectiveness of the haptic communication system was evaluated by verifying user experiences with questions developed through a user evaluation experiment. The validity of the evaluation questions pertaining to the tactile attributes and the effects of the haptic communication system were verified. Factor analysis was conducted to verify the evaluation of the tactile sense attributes of the fashion material, which identified density, thickness, and elasticity of the material as key factors. As a result of comparisons between the tactile sense through haptic characteristics and through touching, it was observed that regarding density and thickness, tactile sense experience led to greater perceived reality, while this was not the case for elasticity.
- Published
- 2020
7. Force quantification and simulation of pedicle screw tract palpation using direct visuo-haptic volume rendering
- Author
-
Georg Rauter, Esther I. Zoller, Gregory F. Jost, Philippe C. Cattin, Nicolas Gerig, and Balázs Faludi
- Subjects
Haptic rendering ,Virtual reality ,CT ,Medical simulation ,Human Robot Interaction ,Male ,medicine.medical_specialty ,020205 medical informatics ,Swine ,Computer science ,Human robot interaction ,Biomedical Engineering ,Health Informatics ,02 engineering and technology ,Motion capture ,Palpation ,Motion ,User-Computer Interface ,03 medical and health sciences ,0302 clinical medicine ,Pedicle Screws ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,Animals ,Torque ,Torque sensor ,Computer Simulation ,Radiology, Nuclear Medicine and imaging ,Simulation Training ,Simulation ,Haptic technology ,medicine.diagnostic_test ,Virtual Reality ,Volume rendering ,General Medicine ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Data set ,Spinal Fusion ,Feasibility Studies ,Original Article ,Surgery ,Computer Vision and Pattern Recognition ,030217 neurology & neurosurgery - Abstract
Purpose We present a feasibility study for the visuo-haptic simulation of pedicle screw tract palpation in virtual reality, using an approach that requires no manual processing or segmentation of the volumetric medical data set. Methods In a first experiment, we quantified the forces and torques present during the palpation of a pedicle screw tract in a real boar vertebra. We equipped a ball-tipped pedicle probe with a 6-axis force/torque sensor and a motion capture marker cluster. We simultaneously recorded the pose of the probe relative to the vertebra and measured the generated forces and torques during palpation. This allowed us replaying the recorded palpation movements in our simulator and to fine-tune the haptic rendering to approximate the measured forces and torques. In a second experiment, we asked two neurosurgeons to palpate a virtual version of the same vertebra in our simulator, while we logged the forces and torques sent to the haptic device. Results In the experiments with the real vertebra, the maximum measured force along the longitudinal axis of the probe was 7.78 N and the maximum measured bending torque was 0.13 Nm. In an offline simulation of the motion of the pedicle probe recorded during the palpation of a real pedicle screw tract, our approach generated forces and torques that were similar in magnitude and progression to the measured ones. When surgeons tested our simulator, the distributions of the computed forces and torques were similar to the measured ones; however, higher forces and torques occurred more frequently. Conclusions We demonstrated the suitability of direct visual and haptic volume rendering to simulate a specific surgical procedure. Our approach of fine-tuning the simulation by measuring the forces and torques that are prevalent while palpating a real vertebra produced promising results., International Journal of Computer Assisted Radiology and Surgery, 15 (11), ISSN:1861-6410, ISSN:1861-6429
- Published
- 2020
8. Realistic haptic rendering of hyper-elastic material via measurement-based FEM model identification and real-time simulation
- Author
-
Ibragim R. Atadjanov, Seokhee Jeon, Seungkyu Lee, and Arsen Abdulali
- Subjects
Data collection ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,General Engineering ,System identification ,020207 software engineering ,02 engineering and technology ,Solver ,Haptic rendering ,Computer Graphics and Computer-Aided Design ,Finite element method ,Rendering (computer graphics) ,Human-Computer Interaction ,Real-time simulation ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Simulation ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
This paper presents a measurement-based FEM (finite element method) modeling and haptic rendering framework for objects with hyper-elastic deformation property. A complete set of methods covering the whole process of the measurement-based modeling/rendering paradigm is newly designed and implemented, with a special emphasis on haptic feedback realism. To this end, we first build a data collection setup that accurately captures shape deformation and response forces during compressive deformation of cylindrical material samples. With this setup, training and testing sets of data are collected from four silicone objects having various material profiles. Then, an objective function incorporating both shape deformation and reactive forces is designed and used to identify material parameters based on training data and the genetic algorithm. For real-time haptic rendering, an optimization-based FEM solver is adopted, ensuring around 500 Hz update rate. The whole procedure is evaluated through numerical and psychophysical experiments. The numerical rendering error is calculated based on the difference between simulated and actually measured deformation forces. The errors are also compared to the human perceptual threshold and found to be perceptually negligible. Overall realism of the feedback from the system is also assessed through the psychophysical experiment. A total of twelve participants rated similarity between real and modeled objects, and the results reveal the rendering quality to be at a reasonable level of realism with positive user feedback.
- Published
- 2020
9. Ten Little Fingers, Ten Little Toes: Can Toes Match Fingers for Haptic Discrimination?
- Author
-
Antoine Weill-Duflos, Jeremy R. Cooperstock, Feras Al Taha, Jeffrey R. Blum, and Preeti Vyas
- Subjects
Adult ,Male ,Adolescent ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,media_common.quotation_subject ,Sensory system ,Haptic rendering ,050105 experimental psychology ,Fingers ,Young Adult ,Discrimination, Psychological ,Feedback, Sensory ,Physical Stimulation ,Perception ,Humans ,0501 psychology and cognitive sciences ,Computer vision ,050107 human factors ,ComputingMethodologies_COMPUTERGRAPHICS ,media_common ,Haptic technology ,Little toes ,business.industry ,05 social sciences ,Equipment Design ,Toes ,Computer Science Applications ,Human-Computer Interaction ,Vibrotactile stimulus ,Touch Perception ,Touch ,Female ,Artificial intelligence ,business - Abstract
In comparison with fingers, toes are relatively unexplored candidates for multi-site haptic rendering. This is likely due to their reported susceptibility to erroneous perception of haptic stimuli, owing to their anatomical structure. We hypothesize that this shortcoming can be mitigated by careful design of the tactile encoding to account for the idiosyncrasies of toe perception. Our efforts to design such an encoding achieved an improved perceptual accuracy of 18% for poking and 16% for vibrotactile stimuli. As we demonstrate, in this article, the resulting perceptual accuracy achieved by the proposed tactile encoding approaches that of the fingers, allowing for consideration of the toes as a practical location to render multi-site haptic stimuli.
- Published
- 2020
10. Perceptually Correct Haptic Rendering in Mid-Air Using Ultrasound Phased Array
- Author
-
Ahsan Raza, Inwook Hwang, Seokhee Jeon, Waseem Hassan, and Tatyana Ogay
- Subjects
Computer science ,Phased array ,business.industry ,020208 electrical & electronic engineering ,02 engineering and technology ,Haptic rendering ,Rendering (computer graphics) ,Control and Systems Engineering ,0202 electrical engineering, electronic engineering, information engineering ,Computer vision ,Ultrasonic sensor ,Artificial intelligence ,Electrical and Electronic Engineering ,business ,Haptic technology - Abstract
This paper provides a perceptually transparent rendering algorithm for an ultrasound-based mid-air haptic device. In a series of experiments, we derive a systematic mapping function relating from the device command value to final user's perceived magnitude of a mid-air vibration feedback. The algorithm is designed for the ultrasonic mid-air haptic interface that is capable of displaying vibro-tactile feedback at a certain focal point in mid-air through ultrasound phased array technique. The perceived magnitude at the focal point is dependent on input parameters, such as input command intensity, modulation frequency, and position of the focal point in the work-space. This algorithm automatically tunes these parameters to ensure that the desired perceived output at the user's hand is precisely controlled. Through a series of experiments, the effect of the aforementioned parameters on the physical output pressure are mapped, and the effect of this output pressure to the final perceived magnitude is formulated, resulting in the mapping from the different parameters to the perceived magnitude. Finally, the overall transparent rendering algorithm was evaluated, showing better perceptual quality than rendering with simple intensity command.
- Published
- 2020
11. Examining the Effect of Haptic Factors for Vascular Palpation Skill Assessment Using an Affordable Simulator
- Author
-
Deepak Vadivalagan, Ravikiran B. Singapogu, Jared Wells, Joe Bible, and Zhanhe Liu
- Subjects
lcsh:Medical technology ,medicine.diagnostic_test ,Computer science ,medical simulator ,Haptic rendering ,education ,lcsh:Computer applications to medicine. Medical informatics ,Palpation ,behavioral disciplines and activities ,Article ,Rendering (computer graphics) ,Skills training ,Healthcare delivery ,lcsh:R855-855.5 ,medicine ,Task analysis ,lcsh:R858-859.7 ,performance assessment ,skill training ,Simulation ,Clinical skills ,Haptic technology - Abstract
Goal: Simulators that incorporate haptic feedback for clinical skills training are increasingly used in medical education. This study addresses the neglected aspect of rendering simulated feedback for vascular palpation skills training by systematically examining the effect of haptic factors on performance. Methods: A simulator-based approach to examine palpation skill is presented. Novice participants with and without minimal previous palpation training performed a palpation task on a simulator that rendered controlled vibratory feedback under various conditions. Results: Five objective metrics were employed to analyze participants’ performance that yielded key findings in quantifying palpation performance. Participants’ palpation accuracy was influenced by all three haptic factors, ranging from moderate to statistically significant. Duration , Total Path Length and Ratio of Correct Movement also demonstrated utility for quantifying performance. Conclusions: We demonstrate that our affordable simulator is capable of rendering controlled haptic feedback suitable for skills training. Further, metrics presented in this study can be used for structured palpation skills assessment and training, potentially improving healthcare delivery.
- Published
- 2020
12. Effect of 2.5D haptic feedback on virtual object perception via a stylus
- Author
-
Jaeyoung Park, Gyuwon Kim, and Donghyun Hwang
- Subjects
Multidisciplinary ,business.industry ,Computer science ,media_common.quotation_subject ,Science ,Smart device ,Haptic rendering ,Article ,law.invention ,Engineering ,Virtual image ,law ,Perception ,Medicine ,Psychology ,Computer vision ,Artificial intelligence ,business ,Stylus ,Author Correction ,Haptic technology ,media_common - Abstract
As touch screen technologies advanced, a digital stylus has become one of the essential accessories for a smart device. However, most of the digital styluses so far provide limited tactile feedback to a user. Therefore we focused on the limitation and noted the potential that a digital stylus may offer the sensation of realistic interaction with virtual environments on a touch screen using a 2.5D haptic system. Thus, we developed a haptic stylus with SMA (Shape Memory Alloy) and a 2.5D haptic rendering algorithm to provide lateral skin-stretch feedback to mimic the interaction force between fingertip and a stylus probing over a bumpy surface. We conducted two psychophysical experiments to evaluate the effect of 2.5D haptic feedback on the perception of virtual object geometry. Experiment 1 investigated the human perception of virtual bump size felt via the proposed lateral skin-stretch stylus and a vibrotactile stylus as reference. Experiment 2 tested the participants’ ability to count the number of virtual bumps rendered via the two types of haptic styluses. The results of Experiment 1 indicate that the participants felt the size of virtual bumps rendered with lateral skin-stretch stylus significantly sensitively than the vibrotactile stylus. Similarly, the participants counted the number of virtual bumps rendered with the lateral skin-stretch stylus significantly better than with the vibrotactile stylus. A common result of the two experiments is a significantly longer mean trial time for the skin-stretch stylus than the vibrotactile stylus.
- Published
- 2021
13. Haptic force rendering of rigid-body interactions: A systematic review
- Author
-
Kjell Andersson, Lei Feng, and Yang Wang
- Subjects
Computer science ,Mechanical Engineering ,Computer graphics (images) ,TJ1-1570 ,Mechanical engineering and machinery ,Haptic rendering ,Rigid body ,Haptic technology ,Rendering (computer graphics) ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
Haptic rendering has been developing for decades with different rendering approaches and many factors that affect the stability when rendering rigid-body interactions have been investigated. To get an overall understanding of the challenges in haptic rendering, we approach this topic by conducting a systematic review. This review examines different haptic rendering approaches and how to deal with instability factors in rendering. A total of 25 papers are reviewed to answer the following questions: (1) what are the most common haptic rendering approaches for rigid-body interaction? and (2) what are the most important factors for instability of haptic rendering and how to address them? Through the process of investigating these questions, we get the insight that transparency can be further explored and technical terms to describe haptic rendering can be more standardized to push the topic forward.
- Published
- 2021
14. Towards More Effective Data Visualization Methods Using Haptics
- Author
-
Karthikan Theivendran, Katayoun Sepehri, Rayan Isran, and Ahmed Anwar
- Subjects
Data visualization ,Point of interest ,Human–computer interaction ,Computer science ,business.industry ,Haptic rendering ,business ,Cognitive load ,Haptic technology - Abstract
Cognitive overload in complex multi-dimensional data visualization can cause challenges for users to comprehend, navigate, and determine statistical measures. To address this problem, we tested the feasibility of integrating vibrotactile and force-feedback haptics with bubble charts to test several haptic rendering techniques for visualizing data. The preliminary results suggest that force-feedback was effective in quickly navigating to points of interest in the data, and vibrotactile feedback was effective in representing higher dimensions and locating statistical measures of data.
- Published
- 2021
15. Towards Functional Robotic Rehabilitation: Clinical-Driven Development of a Novel Device for Sensorimotor Hand Training
- Author
-
René M. Müri, Laura Marchal-Crespo, and Raphael Ratz
- Subjects
Rehabilitation ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,business.industry ,Computer science ,medicine.medical_treatment ,Usability ,Robotic rehabilitation ,Haptic rendering ,Somatosensory system ,Sensory function ,Human–computer interaction ,medicine ,business ,Haptic technology - Abstract
Currently, there is a lack of easy-to-use hand rehabilitation devices which not only retrain motor functions, but also include somatosensory information of the interaction with tangible virtual objects to also regain sensory function. To overcome these shortcomings, we are developing a novel haptic rehabilitation device that focuses on usability, enforces physiological movements and provides haptic rendering capabilities.
- Published
- 2021
16. Characterizing the Effects of Haptic Rendering Parameter Variations on Perceived Kinesthetic Rendering Accuracy
- Author
-
Bolun Zhang, Michael Hagenow, Michael Gleicher, Michael R. Zinn, and Bilge Mutlu
- Subjects
Perceived realism ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,business.industry ,Kinesthetic learning ,Model parameters ,Haptic rendering ,GeneralLiterature_MISCELLANEOUS ,Rendering (computer graphics) ,Selection (linguistics) ,Computer vision ,Artificial intelligence ,business ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
To understand how the realism of a kinesthetic haptic rendering is affected by the accurate selection of the rendering model parameters, we conducted a preliminary user study where subjects compared three real-world objects to their equivalent haptic rendering. The subjects rated the rendering realism as the model parameters were varied about their nominal values. The results suggest that the required accuracy of various haptic rendering parameters is not equally important when considering the perceived realism.
- Published
- 2021
17. A high-performance haptic rendering system for virtual reality molecular modeling
- Author
-
Arif Pramudwiatmoko, Yutaka Ueno, Akihiko Konagaya, Gregory Gutmann, and Satoru Tsutoh
- Subjects
Computer science ,business.industry ,0206 medical engineering ,Software development ,DirectX ,02 engineering and technology ,Virtual reality ,Haptic rendering ,020601 biomedical engineering ,General Biochemistry, Genetics and Molecular Biology ,Rendering (computer graphics) ,03 medical and health sciences ,0302 clinical medicine ,Leap motion ,Artificial Intelligence ,Computer graphics (images) ,Software system ,business ,030217 neurology & neurosurgery ,3D computer graphics ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
To provide a virtual reality 3D user interface with comprehensive molecular modeling, we have developed a novel haptic rendering system with a fingertip haptic rendering device and a hand-tracking Leap Motion controller. The system handles virtual molecular objects with real hands motion captured by the Leap Motion controller in a virtual reality environment. The fingertip haptic rendering device attached on each finger and a wrist gives haptic display, when virtual hands manipulating virtual molecular objects. Based on preliminary software development studies using existing 3D graphics toolkit such as CHAI3D and Unity, the fingertip haptic rendering device works with a reasonable performance for a polygon surface model and a ribbon model, but not for an atomic model due to the low rendering performance. On the other hand, the device provides us a grasping feeling of a large molecule represented by an atomic model, when used with the particle simulation system running on graphics library, DirectX 12. The haptic rendering performances, among the three software systems are discussed.
- Published
- 2019
18. Development and assessment of a haptic-enabled holographic surgical simulator for renal biopsy training
- Author
-
Zhaoxiang Guo, Jun Peng, Zhibao Qin, Qiong Li, Yonghang Tai, Junsheng Shi, and Xiaoqiao Huang
- Subjects
0209 industrial biotechnology ,medicine.medical_specialty ,Computer science ,education ,Training system ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Holography ,02 engineering and technology ,Haptic rendering ,Stereo display ,Theoretical Computer Science ,law.invention ,020901 industrial engineering & automation ,law ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,Immersion (virtual reality) ,Medical physics ,Surgical simulator ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology ,Graphics pipeline ,Pipeline (software) ,020201 artificial intelligence & image processing ,Geometry and Topology ,Software - Abstract
In this paper, a high-immersive surgical training system for renal biopsy using holographic demonstration and haptic feedback is presented to push the limitation of virtual medical training development. Proposed system is including: holographic visual rendering pipeline reconstructed by the patient-specific CT images; haptic rendering pipeline implemented by the dual-hands 6-DOF haptic devices connected with surgical instruments of image guide and immersive 3D display learning operating room environment. Twenty-four medical students and eight experienced thoracic surgeons from the Yunnan First People’s Hospital are invited to evaluate our holographic-based training system through the subjective and objective assessment test, respectively. Experiment result from the face, content, improvement, and construct evaluations demonstrated a high performance than the existing VR-based trainer, especially the puncture accuracy of medical students’ group, was improved by 30.8% after training.
- Published
- 2019
19. Voxel-based Haptic Rendering using Adaptive Sampling of a Local Distance Map
- Author
-
Jinah Park and Kimin Kim
- Subjects
Adaptive sampling ,Computer science ,Voxel ,business.industry ,General Engineering ,Computer vision ,Artificial intelligence ,computer.software_genre ,Haptic rendering ,business ,Distance transform ,computer ,Computer Science Applications - Published
- 2019
20. Tactile sensitivity in ultrasonic haptics: Do different parts of hand and different rendering methods have an impact on perceptual threshold?
- Author
-
Sun Xiaoying, Weizhi Nai, and Chongyang Sun
- Subjects
lcsh:Computer engineering. Computer hardware ,Computer science ,Acoustics ,media_common.quotation_subject ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,lcsh:TK7885-7895 ,Haptic rendering ,Focused ultrasound ,Rendering (computer graphics) ,InformationSystems_MODELSANDPRINCIPLES ,Perception ,Ultrasonic sensor ,ComputingMethodologies_COMPUTERGRAPHICS ,Step method ,Haptic technology ,DC bias ,media_common - Abstract
Background Ultrasonic tactile representation utilizes focused ultrasound to create tactile sensations on the bare skin of a user’s hand that is not in contact with a device. This study is a preliminary investigation on whether different ultrasonic haptic rendering methods have an impact on the perceptual threshold. Methods This study conducted experiments with the adaptive step method to obtain participants’ perceptual thresholds. We examine (1) whether different parts on the palm of the hand have different perceptual thresholds; (2) whether the perceptual threshold is different when the ultrasonic focus point is stationary and when it moves in different trajectories; (3) whether different moving speeds of the ultrasonic focus point have an influence on the perceptual threshold; and (4) whether the addition of a DC offset to the modulating wave has an impact on the perceptual threshold. Results The results show that the center of the palm is more sensitive to ultrasonic haptics than the fingertip; compared with a fast-moving focus point, the palm is more sensitive to a stationary and slow-moving focus point. When the modulating wave has a DC offset, the palm is sensitive to a much smaller modulation amplitude. Conclusion For the future ultrasonic tactile representation systems, dynamic adjustment of intensity is required to compensate the difference in perceptual thresholds under different rendering methods to achieve more realistic ultrasonic haptics. Keywords: Ultrasonic tactile, Rendering methods, Amplitude modulation, Perceptual threshold, Human-computer interaction
- Published
- 2019
21. Delay-Dependent Stability Analysis in Haptic Rendering
- Author
-
Ahmad Mashayekhi, Saeed Behbahani, Bruno Siciliano, Fanny Ficuciello, Mashayekhi, Ahmad, Behbahani, Saeed, Ficuciello, Fanny, and Siciliano, Bruno
- Subjects
0209 industrial biotechnology ,Computer science ,Mechanical Engineering ,Stability (learning theory) ,Boundary (topology) ,02 engineering and technology ,Virtual reality ,Haptic rendering ,Industrial and Manufacturing Engineering ,Delay dependent ,020901 industrial engineering & automation ,Operator (computer programming) ,Artificial Intelligence ,Control and Systems Engineering ,Control theory ,Robot ,Electrical and Electronic Engineering ,Software ,Haptic technology - Abstract
Nowadays haptic devices have lots of applications in virtual reality systems. While using a haptic device, one of the main requirements is the stable behavior of the system. An unstable behavior of a haptic device may damage itself and even may hurt its operator. Stability of haptic devices in the presence of inevitable time delay in addition to a suitable zero-order hold is studied in the presented paper, using two different methods. Both presented methods are based on Lyapunov-Krazuvskii functional. In the first method, a model transform is performed to determine the stability boundary, while the second approach is based on Free Weighing Matrices (FWMs). Delay-dependent stability criteria are determined by solving Linear Matrix Inequalities (LMIs). Results of these two methods are compared with each other and verified by simulations as well as experiments on a KUKA Light Weight Robot 4 (LWR4). It is concluded that using free weighing matrices leads to more unknown parameters and needs more calculation, but its results are less conservative.
- Published
- 2019
22. Unilateral and Bilateral Virtual Springs: Contact Transitions Unmask Device Dynamics
- Author
-
Emma Treadway and R. Brent Gillespie
- Subjects
0209 industrial biotechnology ,Computer science ,05 social sciences ,02 engineering and technology ,Human motion ,Haptic rendering ,Load cell ,050105 experimental psychology ,Computer Science Applications ,Rendering (computer graphics) ,Human-Computer Interaction ,User-Computer Interface ,020901 industrial engineering & automation ,Touch Perception ,Electric Impedance ,Humans ,0501 psychology and cognitive sciences ,Mechanical advantage ,Parasitic extraction ,Haptic perception ,Electrical impedance ,Simulation - Abstract
The study of haptic perception often makes use of haptic rendering to display the variety of impedances needed to run an experiment. Unacknowledged in many cases is the influence of the selected device hardware on what the user will feel, particularly in interactions featuring frequencies above the control bandwidth. While human motion is generally limited to 10 Hz, virtual environments with unilateral constraints are subject to excitation of a wider frequency spectrum through contact transitions. We employ the effective impedance decomposition to discuss the effects of parasitics outside the rendering bandwidth. We also introduce an analysis of the admittance and impedance controllers with respect to sensitivity to load cell noise. We explore these effects using a single degree-of-freedom device that can be configured for either a low or high mechanical advantage in a perceptual experiment, with experimental conditions designed through application of the effective impedance decomposition. We find that the excitation of high frequencies through contact transitions negatively impacts humans' ability to distinguish between stiffnesses.
- Published
- 2019
23. Capacitive Sensing for Improving Contact Rendering with Tangible Objects in VR
- Author
-
Claudio Pacchierotti, Maud Marchal, Anatole Lécuyer, Xavier de Tinguy, 3D interaction with virtual environments using body and mind (Hybrid), Inria Rennes – Bretagne Atlantique, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-MEDIA ET INTERACTIONS (IRISA-D6), Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National de Recherche en Informatique et en Automatique (Inria)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-CentraleSupélec-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT), Sensor-based and interactive robotics (RAINBOW), Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-SIGNAUX ET IMAGES NUMÉRIQUES, ROBOTIQUE (IRISA-D5), Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), and Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique)
- Subjects
Computer science ,business.industry ,Sensors ,Capacitive sensing ,Haptic rendering ,020207 software engineering ,02 engineering and technology ,Computer Graphics and Computer-Aided Design ,Human-centered computing ,Rendering (computer graphics) ,Signal Processing ,0202 electrical engineering, electronic engineering, information engineering ,Immersion (virtual reality) ,Computer vision ,Computer Vision and Pattern Recognition ,Artificial intelligence ,Human computer interaction ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,business ,Software - Abstract
International audience; We combine tracking information from a tangible object instrumented with capacitive sensors and an optical tracking system, to improve contact rendering when interacting with tangibles in VR. A human-subject study shows that combining capacitive sensing with optical tracking significantly improves the visuohaptic synchronization and immersion of the VR experience.
- Published
- 2021
24. Promoting Motor Variability During Robotic Assistance Enhances Motor Learning of Dynamic Tasks
- Author
-
Karin A. Buetler, Laura Marchal-Crespo, and Ozhan Ozen
- Subjects
model predictive controllers ,Computer science ,media_common.quotation_subject ,Control (management) ,610 Medicine & health ,haptic rendering ,effort ,Task (project management) ,lcsh:RC321-571 ,robotic assistance ,Human–computer interaction ,Perception ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry ,Original Research ,media_common ,Haptic technology ,neurorehabilitation ,Sense of agency ,variability ,General Neuroscience ,Pendulum ,620 Engineering ,Motor learning ,motor learning ,Delta robot ,Neuroscience - Abstract
Despite recent advances in robot-assisted training, the benefits of haptic guidance on motor (re)learning are still limited. While haptic guidance may increase task performance during training, it may also decrease participants' effort and interfere with the perception of the environment dynamics, hindering somatosensory information crucial for motor learning. Importantly, haptic guidance limits motor variability, a factor considered essential for learning. We propose that Model Predictive Controllers (MPC) might be good alternatives to haptic guidance since they minimize the assisting forces and promote motor variability during training. We conducted a study with 40 healthy participants to investigate the effectiveness of MPCs on learning a dynamic task. The task consisted of swinging a virtual pendulum to hit incoming targets with the pendulum ball. The environment was haptically rendered using a Delta robot. We designed two MPCs: the first MPC—end-effector MPC—applied the optimal assisting forces on the end-effector. A second MPC—ball MPC—applied its forces on the virtual pendulum ball to further reduce the assisting forces. The participants' performance during training and learning at short- and long-term retention tests were compared to a control group who trained without assistance, and a group that trained with conventional haptic guidance. We hypothesized that the end-effector MPC would promote motor variability and minimize the assisting forces during training, and thus, promote learning. Moreover, we hypothesized that the ball MPC would enhance the performance and motivation during training but limit the motor variability and sense of agency (i.e., the feeling of having control over their movements), and therefore, limit learning. We found that the MPCs reduce the assisting forces compared to haptic guidance. Training with the end-effector MPC increases the movement variability and does not hinder the pendulum swing variability during training, ultimately enhancing the learning of the task dynamics compared to the other groups. Finally, we observed that increases in the sense of agency seemed to be associated with learning when training with the end-effector MPC. In conclusion, training with MPCs enhances motor learning of tasks with complex dynamics and are promising strategies to improve robotic training outcomes in neurological patients.
- Published
- 2021
25. Press'Em: Simulating Varying Button Tactility via FDVV Models
- Author
-
Sunjun Kim, Yi-Chi Liao, Antti Oulasvirta, and Byungjoo Lee
- Subjects
FOS: Computer and information sciences ,InformationSystems_MODELSANDPRINCIPLES ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,Computer Science - Human-Computer Interaction ,Input device ,Haptic rendering ,Pipeline (software) ,GeneralLiterature_MISCELLANEOUS ,Simulation ,Human-Computer Interaction (cs.HC) ,Haptic technology - Abstract
Push-buttons provide rich haptic feedback during a press via mechanical structures. While different buttons have varying haptic qualities, few works have attempted to dynamically render such tactility, which limits designers from freely exploring buttons' haptic design. We extend the typical force-displacement (FD) model with vibration (V) and velocity-dependence characteristics (V) to form a novel FDVV model. We then introduce Press'Em, a 3D-printed prototype capable of simulating button tactility based on FDVV models. To drive Press'Em, an end-to-end simulation pipeline is presented that covers (1) capturing any physical buttons, (2) controlling the actuation signals, and (3) simulating the tactility. Our system can go beyond replicating existing buttons to enable designers to emulate and test non-existent ones with desired haptic properties. Press'Em aims to be a tool for future research to better understand and iterate over button designs., 4 pages, CHI'20 EA. arXiv admin note: text overlap with arXiv:2001.04352
- Published
- 2020
26. Explorations of Wrist Haptic Feedback for AR/VR Interactions with Tasbi
- Author
-
Hrvoje Benko, Shea Robinson, Priyanshu Agarwal, Marcia K. O'Malley, Evan Pezent, Ali Israr, Nicholas Colonnese, and Majed Samad
- Subjects
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,05 social sciences ,Wearable computer ,Optical head-mounted display ,020207 software engineering ,Sensory system ,02 engineering and technology ,Wrist ,Virtual reality ,Haptic rendering ,Rigid body ,Rendering (computer graphics) ,medicine.anatomical_structure ,Human–computer interaction ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,0501 psychology and cognitive sciences ,Actuator ,050107 human factors ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
Most widespread haptic feedback devices for augmented and virtual reality (AR/VR) fall into one of two categories: simple hand-held controllers with a single vibration actuator, or complex glove systems with several embedded actuators. In this work, we explore haptic feedback on the wrist for interacting with virtual objects. We use Tasbi, a compact bracelet device capable of rendering complex multisensory squeeze and vibrotactile feedback. Leveraging Tasbi's haptic rendering, and using standard visual and audio rendering of a head mounted display, we present several interactions that tightly integrate sensory substitutive haptics with visual and audio cues. Interactions include push/pull buttons, rotary knobs, textures, rigid body weight and inertia, and several custom bimanual manipulations such as shooting an arrow from a bow. These demonstrations suggest that wrist-based haptic feedback substantially improves virtual hand-based interactions in AR/VR compared to no haptic feedback.
- Published
- 2020
27. PoCoPo: Handheld Pin-based Shape Display for Haptic Rendering in Virtual Reality
- Author
-
Shigeo Yoshida, Hideaki Kuzuoka, and Yuqian Sun
- Subjects
Computer science ,media_common.quotation_subject ,05 social sciences ,020207 software engineering ,02 engineering and technology ,Virtual reality ,Haptic rendering ,Object (computer science) ,Human–computer interaction ,Virtual image ,Perception ,Sensation ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,Mobile device ,050107 human factors ,Haptic technology ,media_common - Abstract
We introduce PoCoPo, the first handheld pin-based shape display that can render various 2.5D shapes in hand in realtime. We designed the display small enough for a user to hold it in hand and carry it around, thereby enhancing the haptic experiences in a virtual environment. PoCoPo has 18 motor-driven pins on both sides of a cuboid, providing the sensation of skin contact on the user's palm and fingers. We conducted two user studies to understand the capability of PoCoPo. The first study showed that the participants were generally successful in distinguishing the shapes rendered by PoCoPo with an average success rate of 88.5%. In the second study, we investigated the acceptable visual size of a virtual object when PoCoPo rendered a physical object of a certain size. The result led to a better understanding of the acceptable differences between the perceptions of visual size and haptic size.
- Published
- 2020
28. Rendering of Constraints With Underactuated Haptic Devices
- Author
-
Daniel Lobo and Miguel A. Otaduy
- Subjects
Informática ,0209 industrial biotechnology ,Computer science ,Underactuation ,Lift (data mining) ,Haptic rendering ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,02 engineering and technology ,Rendering algorithms ,Models, Theoretical ,Computer Science Applications ,Rendering (computer graphics) ,Human-Computer Interaction ,User-Computer Interface ,020303 mechanical engineering & transports ,020901 industrial engineering & automation ,0203 mechanical engineering ,underactuated haptics ,1203.04 Inteligencia Artificial ,Humans ,Simulation ,Algorithms ,Haptic technology ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
Several previous works have studied the application of proxy-based rendering algorithms to underactuated haptic devices. However, all these works make oversimplifying assumptions about the configuration of the haptic device, and they ignore the user’s intent. In this work, we lift those assumptions, and we carry out a theoretical study that unveils the existence of unnatural ghost forces under typical proxy-based rendering. We characterize and quantify those ghost forces. In addition, we design a novel rendering strategy, with anisotropic coupling between the device and the proxy. With this strategy, the forces rendered by an underactuated device are a best match of the forces rendered by a fully actuated device. We have demonstrated our findings on synthetic experiments and a simple real-world experiment.
- Published
- 2020
29. Haptic Rendering of Diverse Tool-Tissue Contact Constraints During Dental Implantation Procedures
- Author
-
Xiaohan Zhao, Zhuoli Zhu, Yu Cong, Yongtao Zhao, Yuru Zhang, and Dangxiao Wang
- Subjects
Computer science ,lcsh:Mechanical engineering and machinery ,Physics::Medical Physics ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,haptic rendering ,Virtual reality ,Haptic rendering ,state switching ,lcsh:QA75.5-76.95 ,Rendering (computer graphics) ,Computer Science::Robotics ,surgery simulation ,dental implantation procedures ,Artificial Intelligence ,Surgical skills ,lcsh:TJ1-1570 ,Movement control ,Motor skill ,Simulation ,ComputingMethodologies_COMPUTERGRAPHICS ,Original Research ,contact constraints ,Robotics and AI ,Computer Science Applications ,State switching ,lcsh:Electronic computers. Computer science ,Bone surface - Abstract
Motor skill learning of dental implantation surgery is difficult for novices because it involves fine manipulation of different dental tools to fulfill a strictly pre-defined procedure. Haptics-enabled virtual reality training systems provide a promising tool for surgical skill learning. In this paper, we introduce a haptic rendering algorithm for simulating diverse tool-tissue contact constraints during dental implantation. Motion forms of an implant tool can be summarized as the high degree of freedom (H-DoF) motion and the low degree of freedom (L-DoF) motion. During the H-DoF state, the tool can move freely on bone surface and in free space with 6 DoF. While during the L-DoF state, the motion degrees are restrained due to the constraints imposed by the implant bed. We propose a state switching framework to simplify the simulation workload by rendering the H-DoF motion state and the L-DoF motion state separately, and seamless switch between the two states by defining an implant criteria as the switching judgment. We also propose the virtual constraint method to render the L-DoF motion, which are different from ordinary drilling procedures as the tools should obey different axial constraint forms including sliding, drilling, screwing and perforating. The virtual constraint method shows efficiency and accuracy in adapting to different kinds of constraint forms, and consists of three core steps, including defining the movement axis, projecting the configuration difference, and deriving the movement control ratio. The H-DoF motion on bone surface and in free space is simulated through the previously proposed virtual coupling method. Experimental results illustrated that the proposed method could simulate the 16 different phases of the complete implant procedures of the Straumann® Bone Level(BL) Implants Φ4.8–L12 mm. According to the output force curve, different contact constraints could be rendered with steady and continuous output force during the operation procedures.
- Published
- 2020
30. Compensating for Fingertip Size to Render Tactile Cues More Accurately
- Author
-
Claudio Pacchierotti, Katherine J. Kuchenbecker, David Gueorguiev, Eric M. Young, University of Pennsylvania [Philadelphia], Max Planck Institute for Intelligent Systems, Max-Planck-Gesellschaft, Sensor-based and interactive robotics (RAINBOW), Inria Rennes – Bretagne Atlantique, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-SIGNAUX ET IMAGES NUMÉRIQUES, ROBOTIQUE (IRISA-D5), Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Rennes 1 (UR1), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT), DGE-1321851, NSF Graduate Research Fellowship Program, University of Pennsylvania, Max Planck Institute for Intelligent Systems [Tübingen], Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), and Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique)
- Subjects
Male ,Computer science ,0206 medical engineering ,02 engineering and technology ,Haptic rendering ,Rendering (computer graphics) ,Fingers ,Software ,Feedback, Sensory ,Software Design ,Physical Stimulation ,0202 electrical engineering, electronic engineering, information engineering ,[INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO] ,Humans ,Computer vision ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,Sensory cue ,ComputingMilieux_MISCELLANEOUS ,Wearable haptics ,Haptic technology ,business.industry ,020207 software engineering ,Equipment Design ,020601 biomedical engineering ,Computer Science Applications ,Human-Computer Interaction ,Touch Perception ,Touch ,Teleoperation ,Female ,Artificial intelligence ,Cues ,business ,Algorithms - Abstract
International audience; Fingertip haptic feedback offers advantages in many applications , including robotic teleoperation, gaming, and training. However, fingertip size and shape vary significantly across humans, making it difficult to design fingertip interfaces and rendering techniques suitable for everyone. This paper starts with an existing data-driven haptic rendering algorithm that ignores fingertip size, and it then develops two software-based approaches to personalize this algorithm for fingertips of different sizes using either additional data or geometry. We evaluate our algorithms in the rendering of pre-recorded tactile sensations onto rubber casts of six different fingertips as well as onto the real fingertips of 13 human participants. Results on the casts show that both approaches significantly improve performance, reducing force error magnitudes by an average of 78% with respect to the standard non-personalized rendering technique. Congruent results were obtained for real fingertips, with subjects rating each of the two personalized rendering techniques significantly better than the standard non-personalized method.
- Published
- 2020
31. HapBead: On-Skin Microfluidic Haptic Interface using Tunable Bead
- Author
-
Xiaochen Shi, Baogang Quan, Feng Tian, Teng Han, Hongan Wang, Sriram Subramanian, Shubhi Bansal, and Yanjun Chen
- Subjects
Channel (digital image) ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,business.industry ,Computer science ,media_common.quotation_subject ,05 social sciences ,Microfluidics ,Illusion ,Wearable computer ,020207 software engineering ,02 engineering and technology ,Haptic rendering ,Rendering (computer graphics) ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,business ,050107 human factors ,Wearable technology ,Simulation ,ComputingMethodologies_COMPUTERGRAPHICS ,media_common ,Haptic technology - Abstract
On-skin haptic interfaces using soft elastomers which are thin and flexible have significantly improved in recent years. Many are focused on vibrotactile feedback that requires complicated parameter tuning. Another approach is based on mechanical forces created via piezoelectric devices and other methods for non-vibratory haptic sensations like stretching, twisting. These are often bulky with electronic components and associated drivers are complicated with limited control of timing and precision. This paper proposes HapBead, a new on-skin haptic interface that is capable of rendering vibration like tactile feedback using microfluidics. HapBead leverages a microfluidic channel to precisely and agilely oscillate a small bead via liquid flow, which then generates various motion patterns in channel that creates highly tunable haptic sensations on skin. We developed a proof-of-concept design to implement thin, flexible and easily affordable HapBead platform, and verified its haptic rendering capabilities via attaching it to users' fingertips. A study was carried out and confirmed that participants could accurately tell six different haptic patterns rendered by HapBead. HapBead enables new wearable display applications with multiple integrated functionalities such as on-skin haptic doodles, visuo-haptic displays and haptic illusions.
- Published
- 2020
- Full Text
- View/download PDF
32. Physical Human-Robot Interaction with Real Active Surfaces using Haptic Rendering on Point Clouds
- Author
-
Marco Hutter, Yves Zimmermann, Michael Sommerhalder, Robert Riener, and Burak Cizmeci
- Subjects
Human Robot Interaction ,Haptic rendering ,Point cloud ,Dynamic constraints ,Haptic interaction ,0209 industrial biotechnology ,Computer science ,Controller (computing) ,02 engineering and technology ,Workspace ,Human–robot interaction ,020901 industrial engineering & automation ,0202 electrical engineering, electronic engineering, information engineering ,Computer vision ,Collision avoidance ,ComputingMethodologies_COMPUTERGRAPHICS ,business.industry ,Usability ,Collision ,Exoskeleton ,Robot ,020201 artificial intelligence & image processing ,Artificial intelligence ,business - Abstract
During robot-assisted therapy of hemiplegic patients, interaction with the patient must be intrinsically safe. Straight-forward collision avoidance solutions can provide this safety requirement with conservative margins. These margins heavily reduce the robot’s workspace and make interaction with the patient’s unguided body parts impossible. However, interaction with the own body is highly beneficial from a therapeutic point of view. We tackle this problem by combining haptic rendering techniques with classical computer vision methods. Our proposed solution consists of a pipeline that builds collision objects from point clouds in real-time and a controller that renders haptic interaction. The raw sensor data is processed to overcome noise and occlusion problems. Our proposed approach is validated on the 6 DoF exoskeleton ANYexo for direct impacts, sliding scenarios, and dynamic collision surfaces. The results show that this method has the potential to successfully prevent collisions and allow haptic interaction for highly dynamic environments. We believe that this work significantly adds to the usability of current exoskeletons by enabling virtual haptic interaction with the patient’s body parts in human-robot therapy., 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), ISBN:978-1-7281-6212-6, ISBN:978-1-7281-6213-3
- Published
- 2020
33. Haptic Rendering of Soft-Tissue for Training Surgical Procedures at the Larynx
- Author
-
Marc Stamminger, Rolf Janka, Thomas Wittenberg, Thomas Eixelberger, Jonas Parchent, and Michael Döllinger
- Subjects
Larynx ,medicine.medical_specialty ,Computer science ,medicine.medical_treatment ,Training (meteorology) ,Soft tissue ,Surgical procedures ,Haptic rendering ,Flight simulator ,Surgical training ,medicine.anatomical_structure ,Tracheotomy ,medicine ,Medical physics - Abstract
Assistant physicians typically learn surgical techniques by observation and supervised practice on the patient or using biophantoms. Alternatively, surgical simulators with the possibility of new training possibilities can be used. A number of simulators is already commercially available and might in the future become as important for surgical training as flight simulators. Since so far no simulator is concerned with the training of tracheotomies, a soft-tissue model for simulating tracheotomy was developed. This soft-tissue model is integrated into our ENT surgical simulator for tracheotomy. To model the soft-tissues of the neck (skin and fat), a computed tomography (CT) scan was interactively segmented. For the interaction simulation of a scalpel with the soft-tissue, position based dynamics (PBD) was used, originally developed for the gaming industries. Initial results imply that the proposed approach is able to model soft tissues for virtual surgical training.
- Published
- 2020
34. Towards Haptic Images: a Survey on Touchscreen-Based Surface Haptics
- Author
-
Antoine Costes, Anatole Lécuyer, Philippe Guillotel, Fabien Danieau, Ferran Argelaguet, 3D interaction with virtual environments using body and mind (Hybrid), Inria Rennes – Bretagne Atlantique, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-MEDIA ET INTERACTIONS (IRISA-D6), Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT), InterDigital R&D France, Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National de Recherche en Informatique et en Automatique (Inria)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-CentraleSupélec-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), and Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1)
- Subjects
Computer science ,media_common.quotation_subject ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,02 engineering and technology ,Haptic rendering ,Vibration ,law.invention ,Rendering (computer graphics) ,User-Computer Interface ,Touchscreen ,haptics ,law ,Human–computer interaction ,Feedback, Sensory ,Perception ,0202 electrical engineering, electronic engineering, information engineering ,Computer Graphics ,Humans ,surface ,0501 psychology and cognitive sciences ,survey ,image ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,050107 human factors ,media_common ,Haptic technology ,ComputingMethodologies_COMPUTERGRAPHICS ,business.industry ,Data Visualization ,05 social sciences ,020207 software engineering ,Usability ,Computer Science Applications ,Visualization ,Human-Computer Interaction ,Touch ,touchscreen ,Haptic perception ,business ,texture - Abstract
International audience; The development of tactile screens opens new perspectives for co-located image and haptic rendering, leading to the concept of "haptic images". They emerge from the combination of image data, rendering hardware, and haptic perception. This enables one to perceive haptic feedback while manually exploring an image. This raises nevertheless two scientific challenges, which serve as thematic axes of the state of the art in this survey. Firstly, the choice of appropriate haptic data raises a number of issues about human perception, measurements, modeling and distribution. Secondly, the choice of appropriate rendering technology implies a difficult trade-off between expressiveness and usability.
- Published
- 2020
35. Combining Wristband Display and Wearable Haptics for Augmented Reality
- Author
-
Tommaso Lisini Baldi, Davide Barcelli, Domenico Prattichizzo, and Gianluca Paolocci
- Subjects
business.industry ,Computer science ,media_common.quotation_subject ,Human-centered computing ,Mixed / augmented reality ,Usability ,Haptic rendering ,Virtual interaction ,Human–computer interaction ,Perception ,Augmented reality ,business ,Wearable haptics ,media_common - Abstract
Taking advantages of widely distributed hardware such as smartphones and tablets, Mobile Augmented Reality (MAR) market is rapidly growing. Major improvements can be envisioned in increasing the realism of virtual interaction and providing multimodal experiences. We propose a novel system prototype that locates the display on the forearm using a rigid support to avoid constraints due to hand-holding, and is equipped with hand tracking and cutaneous feedback. The hand tracking enables the manipulation of virtual objects, while the haptic rendering enhances the user’s perception of the virtual entities. The experimental setup has been tested by ten participants, that expressed their impressions about usability and functionality of the wrist-mounted system w.r.t. the traditional hand-held condition. Subjects’ personal evaluations suggest that the AR experience provided by the wrist-based approach is more engaging and immersive.
- Published
- 2020
36. Collaborative Haptic Exploration of Dynamic Remote Environments
- Author
-
Jinah Park and Seokyeol Kim
- Subjects
Modality (human–computer interaction) ,Haptic interaction ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,020207 software engineering ,02 engineering and technology ,Haptic rendering ,Computer Graphics and Computer-Aided Design ,Task (project management) ,Computer graphics ,Human–computer interaction ,0202 electrical engineering, electronic engineering, information engineering ,Task analysis ,020201 artificial intelligence & image processing ,Augmented reality ,Haptic perception ,Software ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
Haptic perception is an important modality in reinforcing the presence of virtual or remote targets, but providing haptic feedback of dynamic environments still remains a challenging task. We address the issue of haptic telepresence systems by improving and integrating two independent approaches: real-time haptic rendering of unstructured spatial data and collaborative interaction with a remote partner. Contact with physical constraints is directly estimated from streaming point-cloud data without surface reconstruction, and haptic guidance cues that restrict or promote the users motion can be provided by both predefined triggers and gesture-based input from a helper. Through a user study with a proof-of-concept prototype, the authors show that the proposed approach significantly improves the performance of remote exploration tasks while enabling stable haptic interaction with real-world spatial data.
- Published
- 2018
37. Remote Environment Exploration with Drone Agent and Haptic Force Feedback
- Author
-
Sheng Jia, Tinglin Duan, Konstantinos N. Plataniotis, Kosuke Sato, Parinya Punpongsanon, and Daisuke Iwai
- Subjects
0209 industrial biotechnology ,Computer science ,business.industry ,3D reconstruction ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Point cloud ,Usability ,02 engineering and technology ,Haptic rendering ,Drone ,Rendering (computer graphics) ,020901 industrial engineering & automation ,User experience design ,Human–computer interaction ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
Camera drones allow exploring remote scenes that are inaccessible or inappropriate to visit in person. However, these exploration experiences are often limited due to insufficient scene information provided by front cameras, where only 2D images or videos are supplied. Combining a camera drone vision with haptic feedback would augment users' spatial understandings of the remote environment. But such designs are usually difficult for users to learn and apply, due to the complexity of the system and unfluent UAV control. In this paper, we present a new telepresence system for remote environment exploration, with a drone agent controlled by a VR mid-air panel. The drone is capable of generating real-time location and landmark details using integrated Simultaneous Location and Mapping (SLAM). The SLAMs' point cloud generations are produced using RGB input, and the results are passed to a Generative Adversarial Network (GAN) to reconstruct 3D remote scenes in real-time. The reconstructed objects are taken advantage of by haptic devices which could improve user experience through haptic rendering. Capable of providing both visual and haptic feedback, our system allows users to examine and exploit remote areas without having to be physically present. An experiment has been conducted to verify the usability of 3D reconstruction result in haptic feedback rendering.
- Published
- 2019
38. Haptic rendering for the coupling between fluid and deformable object
- Author
-
Shiguang Liu, Gang Feng, and Chao Ma
- Subjects
Coupling ,Buoyancy ,Computer science ,business.industry ,Virtual reality ,engineering.material ,Haptic rendering ,Object (computer science) ,Computer Graphics and Computer-Aided Design ,Physics::Fluid Dynamics ,Human-Computer Interaction ,Computer graphics ,Position (vector) ,engineering ,Computer vision ,Artificial intelligence ,business ,Software ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
It remains a challenging problem to simulate the haptic interaction between the fluid and deformable objects due to the inhomogeneity of the deformable object. In this paper, we present a novel position-based haptic interaction method to tackle this problem. On the one hand, according to the inhomogeneity of the deformable object, we calculate the properties of different regions of the deformable object and then evaluate its haptic force so as to make the results more realistic. On the other hand, to preserve more details of haptic feedback forces, we especially incorporate the calculation of buoyancy, pressure, viscous force and elastic force into our framework and design a novel integration scheme for assembling such forces. Moreover, by respecting the influence from fluid viscosity, our method can obtain different haptic force feedback for fluids of different viscosities. Various experiments validated our new method.
- Published
- 2018
39. Stable haptic rendering in interactive virtual control laboratory
- Author
-
Ali Nahvi, Behnoosh Bahadorian, Ali Chaibakhsh, and Saeed Amirkhani
- Subjects
0209 industrial biotechnology ,Computer science ,Mechanical Engineering ,Control (management) ,Computational Mechanics ,Stability (learning theory) ,02 engineering and technology ,Haptic rendering ,Fuzzy logic ,03 medical and health sciences ,020901 industrial engineering & automation ,0302 clinical medicine ,Impedance control ,Artificial Intelligence ,Control theory ,030220 oncology & carcinogenesis ,Virtual Laboratory ,Engineering (miscellaneous) ,Simulation ,Haptic technology - Abstract
Stable control of haptic interfaces is one of the most important challenges in haptic simulations, because any instability of a haptic interface can cause it to get far from the realistic sense. In this paper, the control strategies employed for a stable haptic rendering in an interactive virtual control laboratory are presented. In this interactive virtual laboratory, there are different scenarios to teach the control concepts, in which a haptic interface is used in the two cases of force control and position control. In this regard, two control strategies are employed to avoid instability. An energy-compensating controller is utilized to remove energy leakage. Besides, a fuzzy impedance control is used along with the energy-compensating controller for the position control scenarios. The results obtained indicate the proposed approaches practically guarantee the stability of the haptic interface for an educational application in practice.
- Published
- 2018
40. A staged haptic rendering approach for virtual assembly of bolted joints in mechanical assembly
- Author
-
Liu Jiawu, Qing-Hui Wang, Guang-Hua Hu, and Jing-Rong Li
- Subjects
0209 industrial biotechnology ,Computer science ,Mechanical Engineering ,02 engineering and technology ,computer.software_genre ,Haptic rendering ,Industrial and Manufacturing Engineering ,Computer Science Applications ,Rendering (computer graphics) ,Transition stage ,020303 mechanical engineering & transports ,020901 industrial engineering & automation ,0203 mechanical engineering ,Control and Systems Engineering ,Virtual machine ,Bolted joint ,Immersion (virtual reality) ,computer ,Software ,Simulation ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
Haptic rendering in virtual environment provides a powerful training and validation tool for assembly of bolted joints that require accurate assembly forces. This work proposes a staged haptic rendering approach for virtual assembly (VA) of bolted joints. Firstly, by analyzing the stress condition during the actual assembly process, four consecutive stages, namely navigation stage, transition stage, linearity stage, and yield stage, are identified. Then, the force rendering model is set up. Moreover, a prototype VA system is developed to implement and test the approach. Two groups of experiments on a two-stage gear reducer are conducted to verify the feasibility of the approach and evaluate the prototype’s performance. The results have shown that the force calculated by the proposed approach is consistent with the actual assembly and the evaluators are highly positive on the immersion and the guiding ability of the VA process with the haptic rendering provided.
- Published
- 2018
41. A small displacement torsor (SDT)-based haptic rendering model for assembly of bolted joints
- Author
-
Yu Yong-Peng and Xu De-Jian
- Subjects
History ,Computer science ,business.industry ,Bolted joint ,Torsor ,Displacement (orthopedic surgery) ,Structural engineering ,Haptic rendering ,business ,GeneralLiterature_MISCELLANEOUS ,ComputingMethodologies_COMPUTERGRAPHICS ,Computer Science Applications ,Education - Abstract
Haptic interaction technology is increasingly used in virtual assembly training, assembly path planning, teleoperation assembly and other fields. Obviously, the key issue of haptic interaction technology is the realistic haptic rendering models. This paper proposes a haptic rendering model based on small displacement torsor (SDT) for bolt assembly. For the first time, geometrical errors of parts are considered in the haptic rendering model of bolt assembly, which can improve the fidelity. The comparative experiment proves that the haptic rendering model proposed in this paper is more familiar to the actual assembly than the existing model.
- Published
- 2021
42. Haptic rendering and interactive simulation using passive midpoint integration
- Author
-
Yongjun Lee, Myungsin Kim, Dongjun Lee, and Yong-Seok Lee
- Subjects
0209 industrial biotechnology ,Property (programming) ,Computer science ,business.industry ,Applied Mathematics ,Mechanical Engineering ,Passivity ,020207 software engineering ,02 engineering and technology ,Haptic rendering ,Midpoint ,Linear complementarity problem ,020901 industrial engineering & automation ,Interactive simulation ,Artificial Intelligence ,Modeling and Simulation ,Computer graphics (images) ,0202 electrical engineering, electronic engineering, information engineering ,Computer vision ,Artificial intelligence ,Electrical and Electronic Engineering ,business ,Software - Abstract
We propose a novel framework of haptic rendering and interactive simulation, which, by exploiting midpoint time integration, that is known for its superior energy-conserving property but has not yet been adopted for haptics and interactive simulation, can enforce discrete-time passivity of the simulation effectively in practice, while retaining real-time interactivity due to its being non-iterative. We derive this passive midpoint integration (PMI) simulation for mechanical systems both in maximal coordinates (i.e. in SE(3)) and in generalized coordinates (i.e. in[Formula: see text] ), with some potential actions as well to implement joint articulation, constraints, compliance, and so on. We also fully incorporate multi-point Coulomb frictional contact into them via the PMI-LCP (linear complementarity problem) formulation. The proposed PMI-based simulation framework is applied to some illustrative examples to demonstrate its advantages: (1) haptic rendering of a peg-in-hole task, where very light/stiff articulated objects can be simulated with multi-point contact; (2) haptic interaction with a flexible beam, where marginally stable/lossless behavior (i.e. vibration) can be stably emulated; and (3) under-actuated tendon-driven hand grasping, where mixed maximal-generalized coordinates are used with very light/stiff fingers.
- Published
- 2017
43. Image-based haptic display via a novel pen-shaped haptic device on touch screens
- Author
-
Dapeng Chen, Lei Tian, and Aiguo Song
- Subjects
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer Networks and Communications ,Computer science ,media_common.quotation_subject ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Image processing ,02 engineering and technology ,Haptic rendering ,Mode (computer interface) ,Computer graphics (images) ,Perception ,0202 electrical engineering, electronic engineering, information engineering ,Media Technology ,Computer vision ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology ,media_common ,Haptic interaction ,business.industry ,020207 software engineering ,Haptic display ,Photometric stereo ,Hardware and Architecture ,Stereotaxy ,020201 artificial intelligence & image processing ,Artificial intelligence ,Haptic perception ,business ,Software - Abstract
Haptic interaction is a new interactive mode in the interaction technology between human and touch screens. In this work, we present an image-based haptic interaction system on touch screens. A novel haptic pen is designed for haptic display of visual image. It generates force feedback and tactile feedback through electromechanical structure and piezoelectric ceramics respectively. Haptic model of image is built by image processing and haptic rendering. In image processing, the image is decomposed into geometry and textures by local total variation for distinguishing contour shape from surface details. And the contour lines are extracted from the geometry based on the adaptive flow-based difference-of-Gaussians algorithm. Then the height information of image is recovered via shape from shading algorithm and expressed by electromechanical structure. Textures and contour lines are displayed by piezoelectric ceramics. Finally, the haptic perception experiment is conducted to investigate effect of perception with different haptic pens and modes of haptic interaction.
- Published
- 2017
44. Measurement-based Hyper-elastic Material Identification and Real-time FEM Simulation for Haptic Rendering
- Author
-
Seokhee Jeon, Seungkyu Lee, Ibragim R. Atadjanov, and Arsen Abdulali
- Subjects
Data collection ,Numerical error ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Mechanical engineering ,020207 software engineering ,02 engineering and technology ,Haptic rendering ,Finite element method ,Rendering (computer graphics) ,Hyper elastic ,Compressive deformation ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
In this paper, we propose a measurement-based modeling framework for hyper-elastic material identification and real-time haptic rendering. We build a custom data collection setup that captures shape deformation and response forces during compressive deformation of cylindrical material samples. We collected training and testing sets of data from four silicone objects having various material profiles. We design an objective function for material parameter identification by incorporating both shape deformation and reactive forces and utilize a genetic algorithm. We adopted an optimization-based Finite Element Method (FEM) for object deformation rendering. The numerical error of simulated forces was found to be perceptually negligible.
- Published
- 2019
45. Force Sensorless Haptic Probe Driven by Large Circular Linear Motor for Haptic Rendering
- Author
-
Takuya Matsunaga, Hiroshi Asai, Tomoyuki Shimono, and Kouhei Ohnishi
- Subjects
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,business.industry ,020208 electrical & electronic engineering ,Stiffness ,02 engineering and technology ,Virtual reality ,Linear motor ,Haptic rendering ,Acceleration ,Circular motion ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,Robot ,Table (database) ,Computer vision ,Artificial intelligence ,medicine.symptom ,business ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
Utilization of haptic sensation for robots has advantages in various fields. Haptic rendering which is a technology to contact with virtual objects will be useful for virtual reality and simulation devices in medical and industrial fields. To realize these applications, technologies to extract environmental information and reconstruct virtual environments have been studied. In this paper, a force sensorless probe to obtain environmental information for a haptic rendering system is presented. Environmental information such as shape and stiffness is automatically recorded by the haptic probe driven by linear motors. By utilizing a large circular linear motor which has circular motion, haptic information of environments which have a two dimensional shape can be extracted by the device with simple structure. In experiments, the haptic probe contacts with an environment which has two kinds of stiffness. The recored environmental information is processed and rendered to an operator by a XY table composed of two linear motors. Experimental results show that the system can capture environmental information and it can be used for haptic rendering.
- Published
- 2019
46. Neck strap haptics
- Author
-
Hironori Mitake, Akihiko Shirai, Shoichi Hasegawa, and Yusuke Yamazaki
- Subjects
Movement (music) ,Computer science ,Perception ,media_common.quotation_subject ,Haptic perception ,Actuator ,Haptic rendering ,Algorithm ,Haptic technology ,media_common - Abstract
In this poster, we propose a new haptic rendering algorithm that dynamically modulates wave parameters to convey distance, direction, and object type by utilizing neck perception and the Hapbeat-Duo, a haptic device composed of two actuators linked by a neck strap. This method is useful for various VR use cases because it provides feedback without disturbing users' movement. In our experiment, we presented haptic feedback of sine waves which were dynamically modulated according to direction and distance between a player and a target. These waves were presented to both sides of the users' necks independently. As a result, players could reach invisible targets and immediately know they had reached the targets. The proposed algorithm allows the neck to become as important a receptive part of body as eyes, ears, and hands.
- Published
- 2019
47. FeelVR: Haptic Exploration of Virtual Objects
- Author
-
Timo Götzelmann and Julian Kreimeier
- Subjects
Focus (computing) ,Future studies ,Computer science ,05 social sciences ,Wearable computer ,020207 software engineering ,02 engineering and technology ,Virtual reality ,Haptic rendering ,Human–computer interaction ,Virtual image ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,Augmented reality ,050107 human factors ,Haptic technology - Abstract
The interest in virtual and augmented reality increased rapidly in the last years. Recently, haptic interaction and its applications get into focus. In this paper, we suggest the exploration of virtual objects using off-the-shelf VR game controllers. These are held like a pen with both hands and were used to palpate and identify the virtual object. Our study largely coincides with comparable previous work and shows that a ready-to-use VR system can be basically used for haptic exploration. The results indicate that virtual objects are more effectively recognized with closed eyes than with open eyes. In both cases, objects with a bigger morphological difference were identified the most frequently. The limitations due to quality and quantity of tactile feedback should be tackled in future studies that utilize currently developed wearable haptic devices and haptic rendering involving all fingers or even both hands. Thus, objects could be identifiable more intuitively and haptic feedback devices for interacting with virtual objects will be further disseminated.
- Published
- 2019
48. A Continuous Material Cutting Model with Haptic Feedback for Medical Simulations
- Author
-
Rene Weller, Maximilian Kaluschke, Mario Lorenz, and Gabriel Zachmann
- Subjects
Scheme (programming language) ,Computer science ,Haptic rendering ,Computer Science::Robotics ,Constraint (information theory) ,Modeling and simulation ,Computer Science::Graphics ,Collision response ,Collision detection ,computer ,Simulation ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology ,computer.programming_language - Abstract
We present a novel haptic rendering approach to simulate material removal in medical simulations at haptic rates. The core of our method is a new massively-parallel continuous collision detection algorithm in combination with a stable and flexible 6-DOF collision response scheme that combines penalty-based and constraint-based force computation.
- Published
- 2019
49. Haptic Force Guided Sound Synthesis in Multisensory Virtual Reality (VR) Simulation for Rigid-Fluid Interaction
- Author
-
Shiguang Liu and Haonan Cheng
- Subjects
Depth sounding ,Modality (human–computer interaction) ,Human–computer interaction ,Computer science ,Real-time simulation ,Synchronization (computer science) ,Immersion (virtual reality) ,Virtual reality ,Haptic rendering ,Motion (physics) ,Haptic technology - Abstract
This paper tackles a challenging problem for interactive rigid-fluid interaction sound synthesis. One core issue of the rigid-fluid interaction in multisensory VR system is how to balance the algorithm efficiency, result authenticity and result synchronization. Since the sampling rate of audio is far greater than visual and haptic modalities, sound synthesis for a multisensory VR system is more difficult than visual simulation and haptic rendering, which still remains an open challenge until now. Therefore, this paper focuses on developing an efficient sound synthesis method tailored for a multisensory system. To improve the result authenticity while ensuring real time performance and result synchronization, we propose a novel haptic force guided granular sound synthesis method tailored for sounding in multisensory VR systems. To the best of our knowledge, this is the first step that exploits haptic force feedback from the tactile channel for guiding sound synthesis in a multisensory VR system. Specifically, we propose a modified spectral granular sound synthesis method, which can ensure real time simulation and improve the result authenticity as well. Then, to balance the algorithm efficiency and result synchronization, we design a multi-force (MF) granulation algorithm which avoids repeated analysis of fluid particle motion and thereby improves the synchronization performance. Various results show that the proposed sound synthesis method effectively overcomes the limitations of existing methods in terms of audio modality, which has great potential to provide powerful technological support for building a more immersive multisensory VR system.
- Published
- 2019
50. Direct Visual and Haptic Volume Rendering of Medical Data Sets for an Immersive Exploration in Virtual Reality
- Author
-
Esther I. Zoller, Georg Rauter, Philippe C. Cattin, Balázs Faludi, Nicolas Gerig, and Azhar Zam
- Subjects
medicine.medical_specialty ,Computer science ,Medical simulation ,Work (physics) ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Volume rendering ,Virtual reality ,Haptic rendering ,computer.software_genre ,Surgical planning ,Human–robot interaction ,030218 nuclear medicine & medical imaging ,03 medical and health sciences ,0302 clinical medicine ,Voxel ,Computer graphics (images) ,Immersion (virtual reality) ,medicine ,computer ,030217 neurology & neurosurgery ,Realism ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
Visual examination of volumetric medical data sets in virtual reality offers an intuitive and immersive experience. To further increase the realism of virtual environments, haptic feedback can be added. Such systems can help students to gain anatomical knowledge or surgeons to prepare for specific interventions. In this work, we present a method for direct visual and haptic rendering of volumetric medical data sets in virtual reality. This method guarantees a continuous force field and does not rely on any mesh or surface generation. Using a transfer function, we mapped computed tomography voxel intensities to color and opacity values and then visualized the anatomical structures using a direct volume rendering approach. A continuous haptic force field was generated based on a conservative potential field computed from the voxel opacities. In a path following experiment, we showed that the deviation from a reference path on the surface of the rendered anatomical structure decreased with the added haptic feedback. This system demonstrates an immersive exploration of anatomy and is a step towards patient-specific surgical planning and simulation.
- Published
- 2019
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.