Current robotic endoluminal interventions require surgeons to hold a proximal joystick to control the distal flexible robot under 2-D X-ray guidance. However, the 2-D X-ray image is not intuitive, which not only increases the risk of surgical misoperation but also the workload of surgeons. Moreover, contact teleoperation exposes surgeons to the potentially infectious environment. To address it, this article proposes an augmented-reality-assisted touchless teleoperated robot for endoluminal intervention, called ARei. It aims to provide immersive experiences with augmented information. The robot integrates perceptual information obtained from an electromagnetic (EM) sensor, a shape sensor, and the virtual anatomy into a head-mounted display (HMD). The touchless teleoperation is used to control the robot, with the assistance of gesture recognition technology. Results show that the mean error of the calibration between the HMD and the EM tracking system is 4.67 mm, and the mean distance error between the points measured by the EM sensor and the points obtained by shape reconstruction with calibration is 5.19 mm (3.05%).