- 最后登录
- 2017-9-18
- 注册时间
- 2011-1-12
- 阅读权限
- 90
- 积分
- 12276
  
- 纳金币
- 5568
- 精华
- 0
|
Copyright is held by the author / owner(s).
SIGGRAPH Asia 2011, Hong Kong, China, December 12 – 15, 2011.
ISBN 978-1-4503-0807-6/11/0012
NAVIgoid: Robot Navigation with Haptic Vision
Junichi Sugiyama* Dzmitry Tsetserukou Jun Miura
EIIRIS, Toyohashi University of Technology
![]()
1 Introduction
Telepresence robotic system allows a person to feel as if they
were present at a place other than their true location. The sense of
telexistence is provided with such stimuli as vision, hearing, sense
of touch, etc. [1]. The user of such system is capable of affecting
the remote location, and hence, the user position and actions must
be sensed and transmitted to the remote robot (teleoperation).
There is a substantial need in interface and telepresence robotic
system that allow intuitive and immersive control of the robot.
The commonly used interfaces (joystick, keyboard, mouse,
PHANTOM) provide a simple but not immersive navigation of a
planar robot movement [2]. With these controllers, the human
hand is engaged in teleoperation of the remote robot and cannot
be used for directing the robotic arm, hand, and fingers of the
manipulator mounted on the mobile platform. The purpose of our
work is to develop a new type of tactile interface and telepresence
robotic system that can make the operator of being “embodied”.
The factors that affect the level of immersion are the type of
visual facilities (monitor, virtual-reality goggles), auditory
feedback, and haptic perception of the remote environment. The
novelty of our idea is to engage the user into teleoperation and
provide a high level of immersion through proprioceptive
telepresence and tactile feedback. The developed interface allows
the operator to use their body posture and gestures for controlling
the mobile robot and at the same time to feel the remote object
through tactile stimuli (Fig. 1(a)).
2 Principle and Technologies
The mobile robot is equipped with two laser range finder (LRF)
sensors scanning the total 360 degrees. The developed algorithms
allow mobile robot to detect the distance, shape and velocity (with
Kalman filter) of the object robustly against LRF scan noises. The
example of the LRF scan data is given on Fig. 1(b) – the black
point depicts the robot location, red point is the moving object
location, and red line is the velocity vector.
Human operator is capable of changing the robot traveling
direction in a smooth and natural manner by twisting and bending
the trunk (Fig. 1(b)). The bending flex sensor changes in
resistance depending on the amount of the sensor bend. The torso
along with the flex sensors acts as joystick. For example, to move
the robot forward or backward, the user leans the torso slightly
forward or backward, correspondingly. The velocity of the robot
is congruent with the trunk tilt angle. When the operator
straightens up, the robot stops smoothly. Such operations allow
the human to experience a sense of absolute, natural, instinctive,
and safe control.
In ProInterface, we employ the tactile stimuli as a modality to
deliver the information about the remote environment. The
interface allows the operator to devote visual faculties to the
exploration of the remote dynamic environment. The device is
represented by a wearable belt, which is integrated with 16
vibration motors (tactors), four flex sensors, 3-axis accelerometer,
and plastic holders linked by elastic bend (Fig. 1(c)). The tactors
are equally distributed around the user’s waist. The motors vibrate
to produce the tactile stimuli indicating the direction, distance,
shape, and mobility of the moving obstacle (object, human, etc.).
The developed algorithm analyses the information about the
environment and sends it to the wearable master robot. For
example, when the detected obstacle is located on the right side of
the robot, the user feels the vibration of the tactor at the right side.
The belt interface provides the wearer with high resolution
vibrotactile signals. Thus, it can also present the shape and speed
of the object. For example, the convex obstacle is presented by
simultaneous activation of three tactors, but with different
vibration intensities. The vibration frequency in the middle tactor
is larger than in neighboring ones (Fig. 1(b)). The mobile object is
represented by the tactile stimuli moving along the waist in the
direction of the object travelling (Fig. 1(b)). The haptic vision
allows operator to feel the entire space around the mobile robot.
The stereoscopic 3D image from the robot cameras is transferred
to the HMD through wireless communication.
The developed technology potentially can have a big impact on
multi-modal communication with remote robot engaging the user
to utilize as many senses as possible, namely, vision, hearing,
sense of touch, proprioception (posture, gestures). We believe that
such telepresence robotic system will result in a high level of
immersion into robot space.
References
[1] TACHI, S. 2009. Telexistence. World Scientific Pub. Co., Singapore.
[2] CHO, S. K., JIN, H. Z., LEE, J. YAO, B. 2010. Teleoperation of a
Mobile Robot using a Force-Reflection Joystick with Sensing
Mechanism of Rotating Magnetic Field, IEEE/ASME Transactions on
Mechatronics, 15(1): 17-26, Feb. 2010. *E-mail: sugiyama@aisl.cs.tut.ac.jp
Figure 1: a) Telepresence robotic system b) Mobile robot control and feedback c) ProInterface: tilt torso – feel object
Single tactor
Vr
Vh
Human
Wall
ro
rh
rw
Human operator
Mobile robot
LRF image
Belt
Interface
yr
xr
xu
yu
Elastic band
User’s waist
Flex sensor I
Flex sensor III
Flex sensor IV
Accelero
meter
Front
Right Back
Left
Geomagnetic sensor
Obstacle
Flex sensor II
Robot-side
PC
CPU: Intel
Core i7
Mobile robot
PeopleBot
by MobileRobots
LRF
UHG-08LX
by HOKUYO
Robot motion
control by
torso tilt
Tactile and visual
feedback
User with a wearable master
robot
ProInterface
Mobile robot
Stereo camera
FFMV-03
by Point Grey
Research
HMD
VR920
by Vuzix
a) b) c)
|
|