Lapas attēli
PDF
ePub

AR and tele-existence. The Proceedings of this conference has just appeared and is entirely in English, and I recommend it highly to anyone interested in getting a quick survey of the current research activities in Japan. Finally, note that a newsgroup, "sci.virtualworlds," is operating. Tachi showed me the list of recent postings, and it is clear that this is a very active communication vehicle.

DISCUSSION OF THE VR ASPECTS OF THE JAPANESE ECONOMIC PLANNING AGENCY SURVEY

Last year, I electronically distributed a report I wrote summarizing a survey from the Japanese Economic Planning Agency (EPA) on technology through the year 2010 (see 2010.epa, 27 Sep 1991). VR was one of the topics discussed there. In the complete Japanese report more explanatory detail was presented. The EPA estimates that practical realization will be sometime around 2020 (this seems further downstream than I would estimate). EPA continues:

Regarding the comparison of various countries' R&D [research and development] efforts at the present juncture in the VR field, both Japan and the U.S. are actively working on research and development of the system. At this particular point, however, the U.S. is more actively pursuing the system research. In the U.S., for instance, the aerospace industry's R&D projects and national-level projects in support of the industry are conducting truly large-scale simulations and are constructing excellent databases.

Key technologies requiring breakthrough will be those used in development of 3D simulation

models, supercomputer application technology, and self-growthdatabase technology.

One form of social constraint standing in the way of practical utilization of VR will be the effects of its possible application to transportation systems as a part of the social infrastructure [sic]. Moreover, it will be influenced by the general public's perception and value judgments regarding the system. Economic constraints will affect business' attempts to establish a market and to drive costs down. Moreover, the need for securing technical specialists in the software development field and the resultant shortage in R&D funds, as well as other difficulties involved in recruiting R&D personnel, must be dealt with.

VR's market scale is estimated as reaching approximately the ¥1T level. Accompanying this market, there will be an increase in the number of related industrial firms' research labs to as many as 100. Software applications fields will be extensive, and a large number of R&D divisions will be pursuing product research and development targeted for various social strata.

Positive impacts created by VR on the industrial economy will include the emergence of a new simulation industry (which probably will maintain supercomputers) and revitalization of computer software-houses, computer industries, entertainment industry, and the aerospace industry. The secondary effects will be experienced by the information industry, the general communication-related industry,

the publishing industry, newspaper and magazine businesses, and by TV and radio broadcasting.

Negative impacts will be felt by industries which have tended to hold on to hardware-oriented products.

VR WORK AT TACHI'S LABORATORY

The point of this article is not to summarize VR work generally but only to describe the specific directions being taken at one laboratory. Tachi uses the term "tele-existence" to denote the technology that enables a human to have a real sensation of being at another place and enabling him/her to interact with the remote environment. The latter can be real or artificial. This is clearly an extension of the sensation we have when we use a joystick to move around the figures in a video game. In the United States, the term is "telepresence."

Related to this is work to amplify human muscle power and sensing capability by using machines, while keeping human dexterity and the sensation of direct operation. This work stems at least from the 1960s with developments of exoskeletons that could be worn like a garment yet provide a safe but strong environment. Those projects were not successful; damage to the garment would endanger the wearer. Further, the technology at that time did not allow enough room for both the human and the equipment needed to control the skeleton.

Another idea was that of teleoperation or supervisory control. In this, a human master moves and a “slave” (robot) is synchronized. Tele-existence extends this notion, in that the human really should have the sensations that he/she is performing the functions of the slave. This occurs if the operator is provided with a very rich sensation of presence that the slave has acquired.

In Tachi's laboratory, the teleexistence master-slave system consists of a master system with visual and auditory presence sensation, a computer system for control, and an anthropomorphic slave robot mechanism with an arm having seven degrees of freedom and a locomotion mechanism (caterpillar track). The operator's head, right arm, right hand, and other motion such as feet are measured by a system attached to the master in real time. This information is sent to four MS-DOS computers (386 x 33 MHz with coprocessors), and each computer generates commands to the corresponding position of the slave. A servo controller governs the motion of the slave. A six-axis force sensor on the wrist joint of the slave measures the force and torque exerted on contact with an object, and this measured signal is fed back to the computer in charge of arm control via A-to-D converters. Force exerted at the hand when grasping an object is also measured by a force sensor installed on the link mechanism on the hand and fed back to the appropriate computer via another A-to-D converter. A stereo visual and auditory input system is mounted on the neck of the slave, and this is sent back to the master and displayed on a stereo display system in the helmet of the operator. Many of the characteristics of the robot are similar to that of a human, for example, the dimensions and arrangement of the degrees of freedom, the motion range of each degree of freedom, and the speed of movement, and motors are designed so that the appearance of the robot arm resembles a human's.

Measured master movements are also sent to a Silicon Graphics (SG) Iris workstation. This generates two shaded graphic images that are applied

to the 3D display via superimposers. Measured pieces of information on the Measured pieces of information on the master's movement are used to change the viewing angle, distance to the object, and condition between the object and the hand in real time. The operator sees the 3D virtual environment in his/ her view, which changes with movement. Interaction can be either with the real environment, which the robot observes, or with the virtual environment, which the computer generates.

There are many details of the system that are carefully explained in Tachi's papers dating back to the mid-1980s and need not be repeated here.

I wondered about the choice of computer systems. Tachi commented that he preferred DOS to Unix for the control computers because DOS made it easier to process real time interrupts. On the other hand, if workstation performance is high enough, interrupts can be handled in "virtually" real time. The SG is fast enough for the needed graphics as long as the operator does graphics as long as the operator does not move his/her head too rapidly. Clearly, workstation performance is an important consideration in real time computer graphics.

Tachi demonstrated the system by sitting on a special "barber chair," putting on the large helmet, and inserting his right arm into a movable sling containing a grasping device that approximates the robot's arm. His left arm held a joystick that controls locomotion, forward, back, right, left, and rotation of the robot. Once so configured he proceeded to make the robot stack a set of three small cubes.

I tried it next. The helmet is large and bulky but is equipped with a small fan so that there is good air circulation. It is also heavy but is carried on a link mechanism that cancels all gravitational

forces, but not inertia, somewhat like wearing the helmet underwater. The color display (one for each eye) is composed of two 6-inch LCDs (720x240 pixels). Resolution is good, better than I expected, but not crystal clear. Tachi explained that humans obtain higher apparent resolution by moving their heads when looking at objects, and that the same effect works in the helmet. He also explained that the 3D view has the same spatial relation as by direct observation (this is one place where workstation performance is needed), and tuning of the system in this area is one of the things that Tachi and his colleagues have been working on for almost 10 years. Tachi claims that operators can use the system for several hours without tiring or nausea; I am not sure if I could last that long. Nevertheless, I was able, first try, to move the robot to within grasping distance of the table, lift the three small blocks, and stack them without dropping any. Training time, zero.

This is one of the most advanced experiments of its kind in Japan. Tachi's laboratory is very close to downtown Tokyo and would be easy to reach.

Please refer to the reports referenced above as well as the references below for further descriptions of this project.

• S. Tachi et al., “Tele-existence (I): Design and evaluation of a visual display with sensation of presence," in Proceedings of RoManSy'84, Udine, Italy, 26-29 June 1984.

• S. Tachi et al., "Tele-existence in real world and virtual world," in ICAR'91 (Fifth International Conference on Advanced Robotics), 19-22 June 1991, Pisa, Italy.

JAPANESE RESEARCH IN INTELLIGENT AUTONOMOUS ROBOT CONTROL

During November 1991, this office supported Prof. Yutaka Kanayama, of
the Naval Postgraduate School, to visit Japan and study and report on
Japanese research in the area of intelligent autonomous robots. His article
centers on a major international conference that he attended, Intelligent
Robots and Systems (IROS), as well as several site visits.

INTRODUCTION

Robotics is one of the subareas of computer science in which Japan shows competence comparable to the United States. Three Intelligent Robots and Systems (IROS) workshops have already been held in Japan, and this year's workshop is the fourth one in this series.

The concept of autonomous robots, which has been attracting many researchers, has numerous applications, such as tasks in high performance manufacturing, in hazardous environments, in warfare, etc. The research in autonomous robotics presented at the workshop is summarized, as are visits to several universities and research laboratories with activities in robotics.

IROS WORKSHOP

The workshop was chaired by

Prof. Fumio Miyazaki
Dept of Mechanical Engineering
Osaka University

1-1 Machikaneyama-cho
Toyonaka, Osaka 560, Japan
Tel: +81-6-844-1151

E-mail: miyazaki@crane.mees. Osaka-u.ac.jp

by Yutaka Kanayama

Despite its name, this was more like a conference than a workshop. The presentations given were from many countries including Japan, the United States, France, the United Kingdom, Italy, Germany, Korea, and China. We will focus on the following talks, which represent major activities in autonomous robotics in Japan. The page numbers refer to the Proceedings of the IEEE/RSJ International Workshop on Intelligent Robots and Systems '91.

"Estimating Location and Avoiding Collision Against Unknown Obstacle for the Mobile Robot Using Omnidirectional Image Sensor COPIS," Y. Yagi, Y. Nishizawa, and M. Yachida, Osaka University, pp 909-914. COPIS (COnic Projection Image Sensor) is an image sensor using a conic mirror that had already been proposed by the authors. Since this sensor is able to obtain a panoramic 360° view, it is straightforward to extract vertical edges in the robot's environment. This paper reports a method for estimating the location and the motion of the robot by detecting the azimuth of each object in the omnidirectional image. A method for matching azimuth orientations with a given environmental model is described. This research is led by

Prof. Masahiko Yachida
Dept of Information and
Computer Science
Osaka University
1-1 Machikaneyama-cho
Toyonaka, Osaka 560, Japan
Tel: +81-6-844-1151
E-mail: yachida@ics.osaka-u.ac.jp

"A Method of Indoor Mobile Robot Navigation by Using Fuzzy Control," S. Ishikawa, IBM Japan, Ltd., pp 10131018. This paper was presented by

Shigeki Ishikawa

Tokyo Research Laboratory
IBM Japan, Ltd.

5-19, Sanbancho
Chiyoda-ku, Tokyo 102, Japan
Tel: +81-3-3288-8379
Fax: +81-3-3265-4370
E-mail: isikawa@jjjpntscvm.bitnet

A sensor-based navigation method for an autonomous mobile robot using fuzzy control was presented. The purpose of using fuzzy logic was to construct expert knowledge for efficient and better piloting of the autonomous mobile robot. Its first task was motion tracking. An autonomous mobile robot senses positional and orientational errors from its planned path to decide the next incremental motion. The second task was

obstacle avoidance. The robot senses the size and shape of an open area ahead to avoid stationary and moving obstacles. Included in this system are 155 fuzzy rules. Although only simulation results are reported in this paper, a real robot's behavior was presented in the video tape demonstration. Fine tuning of the rule-based and membership functions is done on the simulator. Several simulation results on stationary and moving obstacles were presented.

system called Spur for autonomous
system called Spur for autonomous
mobile robots. Essentially, this func-
tional command system enables a user
to describe a path that is a sequence of
straight segments and circular arcs. It
includes the experimental results of
implementing the principle on the robot
Yamabico. [Note: I founded and led
this robotic research group at the
University of Tsukuba until 1984. After
I left the university,

Prof. Shin'ichi Yuta

Institute of Information Science
and Electronics
University of Tsukuba
Tsukuba 305, Japan

Tel: +81-298-53-5509
Fax: +81-298-53-5206
E-mail: yuta@is.tsukuba.ac.jp

who had been my assistant, has been
continuing the project.

SITE VISITS

"A Guide Dog Robot Harunobu-5: Following a Person," H. Mori and M. Sano, Yamanashi University, pp 397402. There are 250,000 visually impaired people in Japan, one of the reasons for the development of the guide dog robot. The authors reported that the outdoor autonomous mobile robot Harunobu-5 was able to find a person 3 to 5 meters ahead and to follow him. The robot consists mainly of a mission planner, a digital map, an interactive navigator, and an image understanding system. Shinko Electric Company An interesting technical point in this research is the model of a pedestrian, which consists of several body parts successfully functioning in interpreting color images. The system contains an MC68030 (25 MHz) OS-9 system with color image memory (512*512), two gyroscopes, two lap-top computers, and an MC68020 (16.7 MHz) system for the "follow-person" behavior. This project has been led by

Prof. Hideo Mori
Dept of Electronic Engineering

and Computer Science
Yamanashi University
Takeda-4, Kofu 400, Japan
Tel: +81-552-52-1111

"Vehicle Command System and Trajectory Control for Autonomous Mobile Robots," S. Iida and S. Yuta, University of Tsukuba, pp 212-217. This paper presents a motion command

Shinko Electric Co. has been developing and producing commercial autonomous unmanned vehicles (AUVs) that are used in clean rooms for semiconductor manufacturers.

Mr. Teppei Yamashita
Shinko Electric Co., Ltd.
100 Takegahana-cho
Ise, Mie 516, Japan
Tel: +81-596-36-1111
Fax: +81-596-36-0577

is leading the development team. The
autonomous mobile robot can plan a
path and automatically navigate itself
using ultrasonic sensors or magnetic
tapes on the floor. It has a robot manip-
ulator arm/hand to carry wafer cas-
settes among various equipment for
semiconductor processes (see Figure 1).
It is equipped with a charge coupled
It is equipped with a charge coupled
device (CCD) camera to obtain precise

[merged small][graphic][merged small][merged small][merged small][merged small][merged small]
« iepriekšējāTurpināt »