Lapas attēli
PDF
ePub

dynamic programming, etc.)

Knowledge-based search methods heuristics) Simulation and optimization

(artificial intelligence (AI) techniques)

Alternative evaluation (depends of list of alternatives) Human-computer interaction (not a technique as a way of making changes)

Scientists from SISIR gave an overview of an intelligent dynamic production scheduling system (IDPSS) that they are hoping to develop after two years of work. Another SISIR project is is the creation of a knowledge-based interactive real-time control system (KIRCS). This was done jointly with U.K. scientists in collaboration with IBM. There were several papers on materials handling, as the Port of Singapore is extremely large, and automating processing processing there motivates factory automation by other local manufactures.

In a supplement to this report (File ia92.abs), I give the titles, authors, and abstracts of all the papers. Here I mention a few (Asianbased) papers that were of particular interest to me. The single thread that held together many of the papers was integration within a company. That is, individual tools, such as CAD, are difficult to justify alone but instead need to be seen as one part of a company's business strategy. In that sense there were several comments to the effect that use of robotics was less than had been predicted because their efficiency within the total organization was questionable.

G.C.I. Lin's paper on AutoLay was useful because he surveyed and compared the existing PC-based facility layout programs. AutoLay is

an Australian package written in LISP and integrated with AutoCAD.

Lin also spoke about a newly developed optimization and simulation package that his group has developed (CIMOS), to run under Windows; CIMOS is an iterative system that progresses from spreadsheet to AI to knowledge-based techniques and has a strategic view of the manufacturing structure. I thought this was a very impressive piece of work by a scientist who has been involved in manufacturing for decades. Western researchers should make an effort to communicate with Lin. (See abstract in my supplement.)

There were several papers describing experiences with simulation packages (especially on PCs), and these were probably useful to listeners who might not have access to a wide selection of programs to choose from.

To my taste, one of the most interesting papers was presented by interesting papers was presented by M. Ang (National University of Singapore, E-mail: MPEANGH@NUSVM.BITNET). (Ang's Ph.D.

degree is from Rochester.) Ang asks, "Why are robots not so pervasive in industries [as they should be]? Automation has penetrated manufacturing industries in doing tasks that humans cannot...requiring high positional accuracies and speed which humans do not possess." Ang's focus is on developing robots to do those things that humans can do well or with ease-opening a bottle, putting a cap onto a pen, picking up a glass of water or some eggs, etc. He feels that we need to be moving toward a generation of robots designed for jobs requiring both force and position control, with the former being the more important. This is not really new; eventually it comes down to "compliance" of the manipulator end effector. (In fact, a similar comment was made by P. Mills, from Adept, in the U.S.)

Compliance is described by the (symmetric positive definite stiffness) matrix that matrix that relates forces and torques, acting on the end-effector to its translational and angular displacements. What Ang wants to do is specify the matrix elements in order to perform a specified task, and then to determine how to build a system with those parameters. What is interesting about this approach is that it requires the solution of an optimization problem, because the stiffness matrix elements are undetermined. As an example, Ang works through two specific problems-a CAM and and surface follower and grinding/polishing against a rotating belt. His approach is mechanistic and involves setting various matrix elements to zero, but it strikes me that more careful analysis could make a significant improvement. In any case, this is a very interesting direction and worth encouraging. Ang presented similar ideas at IA'90, which were co-authored by G. Andeen, from SRI. Earlier, Andeen also wrote about compliance in the following paper.

G. Andeen and R. Kornbluh, "Design of Compliance in Robots," in the Proceedings of IEEE International Conference on Robotics and Automation, Philaelphia, PA, April 1988 (pp. 276281).

A somewhat related paper was by G.C.I. Lin (University of South Australia) on the development of a soft sensor for force and torque sensing. (This was the third paper by Lin's group.) The prototypical problem is to place a peg into a tight hole. Lin has designed a very simple wrist-mounted force/torque sensor and uses some electro-optical transducers as sensing elements. The software is more complicated though and uses a neural net to process the sensor light signals.

not

A mathematical paper, related to the above but including interesting numerical approximation ideas, was given by Duggal et al. (NTU) on process modeling and fault diagnosis using real-time state variable methods.

Another interesting paper from Singapore, although not well related to the topic of the conference, was Ngoi's, in which he discusses using color segmentation to help in the recognizing of 3-D objects. The application considered was a segmentation of surface-mounted devices (SMDs).

There were a set of papers introducing and illustrating the use of neural nets and fuzzy controls. This is now well integrated into the practicing engineering community, and applications to such things as PCB visual inspection systems were presented.

An excellent survey paper was presented by K.B. Lim (NUS) on the Singaporean precision engineering and metal working industries, which in 1989 had an output of US$6.6B, about US$4B. Lim gives a history of the industry and presents the difficulties and solutions they encounter on the road to automation. Two conclusions are developed that Singapore's Economic Development Board (EDB) is doing a good job monitoring and assisting these enterprises and that the lack of skilled workers is the main impediment today. There were also other papers presented on specific applications to metal processing, steel and aluminum coils,

etc.

There were too many Australian-based papers to discuss individually, but I note one by H. Thorne (University of Adelaide) on costing system problems at the AWA Defense Industries, the largest defense systems and software engineering companies in Australia.

A speaker from the Singaporean subsidiary of the Japanese company Yamazaki Mazak spoke about the CIM factory his company recently built in Singapore, the first in southeast Asia. (Since 1983, this company has built and is using an unmanned plant in Japan and has other large plants in Japan, the U.S., and U.K.) It is clear that if Singapore can extract some of this technology to use in its own organizations, it will save years of relearning lessons already understood by established companies in Japan and the West.

Other Japanese contributions were modest and mostly from academia. One exception was M. Yoshitake, who is a senior consultant to the Automation Application Center, set up by the Singaporean government. Yoshitake discussed the use of a versatility index (VI) proposed in IA'90 by H. Makino and related to the number of model changes per year occurring in an assembly process. This seems like a simple idea, but Yoshitake has done a study of various assembly systems and applied the results to draw conclusions about PCB production. Some systems have VI values over 1000, meaning that models change more than once every two hours.

Several papers were submitted by PRC scientists, but these people were unable to attend.

Loh and Nee (NUS) gave a paper about packing hexahedral boxes, a potentially interesting 3-D problem. Their method is layer by layer and totally heuristic, but it did give 65% packing efficiencies, not a bad figure (this degrades rapidly as a function of the depth of packing volume.)

From a cultural point of view, the most interesting paper was from two Pakistani scientists, discussing issues related to implementing CIM in their (Third World) country. They

discussed justifying CIM to management, the relation of government to industry, and social, technical, and managerial issues. They made an excellent point, which applies equally well in the West, that "it is easier to educate (manufacturing) engineers about computers than computer professionals about manufacturing."

There was one (rather theoretical) paper on the topic of inspection from a Malaysian academic but no indication was given that the ideas had been implemented in a real situation.

I would like to mention the excellent keynote papers. Because these were all given by Western researchers, I will not dwell on them, except for very brief remarks. G. Netzler gave an overview on the use of AGVS in factory automation, particularly a new laser guidance system that requires no floor marks (angles are measured between reflector tapes on the wall), thereby simplifying route changing, etc. This system is now in use in Singapore (in their very large port facility.) Netzler responded to questions about vision systems as possible for the future but not yet for practical use. He also observed that an obvious future direction is to mount robots on AGVs to make them mobile. (Also see my comments about Loon's paper, above.)

F. Riley's (Bodine Corporation) keynote speech would have been perfect as a closing lecture rather than an opener. He emphasized that "manufacturing engineers must develop their proposals for automation in the light of strategic issues [rather than] reducing unit cost of production to the lowest level. Engineers [must] sell factory automation to the nontechnical management on the basis of its ability to meet customer expectations [and not] put such heavy reliance on

direct labor reduction. This is far more important in the emerging countries of the Asian-Pacificarea. Present management of countries in this area is not attuned psychologically to turning from their human resource management skills to auto

mation. Engineers must become the instrument of change by changing corporate attitudes toward the the usefulness and market benefits of successful automation.... Judge papers not solely in the light of their technological content, but how utili

zation might be sold within your own corporate structure not only as useful, but most importantly, as ultimately profitable."

SMALL COMMERCIALIZATION OF AUTOMATIC DIFFERENTIATION

IN JAPAN

This paper reports on an example of a commercial automatic differentiation product. Automatic differentiation is distinguished from symbolic differentiation and numerical differentiation.

In numerical differentiation, a different quotient (or a sophisticated variant) is evaluated to obtain an approximate numerical value for the derivative of a given function. Of course, the limit of such a quotient is just the definition of derivative that is studied by all scientists. The approximation is just that, and its accuracy in principle depends only on the increment used to separate the function values (truncation error). In practice, the accuracy of the approximation also depends on characteristics of the computing environment on which the calculations are performed. Unless rational (exact) arithmetic is used, there will also be some numerical or roundoff error, and the combination of both determines the accuracy of the final approximation to the derivative. But in any case, the result is an approximation. Within the context of engineering computations, numerical differentiation is the usual approach.

Symbolic differentiation begins with an algebraic expression (formula) for the function and produces a formula for the derivative. The latter can be evaluated, and if the evaluation is exact, the value of the derivative will be too. Symbolic differentiation, as part of the more general field of symbolic computation, has

by David K. Kahaner

grown in popularity over the past few years, and there are now well known commercial products such as Mathematical that are available for use even on personal computers. The advantage of symbolic computation is that a formula can give insight that a number cannot. Disadvantages have been that not all functions are expressed by expressions (table lookup multilevel subroutines, special functions of physics given by approximations, etc) and that symbolic computation can be very expensive.

Automatic differentiation is a process that obtains numerical values without generating a formula for the derivative and without the truncation error of numerical differentiation. The only errors introduced are those associated with the use of real, as opposed to rational arithmetic. Automatic differentiation requires the use of differentiation arithmetic, which is related to interval arithmetic introduced by R.E. Moore [1], and further developed by L.B.Rall [2]. It depends on the arithmetic of ordered pairs, i.e., a set of rules for manipulation (+,-,*,/) similar to the ordered pair of rules for complex arithmetic, or the rules for rational arithmetic. Automatic differentiation might be considered as somewhere between symbolic and numerical differentia

tion. Its computational cost is greater than that of numerical differentiation, although it also produces a numerical value.

The process of automatic differentiation can be extended to higher order derivatives, Taylor coefficients, etc. Rall and Corliss have applied it to generate remainder terms for the errors associated with numerical evaluation of integrals with validation, and there is an IBM product associated with that work. It has also been used as part of optimization techniques. John Dennis (Rice) and Andreas Griewank (Argonne National Lab) are currently engaged in a project to do automatic differentiation in optimization problems. There was also an optimization language called PROSE developed in the 1960s, which used automatic differentiation.

It is clumsy to take advantage of automatic differentiation using traditional languages like Fortran or C on computers. The main effort in the United States has been to use languages that have extensions for interval analysis. Languages that permit operator overloading and data types are better than Fortran for this purpose. Pascal-XSC language has a complete set of routines for automatic differentiation as part of its

support package. Automatic differentiation in Japan also has a respectable history, and Professor Masao Iri and his colleagues at the University of Tokyo developed a Fortran precompiler some time ago.

In the United States, the main "market" for automatic differentiation has been other scientists that develop algorithms, rather than end users with direct engineering applications. The theory behind automatic differentiation is valid, but its proponents have had difficulty explaining it to other scientists, and have not yet been able to produce a sufficiently convincing practical application. Last year, Rall claimed "that my main contribution to the subject was to put up with 25 years of indifference to it and its usefulness."

Recently I discovered an authentic application marketed by a small company in Japan.

Information and Mathematical
Science Laboratory, Inc. (IMS)
Idebukuro Aoyagi Bldg, 2-43-1
Ikebukuro, Toshima-ku, Tokyo
171 Japan

Tel: +81 3 3590-5211;
Fax: +81 3 3590-5353

IMS was founded in 1974 to do software development, systems integration, and related consultation. Last year's gross sales (1991) were 850M Yen (about US$7.5M). The company develops and supports a collection of software that includes a Fortran static analyzer and check

er, a Unix terminal emulator, and various engineering analysis packages, such as for bearing-motion and planar/linear antennas.

IMS's president, Mr. Akira Isono, explained to me that he first learned about automatic differentiation by studying Rall's 1986 paper [3]. With the help of some university consultants he directed the writing (in Fortran) of an automatic differentiation package that is now marketed under the name Texpander. This corresponds very roughly to the tool set that is available with Pascal XCS, although Texpander produces Fortran as output. Isono admitted that he has had difficulty getting users to buy it; the applicability of the concept is difficult for engineers to grasp. A user manual has several examples, such as the computation of Taylor coefficients for f(x)=sqrt(x), centered around a point "a". But, perhaps this is a bit too esoteric for practical engineers. Also, the Fortran output of Texpander is five to six times larger than a comparable numerical differentiation program, and also it runs slower.

uses Texpander to do sensitivity analysis for structural engineering, and for some of the optical analysis associated with accelerator beam control. His current project is to enhance Planc to perform parameter optimization-this is similar to the general optimization problems considered by Dennis and Griewank mentioned above, but much more focused.

I was not able to examine the source program for Texpander or Planc. What impressed me was the concept of using automatic differentiation to help solve a specific engineering problem. Feedback from such efforts is very effective in determining the applicability of relatively theoretical concepts, and of course, in making improvements.

REFERENCES

1.

[blocks in formation]

R.E. Moore, Methods and Applications of Interval Analysis, SIAM Studies in Applied Mathematics 2, (Soc Indust. Appl. Math., Philadelphia 1979).

L.B. Rall, Automatic Differentiation: Techniques and Applications, Lecture Notes in Computer Science No 120, (Springer-Verlag, Berlin-Heidelberg-New York, 1981).

L.B. Rall, The Arithmetic of Differentiation, Math Magazine, 59, (5), 275-282 (1986).

« iepriekšējāTurpināt »