Lapas attēli
PDF
ePub

strained design styles such as PLA's, gate arrays, standard cell arrays, and even standard floor plan chips. There is now a trend to mix these design styles on single chips, using automatic generators to produce customized PLA, register, RAM, ROM, and random-control logic macros [13]. Test pattern generation is another sophisticated synthesis problem. The most advanced methodologies use special design rules and additional hardware to subdivide the circuitry into manageable combinatorial sections, or to condense the results of long test sequences, or even to administer pseudo-random test patterns on the chip itself.

Such a composite system does not exist, of course, but each of its components does. Clearly, the development of a state-of-the-art automation capability for fast turnaround VLSI design is a very ambitious undertaking indeed.

PROBLEMS FOR THE FUTURE

Fortunately, there are still problems, or, rather, opportunities for creative work. How does one manage the complexity of VLSI design? What happens when computer runs exceed weeks? When tester times exceed hours?

The complexity of VLSI designs has grown to the extent that there are substantial doubts about the designers' ability to keep up with process capability. The implication is that future chips will be designed inefficiently in terms of silicon utilization or performance because of lack of time and design resources. The phrase "silicon is cheap" has always had a certain irony about it, but we may actually be coming to the point that silicon utilization is less important than design time.

While the problems are serious, they are not insurmountable. Clearly some very spectacular chips are being designed. 32-bit microprocessors such as the Intel iapx432, the Bell Laboratories BELMAC, and Hewlett-Packard's 32-bit microprocessor chip set [17] are all near the limit of fabrication technology. There is no reason to expect the next generation of microprocessors to leave any unused silicon either. Even so, these projects are costly (50-100 person-years) and therefore rare. If VLSI were as simple to deal with as modules on wire-wrap boards, many more products would appear.

The problem of handling complexity has come up in other disciplines, notably software engineering, and a variety of promising techniques have been proposed. Prof. C. Sequin has a very interesting discussion of this subject elsewhere in this issue. One technique for dealing with complexity has been to use regular structures such as PLA's rather than try to squeeze out every square micrometer through local optimization. This approach, advocated by C. Mead of Caltech [18], has broad implications. How does one obtain a library of useful regular structures or macros to include in one's VLSI design? To be useful to someone other than the designer, a macro must be general, well documented, and configurable to other technology ground rules and to other system environments. Such macros would necessarily be encoded primarily as programs and only secondarily as pictures. This again is a feature of the Caltech approach. To be useful, each of these macro generation programs should be accompanied by a simulation model as well. All this implies a level of interface standardization which has yet to be achieved. Thus one challenge is the invention and development of commercially available VLSI macro generators and the creation of an environment to facilitate their transfer.

A closely related challenge has to do with interactive graphics. We need to develop graphic techniques for specifying not only pictures, but families of pictures with given relationships among their components. Procedural design or algorithmic macro generation is inherently a problem of expressing shapes and their relationships, yet we must still use programming languages which are patterned on speech, rather than use the seemingly more natural medium of interactive graphics. Why can these programs not be specified by diagrams which express the number of repetitions of a shape in two dimensions, the required clearances and overlaps of related shapes, the fact that some can be extended as necessary, and so on? We can generate families of pictures from programs; how can we generate programs from pictures?

Reusing standard macros is one way to deal with design complexity. Another is to automate the design process so that the designer deals only with high-level entities and the machine handles all the details of converting and optimizing the design. In layout, as was previously mentioned, there are automatic design algorithms for gate arrays and standard cells. For such chips the time spent in logic design far exceeds the time spent in layout. There is a need for automated techniques for converting high-level functional descriptions to lower level logic suitable for implementation. This logic synthesis task has always been thought of as impractical for large networks, but recent progress in optimization by local transforms [19] holds out the

promise of a solution. The generation of functional chips from high-level functional specifications, whether for gate arrays with unit logic or for standard floor-plan MOS microprocessors, would be a true "silicon compiler" and a worthwhile goal.

The issue of simulation and test pattern generation run times is still a very real one. Despite the advances in static verification and other proofs of correctness, there is no better way to verify the initial specifications of a system than through realtime emulation or simulation. The designer often does not understand all the capabilities of a structure which he creates. A period of "playing around" with the design is required. Simulations of VLSI systems running even trivial test programs are almost prohibitively expensive. A potential solution is the hardware simulation engine-a large array of processors and memories tied together with a high-speed communications switching network. It can handle the number-crunching simulation operation at speeds thousands of times greater than a standard serial computer. These engines might have been included earlier in this article as part of the state of the art, but there are still too few of them in use, and their effectiveness in a production environment is undocumented. The simulation problem remains a major challenge.

Test pattern generation speed can also be significantly enhanced by using the same or similar engines. However, there is also the problem of applying the tests in fabrication. This is still a sequential process, carried out by expensive test equipment. One way to cut down both test pattern generation time and testing expense is to have the VLSI chip carry its own built-in tester. While self-test and other hardware-assisted testing techniques impose penalties on silicon utilization, the tradeoff appears favorable. In any case, if there are any fears about designers' ability to use everything the process people can provide, this added testing requirement should allay them.

The most exciting challenge of VLSI design is in the area of applications. There is enough capability today, both in technology and in design techniques, to create radically new electronic systems. In the 1950's computer experts were fond of speculating on the structure of the brain, on robots, and on automatic language translation. Then the IC revolution occurred, and most practical people turned to remapping Von Neumann's computer from one technology to the next.

Some of these questions are being revisited today. Indeed, the ogic simulation engine discussed earlier is an example of a step in this direction. It uses the power of many concurrent processors to model the concurrent events in a digital system. The recognition and translation of speech are also composed of many inherently concurrent activities. The efficient searching of a data base is another example of inherently concurrent processing.

The technology exists to produce vast arrays of processing and memory elements. What is not clear is how to have them communicate with each other. The interconnect capability of integrated circuits is hopelessly outclassed by that of biological systems. The easiest arrays to build have interconnections only among nearest neighbors. When it is necessary for each processor to be able to communicate with any other, as it is in the logic simulation engine, the communication network quickly becomes a bottleneck.

Design automation can only play a supporting role in the process of creating these new concurrent systems. Improvements in logic description languages and in simulation techniques will help researchers to study the properties of alternate architectures. On the other hand, these unconventional new VLSI systems will have profound effects on design automation techniques. Programming general-purpose multiprocessor computer systems will require new techniques, but the resulting code should execute thousands of times faster than on uniprocessors. Compliers may begin to understand subsets of natural language. Spoken input and output may develop into an important medium of communication between man and machine. Design automation will be transformed by the VLSI products which it will have helped create.

ACKNOWLEDGMENT

The author would like to acknowledge the valuable advice and help of J. Werner of VLSI Design, B. Lee of Calma Co., and, especially, of Dr. R. Russo of IBM Corp.

REFERENCES

[1] R. N. Noyce, "Microelectronics," Sci. Amer., vol. 237, no. 3, p. 65, Sept. 1977. [2] J. S. Koford, P. R. Strickland, G. A. Sporzynski, and E. M. Hubacher, "Using a graphic data processing system to design artwork for manufacturing hybrid integrated circuits," in Proc. Fall Computer Conf., pp. 229-246, 1966.

[3] W. M. van Cleemput, Computer Aided Design of Digital Systems—A Bibliography. Woodland Hills, CA: Computer Sci. Press, 1976 pp. 231-236.

[4] R. B. Walford, "The LAMP system,” in Proc. Workshop on Fault Detection and Diagnosis in Digital Circuits and Systems (Lehigh Univ., Bethlehem, PA, Dec. 1970), p. 10.

[5] P.N. Agnew and M. Kelly, “The VMS algorithm," Tech, Rep. TRO1.1338., IBM System Products Div. Lab., Endicott, NY, 1970.

[6] C. Y. Lee, "An algorithm for path connections and its applications," IRE Trans. Electron, Comput., vol. EC-10 pp. 346-365, Sept. 1961.

""

[7]E. F. Motor "Shortest path through a maze,' in Annals of the Computation Laboratory of Harvard Univesity, Cambridge MA: Harvard Univ. Press, vol. 30, pp. 285-292, 1959.

[8] A. Hashimoto and J. Stevens, "Wire routing by optimizing channel assignment within large apertures," in Proc. 8th Design Automation Conf. (Atlantic City, NJ, June 1971), pp. 155-169.

[9] M. Hanan and J. M. Kurtzberg, "Placement techniques," in Design Automation of Digital Systems, M. A. Breuer, Ed. Englewood Cliffs, NJ: Prentice-Hall, 1972, pp. 213-282.

[10] K. H. Khokhani and A. M. Patel, "The chip layout problem: A placement procedure for LSI," in Proc. 14th Annual Design Automation Conf. (New Orleans, LA, 1977) pp. 291-297.

[11] K. A. Chen, M. Feuer, K. H. Khokhani, N. Nan, and S. Schmidt, "The chip layout problem: An automatic wiring procedure," in Proc. 14th Annual Design Automation Conf. (New Orleans, LA, 1977), pp. 298-302.

[12] W. R. Heller, W. F. Mikhail, and W. E. Donath, "Prediction of wiring space requirements for LSI, in Proc. 14th Annual Design Automation Conf. (New Orleans, LA, 1977), pp. 32-42.

[13] R. Donze, J. Sanders, M. Jenkins, and G. Sportzynski, “PHILO-A VLSI design system,” in Proc. 19th Annual Design Automation Conf. (Las Vegas, NV, June 1982), pp. 163-169.

[14] J. P. Roth, "Diagnosis of automata failures: A calculus and a method," IBM J. Res. Develop., vol. 10, pp. 278-291, July 1966.

[15] E. B. Eichelberger and T. W. William, “A logic design structure for LSI testability," in Proc. 14th Annual Design Automation Conf. (New Orleans, LA, June 1977), pp. 462-468.

[16] L. A. O'Neill et al., "Designer's workbench-Efficient and economical design aids," in Proc. 14th Annual Design Automation Conf. (June 1979), pp. 185-199. [17] J. W. Bayers et al, in ISSCC Dig. Tech Papers, pp. 128-129, Feb. 1982.

[18] C. Mead and L. Conway, Introduction to VLSIS Systems. Reading, MA: Addison-Wesley, 1980.

[19] J. A. Darringer, W. H. Joyner, L. Berman, and L. Trevillyan, “LOgic synthesis through local transformations," IBM J. Res. Develop., vol. 25, no. 4, pp. 272-279, July 1981.

[Copyright © 1980 Jurimetrics Journal, reprinted with permission]

INTELLECTUAL PROPERTY
PROTECTION AND INTEGRATED

CIRCUIT MASKS

John Craig Oxınan

I. INTRODUCTION

The foundation of the electronic industry in the United States today lies in the production of integrated circuits.1 Developed in the late 1950s, the integrated circuit (IC)3 has come to dominate all but a few esoteric applications in electronics, and it can safely be said that ICs are responsible for the widespread availability and low cost of products and circuit functions which only a few years ago would have been prohibitively costly or technically infeasible."

ICs are made today by a process substantially similar to the IsoPlanar process developed by Fairchild Semiconductor as an extension

Copyright 1979. All rights reserved.

1This is apparent to anyone who reads industry publications, including ElecTONIC DESIGN, the IEEE SPECTRUM and THE JOURNAL Of Electron DEVICES. For a good introductory discussion, see Microelectronics, SCIENTIFIC AMERICAN, Sept. 1977.

The total sales of electric and electronic equipment was $73.9 billion in 1976. U.S. DEP'T Of Commerce, ANNUAL SURVEY OF MANUFACTURES 1976 at 4 (1978). Integrated circuits alone accounted for $2.5 billion of this, or 3.4%. ICs now account for 25% of all electronic component shipments, and by 1983 will account for 35% of all such shipments. U.S. DEP'T OF COMMERCE, 1979 U.S. INDUSTRIAL OUTLOOK 282-84 (1978).

2 The integrated circuit was invented in 1958 by Jack Kilby of Texas Instruments, Inc. See WHO'S WHO IN AMERICA 1978.

The term "integrated circuit" denotes a collection of electronic parts including, but not limited to, transistors, resistors, capacitors, controlled rectifiers, diodes, etc., which are physically all on one, or perhaps more, semiconductor substrates and which together perform a circuit function, such as amplification.

'For example, today one thinks nothing of buying for $5 a four-function electronic calculator made by any of a myriad of manufacturers. In April, 1971, only one company (Sharp) marketed such calculators which then cost $345. DUN's REV., Sept. 1972, at 89.

of the original planar transistor process. This process involves the use of "masks" which, when employed as described below, define patterns on the IC. It is the interrelationship of the superimposed patterns of several masks which enable a circuit to be created. The optimal juxtaposition of these patterns is crucial both to the performance of the circuit and to the ultimate economy of the IC. As a result, IC makers invest substantially in the development of optimal masks.

Originally, ICs were relatively small and unsophisticated containing perhaps a half dozen transistors and a few resistors. The effort invested in the creation of a set of masks for such circuit, while not trivial, was not so substantial or difficult that a prospective manufacturer of the circuit would not make a mask set suited to his own production process. However, times have changed. Today, it is not unusual for manufacturers to vend ICs with more than 50,000 transistors on the chip. The level of complexity is so great that it now has become far less economical for manufacturers to create mask sets.

In many cases, manufacturers will make or want to make ICs which are directly interchangeable with counterparts manufactured by competitors. There are many reasons for this. First, many of the interchangeable ICs are circuit functions commonly used which a manufacturer of a full line of ICs would want to produce. Second, in other cases the interchangeable IC may be part of a system of ICs designed to be used together. A manufacturer in this way has immediate access to a market of system users interested in less costly components. It can also ease a manufacturer's entry into the "system" market, allowing him to test the water without having to take the plunge. Third, the manufacturer may have several improvements in mind which would make his IC superior to all other interchangeable counterparts. Since

5J. A. Hoerni, Planar Silicon Transistors and Diodes, IRE Electron Devices Meeting, Washington, D.C. (1960). Hoerni, a Fairchild employee, assigned his patent No. 3,025,589 on the planar process to Fairchild.

"The transistor is the basic gain element of solid state electronics. Today the term transistor can mean any of several structures, such as bipolar, field effect, insulated gate field effect, to name a few. Because a transistor can be used both as an amplifier and as a switch it is used both in analog and digital circuits.

For example, a 64 K static random access memory. See Microelectronics, SCIENTIFIC AMERICAN, Sept. 1977.

Customers like most humans, tend to be lazy. If they can satisfy all their needs from one supplier, they will do so, even if another supplier offers a slightly lower price. This is not only due to laziness but to lack of inexpensive information (which itself is a function of small price differentials), desire to maintain goodwill, and past success with the IC maker's products.

"E.g., Microprocessor systems. These systems generally consist of chip "sets" which include μP (microprocessor) chip, an I/O (input/output) chip, memory chips, and other support chips. The system chips form a computer when properly interconnected.

« iepriekšējāTurpināt »