Lapas attēli
PDF
ePub

That, free traders say, should be a warning to those in Congress who want to wield the big stick of government retaliation in the computer chip battles with Japan.

CHIPS: A GLOSSARY OF TERMS

Silicon: the hard, gray, lightweight material from which chips ae made. Wafers of silicon are "doped" with impurities in selected places to change electrical properties and affect the path of the current. Lithography is used to imprint tiny wires, or circuits, on a chip's silicon layers.

Transistor: an electrical switch in a chip that can be turned on and off in a controlled way to store or process data.

Integrated circuit: a combination of tansistors. The latest generation contains as many as 100,000.

Memory: a chip that stores information.

Microprocessor: a chip that performs some of the same tasks as a computer; the "brain," or control, in hundreds of pieces of equipment, from car engines to comput

ers.

Microcode: a software program that is the permanent set of instructions on a microprocessor chip.

Bit: A single "on" or "off" signal, a single piece of electronic code. It takes several bits together to represent one letter, punctuation mark or numeral.

Hon. ROBERT W. KASTENMEIER,

C. BY GERALD J. MOSSINGHOFF
U.S. DEPARTMENT OF COMMERCE,
PATENT AND Trademark OFFICE,
Washington, D.C., November 23, 1983.

Chairman, Subcommittee on Courts, Civil Liberties and the Administration of Justice, Committee on the Judiciary, House of Representatives, Washington, D.C. DEAR. MR. CHAIRMAN: I have been following with great interest efforts to develop an appropriate form of protection for semiconductor chip designs (H.R. 1028). Aware that your Subcommittee may hold hearings in the near future, I wanted to report to you the Administration's position on this important subject.

As you know, the Cabinet Council on Commerce and Trade established a Working Group on Intellectual Property to develop policy options on a number of important intellectual property issues. Recoginizing the importance of the semiconductor industry to the U.S. economy, the Cabinet Council on Commerce and Trade directed the Working Group to consider the need to protect semiconductor chip designs. It found that while the United States dominates this important market, it faces a serious challenge from foreign competition. It also found that the R&D costs for a single complex chip could reach $4 million, while the costs of copying such a chip could be less than $100,000. This constitutes a significant disincentive for creators to invest in this technology.

There are no effective legal means of stopping the copying of chips under existing United States laws. While a patent would protect against the manufacture, use and sale of the electronic circuitry embodied in a semiconductor chip, the circuits actually placed on chips frequently do not satisfy the patentability requirements of being "new, useful and unobvious.'

On the basis of these considerations, the Cabinet Council on Commerce and Trade recommended that the Administration endorse protection for the creators of this valuable technology. Specifically, the Cabinet Council on Commerce and Trade recommended the prompt enactment of legislation protecting semiconductor chip designs and that such legislation have the following characteristics:

(a) It should accord prompt, inexpensive protection to original semiconductor chip designs through a registration system without substantive examination. (b) The protection should grant to the owner of the chip design the exclusive right to copy, for commercial purposes, the chip design, or chip embodied in that design, as well as the exclusive right to distribute such a chip.

(c) The protection should have a relatively short term, e.g., ten years. (d) As an exception to the exclusive rights, there should be an express right to reverse engineer for the purpose of teaching, analyzing or evaluating the concepts or techniques embodied in the design of the semiconductor chip.

(e) Unless there are overriding circumstances to the contrary, the protection should be prospective from the current time.

The prompt enactment of legislation along these lines would materially assist U.S. industry by providing protection for this valuable and important new technolo

30-425 O-84--25

gy. I would be pleased to discuss the recommendations of the Cabinet Council on Commerce and Trade in greater detail with you or your staff and to assist the Subcommittee in any way I can.

Sincerely,

GERALD J. Mossinghoff, Commissioner of Patents and Trademarks.

APPENDIX 3.-SUPPLEMENTAL ARTICLES

[Copyright © 1983 IEEE; reprinted with permission]

VLSI DESIGN AUTOMATIION: AN INTRODUCTION

(By Michael Feuer)

INVITED PAPER

(Abstract. This paper is a brief introduction to the automation of the design of very-large-scale integrated circuits (VLSI). The field of design automation has grown so large in the last twenty years that a complete treatment would require and encyclopedia. What follows, therefore, is only a sketch of the history, state of the art, and current key problems of the aoutomation of VLSI design.)

HISTORY

The history of anything to do with VLSI is almost a contradiction in terms. Until recently, VLSI had always been thought of in the future tense. Integrated circuits (IC's), medium/scale integration (MSI), and large-scale integration (SLI) are historical terms, but not VLSI. Only with the advent of microprocessors with some half/ million transistors on a chip has there been a grudging acceptance that VLSI may indeed have arrived. These acronymic labels are always applied after the fact, but VLSI was resisted longer than most. Extrapolating from the fact that early IC's contained several logic gates, MSI tens, and LSI hundreds, we might expect VLSI circuits to contain thousands of gates. By the same reasoning, today's 32-bit microprocessors would be examples of ULSI (the U for ultra). Maybe we are running out of acronyms and need to conserve. In any case, for this article, a chip with several thousand logic gates or more qualifies as a VLSI chip.

During the 1950's, Texas Instruments, Fairchild Semiconductor, and others developed the photolithographic process for the fabrication of transistors on crystalline silicon. The steps involved in the design of early IC's are still qualitatively the same today. The first step is the definition and optimization of the process by which the devices and interconnections are to be fabricated. The second is the electrical characterization of the circuit elements. These two steps together are sometimes known as technology definition. Third, the user of the technology generates a design (circuit of logic schematic) to be implemented. Fourth, this logical design is reduced to a series of geometric patterns through which materials are to be added or subtracted in the fabrication of the circuit. Finally, a set of test input signal patterns and responses is generated to detect fabrication defects. Testing is an integral feature of IC manufacture because a significant prcentage of chips come off the line with at least one defect. These defects are detected by applying the test patterns to the chip inputs and comparing the output signals to those expected. Defective chips are discarded.

In the 1960's, these five steps were largely manual. Process parameters, such as diffusion temperatures, times, and pressures, and metal line widths and spacings were worked out primarily through trial and error. Yields and electrical properties of the resulting devices were monitored. The process was characterized by a set of electrical and physical design rules for the user of the technology. For digital circuits, the switching characteristics were boiled down to rising and falling delays, fan-out rules, and the like. Physical design rules prescribed widths, spacings, and overlaps required to achieve acceptable yields.

The engineer-user would supply a circuit or logic schematic sketched on a piece of (yellow) paper. The correctness of the circuit could be verified by implementing the same circuit in discrete components ("breadboarding"). An expert layout designer then drew the mask patterns necessary to implement the circuit. The drawings were transferred to a red plastic material called rubylith which was cut away according to the drawing. This step was verified by a careful, independent visual inspection ("eyeballing"). The rubylith pattern was optically reduced to form photo

lithographic masks. Testing was a manufacturng function. For small circuits, exhaustive functional testing was possible and ac characteristics could be measured. As time progressed, the number of devices per chip started to double every year (Moore's law, [1]). This increased mask complexity, and in the early 1970's the rubylith patterns began to outgrow the space on laboratory floors. By the late 1960's this method began to give way to numerically controlled optical pattern generating machines. These required digitally encoded geometric patterns, and the layouts were transferred to data tapes by tracing over them with electromechanical digitizers. With the patterns now accessible to computer processing, the visual inspection could be enhanced with design rule checking (DRC) programs which detected shorts and spacing violations. Another advantage was that corrections to the drawing could be made much more easily than to the rubylith cutouts.

The next step was to display the patterns on a CRT screen, and interactive graphic layout was born-an activity almost synonymous with computer-aided design (CAD) for many years. Commercial turnkey graphics systems began to appear in the early 1970's, although large companies developed in-house systems earlier [2]. The power of interactive graphics was most evident for repetitive patterns such as memory arrays or gate arrays, where a set of geometric data called a cell could be replicated thousands of times in different positions and orientations on the array without having to be redrawn.

As the density of IC's increased, the need for circuit simulation programs became critical. Discrete circuits could be probed and monitored at all nodes, but IC's were inaccessible inside the chip. The only way to tell what they were doing internally was through circuit simulation and through effects accessible at output pins. A series of programs was developed in the decade from the mid-1960's to the mid1970's; CIRCAL, SCEPTRE, ECAP, ASTAP, SPICE, and others [3]. A byproduct of circuit simulation was the availability of the circuit schematic in machine readable form. This network information was entered on punched cards, then through alphanumeric terminals, and lately as drawings on interactive graphics equipment. The network information made possible not only simulation, but also automatic verification that the layout interconnections indeed matched those of the input network. Because it was impossible to modify a chip to correct a design error, it became important to verify the correctness of the design prior to releasing the chip to manufacturing. Since the simulation of the full analog behavior of large digital circuits became prohibitively expensive, logic simulation with discrete Boolean values became the dominant software verification tool. Switching-level or gate-level simulators evolved through a series of stages ([4] and [5]) until event-driven simulators capable of handling unique delays for several thousand logic blocks became standard tools.

The automation of the layout function began with techniques borrowed from printed circuit board design. Routing algorithms based on work by Lee [6] and Moore [7] were available for finding paths for metal interconnections between pins of logic functions on the chip. A distinction can be made between this sort of automatic design activity and the verification mentioned above: one is synthesis and the other analysis. To facilitate layout, certain constrained design styles such as gate arrays and standard cell arrays were developed in the late 1960's. These led to the invention of the channel router of Hashimoto and Stevens [8], an algorithm unique to IC's. Over the years, routing has become one of the richest areas in design automation in terms of available techniques, and algorithms have been developed to handle the interconnection problem in almost all conceivable situations.

The regularity of standard cells and gate arrays also facilitated the development of automatic placement algorithms of very high quality [9]. The standardization of the size and shape of the units of logic made the placement task more tractable than that of modules on printed circuit boards. Automatic placement and routing together formed a complete automatic layout system [10], [11].

The gate array, or masterslice, was recognized by the systems manufacturers, notably IBM, as a design style which reduced design time while still providing reasonable silicon area utilization compared to free-form layout. It became very important to understand how much routing space was required on a gate array to ensure the automatic layout of almost all designs using the array. Too much routing space reduced the gate count, while too little led to low utilization of available gates. This need led to theoretical work on routing space estimation which found substantial usage and payoff [12].

For designs consisting of large functional units of different internal structure, tools were developed for the automatic generation of PLA macros, register stacks, memory macros, and bit sliced data flow macros [13].

Test generation also soon outgrew the capabilities of manufacturing organizations. Exhaustive tests based on the input-output specifications of the circuit require an astronomical amount of time even for moderately large IC's. An exhaustive test requires that all possible input patterns be applied for each internal state of the circuit. For a static (dc) test this number is two rasied to the number of primary inputs times two raised to the number of internal latches. Even for an early microprocessor, the Intel 8080, an exhaustive test set would contain over 10 patterns; at 1 us per input pattern, the test time would be more than 10 years!

One solution was to save the simulation patterns used to verify the logic design and to appy them during test. Unfortunately, this functional testing did not provide a high level of confidence that other valid input patterns would not uncover defects missed by the test. To estiamte this risk, researchers studied the circuit structure and classified the likely local faults. One model appealing because of its mathematical tractability if nothing else, was the single-struck fault model. With a fault dictionary it was possible to include fault grading into simulation to compute the number of faults which would be uncovered by a set of patterns. The designer could also see which faults would have been missed and could add more patterns to find them. With the single-stuck fault model, test patterns could be automatically generated for combinatorial unit logic using methods such as Roth's celebrated D-algorithm [14].

Extensions of automatic test pattern generation algorithms to sequential circuits met with only limited success up to about 5000 equivalent gates, and it became obvious that the test pattern generators would need more assistance from the logic designers. At least in the case of the large systems manufacturers, special circuitry was added to the chips to increase the ease of generating and applying tests. The best known of these is IBM's Level Sensitive Scan Design (LSSD) [15]. Today testability is recognized as one of the key responsibilities of the logic design. An untestable design, even if otherwise correct, is worthless.

STATE OF THE ART

The status of VSLI design automation is particularly difficult to assess because so much of it is carried on inside large electronics companies on a proprietary basis. Most of these activities are reported in the literature, but, since the systems themselves remain inaccessible, others are forced to develop their own tools or to turn either to unversity sources or to the relatively small vendor design automation industry. This makes for a very uneven of the art.

VLSI design practices vary from the fully integrated highly automatic gate array design systems of the large systems manufacturers to the computer-assisted largely manual methodologies of the designers of high-density custom MOS microprocessors. The following is a composite state-of-the-art design system:

Hardware

A design automation facility usually consists of a family of interactive terminals attached to each other and to a host mainframe computer by a communications network. Alphanumeric terminals are sufficient for messages, status reporting, and job control. A low-cost graphics terminal for logic entry is desirable in each engineer's office. For layout, a high-function color system is most efficient. The advent of inexpensive VLSI memories and microprocessors is revolutionizing the interactive graphics business. The trend has been to supply more and more processing power and memory at the terminals or work stations. The mainframe computer is reserved for long-running jobs such as simulation, test pattern generation, or design rule checking and for maintenance of the central data base. A high-speed plotter is useful for displaying the finished artwork.

Control and Release System

This is software to track design status, to coordinate the contributions of many designers, to control engineering changes and other levels of design, to ensure that updates do not invalidate previous verification steps, and to prepare data in standard form for manufacturing. Data integrity is the key to success in VLSI design. Not only is the number of devices per design staggering, but the design automation process itself produces volumes of intermediate data which must be controlled. Multimode Hierarchical Data Base

This is not a data base in the usual sense of small interactive transactions. The data needed for automatic processing are rather large specially organized files. These files are related to each other in at least three ways. The first was already mentioned: they may describe different versions or levels of the same thing. The

second is that they may describe a different aspect or mode of the same entity. Thus a shifter can have a symbolic form for documentation, a behavioral simulation model, another model for test pattern generation, an outline shape for floor planning purposes, a symbolic track description for automatic routing, detailed polygon mask shapes, and "fractured" rectangle shapes for pattern generation. The data base must maintain consistency among these data modes. These modes contribute to the volume of intermediate data mentioned earlier. The third relationship is hiearchy. The same shifter behavioral model can have an expansion to behavioral models of interconnected latches, which, in turn, can be expanded to simulation models of unit logic elements and, finally, to individual transistors. The associated shapes will display a similar hierarchical structure. In a large systems environment, the hierarchy will extend to all packaging levels as well as the chip. The data base must allow for this multiple nesting of design entities. The trend toward relational data base organization (e.g., Mentor Graphics, Portland, OR) also deserves mention. The advantages claimed are simplicity of use and ease of reorganization for future enhancements without invalidating existing programs. The traditional disadvantage of poor performance seems to be yielding to improved software and hardware techniques.

Unified Interactive User Interface

Any large design system must incorporate tools from various sources. It is important, however, that the user be presented with a consistent, well-designed view of the system. Nomenclature, menu layout, message style, and job submission commands should be consistent. The Bell Laboratories Designer's Workbench is an example of such a system [16]. Redundant data entry should be minimized. Errors, especially simple syntactic errors, should be trapped by the system in real time. Even better is a system to guide the user by presenting only options which cannot produce trivial errors.

Automated Verification

With VLSI this is the key function which a design automation system performsthe avoidance of errors. The beginning of the design process currently is the specification of external system behavior. The verification of system specifications is accomplished through design reviews, emulation on existing hardware, and simulation using general-purpose or specially written simulation systems. The state of the art here is understandably rather uneven. The next phase is the design of the system in terms of functional components. For computer systems, these might be ALU's, PLA's, registers, and busses. The verification of this design is usually done using simulators which contain behavioral models for these functional components. The results are examined for consistency with the system specifications. This comparison is typically not automatic because of the lack of precision of the usual specifications. At this point, the designer should also have a plan for partitioning and packaging the system. On single-chip systems, this is the so-called floor plan. Tools are under development to estimate the shape, area, power consumption, pin requirements, and routability of the partitioned subfunctions, but the verification of the feasibility of a partition or floor plan still depends largely on human judgement. The ensuing refinement steps of detailed logic design can all be verified automatically against the next higher level of design. Static verification of logical equivalence and static timing analysis can take the place of simulation. Where simulation is desired, a mixed-mode simulator capable of combining behavioral, unit logic and possibly switch level, and analog circuit level models is ideal.

Layout verification consists of a comparison with the logic and a check of internal consistency. In a hierarchical system, each level of the layout hierarchy can be checked for spacing violations with the boundaries specified at the next higher level. However, at the lowest levels of design, the verification that a given mask geometry will produce the desired analog devices, and that these, in turn, will perform the desired digital functions is only partly automated today. The usual practice is to limit the design to a specified library of basic structures, to analyze these exhaustively using device analysis and circuit simulation programs, and to generate the appropriate digital models.

Automated Design

Modern design automation systems provide powerful tools for the synthesis of VLSI circuits. Logic entry is necessarily an interactive task. It is supported by intelligent graphic engineering workstations. The automatic generation of detailed unit logic from register transfer logic has met with practical success. PLA minimization programs are in common use. Layout is either computer assisted on high-function color graphic workstations for free-form designs, or highly automated for more con

« iepriekšējāTurpināt »