Lapas attēli
PDF
ePub

NATIONAL SEMICONDUCTOR,
August 19, 1983.

Hon. ROBERT KASTENMEIER,

Chairman, Subcommittee on Courts, Civil Liberties and Administration of Justice, Rayburn House Office Building, Washington, D.C.

DEAR CONGRESsman KastenmEIER: National Semiconductor Corporation wishes to place the following statement on record in regard to the proposed Semiconductor Chip Protection Act of 1983.

National Semiconductor Corporation supports the proposed Act as set forth in the document of February 24, 1983, before the House of Representatives (copy enclosed), subject to the following provisos:

1. National agrees with the widely held position that legitimate reverse engineering is not to be prohibited. In furtherance thereof, we submit that specific language making this clear should be included in the proposed Act.

2. The proposed Act should also include language setting forth a simple and rapid procedure for establishing that legitimate reverse engineering has been undertaken. This would allow the parties to avoid drawn-out legal proceedings involving large amounts of time and expense. No injunction could issue to the copyright holder during a reasonable period of time given to allow proof of legitimate reverse engineering, in accordance with that procedure.

3. The effective date of the Act should remain ninety days after enactment of the Act.

4. With reference to Sec. 9(2) of the proposed Act, in the case of masks made in or imported into the United States before the effective date of the Act, replacement of such masks should be allowed.

While National supports the proposed Act in accordance with the above, there is some concern as to whether such an Act, if passed, would provide a value in protection that is worth the burden placed on parties in documenting legitimate reverse engineering.

Yours very truly,

JOHN R. FINCH,

Vice President and General Manager, Semiconductor Division.

Hon. PETER W. RODINO,

NATIONAL ASSOCIATION OF MANUFACTURERS,

Chairman, House Committee on the Judiciary,
House of Representatives, Washington, D.C.

April 26, 1984.

DEAR MR. CHAIRMAN: We note the recent reporting of the Semiconductor Chip Protection Act, H.R. 1028 as amended by the Subcommittee on Courts, Civil Liberties and the Administration of Justice. The swift action by the Subcommittee is an encouraging indication that this legislation can be enacted in this session of Congress.

NAM believes that this addition to U.S. intellectual property law, although unusual in terms of traditional concepts of what is copyrightable, will provide U.S. semiconductor companies with much-needed protection against unauthorized copying of semiconductor designs and masks (glass plates that incorporate circuit patterns).

Mr. Ralph Thomas, Senior Vice President, American Electronics Association, recently noted that this legislation "will enable U.S. semiconductor manufacturers to remain competitive in an increasingly combative world marketplace. [The legislative] provides incentives for these firms to invest in vital research and deveopment programs and eliminate the unfair advantage presently available to those who would pirate and subsequently copy semiconductor designs."

We agree, the threat of pirating of semiconductor chip designs is a deterrant to innovation in semiconductor products. We believe that this legislation can help U.S. manufacturers maintain our technological edge.

Sincerely,

RICHARD SEIBERT.

APPENDIX 2.-ADDITIONAL MATERIALS SUBMITTED BY WITNESSES

A. (BY HON. DON EDWARDS)

[Copyright © 1982 National Geographic Society, Reprinted with permission, National Geographic magazine] ELECTRONIC MINI-MARVEL THAT IS CHANGING YOUR LIFE-THE CHIP

(By Allen A. Boraiko)

It seems trifling, barely the size of a newborn's thumbnail and little thicker. The puff of air that extinguishes a candle would send it flying. In bright light it shimmers, but only with the fleeting iridescence of a soap bubble. It has a backbone of silicon, an ingredient of common beach sand, yet is less durable than a fragile glass sea sponge, largely made of the same material.

Still, less tangible things have given their names to an age, and the silver-gray fleck of silicon called the chip has ample power to create a new one. At its simplest the chip is electronic circuitry: Patterned in and on its silicon base are minuscule switches, joined by "wires" etched from equisitely thin films of metal. Under a microscope the chip's intricate terrain often looks uncannily like the streets, plazas, and buildings of a great metropolis, viewed from miles up.

Even more incongruous, a silicon flake a quarter inch on a side can hold a million electonic components, ten times more than 30-ton ENIAC, the world's first electronic digital computer. ENIAC was dedicated in 1946, the ancestor of today's computers that calculate and store information, using memory and logic chips. But ENIAC's most spectacular successor is the microprocessor-a "computer on a chip." This prodigy is 30,000 times as cheap as ENIAC, draws the power of a night-light instead of a hundred lighthouses, and in some versions performs a million calculations a second, 200 times as many as ENIAC ever could.

The chip would be extraordinary enough if it were only low-cost, compact electronics, but its ability to embody logic and memory also gives it the essence of human intellect. So, like the mind, the chip has virtually infinite application-and much the same potential to alter life fundamentally.

A microprocessor, for example, can endow a machine with decision-making ability, memory for instructions, and self-adjusting controls. In cash registers the miniature computer on a chip totals bills, posts sales, and updates inventories. In pacemakers it times heartbeats. It sets thermostats, tunes radios, pumps gas, controls car engines. Robots rely on it; so do scientific instruments such as gene synthesizers. Rather than simply slave harder than humans, machines can now work nearly as flexibly and intelligently, to begin priming a surge in productivity we may one day recall as the second industrial revolution.

The chip's condensed brainpower nourishes another phenomenon-personal computers. Last year more than 800,000 were sold, most to people who do not know how these first cousins of the pocket calculator work, nor need to know, because the chip makes them increasingly easy to use.

Piggybacking on personal computers are dozens of new services. Exotic now, computer conveniences such as electronic mail and newspapers and home banking and shopping could in time become as universal as telephone service.

Questions arise. If we can screen out all but the news that interests us most, will we grow parochial? If we shop and pay bills from home and carry less cash, will streets be safer? Must employees who work at home with company computers be electronically monitored? Will children stimulated by computers grow up to find effective cures for poverty, hunger, and war?

These questions were unimaginable in 1959, birth year of the chip, but in a decade they may be current. That would be no surprise, so broadly and swiftly has the chip penetrated our lives.

Recently I spent months gauging the progress and impact of the chip. In laboratories, scientists showed me that the chip, though complex, is understandable. At home a personal computer alternately enraged and enlightened me. And I learned that the chip's every advance incubates another, and that one another and another. Eventually one billion transistors, or electronic switches, may crowd a single chip, 1,000 times more than possible today. A memory chip of such complexity could store the text of 200 long novels.

Chips refrigerated in ultracold liquid helium make feasible a supercomputer vastly more powerful than any yet built, with a central core as compact as a grapefruit.

Naval scientists envision semi-intelligent and autonomous robots that can pilot ships to evade enemy fire as well as rescue sailors and recover sensitive code books from sunken submarines.

Borrowing techniques from drug manufacturers, chemists hope to grow, not build, future computer circuits.

Farfetched? Then consider these coming innovations in light of some breakthroughs already achieved.

Unperfected but promising microelectronics implanted beneath the scalp can restore very rudimentary sight and hearing to some of the blind and deaf.

Robots that see, feel, and make simple judgments are entering factories, where less capable robots have been "reproducing" themselves for some time.

Within limits, computers can talk, heed our speech, or read. Some diagnose illness, model molecules, or prospect minerals with the reasoning and success of expert human doctors, chemists, and geologists.

The shock waves of the microelectronics explosion expand too far, in too many directions, to tally them all. But a few of the deeper tremors, recorded here, yield a sort of seismic profile of what lies beneath and beyond this first instant in the age of the chip.

"Wish we'd had this chip when we were designing it." Dana Seccombe taps the tiny device in the palm of his hand as tenderly as if it were a rare seed, germ of some plant bred to fruit with money. Just so for his employer, the Hewlett-Packard Company, propagator of computers, calculators, and other electronic cash crops.

Dana, head of chip design at an HP plant in Colorado, passes me the chip. It's a microprocessor and quite a handful, so to speak: 450,000 transistors, laced together by 20 meters of vapor-deposited tungsten "were." Mapping every street and freeway of Los Angeles on the head of a pin would be an equivalent feat-and no harder. That is, in fact, the gist of Dana's complaint.

Every year for more than two decades now, engineers have roughly doubled the number of components on a chip, mainly by shrinking them. They began with soldered wires as thin as cat whiskers. These projected from silicon or germanium crystals sealed in pea-size metal cans. What resembled a three-legged stool was actually a simple electronic switch-a transistor.

The transistor was invented in 1947 at Bell Telephone Laboratories to replace the bulky glass tubes that controlled and amplified electric currents in early TVs and computers such as ENIAC. These vacuum tubes were energy hungry, gave off far more heat than transistors, and frequently burned out.

But the transistor too had a flaw. If often broke off circuit boards, plastic cards embossed with flat, snakelike wires. The remedy, hit on independently by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor: Make the crystal in a transistor serve as its own circuit board. When the snake ate its tail, the integrated circuit-since dubbed the chip-was born.

Today engineers call it the crude oil of electronics, attesting that world dominance in technology rests substantially on the chip. It has strategic virtues indeed. The chip lacks soldered wires, reducing failure points and making it ultrareliable. (A vacuum-tube computer as complex as Hewlett-Packard's microprocessor would fail in seconds.) Since the chip is tiny, electrical signals take short paths from switch to switch, saving time. Further, a chip carrying 1,000 transistors does more work, faster, than one with ten-at about the same cost.

Lured by this fairy-tale performance and economy, engineers raced to jam transistors on the chip: 5,000 produce a digital watch; 20,000 a pocket calculator; 100,000 a small computer equal to older ones as large as rooms. At 100,000 transistors, you enter "very large-scale integration," or VLSI. The chip engineers joke, comes in grades like olives-large, jumbo, and colossal.

Contemplating the Hewlett-Packard chip-colossal grade-Dana says that to grasp its complexity I must scan its floor plan. He unfurls a roll of drafting paper. Four by eight feet, shingled edge to edge with thousands of squares and rectangles neatly inked in brown and black and green and blue, it's but one section of the chip. "HOW wide a section, Dana?"

He thinks in microns; one equals thirty-nine millionths of an inch. "Fifteen hundred microns." That's the width of 20 hairs from my head; to spread out the rest of the chip's design would take a gymnasium.

Dana traces a red line form a black square to a green rectangle, symbols denoting transistors and their precisely mated connections. "It takes 100 calculations to position one of these rectangles properly. We mapped two million of them," he adds. Not so odd, his wish for the computing power of a new chip even while still designing it.

Indirectly but obligingly, the chip goes to its own rescue in the guise of computeraided design, or CAD. A computer built of earlier chips can store diagrams of transistors, rules on how to link them, and data on the intended function of new chips, information that enables the computer to design a chip circuit, display it on a screen, simulate its operation, and report its performance.

Besides plotting transistors, computers also route the interconnections among them. But no computer can yet calculate, in reasonable time, the optimum way to wire a VLSI chip: Possible wiring patterns number in the millions, so complex have chip designs become. Humans must still tediously debug them-hunt for errors, or bugs-and with video screens and attached electronic pens reroute connections or regroup transistors like building blocks.

By 1990 ambitious engineers expect to squeeze ten million transistors on the chip, enlarging it slightly and making it as complex as a city nearly 1,000 miles square. How do you build a megalopolis almost twice the size of Alaska?

Manufacturing any chip is a painstaking, protracted process. Just south of San Francisco Bay, at the Intel Corporation in Silicon Valley, I found that it can take as long as three months to make a microprocessor (see the article about Silicon Valley beginning on page 459).

"Some magic's involved," engineer Ralph Leftwich said as I pulled on a baggy white nylon jump suit, cap, and bootees. Voila! I was a conjurer's illusion in my bunny suit, required fashion in the "clean rooms" where Intel pioneered the microprocessor in 1971 and where filtered air holds fewer than 100 particles of dust or other contaminants per cubic foot. To a microscopic chip circuit, motes are as menacing as boulders.

In one clean room, trays held razor-thin silicon wafers, polished mirror smooth and racked like diminutive records. They were slices of a sausagelike crystal grown from molten silicon so pure that if contaminants were redheads, there would be but 15 of them on earth. Such crystals yield wafers as large as five inches across; each wafer becomes the base of hundreds of chips.

Two things make silicon, a semiconductor, the favored material for chips. Its ability to carry electricity can be precisely altered by ingraining its crystal structure with measured amounts of chemical impurities, or dopants. And silicon surfaces can be conveniently oxidized—rusted, in effect-into an electrically insulating glaze.

"Chips are sandwiches," Ralph said as I peered at a silvery oxidized wafer. He explained that techniques reminiscent of silk screening would stack and stencil the wafer with layers of insulation and crystal, the crystal doped with infinitesimal pockets of impurities laid out in some 300 identical chip-scale circuit patterns (pages 426-7).

"The impurities form conducting areas that overlap from top to bottom of a wafer. By etching 'windows' between them, we create transistors." At the end, with as many as 12 detailed levels demanding interconnection, a wafer receives an aluminum coating and a final etch that leaves conducting filaments invisible to the naked eye.

A new chip's ultrafine "wiring" offers so little entrée to its transistors that they defy individual quality testing. But their collective performance is judged as needlelike probes jab at metal pads on the rim of each chip on a wafer, running 10,000 electrical checks a second. Sound chips are diced from wafers by a diamond saw, then bonded and wired to gold frames and sealed in small ceramic cases propped on stubby plug-in prongs. Packaged, a wafer's worth of chips looks like a swarm of black caterpillars.

This electronic species shelters by the dozens in a personal computer, and in their cocoon they might metamorphose into a journalist's tool as useful as pen or notebook.

So I fancy at home one day, unpacking a personal computer the size of a portable typewriter. And "floppy discs": plastic platters about the diameter of 45-rpm records. Like cassette tapes, they're invisibly patterned with magnetic fields representing information. To make the computer receptive, there's a master disc. A shoebox-shaped "disc drive" that I hook to the computer sends information back and forth between the disc and the computer's chips.

"Slip disc into drive," directs a manual. "Turn on power." The drive purrs, spinning the disc. It stops. Atop the computer, in the upper left of a TV screen-another attachment-there now hovers a small square of light. It blinks. That's all.

Minutes pass. "How's its going?" calls my wife from another room. Flustered, I tell her truthfully: "Nothing to it!"

That maddening, flashing marker on the screen insists on action, so I yank the computer's plug. A sullen scan of the manual discloses what's really needed: a con

cise chain of instructions-a program-telling the computer what to do, step by step. In my knotted brain a light goes on, followed by another on the screen.

Prompted by the blinking marker, or cursor, I type a practice game program on the computer's keyboard. Now the machine should display a dot, bouncing like a ball back and forth across the screen.

It beeps instead, heralding an error. I give the computer a very personal command not in any manual, then begin debugging.

Choose a starting position for dot is up on the screen, good. So are the commands, if dot on screen, plot new dot position and erase old position. About two dozen other instructions look fine. Wait. I forgot to type: Move dot again. Short one step of logic in its program, the computer simply quit. As might a dim-witted cook given a recipe that fails to instruct: "Bake cake in 350° oven for 50 minutes."

Frustrated and chastened by this machine that demands finicky precision, I can see why last year business and government paid an estimated four billion dollars for ready-made computer programs, or "software." Why by 1990 we may need 1.5 million programmers-more than three times as many as today-to write instructions for computers that issue paychecks, run factories, and target nuclear missiles. And why hundreds of programmers need months to debug 500,000 commands for flight computers aboard the space shuttle.

Fortunately, falling prices for personal computers help swell a rising tide of offthe-shelf programs that make the machines "user friendly." Once only an electronics hobbyist could master a personal computer-by building it. But as the chip reshapes computers into consumer items-some desk-top models cost no more than TV sets, pocket computers even less-they must be simple enought for anyone to

use.

To budge money, for example. One program instantly shows a home buyer how changing interest rates affect house payments. Or savings. Programs teach, everything from arithmetic to zoology. Game programs—pinball and chess and monster mazes-may number in the thousands.

With a printer and a word-processing program, the computer I used to write this article shifts, copies, or erases a word, line, paragraph, or page of text, to print cleanly edited manuscripts or letters. It also keeps files and corrects mispellings, Misspellings. Misspellings.

It's the nature of computers, of course, to do these things electronically, by switching, storing, and transforming pulses of electricity. But humans can't understand electrical signals; computers comprehend nothing else.

Yet we do communicate with computers-by translating our numbers, letters, and symbols into a code of electical pulses. In computers, by custom, a high-voltage electrical pulse represents the digit 1; a low-voltage signal stands for 0. Because this system is binary (it contains only two digits), the electrical pulses in a computer are called bits, from binary digits.

Electrical pulses representing two digits may seem thin resource for expression, but Lincoln's eloquent Gettysburg Address was telegraphed across Civil War America with only a dot and a dash, the "bits" of Morse code. Similarly, ones and zeros can encode numbers, an alphabet, or even the information in photographs and music.

Many computers, including most personal ones, digest information in chains of eight electrical pulses. These pulse strings-called bytes-shuttle through a computer's chips something like trains in a railroad switchyard. Since a byte consists of eight bits that may stand for either 1 or 0, the "cars" in one of these "trains" can be arranged in 256 (29) different ways. That's more than enough combinations to represent uniquely each letter, number, and punctuation mark needed for this article. Or to write the instructions enabling a computer to express and print it.

To carry out instructions, a computer depends on its central processor; in personal computers this "brain" is a single chip-a microprocessor. If you scanned this silicon sliver by microscope, you would notice what might be railroad tracks. These conduct "1" and "0" electrical pulses, passing through the chip at nearly the speed of light.

Alone, a microprocessor cannot hold all the data it needs and creates when working. So memory chips help out. Magnified, they show transistors in intersecting rows and columns, recalling a city street map. This grid allows the microprocessor to assign a byte a unique "address" for instant storage and recall. Most often, a memory chip permits bytes to be retrieved individually, like the numbers in a telephone book. Some such random-access memory chips, or RAMs, can store the equivalent of four copies of the Declaration of Independence.

For Japan, the chip itself is a declaration of independence. In recent years Japanese electronics firms have adopted and refined U.S. technology to win a global lead

« iepriekšējāTurpināt »