Lapas attēli
PDF
ePub

"front ends" to provide remote users with point and click access to information stored on their servers, as well as access through "links" to information stored on other remote servers. Web "browsers" are programs that run on a personal computer or workstation that enable a user to establish connections to these graphical front ends, view, retrieve and manipulate data provided by those remote servers. Examples of popular, currently available Web browsers include: Mosaic, from the National Center for Supercomputing Applications; Netscape Navigator, from Netscape Communications Corporation; and Enhanced Mosaic, from Spyglass, Inc. Web browsers typically provide support for electronic mail, gopher and ftp sessions, and, most importantly, support retrieval and display of a much broader variety of information (e.g., text, audio, image and multimedia data).

At the root of the Web are several of the established protocols (e.g., gopher, ftp, various e-mail standards) and three new protocols: the Hypertext Markup Language (HTML), a file format for embedding navigational information in graphical and text-based documents; the Hypertext Transfer Protocol (HTTP), a communications protocol for communicating navigational information and other data between the remote server and the requesting computer; and the Uniform Resource Locator (URL) scheme for identifying the location (e.g., the location of the remote server and the location on that server of the file corresponding to the URL) of Web-accessible documents. A number of organizations and groups are also working to develop additional protocols to enable secure communications. Some of these protocols have been published as draft specifications at this point, including the Secure Sockets Layer (SSL), the Secure Hypertext Transfer Protocol (SHTTP) and the Enhanced Mosaic Security Framework. The integration of these various protocols into a single, easy to use, understandable interface has led to a tremendous increase in the popularity and use of the World Wide Web and, correspondingly, of the Internet as a means for providing and retrieving information.

C. ACCESS AND USE TECHNOLOGICAL

CONTROLS

1. SERVER AND FILE LEVEL CONTROLS

Technology will likely play a central role in implementing controls on the access to and use of protected works at both the file and server level.

[ocr errors]

Distribution of digital works can be regulated by controlling access to the source of copies of the works information or data servers. Access to these servers can vary from completely uncontrolled access (e.g., the full contents of the server are available without restriction) to partially controlled access (e.g., unrestricted access is granted to only certain data on the server) to completely controlled access (e.g., no uncontrolled access in any form is permitted). Access control is affected through user identification and authentication procedures that deny access to unauthorized users to a server or to particular information on a server.

507

507

The most common elements of such systems involve authentication of the user desiring access to the server. Typically, the server will require entry of a user name and a password. More elaborate mechanisms, however, have been developed. For example, some servers do not grant access once a user is verified, but rather, they terminate the connection and reestablish it from the server to the registered user's site. Such call-back systems tend to govern fully controlled server environments (e.g., where access will only be granted to known and verified users). Other systems are being implemented that use more elaborate authentication systems. For example, a number of companies are developing hardware key systems that require the user, after establishing a preliminary connection, to verify that connection by inserting a hardware device similar to a credit card into the user's computer system. That device then sends an indecipherable code to verify the identity of the user.

Protection of works by means of access control mechanisms assumes that the system in question is in a physically secure environment and is not vulnerable to external means to circumvent access control. Several instances have been reported where the security of a supposedly secure server system was compromised, for example, through passive monitoring during the exchange of unencrypted passwords. As a consequence, many are currently pursuing efforts to improve security at the access control level.

Nearly all service providers, including commercial online services such as CompuServe and America Online, private dial-up bulletin board systems, and servers accessible through the Înternet, control access to their systems. For example, via the Internet, users today can connect to a bewildering array of public servers using a variety of schemes, including telnet, ftp, gopher and the World Wide Web. Some information providers grant full unrestricted access to all the information contained on their servers, and use control simply to comport with physical limitations of their servers (e.g., to limit the number of concurrent users). Other information providers restrict access to users with accounts or grant only limited access to unregistered users. For example, using ftp a user can often log on to a remote server through the Internet as an "anonymous" user (e.g., a user for which no account has been created in advance); however, such a user will normally only be able to access specific data on the server. Of course, an information provider can elect not to provide uncontrolled access, and permit only those with pre-established accounts to access the server. This is more common with commerciallyoriented on-line service providers. Control over access to a server containing protected works will typically be the first level of protection a content provider will look for before making their protected works accessible through the server.

A second level for controlling access to and use of protected works can be exerted through control measures tied to the electronic file containing the work.

Restrictions on access at the file level can be implemented using features in "rendering" software. For example, a content provider may develop specialized software products or implement features in general purpose software products that would control by whom, and to what degree, a protected work may be used. Such restrictions could be implemented using features in the rendering software, a unique file format or features in an established file format, or a combination of both. "Control" measures could also be implemented to determine if the content

provider had authorized certain uses of the work, as well as some means to control the degree to which a user would be able to subsequently "manipulate" the work. For example, the rendering software could preclude a user who had not obtained the appropriate authority from the content provider or who enters an unauthorized or expired password from using the data. Rendering software can also be written to deny general access to the work if the file containing the work is not a properly authenticated copy (e.g., the file has been altered from the version as distributed by the content provider). Such features will be possible provided that sufficient information regarding authorized use can be associated with the file containing the information product (e.g., through inclusion in a file header, packaged and sealed in an "electronic envelope" sealed with a digital signature, embedded through steganographic means, etc.).

2. ENCRYPTION

508

In its most basic form, encryption amounts to a "scrambling" of data using mathematical principles that can be followed in reverse to "unscramble" the data. File encryption thus simply converts a file from a manipulable file format (e.g., a word processor document or a picture file that can be opened or viewed with appropriate software) to a scrambled format. Authorization in the form of possession of an appropriate "key" is required to "decrypt" the file and restore it to its manipulable format.

510

508

509

See discussion of stenography infra pp. 188-89.

For example, the software may deny access to a work if the electronic file containing the work has been altered or information stored in the file does not match data supplied by a user necessary to open and use the file. See discussion of digital signatures infra pp. 187-88.

510 Rendering or viewing software may integrate encryption and file manipulation into a single software package. In other words, the rendering software, after getting a password, will decode the file and permit the user to manipulate the work (e.g., view it or listen to it), but only with the provided rendering software.

Encryption techniques use "keys" to control access to data that has been "encrypted." Encryption keys are actually strings of alphanumeric digits that are plugged into a mathematical algorithm and used to scramble data using that algorithm. Scrambling means that the original sequence of binary digits (i.e., the 1s and Os that make up a digital file) that constitute the information object is transformed using a mathematical algorithm into a new sequence of binary digits (i.e., a new string of 1s and Os). The result is a new sequence of digital data that represents the "encrypted" work. Anyone with the key can decrypt the work by plugging it into a program that applies the mathematical algorithm in reverse to yield the original sequence of binary digits that comprise the file. Although most commonly thought of as a tool for protecting works transmitted via computer networks, encryption can be and is used with virtually all information delivery technologies, including telephone, satellite and cable communications. Of course, once the work is decrypted by someone with the key, there may be no technological protection for the work if it is stored and subsequently redistributed in its "decrypted" or original format.

A widely publicized technique for sending secure transmissions of data is "public key" encryption. This technique can be used to encrypt data using an algorithm requiring two particular keys a "public" key and a "private" key. The two keys are affiliated with the recipient to which the information is to be sent. The "public" key is distributed publicly, while the private key is kept secret by recipient. Data encrypted using a person's public key can only be decrypted using that person's secret, private key. For instance, a copyright owner could encrypt a work using the public key of the intended recipient. Once the recipient receives the encrypted transmission, he could then use his private key to decrypt that transmission. No secret (private)

511

An algorithm is a set of logical rules or mathematical specification of a process which may be implemented in a computer.

« iepriekšējāTurpināt »