The safeguards we recommend will not be without costs, which will vary from system to system. The personal- data record-keeping practices of some organizations already meet many of the standards called for by the safeguards. The Social Security Administration, for example, maintains a record of earnings for each individual in the Social Security system, and each individual has the legal right to learn the content of his record. Procedures have been set up to allow an individual to find out easily what is in his record and to have the record corrected if it is wrong. Disclosure of an individual's record outside the system is forbidden, except under certain limited circumstances prescribed by statute and regulation, and there are criminal penalties for unauthorized disclosure. An individual is given notice and opportunity for a hearing when the record is being changed at the initiative of the Social Security Administration. These protections are a normal part of Social Security administration and,. in our view, demonstrate the feasibility of building such safeguards into any system. when the system's managers are strongly committed to do so.
We believe that the cost to most organizations of changing their customary practices in order to assure adherence to our recommended safeguards will be higher in management attention and psychic energy than in dollars. These costs can be regarded in part as deferred costs that should already have been incurred to protect personal privacy, and in part as insurance against future problems that may result from adverse effects of automated personal data systems. From a practical point of view, we can expect to reap the full advantages of these systems only if active public antipathy to their use is not provoked.13
The past two decades have given America intensive lessons in the difficulty of trying to check or compensate for undesirable side-effects stemming from headlong application and exploitation of complex technologies. Water pollution, air pollution, the annual highway death toll, suburban sprawl, and urban decay are all unanticipated consequences of the too narrowly conceived and largely unconstrained applications of technology. Hence, it is essential now for organizational decision makers to understand why they should be sensitive to issues of personal privacy and not permit their organizations unilaterally to adopt computer-based record-keeping practices that may have adverse effects on individuals. They must recognize where conflicts are likely to arise between an individual's desire for personal privacy and an organization's record-keeping goals and behavior. They must recognize that although individuals and record-keeping organizations do have certain shared purposes, they also have other purposes-some of which are mutual, though not perceived as such, and some of which can be in direct conflict.
Record-keeping organizations must guard against insensitivity to the privacy needs and desires of individuals; preoccupied with their own convenience or efficiency, or their relationships with other organizations, they must not overlook the effects on people of their record-keeping and record-sharing practices. They have the power to eliminate misunderstanding, mistrust, frustration, and seeming unfairness; they must learn to exercise it.
1 Appendix G contains a review of law that bears on the collection, storage, use, and dissemination of information by the Department of Health, Education, and Welfare.
2 44 U.S.C 3501-3511.
3 5 U.S.C 552.
4 The privacy implications of the Freedom of Information Act and its application to computer-based record- keeping systems are discussed in Arthur R. Miller, The Assault on Privacy (Ann Arbor: University of Michigan Press), 1971, pp. 152-161.
515 U.S.C. 1681-1681t (1970).
6The Fair Credit Reporting Act is a notable exception.
7 From this conclusion we should not be understood to be unaware of the potential significance of an unqualified right of personal privacy-either Constitutionally or by statute. We know of at least one instance in which the existence of such a right in a State constitution served as the basis for the State's Attorney General to deny access to certain public records whose disclosure was not explicitly provided for in the governing State statutes. We would support enactment of a right of personal privacy for many reasons, but not as the only or best way to protect personal privacy in computer-based record-keeping systems.
8 Alan F. Westin, Privacy and Freedom (New York: Atheneum), 1967, p. 7.
9 Ibid p. 373
10 Office of Science and Technology of the Executive Office of the President, Privacy and Behavioral Research (Washington, D.C., 1967), p. 8.
11 Charles Fried, "Privacy," The Yale Law Journal, Vol. 77 (1968), p. 482.
12 These comments point up what we regard to be the deficiencies of a regulatory approach that would constitute a single Federal agency as the regulatory body. They are not intended to discourage the development of regulation in specific, limited areas of application of computer-based record-keeping systems. For example, where particular institutions or societal functions are already subject to regulation, e.g., public utilities, common carriers, insurance companies, hospitals, it well may be that an effective way to introduce and enforce safeguard requirements would be through the public agencies that regulate such institutions. Such an approach has been adopted with respect to the credit-reporting industry (see discussion, Chapter IV, p. 69).
Many municipal governments have been exploring regulatory or quasi-regulatory mechanisms for applying safeguard requirements to so-called "integrated municipal information systems." The efficacy of such mechanisms has not yet been demonstrated; however, we know of several that appear promising in conception. In addition, at both State and local government levels, efforts are being made to regulate the use of criminal justice information systems.
13In addition to maintaining and using records of personal information, computer technology is a tremendous new force for development in many ways. Already, for example, computers are controlling traffic on city streets and highway systems, and in the air; supplementing human judgment in making medical diagnoses; monitoring air pollution; predicting the weather; and even acting as surrogates for human decision makers in controlling large electrical power systems, industrial manufacturing processes, and highspeed rail transportation systems. Such computer applications do not typically require identifiable information about people. That which is required is limited and need be retained for only a short time. Thus the social risks from computer systems such as these are beyond the scope of this report.