Many mechanisms have been suggested for providing safeguards against the potential adverse effects of automated personal-data systems. Those who believe a general right of personal privacy should be established, by Constitutional amendment or by statute, propose, in - effect, that the courts should be the mechanism. Although we have concluded that a general right of privacy is not a reliable approach to achieving effective protection, the safeguards we recommend in the following chapters of this report would rely in part on the courts.
Some have proposed that there be a public ombudsman to monitor automated personal data systems, to identify and publicize their potential for adverse effects, and to investigate and act on complaints -about their operation. We note with approval the efforts of the Association for Computing Machinery, and of many business firms and newspapers, to provide ombudsman service to the victims of computer errors. We believe the benefits of this approach are many and would like to see it extended to more systems. However, the ombudsman concept is basically remedial and will, therefore, work best in the context of established rights and procedures. Furthermore, the function is not well understood or widely accepted in America, and some observers feel it has severe limitations in the context of American legal, political, and administrative traditions.
The "strongest", mechanism for safeguards which has been suggested is a centralized, independent Federal agency to regulate the use of all automated personal data systems. In particular, it has been proposed that such an agency, if authorized to register or license the operation of such systems, could make conformance to specific safeguard requirements a condition of registration or licensure. The number and variety of institutions using automated personal data systems is enormous. Systems themselves vary greatly in purpose, complexity, scope of application, and administrative context. Their possible harmful effects are as much a product of these- features as of computerization alone. We doubt that the need exists or that the necessary public support could be marshaled at the. present time for an agency of the scale and pervasiveness required to regulate all automated personal data systems. Such regulation or licensing, moreover, would be extremely complicated, costly, and might uselessly impede desirable applications of computers to record keeping.12
The safeguards we recommend require the establishment of no new mechanisms and seek to impose no constraints on the application of electronic data-processing technology beyond those necessary to assure the maintenance of reasonable standards of personal privacy in record keeping. They aim to create no obstacles to further development, adaptation, and application of a technology that, we all agree, has brought a variety of benefits to a wide range of people and institutions in modem society.
The proposed safeguards are intended to assure that decisions about collecting, recording, storing; disseminating,. and using identifiable personal data will be made with full consciousness and consideration of issues of personal privacy-issues that arise from inherent conflicts and contradictions in values and interests. Our recommended safeguards cannot assure resolution of those conflicts to the satisfaction of all individuals and groups involved. However, they can assure that those conflicts will be fully recognized and that the decision-making processes in both the private and public sectors, which lead to assigning higher priority to one interest than to another, will be open, informed, and fair.
The safeguards we will recommend are intended to create incentives for institutions that maintain automated personal data systems to adhere closely to basic principles of fair information practice. Establishment of a legal protection against unfair information practice to embody the safeguard requirements described in Chapters IV, V, and VI, will invoke existing mechanisms to assure that automated personal data systems are designed, managed, and operated with due regard for protection of personal privacy. We intend and recommend that. institutions should be held legally responsible for unfair information practice and should be liable for ,actual and punitive damages to individuals representing themselves or classes of individuals. With such sanctions institutional managers would have strong incentives to make sure their automated personal data systems did not violate the privacy of individual data subjects as defined.
Of greatest importance, from our point of view, the safeguards we will recommend give the courts a reliable and generally applicable basis for protecting personal privacy in relation to record keeping. The legal concept of fair information practice we recommend will obviate the need to search for new Constitutional doctrines or to invent ways of extending the existing common law of privacy to cover situations for which it is conceptually ill-suited.