I know everybody's income and what everybody earns;
And I carefully compare it with the income-tax returns;
To everybody's prejudice I know a thing or two;
I can tell a woman's age in half a minute-and I do!
Yet everybody says I am a disagreeable man!
And I can't think why!
King Gama in Gilbert and Sullivan's Princess Ida
Our inquiry has led us to distinguish two categories of personal data systems that deserve separate attention in developing safeguards. One consists of administrative systems; the other of statistical-reporting and research systems. The essential distinction between the two categories is functional. An administrative personal data system maintains data on individuals for the purpose of affecting them directly as individuals-for making determinations relating to their qualifications, character, rights, opportunities, or benefits. A statistical-reporting or research system maintains data about individuals exclusively for statistical reporting or research, and is not intended to be used to affect any individual directly.1
This chapter contains general recommendations for all personal data systems and safeguard requirements for administrative personal data systems used as such. Chapter V contains additional safeguard requirements for statistical-reporting and research applications of administrative systems. Systems maintained exclusively for statistical reporting or research and safeguard requirements for them are addressed in Chapter VI.
Although our specific charge has been to analyze problems of automated systems, our recommendations could wisely be applied to all personal data systems, whether automated or manual. Computer-based systems magnify some record-keeping problems and introduce others, but no matter how data are stored, any maintenance of personal data presents some of the problems discussed in Chapters II and III. Moreover, the distinction between an automated and a non-automated system is not always easy to draw; requiring safeguards for all personal data systems eliminates the need to rule on ambiguous cases. Uniform application of safeguards to all systems will also facilitate conversion from manual to automated data processing when it does occur.
We define an automated personal data system as a collection of records containing personal data that can be associated with identifiable individuals, and that are stored, in whole or in part, in computer-accessible files. Data can be "associated with identifiable individuals" by means of some specific identification, such as name or Social Security number, or because they include personal characteristics that make it possible to identify an individual with reasonable certainty. "Personal data" include all data that describe anything about an individual, such as identifying characteristics, measurements, test scores; that evidence things done by or to an individual, such as records of financial transactions, medical treatment, or other services; or that afford a clear basis for inferring personal characteristics or things done by or to an individual, such as the mere record of his presence in a place, attendance at a meeting, or admission to some type of service institution. "Computer-accessible" means recorded on magnetic tape, magnetic disk, magnetic drum, punched card, or optically scannable paper or film. A "data system" includes all processing operations, from initial collection of data through all uses of the data. Data recorded on questionnaires, or stored in microfilm archives, are considered part of the data system, even when the computer-accessible files themselves do not contain identifying information.
Consistent with the rationale set forth in Chapter III, we recommend the enactment of legislation establishing a Code of Fair Information Practice for all Automated personal data systems.
Pending the enactment of a code of fair information practice, we recommend that all Federal agencies (i) apply the safeguard requirements, by administrative action, to all Federal systems, and (ii) assure, through formal rule making, that the safeguard requirements are applied to all other systems within reach of the Federal government's authority. Pending the enactment of a code of fair information practice, we urge that State and local governments, the institutions within reach of their authority, and all private organizations adopt the safeguard requirements by whatever means are appropriate. Labor unions, for example, might find the application of the safeguards to employee records an appropriate issue in collective bargaining.
Establishing Automated Personal Data Systems
We were not charged with developing criteria for determining when and for what purposes to establish personal data systems. It is doubtful that any such criteria are feasible or warranted. Our inquiry, however, has prompted us to make cautionary observations to those who must decide whether, when, and how to establish automated personal data systems.
The general proposition that records and record-keeping systems are desirable and useful does not necessarily apply to every system. Some data systems appear to serve no clearly defined purpose; some appear to be overly ambitious in scale; others are poorly designed; and still others contain inaccurate data.
Each time a new personal data system is proposed (or expansion of an existing system is contemplated) those responsible for the activity the system will serve, as well as those specifically charged with designing and implementing the system, should answer explicitly such questions as:
What purposes will be served by the system and the data to be collected?
How might the same purposes be accomplished without collecting these data?
If the system is an administrative personal data system, are the proposed data items limited to those necessary for making required administrative decisions about individuals as individuals?
Is it necessary to store individually identifiable personal data in computer-accessible form, and, if so, how much?
Is the length of time proposed for retaining the data in identifiable form warranted by their anticipated uses?
A careful consideration of questions such as these might avert the establishment of some systems. Even if a proposed system survives a searching examination of the need for it, the very process should at least suggest limitations on the collection and storage of data.
Formalized administrative procedures and requirements should be followed to assure that questions about the purposes, scope, and utility of systems are raised and confronted before systems are established or enlarged. Members of the public should also have an opportunity to comment on systems before they are created.
It is especially important that such procedures be followed whenever data collection requirements, imposed by any Federal department or agency on States, other grantees, or regulated organizations, are likely to result in the creation or enlargement of personal data systems. In our view, any such data collection requirement should be established by regulations adopted after the public has been given an opportunity to comment, rather than by less formal means, such as program guidelines or manuals. Adoption of a regulation also forces a Federal agency to go through a formal process of internal justification and executive review. In the case of Federal data-collection requirements, the notice of any proposed regulation should contain a clear explanation of why each item of data is to be collected and why it must be collected and stored in identifiable form, if such is proposed.
The Safeguard Requirements
An automated personal data system should operate in conformity with safeguard requirements that, as stated above, should be enacted as part of a code of fair information practice. It is difficult to formulate safeguard requirements that will assure, in every system, an appropriate balance between the interest of the individual in controlling information about himself and all other interests-institutional and societal. However, because the safeguards we recommend are so basic to assuring fairness in personal data record keeping, any particular system, or class of systems, should be exempted from any one of them only for strong and explicitly justified reason.
If organizations maintaining personal data systems are left free to decide for themselves when and to what extent to adhere fully to the safeguard requirements, the aim of establishing by law a basic code of fair information practice will be frustrated. Thus, exemptions from, or modifications of, any of the safeguard requirements should be made only as specifically provided by statute, and there should be no exemption or modification unless a societal interest in allowing it can be shown to be clearly paramount to the interest of individuals in having the requirement imposed. "Societal interest," moreover, should not be construed as equivalent to the convenience or efficiency of organizations that maintain data systems, the preference of a professional group, or the welfare of individual data subjects as defined by system users or operators.
Existing policies that guide the handling of personal data should not be uncritically accepted or reaffirmed. Nor should the basic "least common denominator" quality of the safeguards discourage law-making bodies, or organizations maintaining personal data systems, from providing individuals greater protection than the safeguards offer. Existing laws or regulations that provide protections greater than the safeguards should be retained; those that provide less protection should be amended to meet the standards set by the safeguards.
SAFEGUARD REQUIREMENTS FOR ADMINISTRATIVE PERSONAL DATA SYSTEMS
I. GENERAL REQUIREMENTS
A. Any organization maintaining a record of individually identifiable personal data, which it does not maintain as part of an administrative automated personal data system, shall make no transfer of any such data to another organization without the prior informed consent of the individual to whom the data pertain, if, as a consequence of the transfer, such data will become part of an administrative automated personal data system that is not subject to these safeguard requirements.
All other safeguard requirements for administrative personal data systems have been formulated to apply only to automated systems. As suggested earlier, the safeguards would wisely be applied to all personal data systems that affect individuals directly, whether or not they are automated. If this is not done, however, it is necessary to assure that individuals about whom an organization maintains records of personal data, which are not part of an automated system, will be protected in the event that personal data from those records are transferred to automated systems. Requirement I.A. is intended to provide such protection by requiring that transfers of personal data to automated systems not subject to the safeguard requirements be made only with the informed consent of the individuals to whom the data pertain.
The requirement is formulated so as not to apply to transfers of personal data that are not in individually identifiable form, e.g., for statistical reporting. (Transfers of individually identifiable data to automated systems used exclusively for statistical reporting and research are covered in Chapter VI, p. 97.)
B. Any organization maintaining an administrative automated personal data system shall:
(1) Identify one person immediately responsible for the system, and make any other organizational arrangements that are necessary to assure continuing attention to the fulfillment of the safeguard requirements;
The obligation to identify a person responsible for the system is intended to provide a focal point for assuring compliance with the safeguard requirements and to guarantee that there will be someone with authority to whom a dissatisfied data subject can go, if other methods of dealing with the system are unsatisfactory. Systems that involve more than one organization may present special problems in this respect, and must be carefully designed to assure that a data subject is not shuffled from one organization to another when he seeks to assert his rights under these requirements.
(2) Take affirmative action to inform each of its employees having any responsibility or function in the design, development, operation, or maintenance of the system, or the use of any data contained therein, about all the safeguard requirements and all the rules and procedures of the organization designed to assure compliance with them;
This requirement takes account of the fact that the actions of many people, with diverse responsibilities and functions located in different parts of an organization, affect the operations of an automated personal data system. Often these people lack a common understanding of the possible consequences for the system of their separate actions. If an organization is to comply fully and efficiently with the safeguard requirements, its employees will have to be made thoroughly aware of all the rules and procedures the organization has established to assure compliance.
(3) Specify penalties to be applied to any employee who initiates or otherwise contributes to any disciplinary or other punitive action against any individual who brings to the attention of appropriate authorities, the press, or any member of the public, evidence of unfair information practice;
The employees of an organization must not be penalized for attempting to prevent or expose violations of the safeguard requirements. Organizations maintaining systems must assure their employees that no harm will come to them as a consequence of bringing evidence of poor practice or willful abuse to the attention of parties who are willing and prepared to act on it.
A personal-data record-keeping system is often one of the least visible aspects of an organization's operations. Organization managers are sometimes ignorant of important facets of system operations, and individual clients or beneficiaries often do not perceive how their difficulties in dealing with an organization may stem from its record-keeping practices. Furthermore, systems tend to be designed, developed, and operated by sizable groups of specialists, no one of whom has a detailed understanding of how each system works and of all the ways in which it can be abused. This diffusion of responsibility, and of practical knowledge of system characteristics, makes the integrity of computer-based record-keeping systems especially dependent on the probity of system personnel. Efforts by associations of data processing specialists to gain nationwide adherence to a code of professional ethics attest to the importance of this aspect of system operations.
(4) Take reasonable precautions to protect data in the system from any anticipated threats or hazards to the security of the system;
The purpose of requirement (4) is to assure that an organization maintaining an automated personal data system takes appropriate security precautions against unauthorized access to data in the system, including theft or malicious destruction of data files.
(5) Make no transfer of individually identifiable personal data to another system without (i) specifying requirements for security of the data, including limitations on access thereto, and (ii) determining that the conditions of the transfer provide substantial assurance that those requirements and limitations will be observed-except in instances when an individual specifically requests that data about himself be transferred to another system or organization;
Requirement (5) is intended to provide protection against any additional risks to data security resulting-from transfer of data from one system to another, or from the establishment of regular data linkages between systems. To comply with this requirement, an organization would have to be able to demonstrate that it had carefully followed procedures deliberately designed to assure that the security conditions for a data transfer, including transmission facilities and the data security features and access limitations of the system receiving the data, conform to specified expectations of the transferring organization and its data subjects. In combination with safeguard requirement 111(3) (pp. 61-62, below), which requires an organization to obtain the informed consent of individual data subjects before permitting data about them to be put to uses that exceed their reasonable expectations, this requirement would, for example, prevent the sale of data files by one organization to another without the consent of the data subjects if the security features and access limitations of the purchasing organizations were such as to open the possibility of uses not anticipated by the data subjects. The exception in requirement (5) is intended to accommodate the possibility that an individual may need or want his record, or data therefrom, to be made available to another organization even though such transfer may entail risks of security or access that the transferring organization would not undertake or permit, and could not, consistent with this safeguard.
(6) Maintain a complete and accurate record of every access to and use made of any data in the system, including the identity of all persons and organizations to which access has been given;
This requirement will contribute significantly to an organization's capacity to detect improper dissemination of personal data. It is not intended to include ordinary system housekeeping entries, such as updating of files, undertaken in the course of normal maintenance by system personnel. To facilitate its compliance with requirement III (4) (p. 62, below), an organization should consider assuring that records of access to and use of data are part of, or are easily associable with, the records of individuals that are accessed and used.
(7) Maintain data in the system with such accuracy, completeness, timeliness, and pertinence as is necessary to assure accuracy and fairness in any determination relating to an individual's qualifications, character, rights, opportunities, or benefits that may be made on the basis of such data; and
(8) Eliminate data from computer-accessible files when the data are no longer timely.
Requirements (7) and (8) are intended to reduce the number of instances in which individuals are adversely affected by poorly conceived, poorly executed, or excessively ambitious uses of automated personal data systems. Because specific deficiencies in individual records will constitute evidence that requirement (7) has been violated, the effect of the requirement will be to make an organization as alert to isolated errors as it is to sources of recurring errors. To assure alertness, giving high priority to periodic retraining of system personnel and the suitability of their working conditions is essential. In addition, the 'organization may find that regular evaluation is needed of its data collection procedures and of the accuracy with which data are being converted into computer accessible form. If particular data are being reproduced for use by another system or organization, steps may also have to be taken to apprise the receiving organization of subtle pitfalls in interpreting the data.
Requirement (7) will discourage organizations from attempting to handle more data than they can adequately process and should also reduce the likelihood that computer-based "dragnet" operations will injure, embarrass, or otherwise harrass substantial numbers of individuals. Requirement (8) will promote the development of data-purging schedules that reflect the reasonable useful life of each category of data. Although the requirement would not prohibit the retention of data for archival purposes, it would assure that obsolete data are not available for routine use.
II. Public Notice Requirement
Any organization maintaining an administrative automated personal data system shall give public notice of the existence and character of its system once each year. Any organization maintaining more than one system shall publish such annual notices for all its systems simultaneously. Any organization proposing to establish a new system, or to enlarge an existing system, shall give public notice long enough in advance of the initiation or enlargement of the system to assure individuals who may be affected by its operation a reasonable opportunity to comment. The public notice shall specify:
(1) The name of the system;
(2) The nature and purpose(s) of the system;
(3) The categories and number of persons on whom data are (to be) maintained;
(4) The categories of data (to be) maintained, indicating which categories are (to be) stored in computer-accessible files;
(5) The organization's policies and practices regarding data storage, duration of retention of data, and disposal thereof;
(6) The categories of data sources;
(7) A description of all types of use (to be) made of data, indicating those involving computer-accessible files, and including all classes of users and the organizational relationships among them;
(8) The procedures whereby an individual can (i) be informed if he is the subject of data in the system; (ii) gain access to such data; and (iii) contest their accuracy, completeness, pertinence, and the necessity for retaining them;
(9) The title, name, and address of the person immediately responsible for the system.
The requirement for announcing the intention to create or enlarge a system stems from our conviction that public involvement is essential for fully effective consideration of the pros and cons of establishing a personal data system. Opportunity for public involvement must not be limited to actual or potential data subjects; it should extend to all individuals and interests that may have views on the desirability of a system.
We have not specified a uniform mechanism for giving notice, but rather expect all reasonable means to be used. In the Federal government, we would expect at least formal notice in the Federal Register as well as publicity through other channels, including mailings and public hearings. We would expect State and local governments to use whatever comparable mechanisms are available to them. For other organizations maintaining or proposing systems arrangements such as newspaper advertisements may be appropriate. Whatever methods are chosen, an organization must have copies of its notices readily available to anyone requesting them.
III. Rights of Individual Data Subjects
Any organization maintaining an administrative automated personal data system shall:
(1) Inform an individual asked to supply personal data for the system whether he is legally required, or may refuse, to supply the data requested, and also of any specific consequences for him, which are known to the organization, of providing or not providing such data;
This requirement is intended to discourage organizations from probing unnecessarily for details of people's lives under circumstances in which people may be reluctant to refuse to provide the requested data. It is also intended to discourage coercive collection of personal data that, are to be used exclusively for statistical reporting and research. (Secondary statistical-reporting and research applications of administrative personal data systems are the subject of Chapter V.)
(2) Inform an individual, upon his request, whether he is the subject of data in the system, and, if so, make such data fully available to the individual, upon his request, in a form comprehensible to him;
We considered having this requirement provide that an individual be informed that he is a data subject, whether or not he inquires. It seems to us, however, that such a requirement could be needlessly burdensome to some organizations, particularly if the character of their operations makes it likely that an individual will know that he is the subject of data in one or more systems-for example, systems that mail their customers monthly statements. Furthermore, since our objective is to specify a set of fundamental "least common denominator" standards of fair information practice, we concluded that it would be sufficient to guarantee each individual the right to ascertain whether he is a data subject when and if he asks to know.
We would, however, urge that organizations take the initiative to inform individuals voluntarily that data are being maintained about them, especially if it seems likely that the individuals would not be made fully aware of the fact as a consequence of normal system operations. For example, in systems where individuals become data subjects as a consequence of providing data about themselves in an application, the form could describe the records that will be maintained about them.
This requirement affords an individual about whom data are maintained in a system the right to be informed, and the right to obtain a copy of data, only if he may be affected individually by any use made of the system. For example, employees about whom earnings data are maintained in individually identifiable form in records kept by their employers would have these rights, but individuals appearing collaterally in records, such as an employee's dependents or character references, would have the rights afforded by this requirement only if they could be affected by the uses made of the records in which they appear.
We recognize that the right of an individual to have full access to data pertaining to himself would be inconsistent with existing practice in some situations. The medical profession, for example, often withholds from a patient his own medical records if knowledge of their content is deemed harmful to him; school records are sometimes not accessible to students; admission to schools, professional licensure, and employment may involve records containing third-party recommendations not commonly made available to the subject.
As indicated earlier (pp. 52-53, above), exemption from any one of the safeguard requirements should be only for a strong and explicitly justified reason. Thus, existing practices restricting an individual's right to obtain data pertaining to himself should be continued only if an exemption from the requirement of full access is specifically provided by law.
Reassessment of existing practices that deprive individuals of full access to data recorded about themselves will be one of the most significant consequences of establishing safeguard requirement III (2). Many organizations are likely to argue that it is not in the interest of their data subjects to have 'full access. Others may oppose full access on the grounds that it would disclose the content of confidential third-party recommendations or reveal the identity of their sources. Still others may argue that full access should not be provided because the records are the property of the organization maintaining the data system. Such objections, however, are inconsistent with the principle of mutuality necessary for fair information practice. No exemption from or qualification of the right of data subjects to have full access to their records should be granted unless there is a clearly paramount and strongly justified societal interest in such exemption or qualification.
If an organization concludes that disclosing to an individual the content of his record might be harmful to him, it can point that out, but if the individual persists in his request to have the data, he should, in our view be given it. The instances in which it can be convincingly demonstrated that there is paramount societal interest in depriving an individual of access to data about himself would seem to' be rare.
Similarly, we cannot accede in general to the claim that the sources of recorded comments of third parties should be kept from a data subject if he wants to know them. Disclosure to the data subject of the sources of such comments may be difficult for organizations that have promised confidentiality. Modifying the data subject's right of access in order to honor past pledges may be necessary. However, the practice of recording data provided by third parties, with the understanding that the identities of the data providers will be kept confidential, should be continued only where there is a strong, clearly justified societal interest at stake. Elementary considerations of due process alone cast grave doubt on the propriety of permitting an organization to make a decision about an individual on the basis of data that may not be revealed to him or that have been obtained from sources that must remain anonymous to him.
(3) Assure that no use of individually identifiable data is made that is not within the stated purposes of the system as reasonably understood by the individual, unless the informed consent of the individual has been explicitly obtained;
This requirement is intended to deal with one of the central issues of fair information practice-controlling the use of personal data. Assume that a system maintains no more personal data than reasonably necessary to achieve its purposes. Assume further that its purposes are well understood and accepted by the individuals about whom data are being maintained, and that all data in the system are accurate, complete, pertinent, and timely. The question of how data in the system are actually used still remains.
Because an individual can be adversely affected even by accurate data in well-kept records, the use of personal data in a system should be held to standards of fairness that minimize the risk that an individual will be injured as a consequence of an organization's permitting data about him to be used for purposes that differ substantially from whatever uses he has been led to expect. The public notice called for by safeguard requirement 1I (pp. 57-58, above) is intended to assure that when an individual first becomes a data subject, he will be able to understand the purposes of the system and the types of uses to which data about him will be put If, however, an organization expands the previously announced purposes of the system, or enlarges the range of permissible uses of data in identifiable form, it must not only revise its public notice for the system; but also must obtain the prior consent of all existing data subjects.
The objective of requirement III(3), in short, is to make it possible for individuals to avoid having data about themselves used or disseminated for purposes to which they may seriously object. The requirement applies to all new types of uses, whether they will be made by the system that initially collected that data or by some other system or organization to which data are to be transferred. Thus it applies (as noted on p. 56, above) to uses that may result from the transfer to data to a system whose security features and access limitations open the possibility of uses not anticipated by the data subjects.
(4) Inform an individual, upon his request, about the uses made of data about him, including the identity of all persons and organizations involved and their relationships with the system;
This requirement will guarantee the individual an opportunity to find out exactly how and why data about him have been used, and by whom. It provides this right for an individual only when he makes a request; a general rule requiring an organization to take the initiative in all cases to inform an individual how data about him have been used would often not serve any useful purpose, and might lead, for example, to periodic mass mailings to inform individuals of uses of which they are already aware. Nonetheless, there may be instances when data subjects will want to be informed on a regular basis about particular types of data use. It is the intent of this safeguard that an organization provide such service when an individual requests it.
Coupled with requirement I(6) (p. 56, above) this requirement would also afford individuals the opportunity to advise those to whom records about them have been disseminated of any corrections, clarifications, or deletions that should be made.
(5) Assure that no data about an individual are made available from the system in response to a demand for data made by means of compulsory legal process, unless the individual to whom the data pertain has been notified of the demand;
"Compulsory legal process" includes demands made in the form of judicial or administrative subpoena and any other demand for data that carries a legal penalty for not responding. It should be the responsibility of the person or organization that seeks to obtain data by compulsory legal process to notify the data subject of the demand and to provide evidence of such notification to the system. In instances when it may be more practicable for the system to give notice of the demand to the data subject, the cost of doing so should be borne by the originator of the demand.
The intent of requirement (5) is to assure that an individual will know that data about himself are being sought by subpoena, summons, or other compulsory legal process, so as to enable him to assert whatever rights he may have to prevent disclosure of the data.
(6) Maintain procedures that (i) allow an individual who is the subject of data in the system to contest their accuracy, completeness, pertinence, and the necessity for retaining them; (ii) permit data to be corrected or amended when the individual to whom they pertain so requests; and (iii) assure, when there is disagreement with the individual about whether a correction or amendment should be made, that the individual's claim is noted and included in any subsequent disclosure or dissemination of the disputed data.
It is not the intent of this requirement in any way to relieve an organization of the obligation to maintain data in accordance with requirement I(8) (p. 57, above). Rather, in combination with requirement I(8), it is expected to give an organization maintaining a system strong incentives to investigate and act upon any claim by an individual that data recorded about him are incorrect, insufficient, irrelevant, or out-of-date. The provision for obtaining injunctions included in the Code of Fair Information Practice (p. 50, above) will enable individuals to seek court orders for corrective action in regard to their records.
Relationship of Existing Laws to the Safeguard Requirements
As we stated earlier in this chapter, existing laws or regulations affording individuals greater protection than the safeguard requirements should be retained, and those providing less protection should be amended to meet the basic standards set by the safeguards. We have not attempted an exhaustive inventory of existing Federal and State statutes that may need to be amended to bring them into conformity with the safeguards, but in the course of our work we have identified two Federal statutes in regard to which we have specific recommendations.
FREEDOM OF INFORMATION ACT
The Federal Freedom of Information Act2 has a disturbing feature that could be eliminated by means of an amendment quite in keeping with the primary purpose of the Act. As noted in Chapter 111, the main objective of the Freedom of Information Act is to facilitate public access to information about how the Federal government conducts its activities. The Act contains a broad requirement that information held by Federal agencies be publicly disclosed. Nine categories of information are specifically exempted from the Act's mandatory disclosure requirement. For seven of the nine, moreover, disclosure is not prohibited or otherwise constrained by the Act, and the decision not to disclose is left entirely to the discretion of the agency holding the information. The agency is completely free to decide whether it will comply with a request that it disclose information falling within any of the seven exemptions.3
Of the seven discretionary exemptions, those that offer the most likely basis for an agency to withhold personal data from the public are:
trade secrets and commercial or financial information obtained from a person and privileged or confidential;
personnel and medical files and similar files the disclosure of which would constitute a clearly unwarranted invasion of personal privacy; and
investigatory files compiled for law enforcement purposes except to the extent available by law to a party other than an agency.
The Act's failure to provide for data-subject participation in a decision by an agency to release personal data requested under the Act is inconsistent with safeguard requirement III(3) (p. 61, above) which calls for an individual's consent to any unanticipated use of data about himself in an administrative automated personal data system. Enactment of this requirement would necessitate modification of the Freedom of Information Act to give the data subject a voice in agency decisions about public disclosure of information covered by the Act, whenever such disclosure is not within the reasonable expectations of individuals about whom a Federal agency maintains data in an automated system.
As we see it, an agency that is the custodian of personal data about an individual should not have unilateral discretion to decide to grant a request for public disclosure of such data, especially if the data fall within one of the exempted categories under the Freedom of Information Act. The data custodian should have to obtain consent from the data subject before releasing identifiable personal data about him from an administrative automated personal data system, except in cases where making the requested disclosure without the individual's consent is within the stated purposes of the system as specifically required by a statute. We expect such cases to be few.
Accordingly, we recommend that the Freedom of Information Act be amended to require an agency to obtain the consent of an individual before disclosing in personally identifiable form exempted-category data about him, unless the disclosure is within the purposes of the system as specifically required by statute. Pending such amendment of the Act, we further recommend that all Federal agencies provide for obtaining the consent of individuals before disclosing exempted-category personal data about them under the Freedom of Information Act.
If the Act were so amended, its purpose of protecting the public's "right to know" about the activities of the Federal government would be brought into a better balance with the no less important public purpose of protecting the personal privacy of individuals who are the subjects of data maintained in the automated personal data systems of the Federal government. There may be other areas of conflict between the safeguard requirements and the Freedom of Information Act. The Act should be given a thorough reappraisal with a view to formulating additional amendments needed to accommodate the safeguard requirements. An amended Freedom of Information Act and the Code of Fair Information Practice we have proposed would, in combination, provide an improved statutory framework within which to resolve the unavoidable conflicts between personal privacy and open government.
FAIR CREDIT REPORTING ACT4
The Fair Credit Reporting Act is the first Federal statute regulating the vast consumer-reporting industry. Its basic purpose, as stated in the Act, is
to insure that consumer reporting agencies exercise their grave responsibilities with fairness, impartiality, and a respect for the consumer's right to privacy.
The consumer-reporting industry is comprised of credit bureaus, investigative reporting companies, and other organizations whose business is the gathering and reporting of information about individuals for use by others in deciding whether individuals who are the subject of such reports qualify for credit, insurance, or employment. Consumer-reporting agencies typically operate what we have called administrative personal data systems, many of which contain large quantities of intelligence-type data. Increasingly, these systems are being computerized.
The Fair Credit Reporting Act requires consumer-reporting agencies to adopt reasonable procedures for providing information about individuals to credit grantors, insurers, employers and others in a manner that is fair and equitable to the individual with regard to confidentiality, accuracy, and the proper use of such information. It also places requirements on users of consumer reports and consumer-investigative reports.
The chief requirements imposed by the Act include the following:
Accuracy of Information
Consumer-reporting agencies must follow reasonable procedures in preparing reports to assure maximum possible accuracy of the information concerning the individual about whom the report is prepared. The effect of this requirement extends to all the data gathering, storing, and processing practices of an agency.
Certain items of adverse information may not be included in a consumer report after they have reached specified "ages" (except in connection with credit and life insurance transactions of $50,000 or more and employment at an annual salary of $20,000 or more) via.: bankruptcies-14 years; suits and judgments-7 years; paid tax liens-7 years; accounts placed for collection or written off-7 years; criminal arrest, indictment, or conviction-7 years; any other adverse information-7 years.
Limited Uses of Information
- A consumer-reporting agency may furnish a consumer report about an individual to be used for the following purposes and no other:
- in response to a court order in accordance with written instructions of the individual to whom it relates;
- to determine the individual's eligibility for (i) credit or insurance to be used for personal, family, or household purposes, (ii) employment, including promotion, reassignment or retention as an employee; or (iii) a license or other benefit granted by a governmental instrumentality required by law to consider an applicant's financial responsibility or status;
- to meet a legitimate business need for a business transaction involving the individual.
A consumer-reporting agency must take all steps necessary to insure that its reports will be used only for the above purposes.
Notices to Individuals
Whenever credit, insurance, or employment is denied, or the charge for credit or insurance is increased, wholly or partly because of information in a report from a consumer-reporting agency, the user of the report must notify the individual affected and supply the name and address of the agency that made the report.
Whenever a consumer-reporting agency reports public record information about an individual which may adversely affect his ability to obtain employment, it must notify the individual that it is doing so, including the name and address of the person to whom the information is reported.
Whenever an investigative report (obtaining information through personal interviews with neighbors, friends, associates, or acquaintances) is to be prepared about an individual, he must be so notified in advance unless the report is for employment for which the individual has not applied.
Individual's Right of Access to Information
An individual about whom an investigative report is being prepared has the right, upon his request, to be informed of the nature and scope of the investigation.
An individual has the right, upon his request, and proper identification, to be clearly, accurately, and fully informed of: (i) the nature and substance of all information, except medical information, about him in the files of a consumer-reporting agency; (ii) the sources of such information, except sources of information obtained solely for an investigative report; and (iii) recipients of consumer reports fumished about the individual, within 2 prior years for employment purposes and within 6 prior months for any other purpose. (The individual has this right whether or not adverse action has been taken.)
Whenever credit is denied, or the charge for it increased, wholly or partly because of information obtained from a source other than a consumer-reporting agency, the individual affected has the right, upon his request, to learn the nature and substance of the information directly from its user.
Individual's Right to Contest Information
If an individual disputes the accuracy or completeness of information in a file maintained about him by a consumer-reporting agency, the agency must reinvestigate and record the current status of that information, or delete the information if it is found to be inaccurate or cannot be reverified. If the reinvestigation does not resolve the dispute, the individual has the right to file a brief statement explaining the dispute; and the agency must, in any subsequent report containing the disputed information, note the dispute and provide at least a clear summary of the individual's statement.
One reason for describing the Fair Credit Reporting Act in such detail is to illustrate the care with which the Congress has responded to the need it found to protect individuals from the adverse effects of unfair information practices in the consumer reporting industry. Although the Congress adopted a regulatory approach in this Act,5 it constitutes a strong precedent for our recommended Code of Fair Information Practice. In regulating the practices of both consumer-reporting agencies and the users of their reports, the Act, in effect, imposes many of the safeguard requirements we recommend.
The chief reason for presenting the Fair Credit Reporting Act, however, is to illustrate the point that existing laws that provide greater protection for individuals than our safeguards offer should be retained, while laws that provide less protection should be amended to meet the standards set by the safeguards. Section 606(a) of the Fair Credit Reporting Act, 15 U.S.C. 1681d(a), for example, requires that an individual be notified that an investigative report is being prepared about him before work on it is begun, whereas safeguard requirement III(2) (p. 59, above) gives an individual the right to be informed that he is the subject of a record only if he asks to know. In this instance, the Act's requirement, responsive to the particular circumstances of the consumer reporting industry, provides the individual with greater protection than our safeguard and should be retained.
Conversely, safeguard requirement III(2), which also guarantees an individual the right to see and obtain copies of data about him, provides more protection for individuals than Section 609(a) of the Fair Credit Reporting Act, 15 U.S.C. 1681g(a). Under the Act's requirement the individual is entitled to be fully informed by a consumer-reporting agency of the content of his record (except medical information and the sources of investigative information), but he is not entitled to see, copy, or physically possess his record. When an individual goes to a consumer-reporting agency to determine what information it has on him, the contents of the record must be read to him, but he must take the agency's word that it is telling him about all information in the record, and about all sources and recipients thereof. We understand that individuals have found this arrangement generally unsatisfactory, and further, that as the proportion of "sensitive" or adverse personal data in a record increases, compliance with the full disclosure requirement tends to diminish.
To bring Section 609(a) more in line with the protection afforded individuals by safeguard requirement III(2), and thus to achieve the objective of the Fair Credit Reporting Act more fully, we recommend that the Fair Credit Reporting Act be amended to provide for actual, personal inspection by an individual of his record along with the opportunity to copy its contents, or to have copies made. The choice between inspecting and copying should be left to the individual, and any charge for having copies made should be nominal.
We further recommend that the exceptions from disclosure to the individual now authorized by the Fair Credit Reporting Act for medical information and sources of investigative information should be omitted. It is a disturbing thought that an investigative consumer-reporting agency may have a record of medical information that the individual cannot know about or challenge. We realize that in Section 603(f) of the Fair Credit Reporting Act, 15 U.S.C. 1681a(f), "consumer reporting agencies" is defined broadly enough to apply to some organizations that are customary and appropriate repositories of medical. information. However, nothing in the Act should warrant the inference that every type of organization falling within the umbrella definition of "consumer reporting agencies" may, with impunity, conceal from an individual the fact that it is gathering, recording, and reporting medical information about him.
We have explained our skepticism about the propriety of utilizing anonymous data sources when determinations about an individual's character, qualifications, rights, opportunities, or benefits are being made. Moreover, we find no strong societal interest in having an individual routinely denied credit, insurance, or employment on the basis of information provided by any source that must be kept secret from him.6
A Note on Mailing Lists
The use of automated personal data systems to generate mailing lists deserves special comment. Ordinarily such use entails no perceptible threat to personal privacy. Even among individuals who strongly object to receiving quantities of so-called "junk mail," most would probably concede that their objections are not founded on any substantial claim that personal privacy has been invaded. Indeed, it is hard to see how the mere delivery of an item of mail to an individual, even though it is addressed to him by name, in itself entails an offensive or harmful disclosure or use of personal data.
More important than the end use of the mailing list itself is the question of the original source of the personal data from which the list was originally assembled. In most cases, commercial mailing lists are made up of names and addresses gathered during the course of commercial transactions. In the most typical case, buying an item through the mail assures that the buyer's name will be added to the list of a commercial dealer in names, and that the list will in turn be sold, rented, and traded through a chain of further commercial mailers. This exploitation of names may occasionally be irritating, but there is little potential for substantial disclosure of closely held personal information, since nothing beyond name and address was probably revealed in the first place.
A more serious threat to personal privacy arises when mailing lists are compiled from sources that have nothing to do with commercial interests-the membership list of a professional society, the faculty roster of a college, or the donor list of a charity. In these cases, data furnished for one purpose are being used for another, and even though the original source may not have contained more than the name and address, the mere fact of being on the list may reveal something about one's private life.
More serious still are lists derived from actual administrative data systems. There is the strong probability that the original source contained data that might well be intensely personal and that names will be selected for mailing lists on the basis of such data. The data files for driver licenses, for instance, usually contain medical information on disabilities. The administrative files of schools contain grades and other personal items. Any use of files such as these for any but the original intention carries a clear danger of exploitation of truly private personal information.
The Committee staff studied the structure and practices of the mailing-list industry to gauge the threats to personal privacy that could arise from that source, as well as to examine the applicability of the safeguard requirements to the industry. The report of the study is presented in Appendix H; an abstract of its conclusions, which we fully endorse, is given here:
An underlying function of the Advisory Committee's recommended safeguards is to provide effective feedback mechanisms that will help to make automated personal data systems more responsive to the interests of individuals. Systems maintained by most government agencies, and by many private organizations, do not provide for tight links between individuals and the system operators. The direct-mail industry, however, is largely organized around the idea of public feedback; the trade press concentrates almost obsessively on methods for maximizing response and minimizing complaints.
Because most mailings draw a response from only 3 or 4 percent of the addressees, a small change in the response rate can have relatively large economic implications for the mailer. The same is true for the compilers and brokers of mailing lists, because the price a list commands in the rental market depends not so much on its demographic sophistication as on its accuracy and freshness. Lists are cleaned by adding a special imprint to the mailing which gives the Postal Service authority to correct and return (at first-class rates) all undeliverable pieces. Since it costs about four times as much to discover and correct a "nixie" as it does to make a clean mailing in the first place, there is a powerful economic incentive to concentrate lists on known buyers at addresses of known accuracy.
Another feedback mechanism operates on the industry as a whole. Direct-mail advertising is strongly dependent for survival on the official good will of a large number of agencies of the government; opposition from the Postal Service, from motor vehicle registrars, or from the Census Bureau, to name a few examples, would seriously hamper the industry on its present scale. It seems likely that a scandal involving public records, or the development of a public allergy to direct-mail advertising, would lead to govemment moves to put constraints on the industry.
Constructive publicity toward emphasizing the rights of the individual relative to direct-mad advertising, especially the methods the industry has adopted for getting off and getting on the larger lists, would go far in strengthening these feedback mechanisms that already operate. In particular, the Direct Mail Advertising Association's Mail Preference Service deserves wider attention.
If feedback mechanisms stronger than those provided by the economics of the industry should become desirable, there would be formidable practical difficulties in applying the Committee's safeguards to the freewheeling small operators of the direct-mail industry. The most directly applicable of the Committee's safeguards is the requirement for the informed consent of the data subject to be obtained before any collateral use may be made of data from an administrative personal data system. To accomplish this, forms that are used by the system in transactions with individuals (applications, for example) and that are vulnerable to mailing-list uses, could be printed with a block in which the individual-by his deliberate action-could indicate whether or not his name and address could be sold or otherwise transferred to another data system for mailing-list use. Of course, this could not prevent his name and address from being copied by hand out of a public record system, but the cost of such handcopying would sharply curtail much commercial use.
In view of the controls already at work in the direct-mail advertising industry, this limited application of the Committee's safeguards seems sufficient. It would provide protection to individuals from having their names unexpectedly appear on mailing lists without their consent. We doubt the utility and feasibility of trying to make the rest of the Committee's proposed safeguard requirements apply to the mailing list as such, as a form of administrative automated personal data system, or to organizations that deal only in mailing lists. If the control of mailing lists is to be undertaken by law, it should be done by legislation that is directed specifically to that purpose.
If the foregoing analysis of the situation underestimates the felt need for greater mailbox privacy, it would be feasible to undertake specific legislative action against the direct-mail advertising industry to provide greater protections, as the regulation of information practices in the consumer-reporting industry amply demonstrates.
A Note on Intelligence Records
In developing safeguard requirements, we have divided personal data record-keeping systems into two broad categories, (i) administrative systems, and (ii) systems maintained exclusively for statistical reporting and research. The distinction between the two is in their purpose vis-a-vis individuals. Administrative systems are intended to be used to affect individuals as individuals; statistical reporting and research systems are not. According to this classification, intelligence records are properly considered administrative records.
A chief characteristic of intelligence records is that they are compiled for purposes that presuppose the possibility of taking adverse action against an individual. Their focus is on providing a basis for protecting the data-gathering organization, or other organizations that it serves, against the individual. There are many examples of intelligence-type personal-data record-keeping systems. From a historical standpoint, the original and classical intelligence records were those compiled and maintained about individuals who were viewed as possible enemies of the state. The most obvious and perhaps most common ones today are those compiled by the criminal intelligence systems of Federal, State, and local law enforcement agencies about individuals suspected of being engaged in criminal activities, of being threats to public safety or national. security, or of being suitable objects of surveillance and investigation for less clearly definable reasons. There are, however, many other examples of intelligence-type records, including investigative records of credit-reporting agencies, private detective agencies, industrial security organizations, and so on. It is hard to know how many types of intelligence data systems exist because their function leads as a rule to careful concealment.
In framing our proposed safeguard requirements for administrative personal data systems, we did not focus on intelligence records as such. We realize that if all of the safeguard requirements were applied to all types of intelligence records, the utility of many intelligence-type records for the purposes they are designed to serve might be greatly weakened. In some instances this would clearly not be a desirable outcome from the standpoint of important societal interests, such as the apprehension and prosecution of individuals engaged in organized crime. It does not follow, however, that there . is no need for safeguards for personal-data intelligence recordkeeping systems. The risk of abuse of intelligence records is too great to permit their use without some safeguards to protect the personal privacy and due process interests of individuals.
The mere gathering of intelligence data can be a serious threat to personal privacy and should be carried out with strict respect for the Constitutional rights of individuals. Once criminal intelligence data have been compiled, their use in connection with law enforcement prosecutions is safeguarded by all the Constitutional requirements of due process and by laws that establish limitations on the exercise of the police power, including civil and criminal remedies and penalties that may be imposed to enforce such limitations. We have not attempted to assess whether protections now afforded individuals from abuses of intelligence records as used in criminal law enforcement should be strengthened.
We are concerned, however, about the use of criminal intelligence data, and intelligence records maintained by organizations other than law enforcement agencies, for many purposes that involve determinations about the qualifications, character, opportunities, or benefits of individuals to which the protective requirements of due process may not apply or for which they may not be fully effective. Such determinations include suitability for employment, especially in public service or in positions of critical fiduciary responsibility; clearance for access to classified national security information held by the Federal government and its contractors; and eligibility for various public benefits, permits, and licenses.
Enactment of the proposed Code of Fair Information Practice for administrative personal data systems will afford an excellent opportunity to determine precisely what protections for individuals should be applied to intelligence record-keeping systems. Any exception from a safeguard requirement that is proposed for any type of intelligence system must be specifically sanctioned by statute and then only if granting the exception would serve a societal interest that is clearly paramount to the interest served by having the requirement imposed.
The process of considering exceptions for intelligence systems will entail a careful review of existing policies, laws, and practices governing the creation, maintenance, and use of intelligence records about individuals. The- need for such a review has seldom seemed more urgent in the history of our Nation.
1In our brief review of the history of record keeping in Chapter 1, we took note of the origins and existence of intelligence records. These should be thought of as a type of administrative personal data system, since intelligence records are maintained about people for the purpose of affecting them directly as individuals We have not, however, examined intelligence record-keeping systems as such, and it was not with such systems in mind that we developed the safeguard recommendations set forth in this chapter. At the end of the chapter, we have included a brief statement about the application of our safeguards to intelligence records.
225 U.S.C. 552 (1970).
3 The remaining two exemptions refer to information that is: "specifically required by Executive order to be kept secret in the interest of the national defense or foreign policy;" and "specifically exempted from disclosure by statute." Legal prohibitions against disclosure of information in these two categories are not affected by the Act.
415 U.S.C. 1681-1684t.
5The Federal Trade Commission has the basic responsibility for enforcing the Act, but where specific types of institutions are already regulated (for other purposes) by other agencies, those agencies are charged with enforcing the Act; e.g., the Comptroller of the Currency (national banks), the Federal Reserve Board (member banks of the Federal Reserve systems other than national banks), the Interstate Commerce Commission (common carriers), and the Civil Aeronautics Board (air carriers).
6Experience under the Fair Credit Reporting Act should be carefully assessed to identify other amendments necessary to assure the effectiveness of its intended protections for individuals. For an analysis of deficiencies of the Act, see "Protecting the subjects of Credit Reports;" The Yale Law Journal, Vol. 80, No. 5 (April, 1971), pp. 1035-1069.