building in privacywriting

civil-libertiesprivacycryptographyforwarded-contentgovernment-info
14 min read · Edit on Pyrite

Source

Automatically imported from: http://commons.somewhere.com:80/rre/1996/building.in.privacy.html

Content

This web service brought to you by Somewhere.Com, LLC.

building in privacy

``` [I've enclosed a paper presented by Ann Cavoukian , the assistant privacy commissioner of Ontario, at a recent meeting on privacy and security technology. It describes the need for (what have come to be called) privacy-enhancing technologies such as biometric encryption. The endnotes are missing. I've taken the liberty of reformatting the paper to 74 columns but haven't edited the text otherwise.]

---

This message was forwarded through the Red Rock Eater News Service (RRE). Send any replies to the original author, listed in the From: field below. You are welcome to send the message along to others but please do not use the "redirect" command. For information on RRE, including instructions for (un)subscribing, send an empty message to rre-help@weber.ucsd.edu

---

GO BEYOND SECURITY -- BUILD IN PRIVACY:

ONE DOES NOT EQUAL THE OTHER

Ann Cavoukian, Ph.D. Assistant Commissioner

CARDTECH/SECURTECH 96 CONFERENCE

ATLANTA, GEORGIA

MAY 14-16, 1996

Privacy vs. Confidentiality/Security

This paper will begin by touching briefly on the meaning of privacy since this term is at times used interchangeably with confidentiality/security. But let me assure you that the two are not one and the same. While privacy may subsume what is implied by confidentiality, it is a much broader concept involving the right to be free from intrusions, to remain autonomous, and to control the circulation of information about oneself.

Privacy involves the right to control one's personal information, and the ability to determine if and how that information should be obtained and used. The Germans have referred to this as "informational self-determination": In 1983, the German Constitutional Court ruled that all citizens had the right to informational self-determination (an individual's ability to determine the uses of one's information). While most countries with privacy laws have this notion of self-control as one of the goals of their legislation, they do not usually have an explicit constitutional guarantee to privacy, as in the case of Germany.

It is in this sense that privacy is a much broader concept than confidentiality since it entails restrictions on a wide range of activities relating to personal information: its collection, retention, use and disclosure. Confidentiality, however, is only one means of protecting personal information, usually in the form of safeguarding the information from unauthorized disclosure to third parties. Confidentiality only comes into play after the information in question has been obtained by a company, organization or government (commonly referred to as "data users"). Data users are expected to be responsible for the safekeeping of the personal information entrusted to them. In this sense, they have a custodial obligation to protect the information in their care. Thus, a relationship of trust exists between data subjects and data users, requiring a duty of care and an expectation of confidentiality. The latter involves containment of the personal information to those permitted access to it, and safeguarding it from disclosure to unauthorized third parties. The means by which this is achieved involves security.

The full spectrum of data security, computer and network security, physical security and procedural controls must be deployed to protect personal information from a wide range of threats: inadvertent or unauthorized disclosure, intentional attempts at interception, data loss, destruction or modification, attempts to compromise data integrity and reliability, and others. Measures that enhance security enhance privacy: the two are complementary, but not one and the same. Therefore, simply focussing on security is not enough. While it is an essential component of protecting privacy, it is not sufficient by itself. For true privacy protection we must turn to the time-honoured principles of data protection commonly referred to as "the code of fair information practices."

These internationally-recognized principles were first formally launched by a European organization, the OECD (Organization for Economic Co-operation and Development) in 1980, for the purpose of conferring rights upon data subjects and responsibilities upon data users (both Canada and the United States are signatories). They place limitations on the collection of personal data, place restrictions on its uses, place an onus on purpose specification, declare a need for openness, transparency, and accountability, and create the right of individual access and correction. The eight principles governing data protection are as follows:

Collection Limitation Limited to the collection of personal data; data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.

Data Quality Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.

Purpose Specification The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.

Use Limitation Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with Paragraph 9 [Purpose Specification Principle] except: a) with the consent of the data subject, or b) by the authority of law.

Security Safeguards Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data.

Openness There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purpose of their use, as well as the identity and usual residence of the data controller.

Individual Participation An individual should have the right:

a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him;

b) to have communicated to him, data relating to him:

i) within a reasonable time; ii) at a charge, if any, that is not excessive; iii) in a reasonable manner; and iv) in a form that is readily intelligible to him;

c) to be given reasons if a request made under subparagraph (a) and (b) is denied, and to be able to challenge such denial; and

d) to challenge data relating to him and, if the challenge is successful, to have the data erased, rectified, completed or amended.

Accountability A data controller should be accountable for complying with measures which give effect to the principles stated above.

Building In Privacy

What is needed is a convergence of these principles with those found in systems design and smart card applications. What is needed are the design correlates of fair information practices. The systems design and architecture should translate the essence of these practices into the language of the technology involved.

In addition, incorporating such a requirement into a privacy impact assessment prior to the actual development of a new system should be viewed as an essential first step. Just as it would be inconceivable to build a new system without any idea of the financial costs involved, it should be equally inconceivable to build a new system without any idea of the privacy costs involved, or the protections needed to minimize those costs. Preparing such an assessment at the beginning of the process, followed by the implementation of fair information practices upon completion, will ensure a much higher degree of privacy protection.

One of the most important principles is the "use limitation" principle, referring to the limitations that should be placed on the uses of personal information. This requires drawing a clear distinction between the primary purpose of the collection and any subsequent or secondary uses, unrelated to the primary purpose. In other words, personal information collected for one purpose (paying taxes), should not be used for another purpose (compiling a mailing list of people with incomes over $100,000), without the consent of the individuals involved. Thus, the use of the information should be limited to the primary purpose (tax collection), which was the purpose specified to the data subject at the time of the data collection. While the above may sound fairly straightforward, it is seldom put into practice without some mechanism (a privacy law or code of conduct) requiring that such principles be followed.

When systems are not built with privacy in mind (which is generally the norm), one may not be able to easily isolate the primary purpose of the collection (if an effort is to be made to restrict uses of the information to that purpose). And if different types of information have been gathered by the same organization for different purposes, then access should be restricted to those who need to access a particular type of information -- not the entire set. This requires the creation of segregated fields of access with a clear demarcation of who should be permitted access to what fields. Even better, however, would be the anonymization of personally identifiable data.

New and emerging information technologies have lead to a massive growth in the amount of personal information accumulated by organizations. In identifiable form, this trend increasingly jeopardizes the privacy of those whose information is being collected. Minimizing or entirely eliminating identifying data, however, will go a long way to restoring the balance. To the extent that smart cards can be designed with anonymity in mind, privacy interests will be advanced enormously.

Why is it that every time you engage in a wide range of activities -- using a credit or debit card, making a telephone call, subscribing to a magazine, joining a club, ordering goods from a mail-order catalogue, or buying something at a grocery store or department store, an identifiable record of each transaction is created and recorded in a database somewhere. Why is it that in order to obtain a service or make a purchase (other than with cash or a cash-card), organizations require that you identify yourself? This practice is so widespread that it is treated as a given: it can be no other way. Really? The time has come to challenge this view. Is it not possible for transactions to be conducted anonymously yet securely, with proper authentication? Emerging technologies of privacy not only make this possible, but quite feasible.

Consumer polls repeatedly show that individuals value their privacy and are concerned with potential losses in this area when so much of their personal information is routinely stored in computers over which they have no control. Anonymity is a key component of maintaining privacy. Protecting one's identity is synonymous with preserving one's ability to remain anonymous. Technologies that provide authentication without divulging identity not only address privacy concerns, but also provide much-needed assurances to organizations regarding the authenticity of the individuals they are doing business with.

Privacy-Enhancing Technologies

Two examples of privacy-enhancing (anonymizing) technologies will be provided here, each of which relies upon the "blinding" of identity through the use of encryption -- in the first case, through an extension of public key encryption, in the second case, through the use of biometric encryption.

Blind Signatures

The blind signature, created by David Chaum of Digicash is an extension of the digital signature -- the electronic equivalent of a handwritten signature. Just as a signature on a document is proof of its authenticity, a digital signature provides the same authentication for electronic transactions. It provides the necessary assurance that only the individual who created the signature could have done so, and permits others to verify its authenticity.

Digital signatures are an extension of an asymmetric cryptosystem, public key encryption. In a public key system, two different keys are created for each person: one private, one public. The private key is known only to the individual while the public key is made widely available. When an individual encrypts a document with his or her private key, this is the equivalent of signing it by hand since the private key is unique to that individual. The intended third party can decrypt the message using the individual's public key, which corresponds to his/her private key. If the information is successfully decrypted, then one has the necessary assurance that it could only have been transmitted by that individual. Otherwise, it would not have been possible to decode the information successfully.

While a digital signature provides proof of authenticity (that a transaction originated from a particular sender), it reveals the identity of the individual in the process. The blind signature, created by David Chaum, Director of DigiCash, is an extension of the digital signature but with one additional feature: it ensures the anonymity of the sender. While digital signatures are intended to be identifiable (to serve as proof that a particular individual signed a particular document), blind signatures provide the same authentication but do so in a non-identifiable or "blind" manner. The recipient is assured of the fact that a transmission is authentic and reliable, without knowing who actually sent it.

One application of blind signatures involves the use of "e-cash" which can be used as an electronic form of payment that can be transmitted via networks such as the Internet. Just as cash is anonymous, e-cash is also anonymous in that it cannot be traced to a particular individual. Chaum calls it "unconditionally untraceable." The service provider, however, is assured of its authenticity; the only thing missing is the ability to link the transaction to a particular person. Chaum emphasizes that his system provides much-needed protections against fraud and abuse. It is predicated on the use of non-identifier-based technology: "A supermarket checkout scanner capable of recognizing a person's thumbprint and debiting the cost of groceries from their savings account is Orwellian at best. In contrast, a smart card that knows its owner's touch and doles out electronic bank notes is both anonymous and safer than cash."

Biometric Encryption

Biometric measures provide irrefutable evidence of one's identity since they offer biological proof that can only be linked to one individual. The most common biometric measure is the fingerprint. Fingerprints have historically raised concerns over loss of dignity and privacy. The central retention of fingerprints and multiple access to them by different arms of government invokes images of Big Brother watching.

The fundamental problem with identifiable biometric measures has been that once obtained, they are stored in identifiable form together with other personal information in a central database. This may then potentially be accessed by a number of third parties and used for a variety of unintended purposes. All of this facilitates surveillance -- making it easier to track your movements and compile detailed personal profiles. Thus, the threat to privacy comes not from the positive identification that biometrics provides best, but the ability of others to access this information in identifiable form, and link it to other personal information. This can only occur, however, if the biometric information is kept in identifiable form. All of that changes if the biometric measure is used only to encrypt the information to be stored.

Therein lies the paradox of biometrics: a threat to privacy in identifiable form, a protector of privacy in encrypted form; a technology of surveillance in identifiable form, a technology of privacy in encrypted form. As noted earlier, reliable forms of encryption can anonymize data and prevent unauthorized third parties from intercepting confidential information. In the case of biometrics, they permit authentication without identification of the user.

Take the example of someone receiving welfare benefits. The government needs to ensure that only those eligible to receive such benefits actually receive them, thereby reducing fraud. But if someone is eligible for welfare, then he should get what he is entitled to. So what is needed is confirmation of the fact that person A (who we have determined is eligible for welfare benefits), is in fact person A and not someone impersonating him. Biometric encryption can do that anonymously, without revealing the fact that person A is John Smith. So if A uses these benefits to buy groceries (with food stamps, for example), he should be able to do so once his eligibility has been authenticated. You don't need to know that A is John Smith who went to the store to buy a box of cereal, a bag of chips, and some milk. But you need to be sure that someone else can't impersonate him and claim the same benefits.

One company, Mytec Technologies, has done just that. Mytec has developed a new technology that will transform the way we view fingerprints: from posing a threat to privacy, to becoming its protector. Fingerprints encrypting information will now be used to protect people's privacy instead of invading it. Through the "bioscrypt," a compound of biometrics encryption, user elegibility is authenticated without divulging identity. Further, the bioscrypt bears no physical resemblance to the user's actual fingerprint. Mytec's system does not retain any record, image or template of the individual's actual fingerprint. Therefore, a copy of the fingerprint is never kept on file. Instead, a number or set of characters encrypted by the finger pattern, not the finger pattern itself, is retained in the form of the bioscrypt. The bioscrypt cannot be converted back to its corresponding fingerprint from which it originated because it is not a fingerprint.

Thus, one's finger becomes one's uniquely private key, with which to lock or unlock information. Since the bioscrypt was designed to confirm an individual's identity, it can only be used for comparative purposes, with the individual holding the key. Extension of this technology allows for the development of complete information systems using anonymous databases. George Tomko, President and C.E.O. of Mytec says that "the bioscrypt precludes the need for a unique identifying number or the centralized storage of fingerprints. People can carry out their transactions privately in a "blind manner" without the electronic tracing of a person's activities. Now transactions made through monetary systems such as credit or debit cards can be completely anonymous, thus ensuring total user privacy. With this technology, elimination of fraud is a by-product of protecting an individual's privacy."

Both David Chaum's blind signatures and George Tomko's biometric encryption provide for maximum privacy through advanced systems of encryption. Technologies such as these should receive the full support of both those interested in protecting privacy and those interested in eliminating fraud. They achieve the goal of fraud reduction without giving away your identity in the process, or your privacy -- a true win/win scenario.

Conclusion

The process of building privacy into systems and smartcard applications begins by recognizing the distinction between privacy and security. Introducing fair information practices into the process will by necessity broaden the scope of data protection, expanding it to cover both privacy and security concerns. The preparation of a privacy impact assessment can be a useful tool to assist in identifying areas where privacy may be negatively impacted, leading either to eliminating the problem areas or building in the necessary protections. The greatest protection, however, may come from de-identifying or anonymizing personal information.

The use of privacy-enhancing technologies such as those described above (DigiCash and Mytec) which minimize or entirely eliminate personally identifiable information are ideal in that they serve the needs of both individuals and organizations: personal privacy is maintained through the anonymity afforded by such systems, while organizations are assured of the authenticity of the individuals they are doing business with. Both needs are met. Mission accomplished.

ENDNOTES: ```

This web service brought to you by Somewhere.Com, LLC.