Chapter 4. Privacy and Identity

Several years ago, General Motors set out to create a company-wide telephone directory. Of course, in the 21st century you don’t create a printed phone directory for a company as large as GM; you create an online directory. Two years and numerous legal hurdles later, GM had an online phone directory. This tale is amazing, given that GM wasn’t trying to do anything particularly difficult, like aggregate identity pools or implement single sign-on company wide. They were simply creating a phone directory; something companies have been doing for a hundred years.

GM’s hang-up was not the technology, but rather the legal challenges presented by differing privacy laws and regulations in each of the many countries where GM has employees. European privacy laws are much stricter than those in the United States. Privacy turned a seemingly simple project into a two-year ordeal.

More and more corporations and government agencies are appointing Chief Privacy Officers, high-level officials whose job is to ensure that identity data is protected. The reason is simple: privacy is a big deal. People believe that their identity data should be private. They don’t necessarily believe that everyone else’s data should be private, but they want to protect their own identities.

Who’s Afraid of RFID?

RFID stands for “radio frequency identification device.” RFID tags are small integrated circuits that broadcast an identifying code whenever they’re hit with radio frequency radiation. The circuit uses the radiation to charge a small capacitor that gives the tag just enough energy to broadcast the identifying code. RFID tags have been used to automate the collection of tolls or access to parking garages.

One of the really interesting uses of RFID tags is to identify things. This has generated significant interest in the retail industry. Imagine if the barcodes on the groceries you buy could talk. Rather than having to scan the items one at a time, the RFID reader could ask all the items in the grocery cart to identify themselves and total up the purchase all at once. An RFID reader in a refrigerator could keep track of what’s in the fridge and build your grocery list for you. Smart shirts could tell your dryer how high to set the temperature.

Against this backdrop of a consumer Utopia is an Orwellian nightmare of massive intrusions into individual privacy. Who else gets the data from your grocery purchase? When you wear your Gap shirt into Eddie Bauer, do they scan it to know what kinds of clothing to offer you? In general, who else can scan the tags in the things you own?

This debate has created lots of heat over the last few years. Gillette planned a pilot with RFID-enabled shelf displays that would automatically send restocking alerts to the company when a display ran low. They were forced to abandon the pilot because of concerns by privacy advocates. Wal-Mart has led a push for RFID in retail, but so far has limited the roll-out to pallet-level packaging in the supply chain. Part of this is pragmatic: RFID tags are not yet cheap enough to put on everything.

The debate over RFID and privacy highlights a crucial aspect of privacy. People are scared of the theoretical threats to their privacy—especially in technologies they don’t understand. At the theoretical stage, privacy advocates are very effective at using this ignorance as a backdrop for painting draconian pictures.

Privacy Pragmatism

The debate over RFID illustrates the great irony of privacy. As someone who’s been involved in online commerce and services for over a decade, I’ve found that while everyone cares about privacy in the abstract, they’re usually willing to trade their personal data for the most trivial of benefits. A cynic would say that this is because people don’t really understand digital identity and how their privacy can be eroded, but I think that most people make rational choices. People willingly choose to share personal data if there is a payoff that they understand.

Here’s an example: if you’ve been to a grocery store, you’re familiar with their “preferred customer” cards. The premise is simple: scan the card, get a discount. Groceries stores do this, of course, so that they can tie customer identities to purchasing habits—very valuable data for a company looking to drive sales and establish loyalty. Even the most hardened privacy warriors are likely to succumb to a large rebate offer on a TV or a computer and send in the rebate form in order to capture the savings.

While the U.S. has its share of privacy advocates, it has been slow to adopt many of the privacy safeguards that have found a home in Europe and elsewhere (the next section will explore the exceptions). The hesitancy has been partly based on free speech concerns, but often, it is because U.S. legislatures are loath to take action that will have negative impact on business development. Again, the cynic would say that business has bought off the legislative process, but from my experience, both in the private and public sectors, the reason has more to do with legislative concern about regulatory burdens making U.S. industry less competitive.

Even with the pragmatic attitudes of consumers and the business-friendly climate promoted by many legislatures, your organization should be very careful in handling the personal data of employees and customers and the private information of partners and suppliers. While it’s true that customers are usually willing to trade identity information for some benefit, they react with anger when they feel like they’ve had their identity data “stolen.” Likewise, legislatures sometimes react with ill-conceived legislation when pressed by anecdotes regarding identity theft, fraud, and misbehaving companies.

Privacy Drivers

Even with the relatively pragmatic attitude of most U.S. consumers regarding privacy, there are still many significant laws and regulations affecting organizations. Table 4-1 shows some of the more prominent laws and regulation concerning privacy. In the U.S. and Canada, those laws tend to be limited to specific kinds of organizations (e.g., health care and financial services), but the European Union directive (the one that tripped up GM in its efforts to build an employee phone directory) is extremely far reaching.

Table 4-1. Some privacy laws and regulations

Law/regulation

Description

Canadian Personal Information Protections and Electronic Documents Act

Applies to employees of firms regulated by the federal government. Recognizes an employee’s right to privacy and imposes principles that employers must follow with respect to the personal data of their employees.

Customer Identification Program (Patriot Act)

Applies to financial services organizations in the U.S. Requires the collection and storage of customer data and its verification against government-owned lists of known or suspected terrorists.

European Data Protection Directive

Applies to organizations operating in the European Union. Imposes wide-ranging obligations regarding the collection, storage, and use of personal information relating to employees and customers.

Health Information Portability and Accountability Act (HIPAA)

Applies to any organization that manages health care data in the U.S. Establishes a patient’s right to control access and use of personal health information (PHI). Requires that organizations control and safeguard PHI. Imposes technical standards for access control, audit, data integrity, and security.

Gramm-Leach-Bliley Act

Applies to financial services organizations in the U.S. Requires physical, administrative, and technical measures to protect customer data. Data can be reused or disclosed only with the specific opt-in of the customer.

The laws listed in Table 4-1 can have a direct impact on your identity management plans. There are other laws and regulations that can indirectly affect the identity management efforts of your organization. For example, NASD Rules 3010 and 3110 and SEC Rule 17a-4 require securities brokers and dealers to retain records for certain periods of time. This clearly affects the design and implementation of a digital identity infrastructure in affected industries.

Another law with indirect effects is Sarbanes-Oxley . Sarbanes-Oxley applies to public companies and, among other things, requires annual reports on the effectiveness of internal controls and procedures. Identity management issues like directories and access control have a direct impact on internal controls and procedures. Consequently, public companies need to design and implement their digital identity infrastructure to not only comply with Sarbanes-Oxley requirements, but to minimize that cost where possible.

In addition to national laws, in the U.S. at least, many state and local governments have adopted various laws and regulations that you may be obligated to obey. These can be especially troublesome, because your organization may operate in multiple jurisdictions, each with widely different expectations.

Determining what laws and regulations affect your identity management strategy and what to do about them is impossible if you attempt to manage identity like most IT departments have traditionally managed security. The issues are business issues and require business input. As an example, consider Sarbanes-Oxley. The Audit Committee of your board of directors will determine the ground rules for how your company is going to comply with Sarbanes-Oxley. Not understanding their directives and ensuring that they’re met would be a career-limiting act.

We’ll spend the final part of this book discussing how you can develop an identity management architecture that ensures business drivers are used to shape your digital identity infrastructure.

Privacy Audits

Chief Privacy Officers and others concerned with privacy in an organization worry about what they don’t know. It’s not the data you know about that will get you in trouble. In Chapter 16, we’ll discuss resource mapping and specifically talk about how to create inventories of the data in your organization. Having these data maps is the first step to being able to perform privacy audits . Here are some of the privacy-related questions you might ask about the identity data in your organization:

  • What kinds of identity data are you collecting?

  • How is this identity data collected?

  • Why was the identity data collected?

  • Were special conditions on its use established at any time?

  • Who is the data owner?

  • Who is the custodian?

  • Who uses the data, why, and how do they usually access it (i.e., remotely, via the Web, from home)?

  • Where is it stored?

  • Is any of the data stored on devices that are routinely transported off-site such as a laptop or PDA?

  • Are there backups? If so, you need to answer these same questions about the backups.

  • Are there access logs for the data?

  • Where are the logs stored?

  • Are the logs protected?

  • What other security measures (firewalls, intrusion detection systems, and so on) are used to protect the data?

Conducting privacy audits and collecting all of this information may seem like a lot of work, but ask yourself what it means if you don’t know the answers to these questions. There’s good news and bad news. The good news is that data maps are useful for more than just privacy, so you can balance the cost and effort with other benefits. The bad news is that it’s hard to get anyone very excited about data. Applications are the stars of the IT world. In Chapter 16, we’ll cover a strategy for moving your organization toward having a better understanding of what data it owns and getting answers to the preceding list of questions.

Privacy Policy Capitalism

When we view the exchange of identity information through the lens of a transaction where the customer perceives some benefit and thus parts with bits of identifying information in consideration for that benefit, privacy policies take on a new feel. Many companies view their privacy policy as something they have to do to keep their customers from being angry with them, because their industry demands it, or because someone convinced the CEO or CIO that she’d be liable if the company didn’t have one. All of these may be true statements, but they’re only ancillary to the real reason for a privacy policy: your privacy policy represents the terms of service you’re offering for whatever benefit the customer perceives.

For example, say you’re an online merchant. You collect identity information from your customers at various stages of the transactions, and the customer receives some benefit. At the most basic level, whenever a customer visits, you install a cookie on his browser so that your shopping cart works. Cookies are a way of maintaining program state across HTTP, an otherwise stateless protocol. In addition to making the shopping cart work, you realize that you can use the cookie to recognize the customer the next time he returns and even to track his shopping habits. When the customer buys something, you collect personal information, such as his name, address, and credit card number, and can link that to the cookie as you create a customer profile.

What should this online merchant’s privacy policy say? First, tell the truth. Tell customers what data you collect, why you collect it, and what you do with it. Be specific. In this example, the merchant might say, in part:

  • We use cookies. Our shopping cart will not work without them.

  • When you make a purchase, your personal information is stored in our system only if you give us permission by clicking the “Save my information” box on the checkout form. When you do this, we can serve you better by automatically filling out some forms for you when you shop.

  • We use cookies to track the shopping habits of our customers. This data is used to make our search tool better and to help us offer a better product selection. The shopping habits of our customers may be released to partners and suppliers in aggregate, but your individual shopping habits will be released to a third party only with your specific permission, obtained in advance.

  • Advertisements appearing on our system may make use of third-party ad response tracking systems that use cookies to track ad click-through and to target those ads to specific customers.

A real privacy policy would be longer, and your lawyers will probably want to fill it with lots of other information. While it’s a good idea to involve lawyers in the process, since it’s ultimately a term sheet between you and your customers, make sure that the privacy policy is readable and understandable by your customers, or it won’t do what you need it to do: inform them in clear language the terms of the bargain that you’re proposing.

If you approach your privacy policy as a term sheet, with a clear understanding of what each side is giving and getting in the relationship, you and your customers will be happier with the result.

Anonymity and Pseudonymity

One of the questions that your business should understand clearly is what level of identity is needed for which relationships. For many purposes, you need specific, authenticated, and detailed identity information from your partners, suppliers, and employees. You may be able to provide service to customers, however, as they remain anonymous or at least pseudonymous.

True anonymity is not realistic for most online services, since they probably have to at least maintain some kind of user state and that requires telling which individual HTTP requests are related. Consequently, the user is not anonymous, because you can distinguish between different users. This leads to the concept of pseudonymity .

In a pseudonymous system, users are uniquely identified, but other identifying information is not shared. Pseudonymous systems give subjects a unique ID with which attributes, rights, and privileges can be associated. Pseudonymity is a term usually reserved for people, since the unique ID and its associated attributes, rights, and privileges constitute an identity as we’ve defined it. Pseudonymity implies that this identity cannot be tied to other identities that the subject might have without the subject divulging the connection.

Businesses should ask, “What identifying information is required?” early in the design process for an online service. I say “businesses” because this is almost always a business decision, not a technology decision. In keeping with our concept of the privacy policy being a term sheet for an identity transaction, each piece of the customer’s identity that is being requested should also be associated with the need for the data as well as the benefit that the customer will receive by giving the data.

This rule is not followed as often as it should be. You’ve probably been to web forms that ask you for more data than you think the company needs to provide the service. You probably resented it. Adding insult to injury, it’s likely that the company collecting the data never made any use of it whatsoever, whether for your benefit or not. Collecting unnecessary data alienates customers and clutters forms, so don’t do it.

Privacy Principles

You might be asking, what principles can I use to make sure I’m acting in good faith with respect to the personally identifying data of my employees, customers, and partners? The Canadian Personal Information Protections and Electronic Documents Act described in Table 4-1 contains 10 principles that, if modified slightly, can serve as a guide:

Accountability

Your organization is responsible for the personal information under its control and must designate someone who is accountable for complying with these principles.

Identifying purposes

Any project must specify why it is collecting personal information at or before the time it does so.

Consent

The subject’s consent is required for the collection, use, or disclosure of personal information. Exceptions should be documented.

Limiting collection

Projects may collect only the personal information that’s necessary for the purpose they’ve identified, and must collect it by fair and lawful means.

Limiting use, disclosure, and retention

Unless a project has the consent of the subject, or is legally required to do otherwise, projects may use or disclose personal information only for the purposes for which they collected it, and they may retain it only as long as necessary for those purposes.

Accuracy

The subject’s personal information must be accurate, complete, and up to date.

Safeguards

Security safeguards must be employed to protect personal information.

Openness

The project must make its personal information policies and practices known to people from whom they collect information.

Individual access

Subjects must be able to access personal information about them, and be able to challenge the accuracy and completeness of it. Exceptions should be documented.

Challenging compliance

Subjects must be able to present a challenge about the project’s compliance with the privacy policy to the person that the organization has designated as accountable.

Even though these principles are not the law in the U.S., or even for most industries in Canada, they provide good guidance for how an organization can protect personally identifying information and be fair about the information they collect. If your organization ignores any of these principles, you should ensure that it does so by choice rather than accident and that the risks are thoroughly explored.

Prerequisites

As we’ve seen, understanding the privacy concerns of employees, customers, and so on is an important part of creating a working digital identity infrastructure. In commercial transactions, people must also believe that your organization is capable of keeping its promise. The perceived credibility and technical competence of your organization will determine whether the promises in your privacy policy are believed.

Even big companies run afoul of this. Microsoft, for example, largely abandoned its plans for Microsoft Passport, because users didn’t believe that Microsoft would do what it said it was going to do. eBay and Yahoo! have drawn fire over the years for changing their privacy policies. When your customers don’t believe your privacy policy, they become cynical and see everything in it as an attempt to create a loophole that will allow their personal data to be sold or given away.

Conclusion

Customers have unrealistic expectations about privacy; get over it. In many cases, the data they claim belongs to them is actually data regarding a transaction that you were both party to and in fact is mutually owned. Nevertheless, it pays to pay attention to privacy. Mitigating customer expectations about privacy requires being clear about what data is being collected and why. Providing unmistakable value for the data collected will usually resolve customer privacy concerns.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.17.20