Images

Privacy

They who would give up an essential liberty for temporary security, deserve neither liberty or security.

—BENJAMIN FRANKLIN

Images

In this chapter, you will learn how to

Images   Examine concepts of privacy

Images   Compare and contrast privacy policies and laws of different jurisdictions

Images   Describe approaches individuals, organizations, and governments have taken to protect privacy

Images   Explain the concept of personally identifiable information (PII)

Images   Describe issues associated with technology and privacy

Privacy can be defined as the power to control what others know about you and what they can do with that information. In the computer age, personal information forms the basis for many decisions, from credit card transactions for purchasing goods to the ability to buy an airplane ticket and fly. Although it is theoretically possible to live an almost anonymous existence today, the price for doing so is high—from higher prices at the grocery store (no frequent shopper discount), to higher credit costs, to challenges with air travel, opening bank accounts, and seeking employment.

Images Anonymity and Pseudonymity

Information is an important item in today’s society. From instant credit, to digital access to a wide range of information via the Internet, to electronic service portals such as e-commerce sites, e-government sites, and so on, our daily lives have become intertwined with privacy issues. Information has become a valuable entity because it is an enabler of many functions. The creation of an information-centric economy is as dramatic a revolution as the adoption of money to act as an economic utility, simplifying bartering. This revolution and reliance on information imbues information with value, creating the need to protect it.

Data retention is the determination of what records require storage and for how long. There are several reasons for retaining data: billing and accounting, contractual, warranty, and local, state, and national government rules are some of the obvious. Maintaining data stores for longer than is required is a source of risk, as is not storing the information long enough. Some information, like protected health information (PHI) for workers in some industries or workers who have been exposed to specific hazards, can have very long retention periods.

Images

Privacy is the right to control information about you and what others can do with that information.

Failure to maintain the data in a secure state can be a retention issue, as is not retaining it. In some cases, destruction of data, specifically data subject to legal hold in a legal matter, can result in adverse court findings and sanctions. Legal hold can add significant complexity to data retention efforts because it forces almost a separate store of the data until the legal issues are resolved, because once data is on the legal hold track, its retention clock does not expire. This makes determining, labeling, and maintaining data associated with legal hold an added dimension for normal storage times.

Images Data Sensitivity Labeling and Handling

Effective data classification programs include data sensitivity labeling, which enables personnel handling the data to know whether it is sensitive and to understand the levels of protection required. When the data is inside an information-processing system, the protections should be designed into the system. But when the data leaves this cocoon of protection, whether by printing, downloading, or copying, it becomes necessary to ensure continued protection by other means. This is where data labeling assists users in fulfilling their responsibilities. Training to ensure that labeling occurs and that it is used and followed is important for users whose roles can be impacted by this material.

Training plays an important role in ensuring proper data handling and disposal. Personnel are intimately involved in several specific tasks associated with data handling and data destruction/disposal; if properly trained, they can act as a security control. Untrained or inadequately trained personnel will not be a productive security control and, in fact, can be a source of potential compromise.

A key component of IT security is the protection of the information processed and stored on the computer systems and network. Organizations deal with many different types of information, and they need to recognize that not all information is of equal importance or sensitivity. This requires classification of information into various categories, each with its own requirements for its handling. Factors that affect the classification of specific information include its value to the organization (what will be the impact to the organization if it loses this information?), its age, and laws or regulations that govern its protection. The most widely known system of classification of information is that implemented by the U.S. government (including the military), which classifies information into categories such as Confidential, Secret, and Top Secret. Businesses have similar desires to protect information and often use categories such as Publicly Releasable, Proprietary, Company Confidential, and For Internal Use Only. Each policy for the classification of information should describe how it should be protected, who may have access to it, who has the authority to release it and how, and how it should be destroyed. All employees of the organization should be trained in the procedures for handling the information that they are authorized to access.

Confidential

Confidential data is data that is defined to represent a harm to the enterprise if it is released to unauthorized parties. This data should be defined by policy, and that policy should include details on who has the authority to release the data.

Private

Private data is data that is marked to alert people that it is not to be shared with other parties, typically because they have no need to see it. Passwords could be considered private. The term private data is usually associated with personal data belonging to a person and less often with corporate entities.

Public

Public data is data that can be seen by the public and has no needed protections with respect to confidentiality. It is important to protect the integrity of public data, lest one communicate incorrect data as being true.

Proprietary

Proprietary data is data that is restricted to a company because of potential competitive use. If a company has data that could be used by a competitor for any particular reason (say, internal costs and pricing data), then it needs to be labeled and handled in a manner to protect it from release to competitors. Proprietary data may be shared with a third party that is not a competitor, but in marking the data, you alert the sharing party that the data is not to be shared further.

Images Data Roles

Multiple personnel are associated with the control and administration of data. These data roles include data owners, stewards, custodians, and users. Each of these has a role in the protection and control of the data. The leadership of this effort is under the auspices of the privacy officer.

Owner

Data requires a data owner. Data ownership roles for all data elements need to be defined in the business. Data ownership is a business function, where the requirements for security, privacy, retention, and other business functions must be established. Not all data requires the same handling restrictions, but all data requires these characteristics to be defined. This is the responsibility of the data owner.

Steward/Custodian

Data custodians or stewards are the parties responsible for the day-to-day caretaking of data. The data owner sets the relevant policies, and the steward or custodian ensure these policies are followed.

Privacy Officer

The privacy officer is the C-level executive who is responsible for privacy issues in the firm. One of the key initiatives run by privacy officers is the drive for data minimization. Storing data that does not have any real business value only increases the odds of disclosure. The privacy officer also plays an important role if information on European customers is involved, because the EU has strict data protection (privacy) rules. The privacy officer who is accountable for the protection of consumer data from the EU is required to be in compliance with EU regulations.

Images Data Destruction and Media Sanitization

When data is no longer being used, whether it be on old printouts, old systems being discarded, or broken equipment, it is important to destroy the data before losing physical control over the media it is on. Many criminals have learned the value of dumpster diving to discover information that can be used in identity theft, social engineering, and other malicious activities. An organization must concern itself not only with paper trash, but also the information stored on discarded objects such as computers. Several government organizations have been embarrassed when old computers sold to salvagers proved to contain sensitive documents on their hard drives. It is critical for every organization to have a strong disposal and destruction policy and related procedures. This section covers data destruction and media sanitization methods.

Burning

Burning is considered one of the gold-standard methods of data destruction. Once the storage media is rendered into a form that can be destroyed by fire, the chemical processes of fire are irreversible and render the data lost forever. The typical method is to shred the material, even plastic disks and hard drives (including SSDs), and then put the shred in an incinerator and oxidize the material back to base chemical forms. When the material is completely combusted, the information that was on it is gone.

Shredding

Shredding is the physical destruction by tearing an item into many small pieces, which can then be mixed, making reassembly difficult if not impossible. Important papers should be shredded, and important in this case means anything that might be useful to a potential intruder or dumpster diver. It is amazing what intruders can do with what appears to be innocent pieces of information. Shredders come in all sizes, from little desktop models that can handle a few pages at a time, or a single CD/DVD, to industrial versions that can handle even phone books and multiple discs at the same time. The ultimate in industrial shredders can even shred hard disk drives, metal case and all. Many document destruction companies have larger shredders on trucks that they bring to their clients location and do on-site shredding on a regular schedule.

Pulping

Pulping is a process by which paper fibers are suspended in a liquid and recombined into new paper. If you have data records on paper, and you shred the paper, the pulping process removes the ink by bleaching, and recombines all the shred into new paper, completely destroying the physical layout of the old paper.

Pulverizing

Pulverizing is a physical process of destruction using excessive physical force to break an item into unusable pieces. Pulverizers are used on items like hard disk drives, destroying the platters in a manner that they cannot be reconstructed. A more modern method of pulverizing the data itself is the use of encryption. The data on the drive is encrypted and the key itself is destroyed. This renders the data non-recoverable based on the encryption strength. This method has unique advantages of scale; a small business can pulverize its own data, whereas they would either need expensive equipment or a third party to pulverize the few disks they need to destroy each year.

Degaussing

A safer method for destroying files on magnetic storage devices (i.e., magnetic tape and hard drives) is to destroy the data magnetically, using a strong magnetic field to degauss the media. Degaussing realigns the magnetic particles, removing the organized structure that represented the data. This effectively destroys all data on the media. Several commercial degaussers are available for this purpose.

Purging

Data purging is a term that is commonly used to describe methods that permanently erase and remove data from a storage space. The key phrase is “remove data,” for unlike deletion, which just destroys the data, purging is designed to open up the storage space for reuse. A circular buffer is a great example of an automatic purge mechanism. It stores a given number of data elements and then the space is reused. A circular buffer that holds 64 MB, once full, as new material is added to the buffer, it over writes the oldest material.

Wiping

Wiping data is the process of rewriting the storage media with a series of patterns of 1’s and 0’s. This is not done once, but is done multiple times to ensure that every trace of the original data has been eliminated. There are data-wiping protocols for various security levels of data, with 3, 7, or even 35 passes. Of particular note are solid-state drives, as these devices use a different storage methodology and require special utilities to ensure that all the sectors are wiped.

Data wiping is non-destructive to the media, unlike pulping and shredding, and this makes it ideal for another purpose. Media sanitization is the clearing of previous data off of a media device before the device is reused. Wiping can be used to sanitize a storage device, making it clean before use. This can be important to remove old trace data that will later show up in free and unused space.

Images

As little information as ZIP code, gender, and date of birth can resolve to a single person.

Collecting PII

PII is by nature sensitive to end users. Loss or compromise of end-user PII can result in financial and other impacts borne by the end user. For this reason, collection of PII should be minimized to what is actually needed. Here are three great questions to ask when determining whether to collect PII:

Images   Do I need each specific data element?

Images   What is my business purpose for each specific element?

Images   Will my customers/end users agree with my rationale for collecting each specific element?

Images Personally Identifiable Information (PII)

When information is about a person, failure to protect it can have specific consequences. Business secrets are protected through trade secret laws, government information is protected through laws concerning national security, and privacy laws protect information associated with people. A set of elements that can lead to the specific identity of a person is referred to as personally identifiable information (PII). By definition, PII can be used to identify a specific individual, even if an entire set is not disclosed.

PII is an essential element of many online transactions, but it can also be misused if disclosed to unauthorized parties. For this reason, it should be protected at all times, by all parties that possess it.

TRUSTe (www.truste.com), an independent trust authority, defines personally identifiable information as any information…

(i) that identifies or can be used to identify, contact, or locate the person to whom such information pertains, or (ii) from which identification or contact information of an individual person can be derived. Personally Identifiable Information includes, but is not limited to: name, address, phone number, fax number, e-mail address, financial profiles, medical profile, social security number, and credit card information.

The concept of PII is used to identify which data elements require a specific level of protection. When records are used individually (not in aggregate form), then PII is the concept of connecting a set of data elements to a specific purpose. If this can be accomplished, then the information is PII and needs specific protections. The U.S. Federal Trade Commission (FTC) has repeatedly ruled that if a firm collects PII, it is responsible for it through the entire lifecycle, from initial collection through use, retirement, and destruction. Only after the PII is destroyed in all forms and locations is the company’s liability for its compromise abated.

Sensitive PII

Some PII is so sensitive to disclosure and resulting misuse that it requires special handling to ensure protection. Data elements such as credit card data, bank account numbers, and government identifiers (social security number, driver’s license number, and so on) require extra levels of protection to prevent harm from misuse. Should these elements be lost or compromised, direct, personal financial damage may occur to the person identified by the data. These elements need special attention when planning data stores and executing business processes associated with PII data, including collection, storage, and destruction.

Search for Your Own PII

Modern Internet search engines have the ability to catalog tremendous quantities of information and make wide-area searches for specific elements easy. Using your own elements of PII, try searching the Internet and see what is returned on your name, address, phone number, social security number, date of birth, and so forth. For security reasons, be sure to be anonymous when doing this—that is, log out of Google applications before using Google Search, Microsoft/Live applications before using Bing, or Yahoo! applications before using Yahoo! Search. This step may seem minor, but with search records being stored, the last thing you want to do is provide records that can cross-correlate data about yourself. If you find data on yourself, analyze the source and whether or not the data should be publicly accessible.

If the accidental disclosure of user data could cause the user harm, such as discrimination (political, racial, health related, or lifestyle), then the best course of action is to treat the information as sensitive PII.

Notice, Choice, and Consent

Because privacy is defined as the power to control what others know about you and what they can do with this information, and PII represents the core items that should be controlled, communication with the end user concerning privacy is paramount. Privacy policies are presented later in the chapter, but with respect to PII, three words can govern good citizenry when collecting PII. Notice refers to informing the customer that PII will be collected and used and/or stored. Choice refers to the opportunity for the end user to consent to the data collection or to opt out. Consent refers to the positive affirmation by a customer that they have read the notice, understand their choices, and agree to release their PII for the purposes explained to them.

Images Fair Information Practice Principles (FIPPs)

In the United States, the Federal Trade Commission has a significant role in addressing privacy concerns. The core principles the FTC uses are referred to as the Fair Information Practice Principles (FIPPs). The FIPPS and their components, as detailed in OMB Circular A-130, are as follows:

Images   Access and Amendment Agencies should provide individuals with appropriate access to PII and appropriate opportunity to correct or amend PII.

Images   Accountability Agencies should be accountable for complying with these principles and applicable privacy requirements, and should appropriately monitor, audit, and document compliance. Agencies should also clearly define the roles and responsibilities with respect to PII for all employees and contractors, and should provide appropriate training to all employees and contractors who have access to PII.

Images   Authority Agencies should only create, collect, use, process, store, maintain, disseminate, or disclose PII if they have authority to do so, and should identify this authority in the appropriate notice.

Images   Minimization Agencies should only create, collect, use, process, store, maintain, disseminate, or disclose PII that is directly relevant and necessary to accomplish a legally authorized purpose, and should only maintain PII for as long as is necessary to accomplish the purpose.

Images   Quality and Integrity Agencies should create, collect, use, process, store, maintain, disseminate, or disclose PII with such accuracy, relevance, timeliness, and completeness as is reasonably necessary to ensure fairness to the individual.

Images   Individual Participation Agencies should involve the individual in the process of using PII and, to the extent practicable, seek individual consent for the creation, collection, use, processing, storage, maintenance, dissemination, or disclosure of PII. Agencies should also establish procedures to receive and address individuals’ privacy-related complaints and inquiries.

Images   Purpose Specification and Use Limitation Agencies should provide notice of the specific purpose for which PII is collected and should only use, process, store, maintain, disseminate, or disclose PII for a purpose that is explained in the notice and is compatible with the purpose for which the PII was collected, or that is otherwise legally authorized.

Images   Security Agencies should establish administrative, technical, and physical safeguards to protect PII commensurate with the risk and magnitude of the harm that would result from its unauthorized access, use, modification, loss, destruction, dissemination, or disclosure.

Images   Transparency Agencies should be transparent about information policies and practices with respect to PII, and should provide clear and accessible notice regarding creation, collection, use, processing, storage, maintenance, dissemination, and disclosure of PII.

Images U.S. Privacy Laws

Identity privacy and the establishment of identity theft crimes is governed by the Identity Theft and Assumption Deterrence Act, which makes it a violation of federal law to knowingly use another’s identity. The collection of information necessary to do this is also governed by the Gramm-Leach-Bliley Act (GLBA), which makes it illegal for someone to gather identity information on another person under false pretenses. In the education area, privacy laws have existed for years. See “Family Education Records and Privacy Act (FERPA),” later in the chapter.

Major Elements of the Privacy Act

The Privacy Act has numerous required elements and definitions. Among other things, the major elements require federal agencies to do the following:

Images   Publish in the Federal Register a notice of each system of records that it maintains, including information about the type of records maintained, the purposes for which they are used, and the categories of individuals on whom they are maintained.

Images   Maintain only such information about an individual as required by law, or is needed to perform a statutory duty.

Images   Maintain information in a timely, accurate, relevant, secure, and complete form.

Images   Inform individuals about access to PII upon inquiry.

Images   Notify individuals from whom it requests information as to what authorizes it to request the information, whether disclosure is mandatory or voluntary, the purpose for which the information may be used, and penalties for not providing the requested information.

Images   Establish appropriate physical, technical, and administrative safeguards for the information that is collected and used.

Additional elements can be found by examining provisions of the act itself, although it is drafted in legislative form and requires extensive cross-referencing and interpretation.

Two major privacy initiatives followed from the U.S. government: the Privacy Act of 1974 and the Freedom of Information Act of 1996.

Privacy Act of 1974

The Privacy Act of 1974 was an omnibus act designed to affect the entire federal information landscape. This act has many provisions that apply across the entire federal government, with only minor exceptions for national security (classified information), law enforcement, and investigative provisions. This act has been amended numerous times, and you can find current, detailed information at the Electronic Privacy Information Center (EPIC) web site, http://epic.org/privacy/laws/privacy_act.html.

Freedom of Information Act (FOIA)

The Freedom of Information Act (FOIA) of 1996 is one of the most widely used privacy acts in the United States, so much so that its acronym, FOIA (pronounced “foya”), has reached common use. FOIA was designed to enable public access to U.S. government records, and “public” includes the press, which purportedly acts on the public’s behalf and widely uses FOIA to obtain information. FOIA carries a presumption of disclosure; the burden is on the government, not the requesting party, to substantiate why information cannot be released. Upon receiving a written request, agencies of the U.S. government are required to disclose those records, unless they can be lawfully withheld from disclosure under one of nine specific exemptions in FOIA. The right of access is ultimately enforceable through the federal court system. The nine specific exemptions, listed in Section 552 of U.S. Code Title 5, fall within the following general categories:

1.   National security and foreign policy information

2.   Internal personnel rules and practices of an agency

3.   Information specifically exempted by statute

4.   Confidential business information

5.   Inter- or intra-agency communication that is subject to deliberative process, litigation, and other privileges

6.   Information that, if disclosed, would constitute a clearly unwarranted invasion of personal privacy

7.   Law enforcement records that implicate one of a set of enumerated concerns

8.   Agency information from financial institutions

9.   Geological and geophysical information concerning wells

Images

FOIA is frequently used and generates a tremendous amount of work for many federal agencies, resulting in delays to requests. This in itself is a testament to its effectiveness.

Record availability under FOIA is less of an issue than is the backlog of requests. To defray some of the costs associated with record requests, and to prevent numerous trivial requests, agencies are allowed to charge for research time and duplication costs. These costs vary by agency, but are typically nominal, in the range of $8.00 to $45.00 per hour for search/review fees and $.10 to $.35 per page for duplication. Agencies are not allowed to demand a requester to make an advance payment unless the agency estimates that the fee is likely to exceed $250 or the requester previously failed to pay proper fees. For many uses, the first 100 pages are free, and under some circumstances the fees can be waived.

Family Education Records and Privacy Act (FERPA)

Student records have significant protections under the Family Education Records and Privacy Act of 1974, which includes significant restrictions on information sharing. FERPA operates on an opt-in basis, as the student must approve the disclosure of information prior to the actual disclosure. FERPA was designed to provide limited control to students over their education records. The law allows students to have access to their education records, an opportunity to seek to have the records amended, and some control over the disclosure of information from the records to third parties. For example, if the parent of a student who is 18 or older inquires about the student’s schedule, grades, or other academic issues, the student has to give permission before the school can communicate with the parent, even if the parent is paying for the education.

FERPA is designed to protect privacy of student information. At the K–12 school level, students are typically too young to have legal standing associated with exercising their rights, so FERPA recognizes the parents as part of the protected party. FERPA provides parents with the right to inspect and review their children’s education records, the right to seek to amend information in the records they believe to be inaccurate, misleading, or an invasion of privacy, and the right to consent to the disclosure of PII from their children’s education records. When a student turns 18 years old or enters a postsecondary institution at any age, these rights under FERPA transfer from the student’s parents to the student.

U.S. Computer Fraud and Abuse Act (CFAA)

The U.S. Computer Fraud and Abuse Act (as amended in 1994, 1996, 2001, and 2008) and privacy laws such as the EU Data Protection Directive have several specific objectives, but one of the main ones is to prevent unauthorized parties access to information they should not have access to. Fraudulent access, or even exceeding one’s authorized access, is defined as a crime and can be punished. Although the CFAA is intended for broader purposes, it can be used to protect privacy related to computer records through its enforcement of violations of authorized access.

U.S. Children’s Online Privacy Protection Act (COPPA)

Children lack the mental capacity to make responsible decisions concerning the release of PII. The U.S. Children’s Online Privacy Protection Act of 1998 (COPPA) specifically addresses this privacy issue with respect to children accessing and potentially releasing information on the Internet. Any web site that collects information from children (ages 13 and under), even simple web forms to allow follow-up communications and so forth, is covered by this law. Before information can be collected and used, parental permission needs to be obtained. This act requires that sites obtain parental permission, post a privacy policy detailing specifics concerning information collected from children, and describe how the children’s information will be used.

Images

Web sites that are collecting information from children under the age of 13 are required to comply with the Children›s Online Privacy Protection Act (COPPA). The U.S. FTC provides an informational web site on COPPA and compliance issues at www.coppa.org.

Video Privacy Protection Act (VPPA)

Considered by many privacy advocates to be the strongest U.S. privacy law, the Video Privacy Protection Act of 1988 provides civil remedies against unauthorized disclosure of personal information concerning video tape rentals and, by extension, DVDs and games as well. This is a federal statute, crafted in response to media searches of rental records associated with Judge Bork when he was nominated to the U.S. Supreme Court. Congress, upset with the liberal release of information, reacted with legislation, drafted by Senator Leahy, who noted during the floor debate that new privacy protections are necessary in “an era of interactive television cables, the growth of computer checking and check-out counters, of security systems and telephones, all lodged together in computers....” (S. Rep. No. 100-599, 100th Cong., 2nd Sess. at 6 [1988]).

This statute, civil in nature, provides for civil penalties of up to $2500 per occurrence, as well as other civil remedies. The statute provides the protections by default, thus requiring a video rental company to obtain the renter’s consent to opt out of the protections if the company wants to disclose personal information about rentals. Exemptions exist for issues associated with the normal course of business for the video rental company as well as for responding to warrants, subpoenas, and other legal requests. This law does not supersede state laws, of which there are several.

Many states have enacted laws providing both wider and greater protections than the federal VPPA statute. For example, Connecticut and Maryland laws brand video rental records as confidential, and therefore not subject to sale, while California, Delaware, Iowa, Louisiana, New York, and Rhode Island have adopted state statutes providing protection of privacy with respect to video rental records. Michigan’s video privacy law is as sweeping as its broad super-DMCA state statute. This state law specifically protects records of book purchases, rentals, and borrowing as well as video rentals.

Protected Health Information (PHI)

HIPAA regulations define protected health information (PHI) as “any information, whether oral or recorded in any form or medium” that “[i]s created or received by a health care provider, health plan, public health authority, employer, life insurer, school or university, or health care clearinghouse”; and “[r]elates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual.”

Health Insurance Portability and Accountability Act (HIPAA)

Medical and health information also has privacy implications, which is why the U.S. Congress enacted the Health Insurance Portability and Accountability Act (HIPAA) of 1996. HIPAA calls for sweeping changes in the way health and medical data is stored, exchanged, and used. From a privacy perspective, significant restrictions of data transfers to ensure privacy are included in HIPAA, including security standards and electronic signature provisions. HIPAA security standards mandate a uniform level of protections regarding all health information that pertains to an individual and is housed or transmitted electronically. The standards mandate safeguards for physical storage, maintenance, transmission, and access to individuals’ health information. HIPAA mandates that organizations that use electronic signatures have to meet standards ensuring information integrity, signer authentication, and nonrepudiation. These standards leave to industry the task of specifying the technical solutions and mandate compliance only to significant levels of protection as provided by the rules being released by industry.

HIPAA’s language is built on the concepts of protected health information (PHI) and Notice of Privacy Practices (NPP). HIPAA describes “covered entities,” including medical facilities, billing facilities, and insurance (third-party payer) facilities. Patients are to have access to their PHI and an expectation of appropriate privacy and security associated with medical records. HIPAA mandates a series of administrative, technical, and physical security safeguards for information, including elements such as staff training and awareness, and specific levels of safeguards for PHI when in use, stored, or in transit between facilities.

Notice of Privacy Practices

Visit your local doctor’s office, hospital, or clinic and ask for their Notice of Privacy Practices (NPP). This notice to patients details what information will be collected and the uses and safeguards that are applied. These can be fairly lengthy and detailed documents, and in many cases are in a booklet form.

In 2009, as part of the American Recovery and Reinvestment Act of 2009, the Health Information Technology for Economic and Clinical Health Act (HITECH Act) was passed into law. Although the primary purpose of the HITECH Act was to provide stimulus money for the adoption of electronic medical records (EMR) systems at all levels of the healthcare system, it also contained new security and privacy provisions to add teeth to those already in HIPAA. HIPAA protections were confined to the direct medical profession, and did not cover entities such as health information exchanges and other “business associates” engaged in the collection and use of PHI. Under HITECH, business associates will be required to implement the same security safeguards and restrictions on uses and disclosures, to protect individually identifiable health information, as covered entities under HIPAA. It also subjects business associates to the same potential civil and criminal liability for breaches as covered entities. HITECH also specifies that the U.S. Department of Health & Human Services (HHS) is now required to conduct periodic audits of covered entities and business associates.

HIPAA Penalties

HIPAA civil penalties for willful neglect are increased under the HITECH Act. These penalties can extend up to $250,000, and repeat/uncorrected violations can extend up to $1.5 million. Under HIPAA and the HITECH Act, an individual cannot bring a cause of action against a provider. The laws specify that a state attorney general can bring an action on behalf of state residents.

Gramm-Leach-Bliley Act (GLBA)

In the financial arena, GLBA introduced the U.S. consumer to privacy notices, requiring firms to disclose what they collect, how they protect the information, and with whom they will share it. Annual notices are required as well as the option for consumers to opt out of the data sharing. The primary concept behind U.S. privacy laws in the financial arena is that consumers be allowed to opt out. This was strengthened in GLBA to include specific wording and notifications as well as requiring firms to appointment a privacy officer. Most U.S. consumers have witnessed the results of GLBA, every year receiving privacy notices from their banks and credit card companies. These notices are one of the visible effects of GLBA on changing the role of privacy associated with financial information.

California Senate Bill 1386 (SB 1386)

California Senate Bill 1386 (SB 1386) was a landmark law concerning information disclosures. It mandates that Californians be notified whenever PII is lost or disclosed. Since the passage of SB 1386, numerous other states have modeled legislation on this bill, and although national legislation has been blocked by political procedural moves, it will eventually be passed. The current list of U.S. states and territories that require disclosure notices is up to 49, with only Alabama, New Mexico, and South Dakota without bills. Each of these disclosure notice laws is different, making the case for a unifying federal statute compelling, but currently it is low on the priority lists of most politicians.

U.S. Banking Rules and Regulations

Banking has always had an element of PII associated with it, from who has deposits to who has loans. As the scale of operations increased, both in numbers of customers and products, the importance of information for processing grew. Checks became a utility instrument to convey information associated with funds transfer between parties. As a check was basically a promise to pay, in the form of directions to a bank, occasionally the check was not honored and a merchant had to track down the party to demand payment. Thus, it became industry practice to write additional information on a check to assist a firm in later tracking down the drafting party. This information included items such as address, work phone number, a credit card number, and so on. This led to the co-location of information about an individual, and this information was used at times to perform a crime of identity theft. To combat this and prevent the gathering of this type of information, a series of banking and financial regulations were issued by the U.S. government to prohibit this form of information collection. Other regulations addressed items such as credit card numbers being printed on receipts, mandating only the last five digits be exposed.

Payment Card Industry Data Security Standard (PCI DSS)

As described in Chapter 24, the major credit card firms, such as MasterCard, Visa, American Express, and Discover, designed a private-sector initiative to deal with privacy issues associated with credit card transaction information. PCI DSS is a standard that provides guidance on what elements of a credit card transaction need protection and the level of expected protection. PCI DSS is not a law, but rather a contractual regulation, enforced through a series of fines and fees associated with performing business in this space. PCI DSS was a reaction to two phenomena: data disclosures and identity theft.

Fair Credit Reporting Act (FCRA)

The Fair Credit Reporting Act of 1999 brought significant privacy protections to the consumer credit reporting agencies (CRAs). This act requires that the agencies provide consumers notice of their rights and responsibilities. The agencies are required to perform timely investigations on inaccuracies reported by consumers. The agencies are also required to notify the other CRAs when consumers close accounts. The act also has technical issues associated with data integrity, data destruction, data retention, and consumer and third-party access to data. The details of FCRA proved to be insufficient with respect to several aspects of identity theft, and in 2003, the Fair and Accurate Credit Transactions Act (FACTA) was passed, modifying and expanding on the privacy and security provisions of FCRA.

FACTA and Credit Card Receipts

One of the provisions of FACTA compels businesses to protect credit card information on receipts. Before FACTA, it was common for receipts to have entire credit card numbers, as well as additional information. Today, receipts can display only the last five digits of the card number and cannot include the card expiration date. These rules went into effect in 2005, and merchants had one year to comply.

FTC Disposal Rule

The FTC’s Disposal Rule applies to consumer reporting agencies as well as to any individuals and businesses that use consumer reports, such as lenders, insurers, employers, and landlords.

Fair and Accurate Credit Transactions Act (FACTA)

The Fair and Accurate Credit Transactions Act of 2003 was passed to enact stronger protections for consumer information from identity theft, errors, and omissions. FACTA amended portions of FCRA to improve the accuracy of customer records in consumer reporting agencies, to improve timely resolution of consumer complaints concerning inaccuracies, and to make businesses take reasonable steps to protect information that can lead to identity theft.

FACTA also had other “disposal rules” associated with consumer information. FACTA mandates that information that is no longer needed must be properly disposed of, either by burning, pulverizing, or shredding. Any electronic information must be irreversibly destroyed or erased. Should third-party firms be used for disposal, the rules still pertain to the original contracting party, so third parties should be selected with care and monitored for compliance.

Red Flag Rules

The FTC has adopted a set of red flag rules that are invoked to assist entities in determining when extra precautions must be taken concerning PII records. The following are some examples of red flags that should prompt an organization to initiate additional, specific data handling steps to protect data:

Images   Change of address request. This is a common tool for identity thieves, and as such, firms should provide protection steps to verify change of address requests.

Images   Sudden use of an account that has been inactive for a long time, or radical changes in use of any account.

Images   A suspicious address or phone number. Many fraudulent addresses and numbers are known, and repeated applications should be quickly noted and stopped.

Images   Request for credit on a consumer account that has a credit freeze on a credit reporting record.

Additional information is available from the FTC at www.ftc.gov/tips-advice/business-center/guidance/fighting-identity-theft-redflags-rule-how-guide-business.

Whenever a red flag issue occurs, the business must have special procedures in place to ensure that the event is not fraudulent. Calling the customer and verifying information before taking action is one example of this type of additional action.

Images International Privacy Laws

Privacy is not a U.S.-centric phenomenon, but it does have strong cultural biases. Legal protections for privacy tend to follow the socio-cultural norms by geography; hence, there are different policies in European nations than in the United States. In the United States, the primary path to privacy is via opt-out, whereas in Europe and other countries, it is via opt-in. What this means is that the fundamental nature of control shifts. In the U.S., a consumer must notify a firm that they wish to block the sharing of personal information; otherwise, the firm has permission by default. In the EU, sharing is blocked unless the customer specifically opts in to allow it. The Far East has significantly different cultural norms with respect to individualism vs. collectivism, and this is seen in their privacy laws as well. Even in countries with common borders, distinct differences exist, such as the United States and Canada; Canadian laws and customs have strong roots to their UK history, and in many cases follow European ideals as opposed to U.S. ones. One of the primary sources of intellectual and political thought on privacy has been the Organization for Economic Co-operation and Development (OECD). This multinational entity has for decades conducted multilateral discussions and policy formation on a wide range of topics, including privacy.

OECD Fair Information Practices

OECD Fair Information Practices are the foundational element for many worldwide privacy practices. Dating to 1980, Fair Information Practices are a set of principles and practices that set out how an information-based society may approach information handling, storage, management, and flows with a view toward maintaining fairness, privacy, and security. Members of the OECD recognized that information was a critical resource in a rapidly evolving global technology environment, and that proper handling of this resource was critical for long-term sustainability of growth.

OECD’s Privacy Code

OECD’s privacy code was developed to help “harmonise national privacy legislation and, while upholding such human rights, [to] at the same time prevent interruptions in international flows of data. [The Guidelines] represent a consensus on basic principles which can be built into existing national legislation, or serve as a basis for legislation in those countries which do not yet have it.” (Source: “OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data,” www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm.)

European Laws

The EU has developed a comprehensive concept of privacy, which is administered via a set of statutes known as data protection. These privacy statutes cover all personal data, whether collected and used by government or by private firms. These laws are administered by state and national data protection agencies in each country. With the advent of the EU, this common comprehensiveness stands in distinct contrast to the patchwork of laws in the United States.

Privacy laws in Europe are built around the concept that privacy is a fundamental human right that demands protection through government administration. When the EU was formed, many laws were harmonized across the original 15 member nations, and data privacy was among those standardized. The initial harmonization related to privacy was the Data Protection Directive, adopted by EU members, which has a provision allowing the European Commission to block transfers of personal data to any country outside the EU that has been determined to lack adequate data protection policies. The impetus for the EU directive is to establish the regulatory framework to enable the movement of personal data from one country to another, while at the same time ensuring that privacy protection is “adequate” in the country to which the data is sent. This can be seen as a direct result of early United States Department of Health, Education, and Welfare (HEW) task force and OECD directions. If the recipient country has not established a minimum standard of data protection, it is expected that the transfer of data will be prohibited.

Safe Harbor Principles

Safe Harbor was built on seven principles:

Images   Notice   A firm must give notice of what is being collected, how it will be used, and with whom it will be shared.

Images   Choice   A firm must allow the option to opt out of transfer of PII to third parties.

Images   Onward Transfer   All disclosures of PII must be consistent with the previous principles of Notice and Choice.

Images   Security   PII must be secured at all times.

Images   Data Integrity   PII must be maintained accurately and, if incorrect, the customer has the right to correct it.

Images   Access   Individuals must have appropriate and reasonable access to PII for the purposes of verification and correction.

Images   Enforcement   Issues with privacy and PII must have appropriate enforcement provisions to remain effective.

Although no longer considered sufficient, the Safe Harbor principles are still the starting point for building data privacy initiatives.

Encryption and Privacy

Encryption has long been held by governments to be a technology associated with the military. As such, different governments have regulated it in different manners. The U.S. government has greatly reduced controls over encryption in the past decade. Other countries, such as Great Britain, have enacted statutes that compel users to turn over encryption keys when asked by authorities. Countries such as France, Malaysia, and China still tightly control and license end-user use of encryption technologies. The primary driver for Phil Zimmerman to create Pretty Good Privacy (PGP) was the need for privacy in countries where the government was considered a threat to civil liberties.

The differences in approach between the U.S. and the EU with respect to data protection led the EU to issue expressions of concern about the adequacy of data protection in the United States, a move that could have paved the way to the blocking of data transfers. After negotiation, it was determined that U.S. organizations that voluntarily joined an arrangement known as Safe Harbor would be considered adequate in terms of data protection. Safe Harbor is a mechanism for self-regulation that can be enforced through trade practice law via the FTC. A business joining the Safe Harbor Consortium must make commitments to abide by specific guidelines concerning privacy. Safe Harbor members also agree to be governed by certain self-enforced regulatory mechanisms, backed ultimately by FTC action.

Another major difference between U.S. and European regulation lies in where the right of control is exercised. In European directives, the right of control over privacy is balanced in such a way as to favor consumers. Rather than having to pay to opt out, as with unlisted phone numbers in the United States, consumers have such services for free. Rather than users having to opt out at all, the default privacy setting is deemed to be the highest level of data privacy, and users have to opt in to share information. This default setting is a cornerstone of the European Union’s Directive on Protection of Personal Data and is enforced through national laws in all member nations.

General Data Protection Regulation (GDPR)

Two factors led to what can only be seen as a complete rewrite of EU data protection regulations. In light of the Snowden revelations, the EU began a new round of examining data protection when shared with the U.S. and others. This brought Safe Harbor provisions into the spotlight as the EU wanted to renegotiate stronger protections. Then, the European Court of Justice invalidated the Safe Harbor provisions. This led the way to the passage of the General Data Protection Regulation (GDPR), which goes into effect in May of 2018.

The GDPR ushers in a brand-new world with respect to data protection and privacy. With global trade being important to all countries, and the fact that trade rests upon information transfers, including those of personal data, the ability to transfer data, including personal data, between parties becomes important to trade. Enshrined in the Charter of Fundamental Rights of the EU is the fundamental right to the protection of personal data, including when such data elements are transferred outside the EU. Recognizing that, the new set of regulations is more expansive and restrictive, making the Safe Harbor provisions obsolete. For all firms that wish to trade with the EU, there is now a set of privacy regulations that will require specific programs to address the requirements.

The GDPR brings many changes, one being the appointment of a Data Protection Officer (DPO). This role may be filled by an employee or a third-party service provider (for example, consulting or law firm), and it must be a direct report to the highest management level. The DPO should operate with significant independence, and provisions in the GDPR restrict control over the DPO by management.

GDPR

The GDPR will require significant consideration, including the following:

Images   Assess personal data flows from the EU to the U.S. to define the scale and scope of the cross-border privacy-compliance challenge.

Images   Assess readiness to meet model clauses, remediate gaps, and organize audit artifacts of compliance with the clauses.

Images   Update privacy programs to ensure they are capable of passing an EU regulator audit.

Images   Conduct EU data-breach notification stress tests.

Images   Monitor changes in EU support for model contracts and binding corporate rules.

The GDPR specifies requirements regarding consent, and they are significantly more robust than previous regulations. Consent requirements are also delineated for specific circumstances:

Images   Informed/affirmative consent to data processing. Specifically, “a statement or a clear affirmative action” from the data subject must be “freely given, specific, informed and unambiguous.”

Images   Explicit consent to process special categories of data. Explicit consent is required for “special categories” of data, such as genetic data, biometric data, and data concerning sexual orientation.

Images   Explicit parental consent for children’s personal data.

Images   Consent must be specific to each data-processing operation and the data subject can withdraw consent at any time.

The GDPR provides protections for new individual rights, and these may force firms to adopt new policies to address these requirements. The rights include the Right to Information, Right to Access, Right to Rectification, Right to Restrict Processing, Right to Object, Right to Erasure, and Right to Data Portability. Each of these rights is clearly defined with technical specifics in the GDPR. The GDPR also recognizes the risks of international data transfer to other parties, and has added specific requirements that data protection issues be addressed by means of appropriate safeguards, including Binding Corporate Rules (BCRs), Model Contract Clauses (MCCs), also known as Standard Contractual Clauses (SCCs), and legally binding documents. These instruments must be enforceable between public authorities or bodies, as well as all who handle data.

Canadian Law

Like many European countries, Canada has a centralized form of privacy legislation that applies to every organization that collects, uses, or discloses personal information, including information about employees. These regulations stem from the Personal Information Protection and Electronic Data Act (PIPEDA), which requires that personal information be collected and used only for appropriate purposes. Individuals must be notified as to why the information is requested and how it will be used. The act has safeguards associated with storage, use, reuse, and retention.

To ensure leadership in the field of privacy issues, Canada has a national-level privacy commissioner, and each province has a provincial privacy commissioner. These commissioners act as advocates on behalf of individuals and have used legal actions to enforce the privacy provisions associated with PIPEDA to protect personal information.

Asian Laws

Japan has the Personal Information Protection Law, which requires protection of personal information used by the Japanese government, third parties, and the public sector. The Japanese law has provisions where the government entity must specify the purpose for which information is being collected, specify the safeguards applied, and, when permitted, discontinue use of the information upon request.

Hong Kong has an office of the Privacy Commissioner for Personal Data (PCPD), a statutory body entrusted with the task of protecting personal data privacy of individuals and to ensure compliances with the Personal Data (Privacy) Ordinance in Hong Kong. One main task of the Commissioner is public education, creating greater awareness of privacy issues and the need to comply with the Personal Data Ordinance.

China has had a long reputation of poor privacy practices. Some of this comes from the cultural bias toward collectivism, and some comes from the long-standing government tradition of surveillance. News of the Chinese government eavesdropping on Skype and other Internet-related communications has heightened this concern. China’s constitution has provisions for privacy protections for the citizens. Even so, issues have come in the area of enforcement and penalties, and privacy items that have been far from uniform in their judicial history.

Images Privacy-Enhancing Technologies

One principal connection between information security and privacy is that without information security, you cannot have privacy. If privacy is defined as the ability to control information about oneself, then the aspects of confidentiality, integrity, and availability from information security become critical elements of privacy. Just as technology has enabled many privacy-impacting issues, technology also offers the means in many cases to protect privacy. An application or tool that assists in such protection is called a privacy-enhancing technology (PET).

Encryption is at the top of the list of PETs for protecting privacy and anonymity. As noted earlier, one of the driving factors behind Phil Zimmerman’s invention of PGP was the desire to enable people living in repressive cultures to communicate safely and freely. Encryption can keep secrets secret, and is a prime choice for protecting information at any stage in its lifecycle. The development of Tor routing to permit anonymous communications, coupled with high-assurance, low-cost cryptography, has made many web interactions securable and safe from eavesdropping.

Other PETs include small application programs called cookie cutters that are designed to prevent the transfer of cookies between browsers and web servers. Some cookie cutters block all cookies, while others can be configured to selectively block certain cookies. Some cookie cutters also block the sending of HTTP headers that might reveal personal information but might not be necessary to access a web site, as well as block banner ads, pop-up windows, animated graphics, or other unwanted web elements. Some related PET tools are designed specifically to look for invisible images that set cookies (called web beacons or web bugs). Other PETs are available to PC users, including encryption programs that allow users to encrypt and protect their own data, even on USB keys.

Images Privacy Policies

One of the direct outcomes of the legal statutes associated with privacy has been the development of a need for corporate privacy policies associated with data collection. With a myriad of government agencies involved, each with a specific mandate to “assist” in the protection effort associated with PII, one can ask, what is the best path for an industry member? If your organization needs PII to perform its tasks, obtaining and using it is fine in most cases, but you must ensure that everyone in the organization complies with the laws, rules, and regulations associated with these government agencies. Policies and procedures are the best way to ensure uniform compliance across an organization. The development of a privacy policy is an essential foundational element of a company’s privacy stance.

Privacy Compliance Steps

To ensure that an organization complies with the numerous privacy requirements and regulations, a structured approach to privacy planning and policies is recommended:

1.   Identify the role in the organization that will be responsible for compliance and oversight.

2.   Document all applicable laws and regulations, industry standards, and contract requirements.

3.   Identify any industry best practices.

4.   Perform a privacy impact assessment (PIA) and a risk assessment.

5.   Map the identified risks to compliance requirements.

6.   Create a unified risk mitigation plan.

Privacy Impact Assessment

A privacy impact assessment (PIA) is a structured approach to determining the gap between desired privacy performance and actual privacy performance. A PIA is an analysis of how PII is handled through business processes and an assessment of risks to the PII during storage, use, and communication. A PIA provides a means to assess the effectiveness of a process relative to compliance requirements and identify issues that need to be addressed. A PIA is structured with a series of defined steps to ensure a comprehensive review of privacy provisions.

The following steps comprise a high-level methodology and approach for conducting a PIA:

1.   Establish PIA scope. Determine the departments involved and the appropriate representatives. Determine which applications and business processes need to be assessed. Determine applicable laws and regulations associated with the business and privacy concerns.

2.   Identify key stakeholders. Identify all business units that use PII. Examine staff functions such as HR, Legal, IT, Purchasing, and Quality Control.

3.   Document all contact with PII:

Images   PII collection, access, use, sharing, disposal

Images   Processes and procedures, policies, safeguards, data-flow diagrams, and any other risk assessment data

Images   Web site policies, contracts, HR, and administrative for other PII

4.   Review legal and regulatory requirements, including any upstream contracts. The sources are many, but some commonly overlooked issues are agreements with suppliers and customers over information sharing rights.

5.   Document gaps and potential issues between requirements and practices. All gaps and issues should be mapped against where the issue was discovered and the basis (requirement or regulation) that the gap maps to.

6.   Review findings with key stakeholders to determine accuracy and clarify any issues. Before the final report is written, any issues or possible miscommunications should be clarified with the appropriate stakeholders to ensure a fair and accurate report.

7.   Create a final report for management.

Images Web Privacy Issues

The Internet acts as a large information-sharing domain, and as such can be a conduit for the transference of information among many parties. The Web offers much in the form of communication between machines, people, and systems, and this same exchange of information can be associated with privacy based on the content of the information and the reason for the exchange.

Cookies

Cookies are small bits of text that are stored on a user’s machine and sent to specific web sites when the user visits these sites. Cookies can store many different things, from tokens that provide a reference to a database server behind the web server to assist in maintaining state through an application, to the contents of a shopping cart. Cookies can also hold data directly, in which case there are possible privacy implications. When a cookie holds a token number that is meaningless to outsiders but meaningful to a back-end server, then the loss of the cookie represents no loss at all. When the cookie text contains meaningful information, then the loss can result in privacy issues. For instance, when a cookie contains a long number that has no meaning except to the database server, then the number has no PII. But if the cookie contains text, such as a ship-to address for an order, this can represent PII and can result in a privacy violation. It is common to encode the data in cookies, but Base64 encoding is not encryption and can be decoded by anyone, thus providing no confidentiality.

Cookies provide the useful service of allowing state to be maintained in the stateless process of web serving (see “Cookies” in Chapter 17). But because of the potential for PII leakage, many users have sworn off cookies. This leads to issues on numerous web sites, because when properly implemented, cookies pose no privacy danger and can greatly enhance web site usefulness.

The bottom line for cookies is fairly clear: Done correctly, they do not represent a security or privacy issue. Done incorrectly, they can be a disaster. A simple rule solves most problems with cookies: never store data directly on a cookie; instead, store a reference to another web application that permits the correct actions to occur based on the key value.

Images Privacy in Practice

With privacy being defined as the power to control what others know about you and what they can do with that information, there remains the question of what you can do to exercise that control. Information is needed to obtain services, and in many cases the information is reused, often for additional and secondary purposes. Users agree to these uses through acceptance of a firm’s privacy policy.

Shared information still requires control, and in this case the control function has shifted to the party that obtained the information. They may store it for future use, for record purposes, or for other uses. If they fail to adequately protect the information from loss or disclosure, then the owner no longer has authorized the uses it may be employed in. Data disclosures and information thefts both result in unauthorized use of information. Users can take actions to both protect their information and to mitigate risk from unauthorized sharing and use of their information.

User Actions

Users have to share information for a variety of legitimate purposes. Information has value, both to the authorized user and to those who would steal the information and use it for unauthorized purposes. If users are going to control their information, they have to take certain precautions. This is where security and privacy intersect at an operational level. Security functionality enables control and thus enables privacy functionality.

One aspect of maintaining control over information is in the proper security precautions presented throughout the book, so they will not be repeated here. A second level of actions can be employed by users to maintain knowledge over their information uses. The value of information is in its use, and in many cases, this use can be tracked. The two main types of information that have immediate value are financial and medical. Financial information, such as credit card information, identity information, and banking information, can be used by criminals to steal from others. Many times the use of identity or financial information will show up on the systems of record associated with the information. This is why it is important to actually read bank statements and verify charges.

Images

Users should periodically, as in annually, request copies of their credit bureau reports and examine them for unauthorized activity. Likewise, users should periodically verify with their healthcare insurers, looking for unauthorized activity there as well. These checks do not take much time and provide a means to prevent long-term penetration of identities.

In the same vein, one should periodically examine their credit report, looking for unauthorized credit requests or accounts. Periodic checks of healthcare insurance accounts and reports are essential for the same reason. Just because you have paid all your copays, you shouldn’t shred unopened envelopes from the insurance company. If someone else is using your information, you may be authorizing their use of your stolen information by not alerting the insurance company to the misuse.

Images

Data breaches continue to plague firms. Here are some recent major breaches and the number of records they affected:

Images   Equifax 143,000,000 records

Images   Friend Finder Network 412,000,000 records

Images   River City Media 1,370,000,000 records

Images   Spambot 700,000,000 records

Images   Philippine Commission on Elections 550,000,000 records

Images   Uber 57,000,000 records

There are many additional breaches, varying in size and in data sensitivity. While the large numbers of e-mail addresses capture the headlines, the release of all Swedish car registrations in the entire country is missed because of the limited numbers of cars in Sweden, yet the impact for Swedes could be significant. For further reference and additional information see http://www.informationisbeautiful.net/visualizations/worlds-biggest-data-breaches-hacks/.

Data Breaches

When a company loses data that it has stored on its network, the term used is data breach. Data breaches have become an almost daily news item, and the result is that people are becoming desensitized to their occurrence. Data breaches act as means of notification that security efforts have failed. Verizon regularly publishes a data breach investigation report, examining the root causes behind hundreds of breach events. In the 2017 report, Verizon found that nine out of ten breaches can be described by the following distinct patterns:

Images   Point-of-sale (POS) intrusions

Images   Web app attacks

Images   Insider and privilege misuse

Images   Physical theft and loss

Images   Miscellaneous errors (misdelivery, misconfiguration, user errors)

Images   Crimeware

Images   Payment card skimmers

Images   Denial of service

Images   Cyber-espionage

In 2017, over 42,000 security incidents were analyzed, with 1935 confirmed data breaches across 84 countries. While the Verizon report is considered the gold standard in analysis of breaches, it has received some flack in recent years for not including the industrial control system (ICS) elements that have been noted to be under attack. To get this information, one must use data from the ICS-CERT, which is part of the US-CERT.

Images For More Information

Rebecca Herold, Privacy Professor

Monthly Privacy Professor tips www.privacyguidance.com/eTips.html

Blog www.privacyguidance.com/blog/

Videos www.privacyguidance.com/eMy_Videos.html

Data Breaches

Information is Beautiful (visualizations) www.informationisbeautiful.net/visualizations/worlds-biggest-data-breaches-hacks/

Verizon data breach investigations report www.verizonenterprise.com/DBIR

Chapter 25 Review

Images   Chapter Summary


After reading this chapter and completing the exercises, you should understand the following aspects of privacy.

Examine concepts of privacy

Images   Privacy is the power to control what others know about you and what they can do with that information.

Images   The concept of privacy does not translate directly to information about a business because it is not about a person.

Compare and contrast privacy policies and laws of different jurisdictions

Images   Numerous U.S. federal statutes have privacy provisions, including FERPA, VPPA, GLBA, HIPAA, and so on.

Images   The number of state and local laws that address privacy issues is limited.

Images   A wide array of international laws address privacy issues, including those of the EU, Canada, and other nations.

Describe approaches individuals, organizations, and governments have taken to protect privacy

Images   Policies drive corporate actions, and privacy policies are required by several statutes and are essential to ensure compliance with the myriad of mandated actions.

Images   Cookies represent a useful tool to maintain state when surfing the Web, but if used incorrectly, they can represent a security and privacy risk.

Images   Data sensitivity labels are used to identify the types of data sensitivity.

Images   Assignment of duties to data owners, stewards/custodians, and privacy officers is done by management.

Describe issues associated with technology and privacy

Images   A direct relationship exists between information security and privacy—one cannot have privacy without security.

Images   Privacy-enhancing technologies (PETs) are used in the technological battle to preserve anonymity and privacy.

Explain the concept of personally identifiable information (PII)

Images   Specific constituent elements of PII need to be protected.

Images   Corporate responsibilities associated with PII include the need to protect PII appropriately when in storage, use, or transmission.

Images   Key Terms


choice (819)

consent (819)

cookie cutters (831)

cookies (833)

data custodian (815)

data owner (815)

data protection (827)

data retention (813)

data roles (815)

data sensitivity labeling (813)

data steward (815)

Disposal Rule (826)

Fair Information Practice Principles (FIPPs) (819)

Freedom of Information Act (FOIA) (821)

General Data Protection Regulations (GDPR) (829)

Health Insurance Portability and Accountability Act (HIPAA) (823)

identity theft (825)

notice (819)

Notice of Privacy Practices (NPP) (824)

opt-in (827)

opt-out (827)

Personal Information Protection and Electronic Data Act (PIPEDA) (830)

personally identifiable information (PII) (817)

privacy (812)

Privacy Act of 1974 (821)

privacy-enhancing technology (PET) (831)

privacy impact assessment (PIA) (832)

privacy officer (815)

privacy policy (832)

protected health information (PHI) (824)

pulping (816)

pulverizing (816)

purging (817)

red flag (826)

red flag rules (826)

Images   Key Terms


Use terms from the Key Terms list to complete the sentences that follow. Don’t use the same term more than once. Not all terms will be used.

1.   In the United States, the standard methodology for consumers with respect to privacy is to _______________, whereas in the EU it is to ______________.

2.   _______________ is the right to control information about oneself.

3.   The FTC mandates firms’ use of _______________ procedures to identify instances where additional privacy measures are warranted.

4.   The new set of privacy rules and regulations in the EU are referred to as the _______________ .

5.   Data that can be used to identify a specific individual is referred to as _______________.

6.   Programs used to control the use of ___________ during web browsing are referred to as _________.

7.   The major U.S. privacy statutes are the ____________ and the _______________.

8.   Medical information in the United States is protected via the _______________.

9.   Many privacy regulations have specified that firms provide an annual _______________ to customers.

10.   To evaluate the privacy risks in a firm, a(n) _______________ can be performed.

Images   Multiple-Choice Quiz


1.   HIPAA requires the following controls for medical records:

A.   Encryption of all data

B.   Technical safeguards

C.   Physical controls

D.   Administrative, technical, and physical controls

2.   Which of the following is not PII?

A.   Customer name

B.   Customer ID number

C.   Customer social security number or taxpayer identification number

D.   Customer birth date

3.   A privacy impact assessment:

A.   Determines the gap between a company’s privacy practices and required actions

B.   Determines the damage caused by a breach of privacy

C.   Determines what companies hold information on a specific person

D.   Is a corporate procedure to safeguard PII

4.   Which of the following should trigger a response under the red flag rule?

A.   All credit requests for people under 25 or over 75

B.   Any new customer credit request, except for name changes due to marriage

C.   Request for credit from a customer who has a history of late payments and poor credit

D.   Request for credit from a customer with a credit freeze on their credit reporting record

5.   Which of the following is an acceptable PII disposal procedure?

A.   Shredding

B.   Burning

C.   Electronic destruction per military data destruction standards

D.   All of the above

6.   Key elements of GDPR include:

A.   Conduct EU data-breach notification stress tests

B.   Appoint a Data Protection Officer reporting directly to top-level management of the firm

C.   Right to Erasure

D.   All of the above

7.   European privacy laws are built upon:

A.   General Data Protection Regulations

B.   Personal Information Protection and Electronic Data Act (PIPEDA)

C.   Safe Harbor principles

D.   Common law practices

8.   In the United States, company responses to data disclosures of PII are regulated by which of the following?

A.   Federal law, the Privacy Act

B.   A series of state statutes

C.   Contractual agreements with banks and credit card processors

D.   The Gramm-Leach-Bliley Act (GLBA)

9.   What is/are the primary factor(s) behind data-sharing compliance between U.S. and European companies?

A.   U.S. firms adopting provisions of the GDPR

B.   Safe Harbor provisions

C.   U.S. FTC enforcement actions

D.   All of the above

10.   Privacy is defined as:

A.   One’s ability to control information about oneself

B.   Being able to keep one’s information secret

C.   Making data-sharing illegal without consumer consent

D.   Something that is outmoded in the Internet age

Images   Essay Quiz


1.   Privacy and technology often clash, especially when technology allows data collection that has secondary uses. In the case of automotive technology, black boxes to collect operational data are being installed in new cars in the United States. What are the privacy implications, and what protections exist?

2.   Privacy policies are found all over the Web. Pick three web sites with privacy policies and compare and contrast them. What do they include and what is missing?

3.   The EU has dramatically changed its privacy infrastructure and requirements as a result of several events, including court cases, the Snowden revelations, and government activism. Examine the new world of data privacy regulations under the GDPR and then compare and contrast this to both the U.S. systems and the previous EU system.

Lab Projects

   Lab Project 25.1

Privacy-enhancing technologies can do much to protect a user’s information and/or maintain anonymity when using the Web. Research onion routing and the Tor project. What do these things do? How do they work?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.116.62.45