CHAPTER MENU
Thus far we have focused primarily on laws that affect the security of data, systems, and networks, and the ability of the government and the private sector to conduct surveillance on this infrastructure to prevent cybercrime and other harms. However, an examination of cybersecurity law would be incomplete without an overview of privacy law.
Privacy law limits companies' collection, use, sharing, and retention of personal information. While data security laws provide the safeguards that companies must have in place to prevent hackers from accessing customer data, privacy law restricts companies' ability to use customer data. For instance, privacy law may prevent a company from selling customer web-browsing activities to third-party marketers, building customer profiles based on the videos they view online, or using facial recognition.
Some might argue that privacy law is outside of the scope of cybersecurity law, and they may be correct. At least under some conceptions of cybersecurity law, it is irrelevant how companies choose to legitimately use customer data. However, cybersecurity is an emerging field and there is not a single, settled definition. Nevertheless, privacy does often intersect with cybersecurity, and, consequently, all cybersecurity professionals should have a basic understanding of privacy legal principles.
Any examination of cybersecurity law would be incomplete without an overview of the legal restrictions on the use and disclosure of personal information. As with data security, the Federal Trade Commission regulates privacy under Section 5 of the FTC Act, which prohibits unfair and deceptive trade practices. However, the United States, unlike other jurisdictions, such as the European Union and Canada, does not have a general privacy law that applies to all companies. Instead, privacy regulation in the United States, like data security regulation, is a web of federal and state laws, some of which focus on specific industries or types of data. This chapter provides an overview of the regulation of privacy under Section 5 of the FTC Act, as well as the most prominent federal and state privacy laws that restrict the private sector's cyber-related use and disclosure of personal information.1
As described more thoroughly in Chapter 1, Section 5 of the Federal Trade Commission Act declares illegal “unfair or deceptive acts or practices in or affecting commerce.”2 The statute states that “unfair” practices are those that cause or are likely to cause “substantial injury to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition.”3
As with data security, the FTC has not promulgated privacy regulations under Section 5. Rather, it takes a case-by-case approach to determine whether a company's privacy practices are unfair or deceptive. In general, the FTC expects companies to disclose material aspects of the manner in which they handle personal data (i.e., if they share information with third parties) and to be honest in their statements about data processing (i.e., not to lie in their privacy policies). Transparency, full disclosure, and honesty are among the most important principles in complying with the FTC's privacy expectations. Unlike data protection regulators in other countries, the FTC does not impose a specific set of data privacy requirements on all companies.
The FTC most clearly articulated its privacy expectations in March 2012, when it issued a 73-page report, entitled Protecting Consumer Privacy in an Era of Rapid Change.4 Although the report is not legally binding, it provides the best guidance to date as to the types of actions that companies can take to reduce the chances of facing FTC enforcement actions regarding their privacy practices. The following are the principles in the FTC's report:
The FTC followed up on its Privacy Report in February 2013, when the FTC staff issued a report with best practices for mobile app privacy.5 The mobile report contained privacy recommendations for app platforms (e.g., the iTunes store and mobile operating systems) and app developers (the creators of the apps). Although the report does not constitute binding law, it provides a good indication of the FTC's privacy expectations for mobile apps.
In the mobile report, the FTC staff stated that the app platforms should consider the following features for its apps and privacy disclosures:
The FTC recommended that app developers consider the following:
Transparency is perhaps the FTC's most important expectation for companies' privacy practices. The FTC provides companies with tremendous flexibility in determining how to collect, use, and share customers' personal information, but it expects that companies will accurately disclose these practices to consumers. If a company does not disclose a material practice – or, even worse, misrepresents it – the FTC may bring an enforcement action.
For example, in August 2012, the FTC announced a $22.5 million civil penalty against Google for violating a previous privacy-related consent order. The FTC alleged that Google surreptitiously placed advertising tracking cookies in consumers' Safari web browsers. These cookies allowed targeted advertising via Google's DoubleClick advertising network. However, Google informed Safari users that they did not need to take any actions to opt out of targeted advertisements because Safari's default setting effectively prevents such cookies. Typically, the FTC settles privacy and data security cases by entering into consent orders that require a company take specific remedial actions, and it does not fine them initially. However, the FTC imposed the $22.5 million because Google already was operating under a 2011 consent order, arising from alleged misrepresentation of its privacy practices on a social network, Google Buzz. In a statement accompanying the $22.5 million fine, the FTC stated that the penalty “signals to Google and other companies that the Commission will vigorously enforce its orders.”
Similarly, in 2011, the FTC announced a settlement and consent decree with Facebook arising from allegations about the social network's privacy practices. Among other things, the FTC alleged that Facebook made users' friends lists public without providing them with advance warning. Additionally, the FTC alleged that Facebook's “Friends Only” setting still allowed user content to be shared with third-party application used by the users' friends.
Companies that tout their privacy protections in their marketing to consumers must ensure that their products and services live up to those promises. For instance, in May 2014, the FTC reached a settlement with Snapchat, a mobile messaging application that marketed the fact that messages sent via the service would “disappear forever” after a designated time period. The FTC brought the complaint because users could circumvent this requirement, and store the messages beyond the expiration date.
The FTC in recent years has increasingly focused on particularly sensitive data collected by new technologies, such as geolocation. For instance, in December 2013, it announced a settlement with Goldenshores Technologies, the creator of a popular flashlight app for Android. The FTC alleged that the company provided third parties, including ad networks, with users' precise location and unique device identifiers, but it failed to disclose this sharing in its privacy policy. Moreover, the FTC states that when customers first opened the app, they had an option to “accept” or “refuse” the end user licensing agreement, but that this presented a “false choice” because the information already had been sent to third parties after the app was downloaded.
Although the FTC has not enacted specific privacy regulations for all companies, some sectors are legally required to abide detailed privacy regulations. Among the most restrictive is healthcare, owing in part to the sensitive nature of health records.
Chapter 3 described the data security and breach notification requirements of the Health Insurance Portability and Accountability Act (HIPAA), enforced by the U.S. Department of Health and Human Services.6 Also under HIPAA, the Department has adopted a Privacy Rule, which limits the ability of health plans, healthcare providers, healthcare clearinghouses, and their business associates to use and disclose “protected health information,” which is information that relates to an individual's physical or mental health or condition, the provision of healthcare to the individual, or the individual's payments for healthcare.7 The Privacy Rule only applies if the information identifies the individual; it does not apply to the use or disclosure of de-identified information.
The Privacy Rule allows covered entities to use and disclose protected health information:
If the use or disclosure is not explicitly covered by one of the exemptions above, the covered entity or business associate is required to obtain the individual's written authorization, which specifically allows the use and disclosure. For instance, in order for a healthcare provider to be permitted to use an individual's protected health information for marketing purposes, the written authorization must explicitly give permission to use the data for marketing.9
When covered entities and business associates use or disclose protected health information, they typically must make “reasonable efforts” to only use or disclose the “minimum necessary” information for the intended purpose. In other words, if a health insurer needs the healthcare provider's records of a patient's most recent physical in order to process its payment, the healthcare provider should not provide the insurer with records from the patient's ten most recent visits. The “minimum necessary” limit does not apply in a few select cases – notably if needed for treatment, disclosed to the individual who is the subject of the information, or if disclosed under an authorization by the individual.10
Covered entities also must fulfill a number of administrative requirements under the Privacy Rule. They must designate a privacy official responsible for implementation of its privacy policies and procedures. Covered entities also must train all of their employees regarding protected health information.11 HIPAA further imposes a number of data security requirements, which are discussed in Chapter 3.
The HIPAA Privacy Rule also requires covered entities to provide consumers with “adequate notice of the uses and disclosures of protected health information that may be made by the covered entity, and of the individual's rights and the covered entity's legal duties with respect to protected health information.”12 If the covered entity has a direct treatment relationship with the individual, it must make a good-faith effort to obtain a written acknowledgment that the customer has received the privacy notice.13
In addition to restricting the use and disclosure of protected health information, HIPAA provides individuals with a relatively broad right of access to their information.14 Individuals do not, however, have a right to access psychotherapy notes or information that is compiled in reasonable anticipation of, or for use in, a civil, criminal, or administrative action or proceeding.15
As with health information, nonpublic financial data also receives special protection under U.S. law. The Gramm-Leach-Bliley Act (GLBA), whose data security requirements were discussed in Chapter 3, also imposes privacy requirements on financial institutions. GLBA's privacy requirements, known as the Privacy Rule, generally are less burdensome than the HIPAA requirements that healthcare providers face, owing in part to the greater sensitivity of healthcare data.
GLBA imposes two general requirements: notice and choice. Under the notice requirement, a financial institution generally must provide customers with privacy notices at the time that the relationship with the customer is formed and at least once a year after that.16 The notices must provide “clear and conspicuous disclosure” of the institution's privacy practices, including its policies for disclosing nonpublic personal information to nonaffiliated third parties, the categories of persons to whom the information may be disclosed, the categories of nonpublic personal information that the financial institution collects, the institution's confidentiality and security policies, and other disclosures.17 Financial regulators have developed model privacy notices that could satisfy this requirement.18 In 2014, the Consumer Financial Protection Bureau adopted a new regulation that allows certain institutions to satisfy this requirement by posting the notices online, provided that they do not share nonpublic personal information with unaffiliated third parties.19
GLBA's Privacy Rule also requires financial institutions to allow users to choose whether to permit certain types of information sharing. If the financial institution is sharing information with a nonaffiliated third party that is performing services for the financial institution (e.g., marketing the institution's services), the institution does not need to provide the user with choice before sharing the data. However, if the financial institution intends to disclose nonpublic personal information to nonaffiliated third parties for other purposes (e.g., to market another company's services), the institution first must clearly and conspicuously notify the individual of the planned sharing and provide the individual with an opportunity to opt out before the institution shares the information.20
A California law imposes more restrictive choice requirements on financial institutions. The California Financial Information Privacy Act, also known as SB-1,21 requires companies to receive opt-in consent from consumers before sharing their data with most unaffiliated third parties (unless the sharing is necessary to provide the financial services to the customers). This opt-in requirement is significantly more restrictive than GLBA's opt-out requirement because opt-in requires the customer to provide explicit consent before the sharing occurs. In contrast, under the opt-out system, if a customer does nothing after receiving notice, the information sharing is permitted. The California Financial Information Privacy Act also restricts financial institutions' ability to share information with affiliated entities. To do so under the California law, customers must obtain opt-out consent. This also is more restrictive than the GLBA Privacy Rule, which does not restrict financial institutions' sharing of data among affiliated companies.
In the early 2000s, as email was becoming an important component of business and personal lives, policy makers focused on the increasing volume of junk “spam” email messages that were flooding inboxes around the nation. States began to develop their own patchwork of anti-spam laws, and in 2003, Congress passed a single national restriction on spam, the Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003 (CAN SPAM Act).22 The law is enforced by the Federal Trade Commission and Federal Communications Commission. The statute has been criticized by some consumer groups because it preempts more stringent state laws and prevents consumers from bringing private lawsuits against spammers.
Among the key requirements of the CAN SPAM Act are the following:
Companies that violate the CAN SPAM Act can face FTC penalties of up to $16,000 per email.
The Video Privacy Protection Act (VPPA),25 passed in 1988 in an effort to protect the privacy of video cassette rental information, has had a surprisingly large impact on the data processing abilities of websites and apps that deliver online video content.
Congress passed the VPPA in response to a newspaper's publication of the video rental records of Judge Robert Bork, who had been nominated to the United States Supreme Court. The VPPA prevents “video tape service providers” from knowingly disclosing an individual's personally identifiable video requests or viewing habits, unless the individual has provided informed, written consent. The requirement is rather broad, though it contains a few exceptions, including disclosures that are incidental to the ordinary course of business for the service provider (debt collection activities, order fulfillment, request processing, and the transfer of ownership) and to law enforcement under a warrant, subpoena, or court order. Companies may use opt-out consent to share only customers' names and addresses, providing that their video viewing information is not disclosed.
Why should websites and apps be concerned about a law that restricts the ability of “video tape service providers” to share information? The statute rather broadly defines “video tape service providers” as “any person, engaged in the business, in or affecting interstate or foreign commerce, of rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials[.]”26 This definition is broad enough to encompass not only video rental stores, but websites and apps that provide video (whether it be streaming movies, television shows, or news clips). For instance, in 2012, a federal court ruled that Hulu's online movie streaming is covered by the VPPA because Hulu provides “similar audio visual materials.”27
VPPA disputes often arise when websites or apps provide individually identifiable video viewing information to third-party analytics companies. Unless companies can convince a court that they are not covered by the VPPA or an exception applies, they must obtain a very specific form of consent from the consumer in order to share the data. The VPPA requires the request for consent be “separate and distinct” from any other legal or financial notice. For instance, to obtain a consumer's consent online, the provider must use a separate online pop-up ad seeking consent to disclose video viewing information, and the customer must take an affirmative act, such as clicking “I agree.” The notice may not be buried in a larger privacy policy or terms of service. Once a website or app obtains consent, it may share video viewing information for two years, or until the consumer revokes consent.
Companies have good reason to care about compliance with the VPPA. The statute allows damages of at least $2500 per violation. This large amount makes the VPPA a particularly attractive tool for class action plaintiffs' lawyers. Imagine that a newspaper's website shared the video viewing information of 100,000 registered users with its online analytics provider and did not obtain proper consent. A VPPA class action lawsuit could recover $250 million. For this reason, it is important that companies take extra precautions to ensure that they obtain adequate consent before sharing video viewing information.
A number of states have enacted statutes that are similar to the VPPA. One notable state law is the Michigan Video Rental Privacy Act, which is broader than the VPPA. The Michigan law restricts information sharing by companies that are “engaged in the business of selling at retail, renting, or lending books or other written materials, sound recordings, or video recordings.”28 In 2014, a federal court ruled that this includes not only online video providers but also magazines that share information about their subscribers.29 Accordingly, at least for Michigan subscribers, companies must be careful about sharing subscriber information not only for videos but for virtually all forms of online content.
The Children's Online Privacy Protection Act (COPPA)30 restricts the online collection of personal information from minors who are under 13 years old. The Federal Trade Commission has promulgated regulations31 under COPPA and enforces the law.
COPPA applies to two types of website and online services: (1) those that are directed to children under 13 and (2) those that have actual knowledge that they are collecting or maintaining information from children under 13. To determine whether a website or online service is directed to children under 13, the FTC's regulations state that the Commission considers:
Websites and online services that are covered under COPPA must provide clear notice on their sites about the information that they collect from children, how they use the information, and their disclosure practices for this information. The websites and services must obtain “verifiable parental consent” before collecting, using, or disclosing any personal information from children under 13. The FTC broadly defines personal information as including:
To obtain verifiable parental consent, the regulations state, websites and online services must use methods that are “reasonably calculated, in light of available technology, to ensure that the person providing consent is the child's parent.”34 Included among the examples of such methods listed in the regulations are:
If a website or online service is collecting personal information only for internal operations, and will not disclose it to any outside party, it also can obtain parental consent using the “email plus” method, in which the parent provides consent via a return email message, provided that the website or online service also take an additional confirming step, either (1) requesting in the initial message that the parent include a phone number or mailing address to which the operator send a confirming phone call or letter, or (2) after a “reasonable” delay, the operator sends another email confirming consent.36
Even if a website or online service has obtained verifiable parental consent, it must provide parents with an ongoing opportunity to access the personal information collected from their children, delete the information, and prevent further use or collection of the information. The websites and online services also are required to maintain the confidentiality, security, and integrity of information that they collect from children, though the regulations do not specify particular data security safeguards that the companies must enact.
If the FTC determines that a website or online service has violated COPPA, it can bring an enforcement action in court, seeking damages of up to $16,000 per violation. The largest COPPA enforcement action to date resulted in a $3 million settlement. Playdom, which develops online multi-player games, operated a website called Pony Stars, a virtual online world directed to children. The FTC alleged that from 2006 to 2010, hundreds of thousands of children registered for Pony Stars, even though Playdom did not properly obtain verifiable parental consent, nor did it properly post a COPPA-compliant privacy policy.
Some of the most stringent online privacy laws were adopted not by Congress but by the California legislature. Although the laws only apply to California residents, they have become de facto requirements for most companies that conduct business in the United States. In addition to the California Financial Information Privacy Act, described above, California has imposed requirements on companies via the California Online Privacy Protection Act (CalOPPA), the California Shine the Light law, and the California “Eraser Button” law.
Until 2004, many U.S. websites did not contain privacy policies. The general rule had been that companies do not need to post a privacy policy, but if they did, the policy must accurately reflect the company's data processing practices. This rule changed in 2004, when CalOPPA went into effect. The statute requires all operators of commercial websites or online services that collect personally identifiable information about California customers to “conspicuously” post a privacy policy.37 The privacy policy must, at minimum, contain the following elements:
CalOPPA defines personally identifiable information as individually identifiable information including:
Since it went into effect, CalOPPA has effectively set a nationwide requirement that all companies post a privacy policy describing how they handle customers' personal information. The California Attorney General aggressively enforces the policy, and in recent years has taken the position that CalOPPA also requires mobile apps to post privacy policies.
In 2005, the California Shine the Light law40 went into effect, adding additional privacy requirements for websites. The statute applies to businesses that have established business relationships with customers and have disclosed their personal information in the past calendar year to third parties for direct marketing.
The following are the categories of personal information under the California Shine the Light law:
Upon request from a customer, a business that is covered by this statute must provide the following information to the customer, for free:
Businesses that are required to comply with the California Shine the Light law must designate mailing and email addresses (or, at their discretion, a toll-free phone or fax number), to which customers may direct requests for information. Businesses must take one of the following steps to ensure compliance with the law:
Businesses also may comply with this requirement by stating in their privacy policy that either (1) they do not disclose personal information of customers to third parties for the third parties' direct marketing purposes unless the customer opts in or (2) that the business will not disclose personal information to third parties for the third parties' direct marketing purposes if the customer opts out (provided that the customer is notified of this right and provided with a cost-free method to exercise that right).
Companies that receive requests under the Shine the Light law must respond within 30 days. If the request is received in a manner other than the designated addresses or phone numbers, the business generally must respond within 150 days.
Businesses with fewer than twenty full-time or part-time employees are exempt from the California Shine the Light law, and businesses are not required to respond to a single customer more than once per calendar year.
California's later endeavor into privacy law went into effect in 2015. Known as the “eraser law,”42 the statute imposes a number of restrictions on websites, online services, and apps that are directed to minors. Unlike the federal COPPA, which only applies to minors under 13, the California law applies if the website, service, or app is targeted at minors under 18.
A website, service, or app is considered to be “directed to minors” and therefore covered by the statute if it “is created for the purpose of reaching an audience that is predominately comprised of minors, and is not intended for a more general audience comprised of adults.”43 The statute is known as the “eraser law” because it provides minors with a limited ability to request the removal of certain information.
The statute requires covered websites, services, and apps to allow minors who are registered users to request and obtain removal of content and information that the minor posted on the service. The sites must notify minor registered users of the instructions to remove the data.44
Covered websites, services, and apps are not required to remove content or information under any of the following circumstances:
This statute received a great deal of media attention because it allows users to request the removal of certain content. However, the right is limited. First, it only applies if the minor was a registered user, and it only covers content that the minor provided. If, for example, the minor's friend posted personal information about the minor on a social media site, the minor would not have a right to request removal.
Less discussed in the media coverage, but perhaps more significant, are the restrictions that the statute places on online marketing. It prohibits covered websites, services, and apps from marketing the certain categories of products and services:
The Illinois Biometric Information Privacy Act46 is being increasingly used by plaintiffs' lawyers to limit online services' use of facial recognition and other new technologies. The statute prohibits companies from obtaining or disclosing “biometric identifiers or biometric information” unless the companies first obtain the individuals' opt-in consent.
The statute broadly defines “biometric identifier” to include “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” The statute excludes a number of types of information from the definition of “biometric identifier,” including photographs, writing samples, and physical descriptions of individuals. The statute defines “biometric information” as “any information, regardless of how it is captured, converted, stored, or shared, based on an individual's biometric identifier used to identify an individual.”
Private parties can bring lawsuits under the statute for $5000 per violation. As with other statutes, this can lead to significant, bet-the-company damages if a plaintiff brings a class action lawsuit on behalf of thousands of customers.
The statute received significant attention in May 2016 when a federal judge refused to dismiss a class action lawsuit under the statute against Facebook. The plaintiffs claimed that Facebook violated the Illinois law with its “Tag Suggestions” program, in which Facebook scans photos uploaded by users and uses facial recognition to suggest that the users tag the photo subjects by name. Facebook moved to dismiss the lawsuit, claiming that the statute does not apply because it explicitly states that it does not cover photographs. The court disagreed and denied the motion to dismiss, reasoning that Facebook's facial recognition technology constitutes a “scan of face geometry,” which is covered by the statute. The court reasoned that the exclusion for photographs is “better understood to mean paper prints of photographs, not digitized images stored as a computer file and uploaded to the Internet.”47
The Facebook decision was significant because it broadly applies the Illinois law to facial recognition technologies. Companies must ensure that they obtain adequate consent before using facial recognition or other new technologies, or they could find themselves on the hook for significant penalties under the Illinois law.
3.14.136.121