CHAPTER 3

Privacy Operational Lifecycle: Assess

In this chapter, you will learn about

•   Baselining a privacy program to gauge future progress

•   Assessing service providers as a part of a third-party risk management program

•   Performing physical assessments of work centers and processing centers

•   Protecting physical records, devices, and media

•   Dealing with mergers, acquisitions, and divestitures

•   Incorporating privacy impact assessments into business processes

This chapter covers Certified Information Privacy Manager job practice III, “Privacy Operational Lifecycle: Assess.” The domain represents approximately 22 percent of the CIPM examination.

“Trust but verify” is a Russian proverb that is commonly used by privacy and cybersecurity industry professionals. The complexity of information processing and management, which includes layers of underlying business processes and information systems, invites seemingly minor changes that can bring disastrous consequences.

This chapter covers a variety of privacy assessment and privacy management topics:

•   Baseline assessments of privacy programs help privacy leaders understand the initial state of an organization’s privacy program so that progress can be more easily measured over time.

•   Third-party risk management, or TPRM, is a business practice of growing concern, given that the majority of organizations now outsource a growing proportion of IT services to outside organizations.

•   Physical assessments include reviews of work center and processing center protective measures and environmental controls, as well as document protection, media destruction, and device and media security.

•   Mergers, acquisitions, and divestitures involve numerous considerations on a privacy program that privacy leaders need to influence and manage.

•   Privacy impact assessments need to be incorporated into existing business change processes to ensure continuous and effective compliance with privacy laws and other legal obligations. The chapter includes discussions of privacy threats, vulnerabilities, and countermeasures.

Monitoring and auditing of a privacy program are covered in Chapter 5.

Privacy Program Baseline

It can be difficult to know whether we have made progress unless we know where we began. When building, reinvigorating, or improving a privacy program, the act of developing a baseline will result in an important business record that will help privacy leaders and management understand in tangible terms the progress that has been made since the baseline was created. The development of a baseline involves documenting the current state of a program so that a later analysis of a future state will highlight the progress made. Privacy programs need to make progress to keep up with rapidly emerging privacy laws and changing societal norms.

The remainder of this section is focused on the functional areas of a privacy program that should be included in a baseline.

Process Maturity

When baselining a program of any kind, whether privacy, security, information management, or other programs, the organization must include the concept of process maturity as an important measurement tool. The maturity of a process gauges its health on a chaos–order continuum: maturity is a measure of how organized a process is, whether it is performed consistently, whether it is documented, whether it is measured, and whether measurements are examined from time to time to make improvements in the process.

Maturity is not the only measure to be used, however, and maturity can be misleading if we do not understand what each process is intended to achieve. A process can be well designed and highly mature, but its absence outside of its intended scope could bring serious harm to the organization if its scope is too narrow. For example, an organization may have a nicely designed and managed security awareness program for headquarters employees that the remote salesforce is not required to use. Or a business continuity program may be too confined and may not include parts of the organization that are more vital than realized.

The gold standard for process maturity is the Capability Maturity Model Integration (CMMI), originally developed at Carnegie Mellon University and now owned by ISACA. CMMI is discussed in detail in Chapter 1.

Baselining Program Elements

The following functions should be examined and documented when baselining an organization’s privacy program:

•   Education and awareness Examine privacy and security training at each employee’s time of hire and periodically after that. See if training includes competency scores and if a minimum score is required. Determine whether training is required before an employee is granted access to business applications. Look for a variety of messaging on privacy and security awareness such as e-mails, intranet pages, posters, and management reiterating the importance of privacy and security in the organization.

•   Monitoring regulatory developments and incorporating change Determine whether the corporate legal team has subscription or advisory services to alert them to new laws and developments in privacy and security. See if the legal team has formally documented all of the laws and regulations the organization is required to comply with. Sometimes, smaller legal teams will retain outside counsel on specific subject matter, including cybersecurity and privacy.

•   Internal compliance to policy Determine whether the organization tracks the organization’s compliance with internal and external privacy policies. Further, it is important to know whether the privacy policy is compliant with applicable privacy laws, regulations, standards, and other obligations such as legal agreements with customers and other organizations. See if policy compliance is tracked and deficiencies are followed up.

•   Data management practices Determine whether the organization has data classification policy and handling procedures, and whether the workforce is aware of these procedures. See if there are any automatic controls such as data loss prevention (DLP) that scan, monitor, or intervene in data storage and data movement activities warranting action. Determine whether procedures are in place to follow-up with data movement alerts. Identify whether the organization maintains data inventories and data flow diagrams (DFDs).

•   Risk management and risk assessments See if the organization has a formal privacy and security risk management program, including the performance of risk assessments to identify privacy and cybersecurity risks. Also, look for a risk register and a risk treatment process with business records to see who is making business-level decisions about privacy and security and whether risk assessment findings are properly addressed.

•   Incident response and remediation Determine whether the organization has a formal privacy and security incident response process that includes severity levels, escalations, recordkeeping, after-action reviews, periodic tabletop testing, and training. Examine the incident record to understand the history of privacy and security incidents in the past. The absence of an incident record often suggests that the organization is not equipped to recognize and respond even to minor incidents.

•   Audits See if there have been internal or external audits of the organization’s privacy and security programs. If so, determine who was apprised of the results and whether significant issues were remediated and confirmed. See how often audits occur and whether they are performed by qualified (such as CISA [Certified Information Systems Auditor] or ISO 27001 Lead Auditor–certified) personnel. Determine the frequency and scope of audits and whether they have adequately revealed weaknesses in the organization’s privacy and security programs.

•   Staff competence and capability Identify all of the organization’s privacy and security staff and active stakeholders and participants in existing business processes to understand the organization’s ability to operate sound programs. Privacy and security programs cannot succeed without qualified and competent staff.

•   IT service management Benchmark the IT organization’s service management processes to understand the integrity of IT business processes and systems. An effective privacy program requires a sound cybersecurity program, which depends upon the integrity of IT service management.

•   Business continuity and disaster recovery planning Determine whether the organization has business continuity and/or disaster recovery planning programs. If so, see if they adequately address privacy and security issues so that contingency plans do not compromise privacy or security compliance requirements. See how often a business impact analysis (BIA) is performed and how frequently it is updated.

•   Program metrics and reporting Investigate the metrics gathered, analyzed, and reported in the organization’s privacy and security programs. Determine to whom reporting is delivered, notably what is reported to the board of directors.

A program baseline can be used as a basis for a compliance gap analysis between the organization’s current state and requirements in applicable privacy regulations, industry practices, and customer and societal expectations. As detailed in Chapter 2, the privacy leader can develop roadmaps to close the capability gaps to bring the program to the desired state.

Third-Party Risk Management

Third-party risk management (TPRM) refers to activities used to discover and manage risks associated with external organizations performing operational functions for an organization. Many organizations outsource some of their information processing to third-party organizations, often in the form of cloud-based software as a service (SaaS) and platform as a service (PaaS), and often for economic reasons: it is less expensive to pay for software in a leasing arrangement than to develop, implement, integrate, and maintain software internally. Similarly, many organizations prefer to lease server operating systems using infrastructure as a service (IAAS) versus purchasing their own hardware.

TPRM involves the extension of techniques used to identify and treat privacy and security risk within the organization. The same risks present in third parties’ services are present within an organization’s processing environment. The discipline of third-party risk exists because of the complexities associated with identifying risks in third-party organizations, as well as risks inherent in doing business with specific third parties. At its core, TPRM is similar to other risk management; the difference lies in acquiring relevant information to identify risks outside of the organization’s direct control.

Images

NOTE    Organizations lacking a mature TPRM program should implement a process that asserts both security and privacy requirements to relevant service providers.

Cloud Service Providers

Organizations moving to cloud-based environments often assume that their cloud service providers have taken care of many or all information security functions, when generally this is not the case at all. This often results in security and privacy breaches, because each party believes that the other was performing critical data protection tasks. Many organizations are unfamiliar with the shared responsibility model that delineates which party is responsible for specific operations and security functions. Tables 3-1 and 3-2 depict shared responsibility models in terms of operations and security, respectively.

Images

Table 3-1 IT Operational Shared Responsibility Model

Images

Table 3-2 Security and Privacy Shared Responsibility Model

Images

NOTE    The specific responsibilities for operations and security between an organization and any specific service provider may vary somewhat from these tables. It is vital that an organization clearly understand its specific responsibilities for each third-party relationship, so that no responsibilities that may introduce risks to the organization are overlooked or neglected.

The privacy officer should recognize that third-party service providers generally play little or no role in the data-handling aspect of privacy. These functions are wholly the responsibility of the organization using the third-party services, not the third party itself.

TPRM has been the subject of many standards and regulations that compel organizations to be proactive in discovering any security risks present in the services provided by critical third parties. Historically, many organizations were not voluntarily assessing these third parties. Statistical data about breaches over several years has revealed that more than half of all breaches have a nexus in third parties. This statistic has illuminated the magnitude of the third-party risk problem and has resulted in the enactment of laws and regulations in many industries that now require organizations to build and operate effective third-party risk programs in their organizations.

Privacy Regulation Requirements

In GDPR parlance, organizations that use third-party service providers are often, but not always, considered data controllers, which are entities that determine the purposes and means of the processing of personal data and that can include directing third parties to process personal data on their behalf. The third parties that process data on behalf of data controllers are known as data processors. The CCPA uses the terms business and service provider like GDPR’s data controller and data processor, respectively. Increasingly, organizations not subject to GDPR or CCPA are also using these and similar terms to identify roles and expectations in contractual relationships.

HIPAA requires that covered entities (organizations subject to HIPAA regulations) establish a business associate agreement (BAA) with every service provider with access to the covered entity’s information or information systems. Sarbanes–Oxley requires that organizations perform up-front and periodic due diligence on financially relevant service providers.

TPRM Life Cycle

The management of business relationships with third parties is a life-cycle process. The life cycle begins when an organization contemplates using a third party to augment or support the organization’s operations. The life cycle continues during the third party’s ongoing relationship and concludes when the organization no longer uses the third party’s services: all connections are severed, and all data stored at the third party is removed or destroyed.

Initial Assessment

Before establishing a business relationship with a third party, an organization will assess and evaluate the third party for suitability. Often this evaluation is competitive, where two or more third parties are vying for the formal relationship. During the evaluation, the organization will require that each third party provides information describing its services, generally in a structured manner through a request for information (RFI) or a request for proposal (RFP) process.

In their RFIs and RFPs, organizations often include sections on privacy and security to help determine how each third party protects the organization’s information. This, together with information about the services themselves, pricing, and other information, reveals details that the organization uses to select the third party that will provide services.

Legal Agreement

Before services can begin, the organization and the third party will negotiate a legal agreement that describes the services provided, along with service levels, quality, pricing, and other terms included in typical legal agreements. Based on the details discovered in the assessment phase, the organization can develop a section in the legal agreement that addresses privacy and security and typically covers these subjects:

•   Privacy and/or security program Requires the third party to have a formal privacy and/or security program, including but not limited to governance, policy, risk management, annual risk assessment, internal audit, vulnerability management, incident management, secure development, privacy and security awareness training, data protection, and third-party risk.

•   Security and/or privacy controls Require the third party to have a controls framework, including linkages to risk management and internal audit.

•   Vulnerability assessments Require the third party to undergo penetration tests or vulnerability assessments of its service infrastructure and applications, performed by a competent security professional services firm, with reports made available to the organization upon request.

•   External audits and certifications Require the third party to undergo annual SOC 1 and/or SOC 2 Type 2 audits (SOC stands for System and Organization Controls), TrustArc audits, ISO/IEC 27001 certifications, HITRUST certifications, Payment Card Industry Reports on Compliance (PCI ROCs), or other industry-recognized and applicable external audits, with reports made available to the organization upon request.

•   Privacy and security incident response Requires the third party to have a formal privacy and security incident capability that includes testing and training.

•   Privacy and security incident notification Requires the third party to notify the organization within a specific timeframe in the event of a suspected and confirmed breach—typically, 24–48 hours. The language around “suspected” and “confirmed” needs to be developed very carefully so that the third party cannot sidestep this responsibility.

•   Right to audit Requires the third party to permit the organization to conduct an audit of the third-party organization without cause. If the third party does not want to permit this, one fallback position is to insist on the right to audit in the event of a suspected or confirmed breach or other circumstances. Further, include the right to have a competent security professional services firm perform an audit of the third-party privacy and security environment on behalf of the organization (useful for several reasons, including geographic location; the external audit firm will be more objective).

•   Periodic review Requires the third party to permit an annual review of its operations, privacy, and security. This can improve confidence in the third party’s privacy and security.

•   Third-party disclosures Require the third party to list any contracted parties it uses to perform services along with a suitable third-party due diligence process.

•   Annual due diligence Requires the third party to respond to annual questionnaires and evidence requests as a part of the organization’s third-party risk program.

•   Cyber insurance Requires that the third party carry a cyber-insurance policy with minimum coverage levels. Require that the third party comply with all policy requirements so that the policy will pay out in the event of a privacy or security event. A great option is to have the organization be a named beneficiary on the policy, in the event of a widespread breach that could result in a large payout to many customers being diluted.

Organizations with many third parties may consider developing standard privacy and security clauses that include all of these provisions. When a new third-party service is being considered, the organization’s privacy and security teams can perform their upfront examination of the third party’s privacy and security environment and then make adjustments to the privacy and security clauses as needed.

During the vetting process, organizations will often find one or more shortcomings in the third party’s privacy or security program that the third party is unwilling or unable to remediate right away. There are still options, however: the organization can compel the third party to enact improvements in a reasonable period after starting the business relationship. For example, a third-party service provider may not have had an external audit such as a SOC 1 or SOC 2 audit, but it may agree to undergo such an audit one year later. Or a third-party service provider that has never had external penetration testing could be compelled to begin testing at regular intervals. Alternatively, the third party could be required to undergo a penetration test and remediate all critical- and high-level issues before the organization will begin using the third party’s services.

Classifying Third Parties

Organizations utilizing third parties often discover a wide range of risks: Some third parties may have access to large volumes of operationally critical or personal information. In contrast, others may have access to small volumes of personal information. Still others do not access data associated with critical operations at all. Because of this wide span of risk levels, many organizations choose to develop a scheme consisting of risk levels based on criteria important to the organization. Typically, this risk scheme will have two to four risk levels, with a level assigned to each third party.

Organizations need to assess their third parties periodically to ensure that they remain at the right classification level. Third parties that provide a variety of services may initially be classified as low risk. However, in the future, if the third party is retained to provide additional services, this could result in reclassification at a higher level of risk.

The purpose of this classification is explained in the following sections on questionnaires and assessing third parties.

Questionnaires and Evidence

Organizations that utilize third parties need to assess those third parties periodically. Generally, this consists of creating and sending a privacy and/or security questionnaire to the third party, with the request to answer all of the questions and return to the organization within a reasonable amount of time. The organization may choose not to rely simply on the answers provided, but may also request that the third party furnish specific artifacts that serve as evidence that support the responses in the questionnaire. Here are some typical artifacts that an organization will request of its third party:

•   Privacy and security policies

•   Privacy and security controls

•   Privacy and security awareness training records

•   New-hire checklists

•   Details on employee background checks (not necessarily actual records but a description of the checks performed)

•   Nondisclosure and other agreements signed by employees (not necessarily signed copies, but blank copies)

•   Vulnerability management process

•   Secure development process

•   Copy of general insurance and cyber-insurance policies

•   Incident response plan and evidence of testing

An organization that uses many third parties may find that it utilizes various services: some store or process large volumes of personal or critical data, others are operationally critical but do not access personal information, and other categories. Often it makes sense for an organization to utilize different versions of questionnaires, one or more for each category of third party, so that the majority of questions asked of each third party are relevant. Organizations that don’t utilize different questionnaires risk having large portions of some questionnaires being irrelevant, which could be frustrating to third parties that would rightfully complain of wasted time and effort.

As described earlier on the classification of third parties, organizations often use different questionnaires according to third parties’ risk levels. For example, third parties in high-risk categories would be asked to complete very extensive questionnaires that include requests for many pieces of evidence on a regular basis, typically at least annually. In contrast, third parties of medium risk would receive shorter questionnaires and on a less frequent basis, and low-risk third parties would receive very short questionnaires on a much less frequent basis. Although it is courteous to send questionnaires of appropriate length to various third parties (mainly to avoid overburdening low-risk third parties with huge questionnaires), remember that this practice also increases the organization’s burden, since someone has to review the questionnaires and attached evidence. An organization that uses the services of hundreds of third parties does not want to overburden itself with the task of analyzing hundreds of questionnaires, each with hundreds of questions, when most of the third parties are lower risk and warrant shorter questionnaires.

Assessing Third Parties

To discover risks, organizations need to assess their third parties, not only at the onset of the business relationship (before the legal agreement is signed, as explained earlier), but periodically after that. Business conditions and operations often change over time, necessitating that third parties be assessed throughout the timespan of the relationship.

Organizations assessing third parties often recognize that IT, privacy, and security controls are not the only forms of risk that require examination. Organizations generally will seek other forms of information about its more critical third parties, including

•   Financial risk

•   Geopolitical risk

•   Inherent risk

•   Recent security breaches

•   Lawsuits

These and other factors can influence the overall risk to the organization and manifest in various ways, including degradations in overall privacy and security, failures to meet production or quality targets, and even failure of the business.

Because of the effort required to collect information on these other risk areas, organizations often rely on outside service organizations that collect information on companies and make it available on a subscription basis. Of course, these are also third-party organizations that require an appropriate measure of due diligence.

Risk Mitigation

When assessing third parties, organizations that carefully examine the parties’ information often discover some unacceptable aspects. In these cases, the organization will analyze the issues and decide on a course of action.

For instance, suppose a highly critical third party indicates that it does not perform annual privacy and security awareness training for its employees and the organization finds this unacceptable. To remedy this, the organization needs to analyze the risk (in a manner not unlike any risk found internally) and decide on a course of action. In this example, the organization contacts the third party and attempts to compel them to institute annual privacy and security awareness training for their employees.

Sometimes, a deficiency problem in a third party is not easily solved. For example, suppose a third party providing services for many years indicates in its annual questionnaire that it does not employ encryption of stored personal information. At the onset of the third-party business relationship, this was not a common practice, but it has become a common practice in the organization’s industry over time. The service provider, when confronted with this, explains that it is not operationally feasible to implement encryption of personal information in a manner acceptable to the organization, mainly for financial reasons. Because of the significant impact of cost on its operations, the third party would have to increase its prices to cover these costs. In this example, the organization and the third party would need to discover the most pragmatic course of action to be satisfied with the level of risk and control costs.

Metrics and Reporting

A mature TPRM program will include several operational metrics representing a measure of activities performed within the program. Perhaps the program will also include one or more key risk indicators (KRIs) used to identify trends in risks among key service providers.

As a part of a privacy and security program that manages risk, information from the TPRM program should also flow into the organization’s overall risk reporting and any dashboards used for management reporting. After all, third-party service providers are extensions of the organization’s business operations.

Physical Assessments

Privacy and security program practices include periodic assessments to identify whether controls are effective and whether the organization is compliant with its policies. The purpose of such assessments is to identify operational risks that could lead to compliance issues and incidents. A variety of assessments used in privacy programs is discussed in this section.

Assessing Processing Centers and Work Centers

In the context of a privacy program, assessments of processing centers and work centers are performed to understand controls that—directly or indirectly—contribute to the protection of personal information. Areas of interest in these assessments include the following:

•   Access controls Processing centers and work centers should have access controls in place so that only authorized personnel can enter these facilities. Controls often used include keycard entry systems. Additional entry controls such as biometrics or PIN pads are used for higher security areas such as processing centers. All facilities should have formal visitor procedures that record each visitor, their purpose for visiting, and the name of the work personnel responsible for them.

•   Surveillance Work centers and processing centers often employ video surveillance to observe facility ingress and egress points. Processing centers often include video surveillance to observe activities where data processing equipment is located. Modern video surveillance systems include continuous or motion-activated recording, with recordings retained for 90 days or more. Although facial recognition is a controversial capability that is disallowed in some jurisdictions, some organizations may use it.

•   Hazards Processing center policy often requires that flammable materials, such as equipment packaging, be removed from facilities to reduce fire risk.

•   Clean desk/screen Work and processing centers frequently have “clean desk” and “clean screen” policies that forbid workers from leaving sensitive information in printed form on desks and work surfaces and/or viewable on screens and monitors. Workers in higher traffic areas are often equipped with screen privacy filters to reduce the risk of onlookers viewing sensitive information on displays. Related controls include automatic screen locking and shred bins.

•   Environmental controls Processing center equipment requires finely tuned environments with temperature and humidity within a narrow range to ensure prolonged equipment life and the absence of equipment failure due to overheating, static discharge, or condensation. HVAC systems require maintenance and testing that should be documented. Processing centers also require continuous clean power that often includes the use of one or more UPSs (uninterruptible power supplies) and electric generators. These power systems require periodic maintenance and testing that should all be reflected in detailed operational records.

In addition, the cleanliness and orderly placement of furniture and equipment contribute to an overall sense of control and organization in work centers and processing centers. Skilled auditors often look for these less tangible attributes when examining these locations.

Document Storage

Organizations that continue to work with paper records (and those that have switched to electronic records in the past ten years) often have a document management system that functions as a working inventory of paper records. These inventories provide information that can include but is not limited to the types of records being retained, the retention time frame for records, destruction date, where they may be located, and how they should be protected from damage and unauthorized access. All of the preceding should be documented in terms of procedures and records.

Organizations that have paper records often assign one or more employees the role of records manager, who is responsible for maintaining the inventory of paper records, retrieves records when needed, and discards records that have reached their retention end of life. A records manager is also responsible for ensuring that only authorized personnel are permitted to access paper records and ensures appropriate measures are in place to protect paper records from damage from threats such as water or fire.

Document storage controls include periodic reviews to ensure that actual physical records align with inventory records and that documents continue to be properly protected.

Document and Media Destruction

Organizations retaining sensitive and personal information usually have a formal records retention schedule that stipulates the maximum time that various records are retained. When specific records exceed their storage period, those records are discarded. Because such information is often sensitive, an organization will employ secure means for destroying records, so that no one can reconstitute them. Destruction techniques for paper records usually involve shredding, although burning and pulping are sometimes used. Electronic media is generally erased, degaussed, or shredded.

Many organizations utilize shredders for the immediate destruction of sensitive paper records, or they secure shred bins in work centers so that personnel can dispose of printed matter containing sensitive information. Often, external document destruction services periodically collect shred bins’ contents and shred these materials onsite or at a centralized location. Personnel should be directed to use shred bins for the disposal of all printed matter, as well as data storage media such as optical discs.

Organizations disposing of older computers and devices should have detailed procedures that outline secure data wipe methods and proper tracking concerning the destruction of sensitive and personal information on internal hard drives, solid-state drives, and optical discs. Data stored in office equipment such as copiers and printers should be wiped using secure data wipe methods as well.

Detailed document and media destruction records are maintained by a records manager or custodian.

Device Security

Because so much of an organization’s information is in electronic form, considerable attention to the security of devices and media containing information is required. Several use cases are discussed here:

•   Desktop computers The use of desktop computers should be controlled to prevent the loss of sensitive information caused by device loss or attack. Information stored on desktop computers should be encrypted, rendering stored information inaccessible in the case of theft. Desktop computers should be equipped with capabilities to observe and control the use of sensitive information.

•   Laptop computers The use of laptop computers should be controlled to prevent the loss of sensitive information through device loss or compromise. All information stored on laptop computers should be encrypted. Laptops should include the full range of security and privacy controls so that the storage and use of personal information are observable and controllable and prevent laptops from attack and compromise. Workers assigned laptops should be required to safeguard them from loss and compromise.

•   Mobile devices The use of mobile devices for conducting company business should be controlled so that sensitive information cannot be revealed through device loss or compromise. Often, this involves using a mobile device management (MDM) system used to monitor mobile devices, control the storage and use of sensitive information, and permit remote wiping in the event of loss or theft.

•   Storage media Information on backup media such as magnetic tape should be encrypted so that only authorized personnel can recover information. Backup media should be securely stored to prevent loss and stored in locations with appropriate environmental controls to ensure long life. The use of USB external storage devices should be controlled (such as requiring encryption or permitting only company-issued secure devices) or prohibited.

•   Scanners and copiers Many types of scanners and copiers have storage devices (hard drives or solid-state drives) where printed, scanned, and copied information persists for potentially extended periods. Safeguards are needed to protect this stored information and ensure its destruction when copiers and printers are maintained and eventually discarded. While it is uncommon for scanners and copiers to include security tools such as antimalware or firewalls, these devices should be protected on and by the network to prevent data loss in the event of an attack.

•   Device forensics Organizations need to have device forensics capabilities in the form of trained staff and forensics tools to support investigations of various kinds in determining events and actions on computers and networks. Smaller organizations often outsource forensics services to outside firms in the form of retainers.

Mergers, Acquisitions, and Divestitures

Changes to an organization resulting from mergers, acquisitions, and divestitures have a high potential for disrupting the organization’s privacy program. The potential impacts include a change in scope for a program, the gain (or loss) of privacy and security staff, and the addition (or reduction) in regulations, standards, and other compliance obligations.

A privacy leader will be involved in several different ways, in terms of timing and influence. In the best case, the privacy leader is aware in advance of the merger, acquisition, or divestiture—early enough and in a position to influence the details of the transaction. The worst case is when the privacy leader is in the dark until after the action has been completed and announced to the public. Both cases are discussed in this section.

Influencing the Transaction

A privacy leader involved during the development of the terms of a merger, acquisition, or divestiture can play the role of a subject matter expert and advisor to executive management during the transaction’s planning and development. The privacy leader has an array of considerations:

•   Change in regulatory scope A change in the regulations that the post-transaction organization will potentially bring in (or remove) additional privacy regulations. For instance, if a US-based company is acquiring or merging with a company with customers in Europe, it will be required to comply with the GDPR.

•   Post-transaction structure The privacy leader needs to understand the organization’s structure after the transaction has been completed. For instance, in the case of a merger or acquisition, understanding the degree of integration is vital. Will the two pre-transaction organizations function as before, or will some (or all) corporate functions be merged? If they are to remain separate, there may be two similar privacy and security programs that continue long-term after the transaction has closed. Further considerations include how corporate integration will take place—whether all-at-once or through a series of small steps over time.

•   Cultural impact Engineering a merger, acquisition, or divestiture is challenging, and the impact and influence on corporate culture is a key consideration. From simple considerations, including job security, to longer term issues such as career paths and changing organization charts, the workforce can become distracted by expected, anticipated, and actual changes that affect productivity and morale. Maintaining the integrity and effectiveness of key activities in privacy and cybersecurity can be a significant challenge, particularly when some workers decide to leave the organization for greener pastures. Some mergers and acquisitions represent a clash of different cultures that bring a lot of angst and conflict (or just the fear of it).

•   Divestiture In the case of an organization spinning off one or more portions of its business, numerous questions arise, including division of a workforce, assets, and work centers. In many divestitures, one organization may continue providing services to the other until the other can acquire its own capabilities. For instance, one entity may lease data center space and/or some of its IT infrastructure to the other until the other has had an opportunity to acquire its own assets. Some staff members may continue performing services to the other organization until the other can hire its own staff.

Images

NOTE    Privacy and security leaders can utilize the techniques of third-party risk management to assess organizations targeted for merger and acquisition, to obtain a high-level view of risk in the target organization.

Integrating Programs

After a merger or acquisition, IT, security, and privacy leaders must figure out how their new merged programs will take shape and operate on a day-to-day basis. In many cases, the new organization will have “two of everything.” It will need to develop a strategy to return to single processes and solutions that support the new organization’s needs. The development of a strategy for a privacy program in the new whole organization after a merger or acquisition is not too different from strategy development in a new program: existing capabilities are cataloged, the new end state is envisioned, a gap analysis is performed, and the roadmap is developed to take the program to its new, merged future state. Program strategy development is discussed in Chapters 1 and 2, and the principles there apply to cases of mergers and acquisitions, and even to divestitures.

Privacy Impact Assessments and Data Privacy Impact Assessments

A privacy impact assessment (PIA), sometimes confused with a data protection impact assessment (DPIA, as coined in the GDPR, Article 35), is a targeted risk assessment undertaken to identify impacts to individual privacy and to an organization’s ability to protect information resulting from a proposed change to a business process or information system.

PIAs are not a new concept. They have been required since 2002 for US government electronic services and processes under the E-Government Act of 2002, and the European Union Article 29 Working Party endorsed the requirement to conduct PIAs for radiofrequency identification applications in 2011. However, the GDPR raised the visibility of this process because of the applicability to all businesses that process personal data. Generally, a PIA is conducted for new processes or systems that will collect, store, or transmit personally identifiable information (PII) or a significant modification to a process or system that may create a new privacy risk. The purpose of a PIA is to ensure that personal information collected is used only for the intended purpose. It identifies the impact(s) that any process or system change has on the organization’s compliance with its privacy policy and applicable privacy laws and regulations.

One could also say that the purpose of a PIA is to validate the proposed change from a privacy perspective. By “validate,” I imply that the proposed change has been well-designed (presumably with privacy and security by design) and that the impact on privacy is neutral or better. A PIA’s purpose is the protection of privacy by design principles and practices.

Images

EXAM TIP    CIPM candidates are not expected to memorize PIA procedures. You should, however, be familiar with the concepts, purposes, and approaches for performing PIAs.

Two excellent resources for PIAs and how PIAs achieve these objectives are

•   www.gsa.gov/reference/gsa-privacy-program/privacy-impact-assessments-pia

•   www.ftc.gov/site-information/privacy-policy/privacy-impact-assessments

Privacy Threshold Analysis

When an organization is considering a change to a business process or an information system, a privacy threshold analysis (PTA) is performed. A PTA determines whether the process or system is associated with personal information. If so, the PTA will direct the performance of a PIA. If the process or system does not involve personal information (or influence processes and systems that do), a PIA is unnecessary.

PIA Procedure

Following is the procedure for conducting a PIA:

1. Obtain a description of the project or proposed change, including the purpose of PII data collection and relevant details on business processes or information systems.

2. Identify all changes to data collection, data flows, storage, protection, and use of personal information.

3. Determine whether the proposed change violates any terms of the organization’s external privacy policy, internal privacy policy, or security policy. Identify and describe all such violations. Identify any compensating controls or changes that would reduce or eliminate the violations. This determination must include identifying the original purpose of collecting and/or using personal information and whether the proposed change violates or exceeds the purpose.

4. Determine whether the proposed change violates any terms of privacy or security laws, regulations, or related guidelines or codes of conduct.

5. Determine whether the proposed change introduces any new security risks and, if so, what potential alterations to the proposed change may reduce or eliminate those risks.

6. Determine whether the proposed change alters any previously known security risks and, if so, what potential alterations to the proposed change may reduce or eliminate those risks.

7. Develop a list of all such impacts (and possible countermeasures) identified in the preceding steps.

8. Write a formal report describing all of the above.

Images

NOTE    A PIA is nothing more than a risk assessment that focuses on the privacy and security of subject data in some specific context.

Engaging Data Subjects in a PIA

The GDPR, Article 35, Paragraph 9 suggests that the organization consult with data subjects or their representatives to obtain their opinion of proposed changes as part of a PIA. Such engagement could take the form of

•   Surveys

•   Focus groups

•   Announcement of the proposed change with a request for comments

Images

NOTE    The EU GDPR describes the PIA procedure in Articles 35 and 36.

The Necessity of a PIA

Not all regulations explicitly require an analysis of proposed changes to stated or implied data subject rights or the security of their personal information. However, the absence of a specific requirement for a risk assessment does not absolve an organization from performing one. It is a standard business practice to analyze various forms of impact of a proposed change to a business process or supporting information system. A failure to assess the impact of a proposed change could even be seen as negligence: a reasonable person would find such an organization at fault for not seeking to understand the potential impacts of a proposed change upon the security, privacy, or proper use of personal information.

Integrating into Existing Processes

Organizations building their privacy programs need to place “hooks” into three types of existing business processes:

•   Product development The PIA is a tool designed to promote privacy by design by ensuring that privacy is considered in technical, organizational, and security measures at the beginning stage of product development and throughout the product life cycle.

•   IT change control This process needs to include a security risk assessment and a privacy impact assessment. The data privacy officer must be informed of all changes and counted among the approvers for changes that potentially impact privacy.

•   Business process change control This process must include a PIA. The data privacy officer needs to be informed of all changes and counted among the approvers for changes that potentially impact privacy.

Images

CAUTION    Organizations lacking IT change control or business process change control need to implement these processes and ensure that all relevant personnel are aware of, and will comply with, the terms of these processes.

Recordkeeping and Reporting

All PIAs that are performed must be preserved as a part of the organization’s recordkeeping. The data protection officer (DPO) is typically assigned this responsibility. The DPO may elect to include statistics about PIAs as a part of regular privacy metrics reported to executive management and the board of directors. Aspects of this reporting may include

•   The number of PIAs performed and the level of effort expended

•   The projects associated with PIAs that were performed

•   The number of exceptions noted where processes or systems required remediation

•   The regulations in scope for PIAs (applicable for organizations subject to multiple privacy laws)

•   Whether findings for PIAs are placed into a risk register, statistics on the contents of the risk register, including the number of items, aging, time to remediation, and context

Risks Specific to Privacy

To perform an effective PIA, the privacy specialist needs to be aware of privacy-specific threats, vulnerabilities, and attacks. A typical risk assessment considers threats and vulnerabilities in the context of some particular asset. Being familiar with specific threats and vulnerabilities will result in a better PIA.

Privacy Vulnerabilities

During a PIA, the privacy specialist must identify vulnerabilities in an information system, business process, or whatever the PIA is focused upon. A good definition of the term vulnerability is “a weakness that may be present in a system that makes the occurrence of one or more threats more likely.”

Weaknesses in processes or systems represent opportunities for business processes to deviate from what is expected. The nature of the deviation may be a skipped step in a manual procedure, a software program that behaves unexpectedly when presented with unexpected input, or the ability for a worker to deliberately perform something incorrectly without being detected.

Identifying Vulnerabilities When the privacy specialist examines business processes, the following may indicate the presence of vulnerabilities:

•   Manual steps that rely upon human decision-making

•   Steps that require workers to be proactive (such as checking a mailbox for incoming requests)

•   Steps that involve data entry (the possibility of miskeying data)

•   Steps performed that do not include recordkeeping

•   The absence of reconciliation procedures (such as matching the number of incoming requests to the number of outgoing replies)

•   Lack of written documentation describing the steps in processes and procedures

•   Lack of training for personnel who perform processes and procedures

•   Lack of oversight for personnel who perform processes and procedures

When the privacy specialist examines information systems, the techniques performed in typical system vulnerability assessments apply and include the following:

•   Misconfiguration

•   Missing security patches

•   Software vulnerabilities that enable an attacker to cause software to behave in unintended ways (such as script injection, SQL injection, denial of service)

•   Poor user access management controls (including easily guessed passwords, failure to remove user accounts for terminated users, password settings that invite brute-force attacks)

Automated tools such as port scanners, vulnerability scanners, and code scanning tools are often used to perform these activities.

Vulnerability Severity When identifying vulnerabilities, the privacy specialist should also rate the severity of each vulnerability. The severity can be expressed on a high–medium–low scale or a numeric scale such as 1–5 or 1–10. A severity rating is generally associated with the ease of exploiting the vulnerability, the skill level required to exploit it, and the result of exploiting it.

A standard vulnerability rating scale, the Common Vulnerability Scoring System (CVSS), is used in the information security world. The severity of vulnerabilities identified in information systems using CVSS is 0–10, where 10 is the highest. A CVSS score is calculated using several inputs, including attack complexity, whether authentication is required, whether the attacker must be in the physical proximity of the target system, and the impact of exploitation upon the confidentiality, integrity, and availability of information in the system.

Data Storage and Data Flow Vulnerabilities It’s also necessary to understand the flow of personal information in the context of the PIA. The examiner needs to identify all instances of data storage and then examine access control processes and the security of systems associated with stored data. Also, the examiner needs to identify all instances of data movement within the organization and data leaving and entering the organization. For each case, the measures taken to protect the confidentiality and integrity of personal information in transit must be identified.

Application Security Vulnerabilities Suppose an application is new or is undergoing significant changes. In that case, it should be subjected to various tests to ensure that it is free of vulnerabilities that could be exploited by an attacker to obtain personal information illicitly or cause a malfunction of the application. Testing includes penetration tests, static application security testing (SAST) code reviews, and dynamic application security testing (DAST) code reviews.

Privacy Threats

A privacy analyst should consider reasonable and likely threats that may occur within the business process or system being examined through the performance of a PIA. As mentioned, a PIA is a risk assessment focused on the potential impact on privacy compliance and security, targeted to a process or system undergoing changes. A good definition of the term threat is “an event that, if realized, would bring harm to an asset.”

In this case, of course, the asset is personal information, but a threat can also include the business process being examined, the underlying information systems that facilitate the operation of the business process, the persons who perform the process, and those who operate and manage the systems. All of these are a part of the attack surface that must be considered.

When analyzing a variety of threats, the privacy specialist considers each threat and determines

•   Whether the threat is relevant

•   How the threat may be carried out (including consideration for any corresponding vulnerabilities that have been identified)

•   The likelihood that the threat will be carried out

•   The impact on the organization (and the data subject) if the threat were carried out

In a typical PIA, the analyst will create a chart listing all reasonable threats. Each threat is scored in terms of relevance, likelihood of occurrence, and impact of occurrence. The scoring may be in the form of qualitative values such as high–medium–low or on a numeric scale such as 1–5 or 1–10 (where the highest number in the scale represents the highest probability and impact).

Images

NOTE    The lists of threats found in Appendix E of NIST SP 800-30 and Appendix C of ISO/IEC 27005 represent good starting points for threat analysis.

Privacy Countermeasures

After completing the vulnerability and threat analysis of the process or system being examined, the privacy analyst may conclude that one or more threats or vulnerabilities represent unacceptable conditions and may suggest that one or more countermeasures be enacted to reduce risks considered unacceptable.

Here are some example countermeasures:

•   In a data subject request (DSR) process, have a second employee check the contents of the response to be sent back to the data subject.

•   Implement a web application firewall (WAF) to offer further protection for a web application that collects and manages personal information.

•   Implement automated search tools to ensure more accurate (and timely) results in a DSR.

PIA Case Study

Let’s take a look at a case study that represents a likely scenario. An organization in the retail industry has conducted most of its business through a web application. The organization has written a mobile app for Apple and Android devices to make it more convenient for its customers to place orders and check on their order status. The privacy officer was informed of the mobile app late in its development.

As is typical in the organization, a security firm was commissioned to perform a vulnerability analysis on the mobile app. Several vulnerabilities were identified and later remediated by the organization’s developers. A retest confirmed that the vulnerabilities were remediated.

The privacy officer determined that the mobile app uses simple user ID and password authentication with no other options. As a result, the privacy officer’s PIA made the following recommendations:

•   Make multifactor authentication available to mobile app users who prefer to use it.

•   Change the system’s development life-cycle process so that a PIA can be completed as a new application or system is designed, rather than conducting an analysis of the nearly finished product.

•   Include privacy requirements in future systems development projects.

Chapter Review

When an organization is building, reinvigorating, or improving a privacy program, the act of developing a baseline will result in an important business record that will help privacy leaders and management understand in real terms the progress that has been made since the starting point.

Third-party risk management (TPRM) refers to activities used to discover and manage risks associated with external organizations performing operational functions for an organization. Many organizations outsource some of their information processing to third-party organizations, often in the form of cloud-based software as a service (SaaS) and platform as a service (PaaS), and often for economic reasons: it is less expensive to pay for software in a leasing arrangement than to develop, implement, integrate, and maintain software internally.

Because of the wide span of risk levels associated with third-party service providers’ services, many organizations choose to develop a scheme consisting of risk levels based on criteria critical to the organization. Typically, this risk scheme will have two to four risk levels, with a level assigned to each third party.

The assessment of work centers and processing centers gives privacy leaders a depiction of potential risks related to protecting personal information and related business processes.

Organizations retaining sensitive and personal information usually have a formal records retention schedule that stipulates the maximum time that various records are retained. When specific records exceed their storage period, those records are discarded.

Devices and media require specialized protection techniques to prevent the compromise of personal information.

Organizational changes in the form of mergers, acquisitions, and divestitures have a high potential for disrupting the organization’s privacy program. The potential impacts include a change in scope for a program, the gain (or loss) of privacy and security staff, and the addition (or reduction) in regulations, standards, and other compliance obligations.

A privacy impact assessment (PIA) is a targeted risk assessment that identifies impacts on individual privacy and an organization’s ability to protect information resulting from a proposed change to a business process or information system.

A privacy threshold assessment (PTA) determines whether a process or system is associated with personal information. If so, the PTA will direct the initiation of a PIA when an organization is planning changes to a process or system.

Quick Review

•   The gold standard for process maturity is the Capability Maturity Model Integration (CMMI), originally developed at Carnegie Mellon University and now owned by ISACA.

•   The specific responsibilities for operations and security between an organization and any specific service provider may vary. It is vital that an organization clearly understand its specific responsibilities for each third-party relationship so that no responsibilities that may introduce risks to the organization are overlooked or neglected.

•   Organizations need to assess their third parties periodically to ensure that they remain at the right classification level. Third parties that provide a variety of services may initially be classified as low risk. However, in the future, if the third party is retained to provide additional services, this could result in reclassification at a higher level of risk.

•   Because personal information is often sensitive, an organization will employ techniques for destroying records so that they cannot be reconstituted.

•   Ideally, a privacy leader will be involved in the planning stages of a merger, acquisition, or divestiture to ensure the ongoing integrity of privacy operations in post-transaction organizations.

Questions

1. A privacy leader is documenting the current state of an organization’s privacy program so that progress over time can be better understood. The documentation of the current state is known as a(n):

A. Gap analysis

B. Risk assessment

C. Baseline

D. Audit

2. What is the purpose of the cloud services shared responsibility model?

A. Defines responsibilities when assigned to a project team

B. Defines which parties are responsible for which aspects of privacy

C. Defines which parties are responsible for which aspects of security and privacy

D. Defines which parties are responsible for which aspects of security

3. An organization that receives and transforms information on behalf of another organization is known as a:

A. Vendor

B. Fourth party

C. Controller

D. Processor

4. An organization retained a service provider for low-risk services, and the provider was classified at the lowest risk tier in the organization’s TPRM program. Later, the organization expanded its use of the service provider, which now collects personal information from customers. What, if any, change is required in the organization’s TPRM program?

A. No change is needed if the vendor’s contacts are unchanged.

B. Inform accounts payable of changes in payment levels.

C. Issue the questionnaire more frequently.

D. Reclassify the vendor’s risk tier and reassess accordingly.

5. An organization is negotiating a contract with a service provider classified at the highest vendor risk tier. The organization’s attorney is contemplating language in the right-to-audit section of the legal agreement. Which of the following is the best term to use?

A. Right to audit in the event of a new privacy law

B. Right to audit in the event of a confirmed breach

C. Right to audit in any circumstance

D. Right to audit in the event of a suspected breach

6. When assessing a third-party service provider that has been classified at a high-risk tier, which of the following is the best method for confirming the answers provided in a privacy assessment questionnaire?

A. Require that the service provider attest that the questionnaire is accurate.

B. Require that the service provider provide specific program artifacts.

C. Perform a site visit to observe controls.

D. Require that the service provider be certified to ISO/IEC 27701.

7. A new privacy leader wants to baseline the existing program to help identify improvements over time. Which of the following is NOT required for a baseline?

A. Format of privacy records

B. List of applicable regulations

C. Privacy program metrics

D. Size and competence of staff

8. An organization has sent a questionnaire to a selected vendor for performing expense management services. The vendor stated in the questionnaire that it does not perform security awareness training. What is the organization’s best response?

A. Accept the risk and proceed.

B. Contractually require the vendor to begin performing security awareness training.

C. Select a different service provider.

D. Create an entry in the risk register.

9. An organization wants to limit the use of USB external storage for the storage of personal information. What is the best first step to accomplish this?

A. Implement software to detect uses of USB storage of personal information.

B. Implement software to block uses of USB storage of personal information.

C. Create a policy that defines the limitations of USB storage.

D. Disable USB ports on end-user computers.

10. Which of the following represents the best practice for protecting sensitive data on laptop computers?

A. Encrypted over-the-air backup

B. User-directed encrypted directories

C. Encrypted thumb drives

D. Whole-disk encryption

11. Executive management is considering entering negotiations that, if successful, will result in the acquisition of another organization. What is the best time for the organization’s privacy leader to become involved in the acquisition?

A. During final negotiations

B. As early as possible

C. After negotiations have concluded

D. When the transaction closes

12. Two organizations are planning to merge later in the year. The privacy leader in organization A has determined that the maturity level of the privacy program in organization B is around 2, while the maturity level in the privacy leader’s organization is around 3.5. What should be the target maturity of the post-merger organization’s privacy program?

A. 1.5 – the difference between the maturity of the two programs

B. 3.5 – the highest common denominator of the two programs

C. 2.75 – an average of the two maturity levels

D. 2 – the lowest common denominator of the two programs

13. What is the purpose of a privacy threshold analysis (PTA)?

A. Determine the circumstances in which an organization should perform PIAs.

B. Determine whether an organization should ever perform PIAs.

C. Determine whether a full PIA is required in a given situation.

D. Determine whether the PIA was properly performed.

14. A PIA is:

A. A risk assessment focused on a proposed business process change

B. A risk assessment focused on a completed business process change

C. A vulnerability assessment focused on a new business process

D. A threat assessment focused on a new business process

15. A privacy strategist wants to eliminate data leakage opportunities in an organization’s workforce related to workers’ use of laptop and desktop computers. What is the best first step?

A. Perform a penetration test on a sampling of laptop and desktop computers to identify likely scenarios.

B. Block access to unsanctioned file storage sites.

C. Block access to personal webmail.

D. Perform a risk assessment to identify all data leakage scenarios and potential remedies for each.

Answers

1. C. The documentation of the characteristics of a process or program for later comparison is known as a baseline. Later measurements will reveal changes or improvements.

2. A. The cloud services shared responsibility model is a guide that helps organizations understand the categories of privacy and security and whether the service provider or the organization using the service provider is responsible for each category of activities. Service providers vary somewhat on the details of the privacy and security capabilities they offer.

3. D. A processor is an organization that processes information on behalf of another organization, which is typically known as a controller. The terms controller and processor are defined in the GDPR but used elsewhere by privacy and security professionals.

4. D. When a vendor’s role expands and its risk classification changes, the organization will need to reassess the vendor at its new risk classification level. This may involve more frequent assessments, more rigorous assessments, or both.

5. C. The best starting point regarding right-to-audit language in a legal agreement with a service provider is to assert a right to audit in any circumstance. This is a strong opening position that may later be reduced to a right to audit in the event of a suspected or confirmed breach, which often is still an acceptable position.

6. C. A site visit is the best way to verify that controls are effective. However, site visits can be prohibitively expensive, which is why many organizations rely on other means, such as requiring the third party to provide program artifacts such as policy and procedure documents.

7. A. The format of initial state privacy records probably has little bearing on the overall health of the privacy program. However, the slate of applicable regulations, a sampling of program metrics, and the size and competency level of program staff are all relevant starting points for a baseline.

8. B. If the organization has already made a decision to use this provider, the best way forward is to contractually require the service provider to implement a security awareness training program. This is a fundamental activity in an organization’s cybersecurity program; the risk should not be accepted.

9. A. The best first step to limiting or blocking the storage of personal information on external USB devices is to implement a tool that provides visibility into the use of USB storage. Then organizations can implement tools that block USB storage with foreknowledge of any issues that may arise. For example, there may be legitimate procedures involving USB storage and personal information that can be altered, or exceptions can be made, so that USB blocking does not disrupt these procedures.

10. D. Whole-disk encryption is the best practice for protecting sensitive data on laptop computers. Whole-disk encryption applies to all files stored on a laptop computer, and in enterprises with central management, end users cannot disable this encryption.

11. B. An organization’s privacy leader should be involved in acquisition proceedings as early as possible so that the privacy leader can understand the capabilities and risks associated with the acquired organization’s business early, to influence the terms of the acquisition and keep privacy risks as low as possible.

12. B. The maturity level of the new, merged privacy program should be as high as the higher of the two separate pre-merger programs. However, no other information is provided in this scenario that would compel an analysis to determine whether 3.5 is indeed appropriate.

13. C. The purpose of a PTA is to determine the necessity for a PIA in each individual circumstance. Where a PTA determines that a proposed business change has no bearing on personal information, a PIA need not be performed. If, however, a PTA identifies the presence of personal information in a business process that is to be changed, then a PIA will need to be performed to understand the impact of the change on privacy.

14. A. A PIA, or privacy impact assessment, is a risk assessment that is focused on a planned change to a business process. The purpose of the PIA is to identify potential risks of privacy compliance in the business process that would exist after its proposed changes have been completed.

15. D. Before the privacy strategist can make specific recommendations, it is first necessary for the strategist to conduct a risk assessment to identify all reasonable data leakage methods and identify potential remedies for each. After the risk assessment has been completed, specific actions can be taken to reduce data leakage risks.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset