Binod Vaidya1 and Hussein T. Mouftah1
1School of Electrical Engineering and Computer Science (EECS), University of Ottawa, Ottawa, ON, Canada
An intelligent energy infrastructure, which links together various elements of city operations, would be one of the essential features in a smart city.
A smart grid is an electric power grid that employs information and communication technologies (ICTs) to improve its efficiency, reliability, resiliency, and flexibility. Such an electric grid not only provides information about its energy usage and costs to consumers such that they can make decisions independently but also transforms electric power grid through remote monitoring and control, automation, self‐healing approach as well as provides safe and reliable integration of distributed renewable energy resources (Strasser et al., 2015).
Smart meters are one of the key enablers of smart grids. An advanced metering infrastructure (AMI) allows bidirectional energy flows as well as utilizes two‐way communication enabling energy service providers (ESPs) to receive energy consumption data of consumers and sends pricing or control signals back to consumers in real time. By measuring consumers' near real‐time energy consumption data at high temporal resolution, AMI enables ESPs to control and optimize the supply and distribution and even offer their customers pricing schemes based on current offer and demand (Siano, 2014). Distribution system operators can monitor the electric power grid at a higher sampling rate and granularity than earlier. Electricity customers also can be benefitted from smart meter deployment by receiving timely information about consumed power and managing power consumption accordingly.
Furthermore, the smart grid allows consumers to sell electricity to the grid or other consumers by producing electricity using photovoltaic or wind turbines (Depuru, Wang, and Devabhaktuni, 2011b). Similarly, the smart grid also includes vehicle‐to‐grid (V2G), in which electric vehicles (EVs) can communicate with power grid operators to trade demand response services by delivering stored electricity into the electric power grid (Yu et al., 2016).
The AMI is a predominant and fundamental component in the development and deployment of the smart grid in the smart cities; nonetheless, V2G networks are also rapidly increasing in the urban cities for charging /discharging EVs.
Although the smart grid delivers numerous performance benefits to the electric power industry and also enables consumers to optimize their power consumption, the smart grid infrastructure (i.e., AMI and V2G) in the smart city has become increasingly susceptible to a wide range of cyber‐threats.
Basically, Smart grid technologies capture customer data relating to sensitive information that is used for various purposes such as real‐time pricing and demand response; however, privacy can be threatened and breached by a number of practices, which are normally considered as unacceptable (Finster and Baumgart, 2015; Li et al., 2012).
If realistic security and privacy‐preserving approaches are not employed in the smart grid, new challenges to security protection, privacy, and data protection will emerge.
The objective of this chapter is to provide insights of privacy protections of the electricity consumers in the smart city. It mainly focuses on aspects of privacy principles including privacy by design (PbD). It also stipulates a basis for better understanding of the current state‐of‐the‐art privacy engineering as well as privacy impact assessment and privacy‐enhancing technologies. The rest of the chapter is organized as follows. Section 20.2 discusses privacy concern in the smart grid including AMI and V2G. Section 20.3 discusses the privacy principles while section 20.4 presents privacy engineering focusing on privacy protection goals and pertinent frameworks. Section 20.5 depicts privacy impact assessment, and section 20.6 explores privacy‐enhancing technologies. Finally, section 20.7 concludes the chapter.
The smart grid (i.e., AMI and V2G) introduces substantial benefits and opportunities to the smart city, but it also raises several challenges related to privacy. For instance, fine‐grained smart metering data and control messages provide ESPs with the information about real‐time electricity consumption status. However, using the fine‐grained electricity consumption, not only varied information about the consumer premises can be inferred but also the probability of leaking customers' privacy including personal information, daily activities, individual behaviors, etc., increases (Lisovich, Mulligan, and Wicker, 2010; Jokar, Arianpoo, and Leung, 2016).
Indeed, security and privacy are considered as crucial components for the success of secure smart grid networks including AMI and V2G networks. Privacy emphasizes the individual's ability to control the collection, use, and dissemination of his/her personally identifying information (PII), whereas security offers mechanisms to ensure confidentiality and integrity of information and availability of ICT systems. Yet the perceptions of privacy and security do intersect. Ensuring privacy is more complicated than ensuring security (SGIPCSWG‐V1, 2014; SGIPCSWG‐V2, 2014; SGIPCSWG‐V3, 2014).
Privacy considerations in the smart grid embrace examining the rights, values, and interests of individuals. According to NIST IR 7628v2 (SGIPCSWG‐V2, 2014), four dimensions of privacy are considered: 1) privacy of personal information; 2) privacy of the person: 3) privacy of personal behavior; 4) privacy of personal communications. Though most smart grid stakeholders directly address the first dimension, since most data protection laws and regulations mainly cover privacy of personal information, the remaining three dimensions are also essential privacy considerations in the smart grid.
Not only potential risks in AMI can become ingress points for adversaries, but also potential insider risks can be exacerbates to stretch to a magnified threat level. Several serious privacy risks in AMI include eavesdropping, traffic analysis, statistical disclosure, and consumption profiling (Jiang et al., 2014; Krishna, Weaver, and Sanders, 2015). Malicious entities could use the smart metering personal data to enable malevolent uses such as identity theft, burglary, vandalism, stalking, etc.
Privacy concern is also one of the obstacles to the successful deployment of V2G networks. Privacy protection issues in V2G networks are more challenging than in AMI networks. Due to the electric mobility (E‐mobility), EVs may join or depart the EV‐charging network frequently, so their privacy requirements are more stringent. Privacy issues in V2G networks include location privacy of EVs and attacks of leaking privacy such as eavesdropping, man‐in‐the‐middle attack (MiTM), impersonation attack, sybil attack, and physical attack.
Primarily, potential privacy consequences of smart grid systems (i.e. AMI and V2G) include identity theft, determine personal behavior patterns, determine specific appliances used, perform real‐time surveillance, reveal activities through residual data, targeted home invasions, provide accidental invasions, activity censorship, decisions and actions based upon inaccurate data, profiling, unwanted publicity and embarrassment, tracking behavior of renters/leasers, behavior tracking (possible combination with personal behavior patterns), location tracking, and public aggregated searches revealing individual behavior (Asghar et al., 2017; Cintuglu et al., 2017).
A primary data flow in the AMI encompasses customer electricity data from the smart meter to the utility. Smart meters can take readings of fine‐grained customer electricity data and send them to the utilities. And the utilities use these data for a variety of purposes. For instance, the customer electricity data can be used for applying time‐variant pricing (i.e., time‐of‐use rates, critical peak pricing, real‐time pricing), for better understanding customer demand, or for detecting meter tampering (Depuru, Wang & Devabhaktuni, 2011a; Amin et al., 2015). Utilities also make the data available to customers for optimizing their power consumption (Hubert and Grijalva, 2012). However, this information flow in AMI raises the following privacy concerns.
A significant shift in the smart grid, relative to its predecessor, is that third parties will be enabled to collect metering data. In case of V2G networks, charging station operators shall collect meter data and process them. These metering data may be transferred to the utility company for third‐party billing. Hence, during collecting energy usage information by a non‐utility‐owned metering device, the customer's privacy is a prime concern.
Cyber security is a critical concern owing to the growing potential of cyber‐attacks and incidents against critical energy infrastructures in the smart city. The cyber‐security solution for such an energy infrastructure must tackle not only intentional attacks from industrial espionage and hackers but also unintentional compromises of the electric power grid infrastructure due to user errors, equipment failures, and natural disasters.
Since private information about consumers (e.g., home energy consumption) shall be included during energy usage data exchange between multiple stakeholders of the Ssart grid, it could harm particular individuals, if the consumer data are not used with appropriate protection measures. Thus, privacy risks and challenges introduced by the smart grid (i.e., AMI and V2G) have to be properly addressed. Protecting critical energy infrastructures in the smart city has to be given great precedence, and technical measures to protect customer's privacy should be considerably prioritized.
Privacy protection comprises preventing any valuable information (i.e., private information) related to the identity of an entity to be known by other entities. Only well‐protected smart grid systems in the smart city ecosystem would be considered robust and secure.
The degree of privacy protection should be well premeditated. Applying PbD in conjunction to engineering aspects is critically imperative such that privacy impact assessment (PIA) can be properly conducted, and appropriate privacy‐enhancing technologies (PETs) can be deployed.
There are several privacy principles that can assist to impose privacy protection in ICT‐based systems including smart grid networks.
As system design and architecture may collect PII, there is a possibility to violate privacy protection. A better methodology to protect privacy of the consumer data is to apply a PbD approach (IPCO 2009; Cavoukian 2011). With initiation of A. Cavoukian, an original Privacy‐by‐Design (PbD) approach was proposed in 1990. This approach identifies a set of foundational principles that should be followed when designing and developing privacy‐sensitive applications (IPCO, 2009). These 7 foundational principles are depicted in Figure 20.1.
With a proactive approach, PbD encompasses embedding privacy directly into design of technologies, business practices, and networked infrastructures. It compels privacy as a foundational requirement, consequently, preventing privacy‐invasive occurrences before they may happen. By making privacy the default setting within an organization, its customers' privacy can be well protected (Cavoukian, 2011).
Nevertheless, as an early approach for interpretation of privacy and personal information, the Federal Trade Commission (FTC), USA, developed the Fair Information Practice Principles (FIPPs) in 1973. Its core principles of privacy in the context of information are: 1) notice/awareness; 2) choice/consent; 3) access/participation; 4) integrity/security; and 5) enforcement/redress (Landesberg et al., 1998).
Different adaptations of the FIPPs have been demarcated. For instance, inferring privacy and data protection, the Guidelines on the Protection of Privacy and Trans‐border Flows of Personal Data were developed by the Organization for Economic Cooperation and Development (OECD) in 1980 and revised in 2013 as OECD Privacy Framework (OECD, 2013). The principles in the OECD documents have been widely adopted.
In order to realize a universal privacy framework, the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) have published ISO/IEC 29100 as an international standard in 2011 (ISO/IEC JTC 1/SC 27 IT Security techniques, 2011). Privacy principles described in the ISO/IEC 29100:2011 Privacy Framework are mainly derived from existing principles developed by various international bodies such as the FTC and OECD. This framework not only targets organizations and intends to support them in defining their privacy preservation requirements but also aims at enhancing the existing security standards by the privacy perspective whenever PII is processed. Hence, these privacy principles are used to guide the design, development, and implementation of privacy policies and privacy controls.
The American Institute of Certified Public Accounts (AICPA) and the Canadian Institute of Chartered Accountants (CICA) have developed privacy principles and applicable criteria, known as Generally Accepted Privacy Principles (GAPPs) in order to assist organizations in the design and implementation of comprehensive privacy practices and policies (Cornelius, 2009).
The relationship among three major privacy principles, namely, OECD Privacy Framework Privacy Principles, Privacy Principles of the ISO/IEC 29100:2011 Privacy Framework, and AICPA Generally Accepted Privacy Principles (GAPPs) is shown in Figure 20.2.
By studying privacy principles from other relevant industries that handle and store sensitive information, smart grid actors, i.e. utilities, can provide utmost protection to the consumer electricity data.
The National Institute of Standards and Technology (NIST), USA, is one of the central players in promoting the growth of the smart grid by developing a framework for the smart grid that embraces interoperable standards and protocols so that all components of the smart grid shall be able to work together. In this regard, NIST published Smart Grid Cyber Security Strategy and Requirements (NIST IR 7628) in 2010 and later revised in 2014 (SGIPCSWG‐V1, 2014; SGIPCSWG‐V2, 2014; SGIPCSWG‐V3, 2014). NIST IR 7628 Vol 2 (SGIPCSWG‐V2, 2014) is dedicated to privacy in the smart grid. NIST IR 7628 applies FIPPs while deliberating privacy considerations for the smart grid and uses GAPPs as one of privacy principles.
IPCO (2010), by adapting a PbD approach with the seven fundamental foundational principles to the smart grid context, best practices for smart grid PbD have been created such that smart grid systems should not only proactively embed privacy requirements into their designs and ensure that privacy is the default but also be visible and transparent to consumers and be designed with respect for consumer privacy. A related approach that emphasizes embedding privacy into the design of electricity has been depicted in Cavoukian, Polonetsky, and Wolf (2010). Further applying privacy by design for third‐party access to customer energy usage data is provided in the article Cavoukian and Polonetsky (2013).
Wicker and Thomas (2011) have advocated a framework for privacy‐aware design practices for embedding privacy awareness into information networks that consists of a set of principles derived from the FIPPs. This approach that has five privacy‐aware principles is specifically intended for the demand response platform (Wicker and Schrader 2011), which is depicted in Table 20.1.
Table 1 Five Privacy‐Aware Principles for the Demand Response Platform.
Privacy‐aware principle | Requirement |
Provide full disclosure of data collection |
|
Require consent to data collection |
|
Minimize collection of personal data |
|
Minimize identification of data with individuals |
|
Minimize and secure data retention |
|
Provisioning a PbD approach is not sufficient due to the lack of holistic and systematic methodologies that address the intricacy of privacy and the absence of the translation of privacy principles into engineering activities (Alshammari and Simpson 2016). Privacy engineering deals with designing, implementing, adapting, and evaluating guidelines, protocols, and techniques to methodically apprehend and address privacy issues in the development of ICT systems (Gurses and Del Alamo, 2016; Spiekermann and Cranor, 2009). Hence, privacy engineering focuses on providing guidance that will enable organizations to make persistent decisions about resource allocation and effective deployment of controls in ICT systems in order to decrease privacy risks (Cavoukian, Shapiro, and Cronk, 2014).
There are not only several efforts to formulate concepts for privacy engineering but also approaches to define engineering, technical, and operational aspects for PbD (Hoepman, 2014; Kroener and Wright, 2014; Antignac and Le Metayer, 2015; Bringer et al., 2015).
Protection goals are regarded as authoritative components while evaluating information security of ICT systems and choosing appropriate technical and operational safeguards in various technologies (Meis, Wirtz, and Heisel, 2015).
Classically, information security in ICT systems features three security protection goals, namely confidentiality, integrity, and availability, commonly known as CIA triad. The CIA triad is typically deemed as critical to evaluate security conditions of the ICT systems. Confidentiality guarantees preserving authorized controls on information access and disclosure including means for protecting personal privacy and sensitive information from malicious people. Integrity ensures protecting against unauthorized and improper information alteration or destruction and includes ensuring information non‐repudiation and authenticity. Availability provides a guarantee of timely and reliable access to the information by authorized people.
Three privacy‐specific protection goals such as unlinkability, transparency, and intervenability (i.e. UTI triad) (Hansen, Jensen, and Rost, 2015) have been identified in order to strengthen the privacy perspective by accompanying with the above security protection goals, i.e., the CIA triad.
Meaningful protection goals should be delivered in order to balance the requirements derived from both protection goals (i.e., CIA triad and UTI triad) concerning legal, technical, and organizational processes. Considerations on fairness, impartiality, and accountability postulate guidance for balancing the requirements as well as determining better strategies and appropriate protections.
In a similar manner, NIST has also provided three privacy engineering goals—disassociability, predictability, and manageability (i.e., DPM triad)—for developing and operating privacy‐preserving ICT systems (Brooks et al., 2017). These goals are designed to enable ICT designers to build ICT systems that are capable of implementing an organization's privacy protection goals and reinforce the management of privacy risk.
Combination of three privacy engineering objectives (i.e., DPM triad), complemented by the CIA triad to address unauthorized access to personal information, stipulate a chief set of information system capabilities to support the well‐balanced realization of business goals and privacy goals and assist in the mapping of controls to mitigate identified privacy risks.
The UTI triad and DPM triad can be loosely associated to provide privacy protection goals for a particular privacy‐preserving ICT system. A mapping between UTI and DPM triads is shown in Figure 20.3.
Numerous methodologies for designing privacy into modern ICT systems have been considered (Notario et al., 2014; Notario et al., 2015; Kung, 2014). Aspects of privacy engineering can be articulated by incorporating privacy requirements into the areas of the systems engineering life cycle (SELC) that could facilitate core privacy protection objectives and other organizational objectives. For some organizations, the prime motivation for privacy engineering would be for regulatory compliance purposes or reducing organizational risk.
Furthermore, MITRE Corporation has formulated a privacy engineering framework such that privacy engineering operationalizes the PbD logical framework within ICT systems (MITRE‐CoP, 2014) by:
Figure 20.4 demonstrates mapping of the fundamental privacy engineering activities into stages of the typical SELC. Such a mapping ensues for every SELC, including agile development.
Table 20.2 Privacy Engineering Activities and Methods.
Life Cycle Activity | Privacy Method |
Privacy requirements definition | Baseline and custom privacy system requirements |
Privacy empirical theories and abstract concepts | |
Privacy design and development | Fundamental privacy design concepts |
Privacy empirical theories and abstract concepts | |
Privacy design tools | |
Privacy heuristics | |
Privacy verification and validation | Privacy testing and review |
Operational synchronization |
The primary life cycle activities for privacy engineering and privacy methods are listed in Table 20.2, and a brief discussion of life cycle activities for privacy engineering framework is given as follows.
Fhom and Bayarou (2011) proposed a step‐by‐step approach as engineering flow to smart grid systems to address privacy concerns. An overall privacy‐aware engineering flow is depicted in Figure 20.5, and the steps and guidelines of the proposed methodology are outlined as follows.
To effectively comply with the privacy protection goals, the privacy principles and other regulations need a greater understanding of the privacy risks in demand response systems (i.e., smart grid).
Basically, a privacy risk model aims to provide a structured, repeatable, and quantifiable method for addressing privacy risk in ICT systems. The model may be defined with an equation and a series of inputs designed to enable (i) the identification of problems that may occur from the processing of personal information and (ii) the calculation of how such problems can be reflected in an organizational risk management approach that allows for prioritization and resource allocation to achieve organizational goals while minimizing overall adversative events.
In general, the system privacy risk is the product of three inputs: personal information collected or generated, data actions performed on that information, and the context surrounding the collection, generation, and processing of this personal information (Brooks et al., 2017).
As the system privacy risk is the risk of challenging data actions occurring, its inputs can be explained as follows.
And the equation expression for a system privacy risk model can be expressed as in Figure 20.6.
Subsequently, a system privacy risk model can assist the organization to identify possible privacy risks other than security risk. Given the emphasis on the operations of the system while processing personal information, ICT system's privacy risk, hence, can be ascribed as a function of the likelihood that a data action causes problems for individuals.
A PIA is a comprehensive and methodical process for identifying and addressing privacy issues in an ICT system that processes PII. Basically, the PIA is used for determining the privacy, confidentiality, and security risks associated with the collection, use, and disclosure of PII as well as assessing the potential effects on privacy of a process and an information system. The PIA also describes the measures that can be used to mitigate and probably eliminate the identified risks.
The main goals of the PIA include: (i) to ensure that information handling obeys the pertinent legal, regulatory, and policy requirements regarding privacy; (ii) to determine the risks and effects of collecting, maintaining, and disseminating PIIs within the ICT system; and (iii) to examine and evaluate protections and alternate processes for handling information to alleviate potential privacy risks.
The PIA being integral to the process for privacy risk treatment to evaluate and manage privacy impacts and to ensure compliance with privacy protection rules and responsibilities, it is considered as an approach for making privacy engineering more specific and effectual.
In recent years, risk management focusing on PIA has started to take on a more prominent role in privacy and data protection (Wright, Finn, and Rodrigues, 2013; Kalogridis et al., 2014; Engage, 2011). Several organizations and regulatory entities have developed PIA guidelines and strategies, which often differ in scope, context, and goal, depending on the concerned organizations and their purposes (IPCO, 2015).
The ISO/IEC 29134 (ISO/IEC JTC 1/SC 27 IT Security techniques, 2017), which is currently under development, is a standard on privacy impact assessment that shall basically provide a set of guidelines for the managing PIAs. The PIA guidelines of the ISO/IEC 29134 standard can be summarized as follows: (i) Determine PIA requisite in the given system and define the information flows and other privacy impacts; (ii) distinguish privacy risks and possible solutions; (iii) formulate and implement the PIA‐related recommendations; (iv) conduct third‐party review and/or audit of the PIA; (v) update the PIA if revisions occur; and (vi) embed privacy awareness throughout the organization and ensure accountability. These guidelines shall assist the organization in conducting the resulting PIA.
In NIST IR 7628 v2 (SGIPCSWG‐V2, 2014), the NIST has depicted a comprehensive consumer‐to‐utility PIA that is meant for the smart grid. Such a smart grid PIA activity delivers a structured, repeatable type of analysis aimed at determining how collected meter data can reveal personal information about individuals, and the focus of the PIA can be on a segment within the electric power grid or the power grid as a whole. Privacy principles and corresponding recommendations for smart grid high‐level consumer‐to‐utility privacy impact assessment as depicted in NIST IR 7628 v2 is shown in Table 20.3.
Table 20.3 Privacy Principles and Corresponding Recommendations for Smart Grid High‐Level Consumer‐to‐Utility Privacy Impact Assessment.
Principle | Recommendations |
Management and accountability |
• assign privacy responsibility • establish privacy audits • establish or amend incident response and law enforcement request policies and procedures |
Notice and purpose | • provide notification for personal information collected • provide notification for new information use purposes and collection |
Choice and consent | • provide notification about choices |
Collection and scope | • limit the collection of data to only that necessary for smart grid operations • obtain the data by lawful and fair means |
Use and retention | • review privacy policies and procedures • limit information retention |
Individual access | • access to energy usage data • dispute resolution |
Disclosure and limiting use | • limit information use • disclosure |
Security and safeguards | • associate energy data with individuals only when and where required • de‐identify information • safeguard personal information • do not use personal information for research purposes |
Accuracy and quality | • keep information accurate and complete |
Openness, monitoring, and challenging compliance | • policy challenge procedures • perform regular privacy impact assessments • establish breach notice practices |
Hofer et al. (2013) propose a PIA for the e‐Mobility system that mainly focuses on the ISO/IEC 15118 standard. The e‐Mobility PIA includes the following guidelines: (i) stipulating scope and purpose definition; (ii) identifying stakeholders; (iii) determining information assets; (iv) identifying information requirements and use; (v) determining information handling and other considerations; and (vi) conducting evaluation.
In the EU, several privacy risk assessment methodologies have been developed (Papakonstantinou and Kloza, 2015), for instance, the French Commission Nationale de l'informatique et des Liberte (CNIL) methodology for privacy risk management (CNIL, 2015) and the UK Information Commissioner's Office (ICO) privacy impact assessments code of practice. More prominently, the Data Protection Impact Assessment (DPIA) template for the smart grid and smart metering has been developed so that smart grid actors can conduct an assessment prior to deployment of any smart metering application (EU‐SGTF 2014).
Privacy enhancing technologies (PET; Senicar, Jerman‐Blazic, and Klobucar, 2003) can refer to particular methods that work in accordance with the data protection laws to prevent situations that might result in violation of privacy. For instance, PETs allow electricity consumers to protect privacy of their PII provided to and handled by other stakeholders in the smart grid. PETs are used to protect electricity consumers, largely against activity and behavioral analysis. Thus PETs are determined to protect privacy by minimizing personal data, subsequently preventing excessive processing of personal data, without the loss of the functionality of the smart grid system (Jawurek, Kerschbaum, and Danezis, 2012).
The main objective of PETs is to protect personal data as well as ensure the customers that their information remains confidential and management of data protection is a priority for the service providers who are responsible for dealing with PIIs. In this regard, PETs aim to furnish functionality and benefits to all stakeholders and to make available personal data to third parties, without disclosing any sensitive information (Kement et al., 2017; Jo, Kim, and Lee, 2016; Tonyali et al., 2017).
To address the privacy concerns in the smart grid, several privacy‐preserving protocols (PPP) and PETs have been proposed and deployed (Souri et al., 2014; Ferrag et al., 2016; Han and Xiao, 2016b). The majority of PETs are focused on AMI networks (Diao et al., 2015; Birman et al., 2015; Li et al., 2015; Wang, Mu, and Chen, 2016); however, some PETs are dedicated for V2G networks (Liu et al., 2014; Wang et al., 2015; Han and Xiao, 2016a; Liu et al., 2016).
Design of innovative PETs should assure consumers' privacy and allow ESPs to monitor and control the grid securely (Abdallah and Shen, 2016; He et al., 2017; Li et al., 2017; Liao et al., 2017). PETs can be classified into several categories: anonymization, trusted computation, cryptographic computation, perturbation, and verifiable computation (Jawurek, Kerschbaum, and Danezis, 2012).
The basic notion of anonymization is that the data consumer (e.g., ESP or utility) can still perform the needed calculations although the direct association between data item (i.e., smart meter reading for consumed electricity) and the data producer (e.g., household or customer) has been isolated. In other words, an anonymization technique shall remove user‐specific features from metering data before sending it to the authorized data consumer such that the ESP obtains anonymous metering data (i.e., data without any PII), which cannot be simply attributed to any specific customer. Thus, the ESP can have the statistical data processing and perform required computations. However, it is difficult or impossible for the data consumers to associate the metering data received to a specific smart meter, household, or electric vehicle.
The data consumer's incapability to attribute information determined from electricity customer data items to their producers can be viewed as a substantial privacy‐enhancing effect due to the anonymization. Thus, anonymity is one fundamental form of privacy protection that can be beneficial. Several privacy‐preserving mechanisms using the anonymization approach have been discussed.
Efthymiou and Kalogridis (2010) propose a mechanism for anonymizing customers' metering data using pseudonymous IDs through an escrow service by a trusted third party (TTP), so data consumers can be assured of the legality of the received metering data, but they would not be able to link the data to a specific customer. Similarly, Gong et al. (2016) propose utilizing two different IDs (i.e., anonymous ID and attributable ID) for providing privacy‐preserving incentive‐based demand response.
Badra and Zeadally (2014) propose a different approach to providing authorized data consumers with anonymous metering data, i.e., to use virtual ring architecture. While Finster and Baumgart (2013) propose a pseudonymous smart metering protocol without a TTP such that a smart meter uses a blinded pseudonym signed by the authorized data consumer, so the smart meter does reveal its identity.
Rottondi, Mauri, and Verticale (2015) propose a data pseudonymization protocol, which uses a secret splitting scheme to construct a unique pseudonym from different intermediate trusted nodes such that once the data consumer receives all the shares attached with the same pseudonym, it can recover the metering data associated with the pseudonym. Furthermore, Vaidya, Makrakis, and Mouftah (2014) propose ID‐based partially restrictive blind signature for V2G network such that the blindness property of the e‐token keeps EV's real ID anonymous to the local aggregator.
Apparently, an anonymization technique is practical only if the computation result does not have to be attributed to a specific data producer. This technique can be ineffective, since sometimes it is possible to re‐identify the owner of the data. Jawurek, Johns, and Rieck (2011) find that pseudonymized consumption traces (i.e., separated from PIIs), can still be attributed to individuals using auxiliary information such as household observation correlations between power events and physical events.
In trusted computation approaches, the data consumer would not have direct access to the electricity usage information of the individual consumer. Instead, it only receives an aggregation of metering data, which is computed either by the data producers themselves or an additional TTP that is introduced as external aggregator. By issuing aggregation results, the data consumer cannot recover consumers' personal details (i.e., PIIs). Yet the data consumer obtains sufficiently accurate aggregated metering data. The aggregation of metering data is mostly done in either temporal (i.e., power traces of a singe user over time) or spatial (i.e., power traces of multiple users at a certain time interval) manner.
In this approach, the TTP are mainly used for provisioning unlinkability between readings and the smart meters and supporting fraud/loss detection. However, the disclosure of this individual data to the data consumer constitutes one of the major threats. The disclosure can be performed by the aggregating entities. Thus, these types of privacy‐preserving protocols typically demand the strong assumptions for the trustworthiness of the aggregating entities.
Li, Luo, and Liu (2010) propose a mechanism that provides aggregation of Paillier‐encrypted data while routing through a minimal‐spanning‐tree of smart meters toward the authorized data consumer. Similarly, Ruj, Nayak, and Stojmenovic (2011) propose a two‐tier system for aggregation of smart metering data and subsequent access by authorized data consumers with a help of Paillier encryption.
Chen, Lu, and Cao (2015) propose a privacy‐preserving data aggregation scheme with fault tolerance (so‐called PDAFT) for smart grid communications that uses a homomorphic Paillier encryption technique to encrypt sensitive consumer data such that the data consumer can obtain the aggregated data without knowing individual ones.
In the cryptographic computation approach, either encryption schemes based on the homomorphic property or secret‐sharing schemes can be deployed. Metering data items that arrive at the data consumer shall be either ciphertexts or secret shares. So the privacy‐preserving protocol should ensure that the data consumer could only decrypt the aggregate of data items (i.e., ciphertexts or secret shares) but not individual data item.
In the homomorphic encryption technique, the individual power consumption items are encrypted by the data producers (i.e., smart meters) using the public key of the data consumer (Tonyali, Saputro, and Akkaya, 2015). Essentially, the encrypted data item undergoes a homomorphic operation prior to going to the data consumer. If the individual data items have to be aggregated, a homomorphic addition operation is employed. Finally, after receiving the aggregated data, the data consumer can obtain the decrypted result using its private key.
In case of a secret sharing scheme, a secret is allotted in multiple parts, and each part is given to an authorized participant. All or a subset of these participants have to contribute their shares to reconstruct the resultant secret. In the context of the smart grid, a consumer's electricity consumption reading obtained from a smart meter can be used as a share.
Garcia and Jacobs (2011) propose a privacy‐friendly energy metering system that uses an aggregation protocol based on a homomorphic property of encryption and specifically targets the detection of energy theft at substations. Likewise, Nateghizad, Erkin, and Lagendijk (2016) put forward a privacy‐preserving cryptographic protocol based on homomorphic encryption for smart metering that can reduce communication cost and improve efficiency of the protocol with encrypted inputs.
Tonyali et al. (2016a) investigated the feasibility and performance of fully homomorphic encryption (FHE) aggregation in the smart grid AMI networks utilizing the reliable data transport protocol, TCP, and proposed a novel packet reassembly mechanism for TCP to overcome the packet reassembly problem.
Rottondi, Verticale, and Capone (2012) describe an approach where trusted privacy‐preserving nodes and a central configurator are introduced into the smart metering system and uses Shamir's secret‐sharing algorithm, a secret sharing variant. Similarly, Rottondi, Fontana, and Verticale (2014) propose to employ Shamir's secret‐sharing scheme in privacy‐preserving mechanism for V2G networks, in which three types of EV data (i.e., plug‐in time period, charge level of the battery, and amount of recharged electricity) are split into parts. Each one in a set of local aggregators holds one part of the data, and the required information is reconstructed from the parts contributed by all the local aggregators.
Privacy‐enhancing protocols using a perturbation technique add certain amount of noise to individual data items or to the final aggregate such that the data consumer can still perform the required computation, but it cannot be used to derive any sensitive information of the data producer (i.e., sufficiently protect privacy of the data producer).
One of the choices of the perturbation‐based PETs is a differential privacy technique, in which a differential private aggregation function appends an adequate quantity of random noise to their result so that individual input data items cannot be deduced from the function's result.
PETs that use perturbation mechanisms that add certain amount of noise to every measurement in a distributed manner such that the resultant noise values is just adequate for reaching differential privacy (Jawurek and Kerschbaum, 2012).
Acs and Castelluccia (2011) put forward a PET that guarantees differential privacy by exploiting an innovative way for adding appropriate noise to the computed aggregate. The distributed noise generation mechanism is designed such that it allows not relying on a TTP to act as an aggregator. Similarly, Shi et al. (2011) describe an aggregation protocol in which individual data producers can add random noise in such a way that a sum of the random noise of all data producers shall ensure differential privacy for the aggregate outcome. Every data producer encrypts its noisy measurement before sending it to the aggregator. The aggregator uses a final share that allows decrypting the differential private aggregate of all measurements.
Bao and Lu (2015) propose a data aggregation scheme for smart metering, named differentially private data aggregation with fault tolerance (DPAFT), which can ensure differential privacy of data aggregation along with fault tolerance by applying the Boneh–Goh–Nissim cryptosystem.
Since smart meters periodically send fine‐grained power consumption data to the utility company in AMI networks, consumer privacy is one of prime concerns. In Tonyali et al. (2016b), a meter data obfuscation scheme is proposed to protect consumer privacy from eavesdroppers and the utility companies while preserving the utility companies' ability to use the data for state estimation.
In the verifiable computation paradigm, the aggregator not only provides an aggregation outcome but also furnishes a proof that the computed result has been performed as claimed. Hence, such privacy‐preserving protocols can be deployed in untrusted environments (i.e., having untrusted aggregators) so that the untrusted aggregators can perform the required computations while guaranteeing the integrity of the aggregation outcome. Such protocols are normally based on the zero‐knowledge proof (ZKP) protocols, in which the verifier only validates the legitimacy of the statement provided by the prover to be proven, but no private information has been revealed.
The verifiable computation–based PET is suitable for deploying protocols related to billing purposes. This is due to the fact that the verifiable computation protocols using ZKP can provide integrity and accuracy of the aggregate result that can be used for billing purposes without disclosing any private information.
In the case of the smart grid, the data producer can be projected as the prover and the data consumer as the verifier. The data producer (i.e., smart meter) computes the total energy consumption for certain duration and sends it to the data consumer (i.e., utility company). Subsequently, the data consumer can verify the validity of the result without disclosing individual smart meter readings. Furthermore, the zero‐knowledge proof enables the data consumer to verify cumulative price such that the total amount for the energy consumption has been calculated correctly.
Rial and Danezis (2011) have proposed an approach in which a smart meter produces metering data as well as its commitment and signature over the commitments and sends them to a user device. The user device obtains the tariff from the service provider and calculates the fee and the required proof. It implements ZKP protocols for various tariffs including cumulative tariffs, interval linear tariffs, and even cumulative polynomial tariffs.
Wan, Zhu, and Wang (2016) propose a privacy‐preserving mechanism called PRAC (privacy via randomized anonymous credentials) for V2G communication, which ensures anonymous authentication and rewarding as well as guarantees unlinkable credentials and rewards. They have used zero‐knowledge proof as an approach to prove that the scheme can ensure integrity and anonymous authentication. Likewise, Rahman et al. (2017) propose a solution in which a secure and privacy‐preserving communication channel between a bidder and a registration manager is established and utilizes El‐Gamal public key encryption and Schnorr signature scheme for ensuring zero‐knowledge proof.
This work was funded by The Ontario Ministry of Energy Smart Grid Fund.
3.140.188.157