Chapter 15

The Legal Challenges of Big Data Application in Law Enforcement

Fraser Sampson

Abstract

This chapter considers the specific issues that Big Data presents for law enforcement agencies (LEAs). In particular, it looks at the dilemmas created for LEAs seeking to use the advantages Big Data gives them while remaining compliant with the developing legal framework governing privacy and the protection of personal data, and how those very advantages can present challenges in law enforcement.

Keywords

Dilemma; Human rights; Jurisdiction; Law enforcement; Privacy; Purpose limitation

Introduction

Big Data “calls for momentous choices to be made between weighty policy concerns” (Polonetsky and Tene, 2013). The weighty policy concerns also have to weigh in the balance the most efficient and effective use of available resources with the fundamental rights and freedoms of individuals. One of the weightiest policy concerns is that of law enforcement. The setting of law enforcement raises several dilemmas for Big Data; because Big Data represents such an expansive, dynamic, and complex subject, this chapter is necessarily selective and succinct.
In the opinion of the European Union Data Protection Working Party,1 “Big Data” refers to exponential growth in both the availability and the automated use of information. Big Data refers to “gigantic digital datasets held by corporations, governments and other large organisations, which are then extensively analysed using computer algorithms.”

Attractions of Big Data

One of the principal attractions—if not the principal attraction—of Big Data is its enabling of analytics, the almost limitless power that attends the super-synthesis of information.
Offering what perhaps are the obverse attractions of nano-technology, Big Data’s giga-analytics can produce macro-level pictures of trends, pathways, and patterns that might reveal pictures hitherto unseen even by the data owners. Such tele-analytics allow not only a better understanding of what may be happening here and now, but a reliable basis for predictions of what is to come.
Aside from the obvious attraction for commercial suppliers trying to understand, predict, and influence consumer behavior, Big Data analytics also holds out a phenomenological capability for law enforcement agencies in trying to understand, predict, and influence behaviors of offenders and potential offenders.
As Professor Akghar from CENTRIC2 puts it, “When we look at ways to advance the use of data and analytics for public security and safety, the potential has never been greater. We now have the computing power to not only understand past events, but also to create new knowledge from billions of data points—quickly. In minutes, we can run analyses that used to take days” (Akhgar, 2014).

Dilemmas of Big Data

With so much data so readily available, one might ask on what basis would law enforcement agencies (LEAs) not seize it and run with it as far and as fast as possible, if doing so meant preventing terrorist attacks, disrupting serious organized crime, or preventing wide-scale child sexual exploitation, human trafficking, and so forth?
Take, for example, successful work in Greater Manchester 3 that has shown the power of having a range of agencies literally in the same room. Why not have the totality of their data virtually present in the same place, too? Because Big Data can be applied to mass datasets to reveal high-level trends and patterns, it might be thought that the extent to which it can assist in preventing and detecting criminality is limited. Not necessarily. As the Article 29 Working Party4 noted, not only can the awesome capability offered by Big Data be used to identify general trends and macro-correlations, it can also be processed—rapidly and almost effortlessly—to directly affect the individual.5
From a practical operation perspective, then, there is a vast potential for Big Data in law enforcement. From a legal perspective, the point at which Big Data focuses this astonishing power on individuality can become highly contentious. One such point is where it is used for law enforcement, whether that is in the context of criminological extrapolation or criminal suspect extradition.
The challenging question from a pragmatic law enforcement perspective is: If information is lawfully held within the databases of willing and socially responsible organizations that might help prevent people becoming victims of crime or bring perpetrators to justice, why would LEAs not only feel justified in accessing those data but obliged to do so?
Part of the answer is that the application of informatics within a law enforcement environment is arguably different from that of Big Data application in most other settings. There are several strands to the answer, first among which is the high level of legal regulation of this area. Yes, there are substantial and significant exceptions within most legal data frameworks to allow access by LEAs to data held by others, particularly when their principal purpose is to prevent or investigate crime or pursue the interests of national security, but they are not always that clear and seldom amount to a blank check. Before looking more closely at some of the components of the law enforcement dilemma, it is necessary to look at the broad components of the legal framework within which the pragmatic law enforcement activity takes place.

Legal Framework

The legal framework regulating the Big Data challenges for law enforcement in the United Kingdom (UK) is dominated by that throughout all European Union (EU) member states. Primary law components (but by no means all) of that framework are to found in:
• The European Convention on Human Rights
• The European Charter of Fundamental Rights
• EU Data Protection Directive 95/46–8
• The Council of Europe Convention 1086—providing the main point of reference for the directive applying to data protection in policing and criminal justice
• The Data Protection Act 1998 (based on the central principles of the Directive)
• The Freedom of Information Act 2000, which created rights of access to information, superseding the Code of Practice on Access to Government Information and amending the Data Protection Act 1998 and the Public Records Act 1958
• The Protection of Freedoms Act 2012, a very wide-ranging act making provision with respect to the retention and destruction of fingerprints, footwear impressions, and DNA samples and profiles taken in the course of a criminal investigation; requirements of schools and further education colleges to obtain the consent of parents of children under 18 years of age attending the school or college before the school or college can process a child’s biometric information; the further regulation of closed circuit television, automatic number plate recognition, and other surveillance camera technology operated by the police and local authorities; the need for judicial approval before local authorities can use certain data-gathering techniques; data provision with respect to parking enforcement and counter-terrorism powers.
These are supported, extended, and elaborated upon in various other instruments too numerous to list here7 (for a guide, see Bignami, 2007; Holzacker and Luif, 2013).
Article 13 of the EU Directive provides that “member states may adopt legislative measures to restrict the scope of the obligations and rights provided for in Article 6 (1)…when such a restriction constitutes a necessary measure to safeguard…national security; defence; public security; the prevention, investigation, detection and prosecution of criminal offences.” However, a qualified test must be applied to any restriction to ensure that the legislative measure meets the criteria that allow derogating from a fundamental right. There are two limbs to this test: First, the measure must be sufficiently clear and precise to be foreseeable; second, it must be necessary and proportionate, consistent with the requirements developed by the European Court of Human Rights.

Human Rights

Much of the legislation and jurisprudence relating to data protection across the EU derive from human rights and fundamental freedoms. Clearly, there is not the space here to review the legal and political provenance of this subject. However, it is worth pausing at this stage to note and distinguish the two “distinct but related systems to ensure the protection of fundamental and human rights in Europe” (Kokott and Sobotta, 2013). The first, the European Convention on Human Rights, is probably known and understood by law enforcement personnel in the UK better than the second. The Convention is an international agreement between the States of the Council of Europe of which all member states are part, as are external states such as Switzerland, Russia, and Turkey. Matters engaging the Convention are ultimately justiciable in the European Court of Human Rights, which has jurisdiction over actions brought by individuals against member states for alleged breaches of human rights, and a substantial body of jurisprudence has been built up around this area.
The second, less familiar system arises from the jurisprudence of the Court of Justice of the European Union (ECJ), which guarantees the protection of fundamental human rights within the EU. Respect of these rights is part of the core constitutional principles of the EU. Both systems are engaged by some activities around data capture, retention, and analysis, but a key distinction in relation to Big Data is that for most purposes, human rights protections treat the protection of personal data as a form of extension of the right to privacy.8 (Article 8 of the European Convention on Human Rights incorporates this in the respect for an individual’s private and family life, home, and correspondence.) Article 8 prohibits interference with the right to privacy, except where such interference is in accordance with the generally applicable departures from the Convention article necessary in a democratic society.9 The EU Charter of Fundamental Rights, however, specifically enshrines data protection as a fundamental right in itself (somewhat unhelpfully under Article 8). This is distinct from the protection of respect for private and family life (Article 7). The Charter also establishes the principle of purpose limitation, requiring personal data to be processed “fairly for specified purposes” and stipulating the need for a legitimate basis for any processing of such data.
Even the EU’s own legal framework for enshrining rights and freedoms for data subjects is not immune from challenge. For example, the ECJ found that the Data Retention Directive10 allowed the data retained under its aegis to be kept in a manner so as to allow the identity of the person with whom a subscriber or a registered user had communicated to be revealed as well as identify the time of the communication and the place in which that communication occurred.11 The Directive sought to ensure that data were available to prevent, investigate, detect, and prosecute serious crimes, and that providers of publicly available electronic communications services or of public communications networks were obliged to reveal the relevant data. The ECJ held that those data might permit “very precise conclusions to be drawn concerning the private lives of the persons, whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.” The ECJ also held that the retention of data might have a chilling effect on the use of electronic communication covered by the Directive on the exercise of freedom of expression guaranteed by Article 11 of the Charter of Fundamental Rights.12
Then there is the indiscriminate—or at least non-discriminating—nature of Big Data analytics. The automation of processing is not just a strength; it is almost a sine qua non of Big Data use. The dilemma for agencies tasked with the exercise of discretionary powers is that the greater the automation, the less scope arguably there is for intervention by the controlling mind and the application of discretion (which, as once described by Lord Scarman,13 is the police officer’s daily task). Much has been written and said of the use of “non fault” or “without cause” powers by the police and the absence of Scarman’s “safeguard of reasonable suspicion” (see, e.g., Staniforth, 2013), and the general trend for law enforcement in the UK has been to move away from the blanket applications of powers.
Interference by a member state with an individual’s rights under the European Convention must be “necessary in a democratic society” and have a legitimate aim to answer a “pressing social need,” but even then an identified interference must be proportionate and remains subject to review by the Court (Coster v. United Kingdom, 2001; 33 EHRR 479).14 Whereas the relationship between accuracy and reliability is clearly important in any form of data analysis, when the analysis is used at the level of the individual, biometrics, demographics, and social epidemiology take on a different legal quality. Almost by definition, Big Data deals with the supra-personal, the yotta-aggregation of data that is unconcerned with the binary constructs of personal identity and individuality.
However, the Working Party puts it thus: “The type of analytics application used can lead to results that are inaccurate, discriminatory or otherwise illegitimate. In particular, an algorithm might spot a correlation, and then draw a statistical inference that is, when applied to inform marketing or other decisions, unfair and discriminatory. This may perpetuate existing prejudices and stereotypes, and aggravate the problems of social exclusion and stratification.”15
Just how little information Big Data needs to pinpoint an individual can be seen in Tene’s (2010) graphic citing of research that has shown how “a mere three pieces of information—ZIP code, birth date, and gender—are sufficient to uniquely identify 87 per cent of the US population.”

Purpose Limitation and Further Processing

Within the legal framework protecting human rights are several key and interlinking concepts. The first such concept is purpose limitation. Purpose limitation is a key legal data protection principle16 that appears (as discussed above) in both limbs of the European framework engaging with data protection: the Convention on Human Rights and the European Charter on Fundamental Freedoms. Through this framework the law seeks to protect data subjects (in crude shorthand, those individuals to whom the relevant data relate) by setting limits, albeit flexible, on how the data controllers (equally crudely, those who are able to manage and direct the manner in which the data are used) are able to use their data.
Purpose limitation, which has parallels in other jurisdictions (such as Article 6 of Law n. 121/1981 in Italy; see Chapter 16 for more information), has two components. First is purpose specification, which means that the collection of certain types of data such as “personal data”17 must be for a “specified, explicit, and legitimate” purpose. The second element of purpose specification is “compatible use.” This means that the data must not be further processed (see below) in a way that is incompatible with those purposes.
Arguably, the whole concept of Big Data analytics is predicated on some further perhaps even ulterior processing of data collected as a separate set or for a different, more specific purpose. The subsequent use of data represents a key barrier to lawful processing because of the requirement for compatibility. That is not to say that there can be no further processing, but such processing as there is will generally need to be compatible with the original lawful purpose or be exempt from that compatibility requirement. Even the recycling of personal data that has already been made publicly available remains subject to the relevant data protection laws.
An important aspect of the further processing issue is the nature of the relationship between the controller and the data subject; in general terms, compatibility assessments should be more stringent if the data subject has not been given sufficient—or any—freedom of choice.
Exemptions for processing personal data within the UK are widely drafted and include purposes such as the administration of justice, statutory functions, and public interest provisions, which cover the work of a whole range of public bodies. However, the number of community outcomes for which the police alone are responsible is vanishingly small and (certainly in the UK) almost every activity that keeps people safe and thriving is the product of collaborative enterprise and partnership. This level of engrenage is not specifically reflected by the law regarding data protection and processing. There are restrictions on data sharing, particularly when the organizations involved are in different jurisdictions. Then there are limitations on the aggregation and analysis of huge datasets generally, which can present barriers to the proper activities of LEAs and problems regarding reliability of extrapolation, interpolation, and identification. Public bodies such as police forces have no general power to share data and must do so only when they are able to indicate a power (expressed or implied) that permits them to do so.18
A key challenge of Big Data for law enforcement therefore arises from the almost total reliance on partnerships within the British neighborhood policing model, which makes sectoral and functional separation (i.e., separation into public health, education, research) all but impossible. The best one can hope for is to identify the legitimate outcomes toward which the law enforcement partnership is working, understand the key elements of the relevant data protection framework applicable to that setting, and aim for compliance.
The relevant legislative frameworks, however, presuppose a “neat dichotomy” (Tene, 2010), whereas the increasingly collaborative manner in which businesses operate precludes a neat dichotomy between controllers and processors. Many decisions involving personal data have become a joint exercise between customers and layers upon layers of service providers. With the rise of cloud computing and the proliferation of online and mobile apps, not only the identity but also the location of data controllers have become indeterminate (Tene, 2010).
This is challenging enough when the LEAs and partners are within EU members states. When non-member states are involved—as occurs in many cases particularly involving serious organized crime—there is an additional requirement of “adequacy of protection.” It is a key principle of the relevant legislation in member states that personal data must not be transferred outside the European Economic Area (EU member states and Norway, Iceland, and Lichtenstein) unless there is an ensured adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.

Public Trust and Confidence

Finally, and perhaps most important, there is public trust. The consensual model of policing in the UK entirely depends on the support of the communities within which the police operate. The principal factor keeping relative order on the streets of the UK is not so much the presence of 140,000 police officers; rather, it is the legitimacy (Stanko, 2011) they enjoy among the 60 million people who tolerate and support them.
Some key features of Big Data, such as behavioral targeting, have a different cachet in LEA settings, and the history of data processing within UK policing has not been without its difficulties. There have been various legal challenges to the use and retention of personal data by the police: for example, S & Marper v. United Kingdom (2008) ECHR 1581 (police retention of DNA samples of individuals arrested, but who are later acquitted or have the charges against them dropped, was a violation of right to privacy) and R (on the application of GC & C) v. The Commissioner of Police of the Metropolis (2011) UKSC 21 (successful challenge of a policy of the Association of Chief Police Officers allowing indefinite retention of biometric samples, DNA and fingerprints for an indefinite period save in exceptional circumstances).
Police monitoring of public protests has produced a series of legal challenges for which LEAs have not always managed to achieve the fine balance between the obligations of the state to ensure the security and safety of its citizens and its duty to ensure the protection of their human rights and fundamental freedoms (see The Queen (on the application of Catt) v. The Association of Chief Police Officers of England, Wales and Northern Ireland and The Commissioner of Police for the Metropolis (2013) EWCA Civ 192). The Catt case involved a lawful demonstration and the indefinite retention of data about the applicant on the National Domestic Extremism Database. The case shows that even where the relevant event takes place in public, the recording and retention of personal data about individuals involved can be an unlawful interference with the right to respect for private life under Article 8 of the European Convention of Human Rights.
Aside from the litigious challenges over operational retention and use of personal data, the police have also experienced the ignominy of having their official recognition removed by the Office for National Statistics because their data processing approaches for recording crime were found to be unreliable. The police found themselves the subject of a Parliamentary report called “Caught red handed: Why we cannot count on police recorded crime statistics,” published by the Public Administration Select Committee,19 whose chair, Bernard Jenkin, MP, said in the press release accompanying the report: “Poor data integrity reflects the poor quality of leadership within the police. Their compliance with the core values of policing, including accountability, honesty and integrity, will determine whether the proper quality of Police Recorded Crime data can be restored.”20 Shortcomings in data quality and reliability in the LEA context are not just about compliance and can have real and immediate detrimental impacts on and within the criminal justice process.21
The Public Administration Committee’s report was followed by a report of HM Inspector of Constabulary on the reliability of crime recording data created and maintained by the police forces of England and Wales.22 The interim report published on May 1, 2014, which drew upon several previous reports, referred to the Inspectorate’s “serious concerns” in the integrity of police crime recording data.
Conversely, the failings of the police in England and Wales to retain relevant data in a searchable and shareable way, so as to enable the tracking of dangerous offenders such as Ian Huntley,23 were widely reported and criticized in the Bichard Report,24 which led to wholesale changes in the police approach to operational information technology capabilities.
The corrosive effect of such cases and the media’s reporting of them can be expected to damage public trust and confidence in the police and to affect the legitimacy they need to operate. When taken against the wider international context of “data-gate” and the Snowden revelations25 of how governments have been using Big Data analytics and high-tech information and communications technology monitoring capabilities, this reduced trust and confidence represents a serious impediment to even the lawful and compliant use of Big Data by LEAs in the future particularly as we move into an era of “omniveillance” (Blackman, 2008).

Conclusions

Although the attractions of Big Data for LEAs are immediate and obvious, so, too, are the dilemmas it creates. The benefits of a capability of the scale offered by Big Data are readily apparent in every aspect of law enforcement, particularly where technology is used by perpetrators. For example, where the proscribed activities take place within the galactic setting of social media communications, such as in radicalization activities in terrorism and the online grooming of children and vulnerable victims in sexual offending, influencing behaviors and searching out prospects, the modus operandi almost invites a Big Data approach to both detection and prevention.
It is one thing to get private organizations from the retail sector or business-to-business suppliers working to certain data protocols, but what about LEAs? Staples such as individual consent and the right to be forgotten become much more difficult to apply, whereas exceptions such as the investigation, detection, and prevention of crime or—even broader—the public interest are much more readily applicable.

How Far Should Big Data Principles Such as “Do Not Track” and “Do Not Collect” Be Applicable to LEAs, Either in Qualified Format or at All?

Can the developing legal framework around human rights and concepts such as privacy and identity offer sufficient protection, engender legitimacy, and foster public trust? At this point the proposed Data Protection Regulation (Article 6 (4)) contains a broad exception from the compatibility requirement and if enacted, will allow a great deal of latitude for the further processing of personal data including a subsequent change of contractual terms. This potentially allows a data controller not just to move the goal posts, but to wait and see where the ball lands and then erect the goal around it. How will such relaxation of the rules be viewed by citizens, and what safeguards can they legitimately expect from their states?
When it comes to Big Data, the higher the stakes, the greater the challenges for LEAs that risk being condemned for not using all available data to prevent terrorist atrocities or cyber-enabled criminality and damned if they do so to the detriment of individual rights and freedoms.
As Polonetsky and Tene (2013) put it: “The NSA revelations crystallized privacy advocates’ concerns of sleepwalking into a surveillance society’ even as decision-makers remain loath to curb government powers for fear of terrorist or cybersecurity attacks.”
One thing seems certain: The continued expansion of Big Data capability will inflate the correlative dilemmas it presents to our LEAs.
The resolution of the dilemmas of Big Data for LEAs—and by extension, for their partners in key areas such as safeguarding, fraud prevention, and the proper establishment of the rule of law in cyberspace—will be as much a challenge for the law as the technology. The dilemmas for LEAs are but one example of how our legal systems and principles need to catch up with the practices of their citizens’ lives. It will need a new breed, a form of lex veneficus,26 perhaps, to work alongside the technical wizards who have set the height of the Big Data bar.

References

Akhgar B. Big Data and public security. Intelligence Quarterly Journal of Advanced Analytics. 2014;2Q:17–19.

Blackman J. Omniveillance, Google, privacy in public, and the right to your digital identity: a tort for recording and disseminating an individual’s image over the Internet. Santa Clara Law Review. 2008;49:313–392.

Bignami F.E. Privacy and law enforcement in the European Union: the data retention directive. Chicago Journal of International Law. 2007;8:233–255.

Boehm F, Cole M.D. Data Retention after the Judgment of the Court of Justice of the European Union. 2014 (Münster/Luxembourg).

Holzacker R.L, Luif P. Freedom, Security and Justice in the European Union: Internal and External Dimensions of Increased Cooperation after the Lisbon Treaty. New York: Springer Science+Business Media; 2013.

Kokott J, Sobotta C. The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR. International Data Privacy Law. 2013;3(4):222–228.

Polonetsky J, Tene O. Privacy and Big Data: making ends meet. Stanford Law Review. 2013;66:25 Online.

Stanko B. Observations from a decade inside: policing cultures and evidence based policing. In: 5th SIPR Annual Lecture. Scottish Police College. Delivered 20 October 2011. 2011.

Staniforth A. In: Sampson F, ed. The Routledge Companion to UK Counter-Terrorism. London: Routledge; 2013.

Tene O. Privacy—the new generations. International Data Privacy Law. 2010:1–13.


1 Article 29 Data Protection Working Party 00,569/13/EN WP 203 Opinion 03/13, p. 35.

2 The Centre for Excellence in Terrorism, Resilience, Intelligence and Organised Crime Research at Sheffield Hallam University, UK.

3 See “Greater Manchester against crime: A complete system for partnership working,” available at: https://www.ucl.ac.uk/jdi/events/mapping-conf/conf-2005/conf2005-downloads/dave-flitcroft.pdf.

4 This Working Party is made up of EU member state national data protection authorities and is an independent advisory body on data protection and privacy. Established under Article 29 of the Data Protection Directive (95/46/EC), its role is to contribute to the uniform application of the Directive across member states.

5 Data Protection Working Party loc. cit.

6 Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data, Council of Europe Treaties 108 (01/1981).

7 See also, for example, Framework Decision 2008/977/JHA for the protection of personal data processed in the framework of police and judicial cooperation in criminal matters (Data Protection Framework Decision) and the Council Decision 2008/615/JHA of June 23, 2008 on the stepping up of cross-border cooperation, particularly in combating terrorism and cross-border crime (the Prum Decision).

8 For an unusual police-related case, see ECtHR June 25, 1997, Halford v. The United Kingdom (no. 20605/92, 1997-III).

9 See, for example, Copland v. The United Kingdom (no. 62617/00 Reports of Judgments and Decisions 2007-I); ECtHR January 12, 2010, Gillan and Quinton v. The United Kingdom (no. 4158/05, Reports of Judgments and Decisions, 2010).

10 EU Data Retention Directive 2006/24/EC.

11 Judgment in Joined Cases C-293/12 and C-594/12 Ireland and Seitlinger and Others.

12 For a fuller explanation, see Boehm and Cole (2014).

13 Report on the Brixton Disorders, April 10–12, 1981 (Cmnd. 8247), February 4, 1984.

14 See also Article 40 of the UN Convention on the Rights of the Child of 1989, which states that it is the right of every child alleged to have infringed a penal law to be treated in a manner consistent with the promotion of the child’s dignity and worth, reinforcing the respect for the child’s human rights and fundamental freedoms.

15 Loc. cit. at p. 45.

16 Article 6 (1)(b) of Directive 95/46/EC of the European Parliament and of the Council of October 24, 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (OJ L 281, November 23, 1995, p. 31).

17 Personal data in England and Wales means data relating to an identified/identifiable living individual (Data Protection Act, 1998).

18 For instance, the Ant-Terrorism, Crime and Security Act, 2001, p. 17.

19 Report of the Public Administration Select Committee 13th session 2013/14 HC 760, The Stationery Office, London.

20 See http://www.parliament.uk/business/committees/committees-a-z/commons-select/public-administration-select-committee/news/crime-stats-substantive/.

21 See http://www.telegraph.co.uk/news/uknews/crime/11117598/Criminals-could-appeal-after-Home-Office-admits-potentially-misleading-DNA-evidence-presented-to-juries.html.

22 See http://www.justiceinspectorates.gov.uk/hmic/programmes/crime-data-integrity/.

23 Convicted on December 17, 2003 of the murder of 10-year-old schoolgirls Holly Wells and Jessica Chapman.

24 Report of the Bichard Inquiry HC 653 June 22, 2004, The Stationery Office, London.

25 See http://www.theguardian.com/world/the-nsa-files.

26 Literally a legal magician.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.148.103.210