Predictive profiling and its legal limits: Effectiveness gone forever

Hans Lammerant, Paul de Hert

Research output: Chapter in Book/Report/Conference proceedingChapterScientificpeer-review

187 Downloads (Pure)

Abstract

We examine predictive group profiling in the Big Data context as an instrument of governmental control and regulation. We first define profiling by drawing some useful distinctions (section 6.1). We then discuss examples of predictive group profiling from policing (such as parole prediction methods taken from the us) and combatting fraud (the icov and syri systems in the Netherlands) (section 6.2). Three potential risks of profiling – the negative impact on privacy; social sorting and discrimination; and opaque decision-making – are discussed in section 6.3. We then turn to the legal framework. Is profiling by governmental agencies adequately framed? Are existing legal checks and balances sufficient to safeguard civil liberties? We discuss the relationship between profiling and the right to privacy (section 6.4) and between profiling and the prohibition on discrimination (section 6.5). The jurisprudence on the right to privacy clearly sets limits to the use of automated and predictive profiling. Profiling and data screening which interfere without distinction with the privacy of large parts of the population are disproportional. Applications need to have some link to concrete fact to be legitimate. An additional role is played by the prohibition of discrimination, which requires strengthening through the development of audit tools and discrimination-aware algorithms. We then discuss current safeguards in Dutch administrative, criminal procedure and data protection law (section 6.6), and witness a trend of weakening safeguards at the very moment when they should be applied with even more rigor. In our conclusion, we point to the tension between profiling and legal safeguards. These safeguards remain important and need to be overhauled to make them effective again.
Original languageEnglish
Title of host publicationExploring the boundaries of big data
EditorsB. van der Sloot, D. Broeders, E. Schrijvers
PublisherAmsterdam University Press/WRR
Pages145-173
Number of pages29
Volume32
ISBN (Electronic)978 90 4853 393 0
ISBN (Print)978 94 6298 358 8
Publication statusPublished - 2016

Fingerprint

discrimination
right to privacy
privacy
criminal procedure
data protection
fraud
jurisprudence
audit
witness
Netherlands
Group
decision making
regulation
Law
trend

Cite this

Lammerant, H., & de Hert, P. (2016). Predictive profiling and its legal limits: Effectiveness gone forever. In B. van der Sloot, D. Broeders, & E. Schrijvers (Eds.), Exploring the boundaries of big data (Vol. 32, pp. 145-173 ). Amsterdam University Press/WRR.
Lammerant, Hans ; de Hert, Paul. / Predictive profiling and its legal limits : Effectiveness gone forever. Exploring the boundaries of big data. editor / B. van der Sloot ; D. Broeders ; E. Schrijvers. Vol. 32 Amsterdam University Press/WRR, 2016. pp. 145-173
@inbook{823c317b46df4112964ee5ce031b58bc,
title = "Predictive profiling and its legal limits: Effectiveness gone forever",
abstract = "We examine predictive group profiling in the Big Data context as an instrument of governmental control and regulation. We first define profiling by drawing some useful distinctions (section 6.1). We then discuss examples of predictive group profiling from policing (such as parole prediction methods taken from the us) and combatting fraud (the icov and syri systems in the Netherlands) (section 6.2). Three potential risks of profiling – the negative impact on privacy; social sorting and discrimination; and opaque decision-making – are discussed in section 6.3. We then turn to the legal framework. Is profiling by governmental agencies adequately framed? Are existing legal checks and balances sufficient to safeguard civil liberties? We discuss the relationship between profiling and the right to privacy (section 6.4) and between profiling and the prohibition on discrimination (section 6.5). The jurisprudence on the right to privacy clearly sets limits to the use of automated and predictive profiling. Profiling and data screening which interfere without distinction with the privacy of large parts of the population are disproportional. Applications need to have some link to concrete fact to be legitimate. An additional role is played by the prohibition of discrimination, which requires strengthening through the development of audit tools and discrimination-aware algorithms. We then discuss current safeguards in Dutch administrative, criminal procedure and data protection law (section 6.6), and witness a trend of weakening safeguards at the very moment when they should be applied with even more rigor. In our conclusion, we point to the tension between profiling and legal safeguards. These safeguards remain important and need to be overhauled to make them effective again.",
author = "Hans Lammerant and {de Hert}, Paul",
year = "2016",
language = "English",
isbn = "978 94 6298 358 8",
volume = "32",
pages = "145--173",
editor = "{ van der Sloot}, B. and D. Broeders and E. Schrijvers",
booktitle = "Exploring the boundaries of big data",
publisher = "Amsterdam University Press/WRR",

}

Lammerant, H & de Hert, P 2016, Predictive profiling and its legal limits: Effectiveness gone forever. in B van der Sloot, D Broeders & E Schrijvers (eds), Exploring the boundaries of big data. vol. 32, Amsterdam University Press/WRR, pp. 145-173 .

Predictive profiling and its legal limits : Effectiveness gone forever. / Lammerant, Hans; de Hert, Paul.

Exploring the boundaries of big data. ed. / B. van der Sloot; D. Broeders; E. Schrijvers. Vol. 32 Amsterdam University Press/WRR, 2016. p. 145-173 .

Research output: Chapter in Book/Report/Conference proceedingChapterScientificpeer-review

TY - CHAP

T1 - Predictive profiling and its legal limits

T2 - Effectiveness gone forever

AU - Lammerant, Hans

AU - de Hert, Paul

PY - 2016

Y1 - 2016

N2 - We examine predictive group profiling in the Big Data context as an instrument of governmental control and regulation. We first define profiling by drawing some useful distinctions (section 6.1). We then discuss examples of predictive group profiling from policing (such as parole prediction methods taken from the us) and combatting fraud (the icov and syri systems in the Netherlands) (section 6.2). Three potential risks of profiling – the negative impact on privacy; social sorting and discrimination; and opaque decision-making – are discussed in section 6.3. We then turn to the legal framework. Is profiling by governmental agencies adequately framed? Are existing legal checks and balances sufficient to safeguard civil liberties? We discuss the relationship between profiling and the right to privacy (section 6.4) and between profiling and the prohibition on discrimination (section 6.5). The jurisprudence on the right to privacy clearly sets limits to the use of automated and predictive profiling. Profiling and data screening which interfere without distinction with the privacy of large parts of the population are disproportional. Applications need to have some link to concrete fact to be legitimate. An additional role is played by the prohibition of discrimination, which requires strengthening through the development of audit tools and discrimination-aware algorithms. We then discuss current safeguards in Dutch administrative, criminal procedure and data protection law (section 6.6), and witness a trend of weakening safeguards at the very moment when they should be applied with even more rigor. In our conclusion, we point to the tension between profiling and legal safeguards. These safeguards remain important and need to be overhauled to make them effective again.

AB - We examine predictive group profiling in the Big Data context as an instrument of governmental control and regulation. We first define profiling by drawing some useful distinctions (section 6.1). We then discuss examples of predictive group profiling from policing (such as parole prediction methods taken from the us) and combatting fraud (the icov and syri systems in the Netherlands) (section 6.2). Three potential risks of profiling – the negative impact on privacy; social sorting and discrimination; and opaque decision-making – are discussed in section 6.3. We then turn to the legal framework. Is profiling by governmental agencies adequately framed? Are existing legal checks and balances sufficient to safeguard civil liberties? We discuss the relationship between profiling and the right to privacy (section 6.4) and between profiling and the prohibition on discrimination (section 6.5). The jurisprudence on the right to privacy clearly sets limits to the use of automated and predictive profiling. Profiling and data screening which interfere without distinction with the privacy of large parts of the population are disproportional. Applications need to have some link to concrete fact to be legitimate. An additional role is played by the prohibition of discrimination, which requires strengthening through the development of audit tools and discrimination-aware algorithms. We then discuss current safeguards in Dutch administrative, criminal procedure and data protection law (section 6.6), and witness a trend of weakening safeguards at the very moment when they should be applied with even more rigor. In our conclusion, we point to the tension between profiling and legal safeguards. These safeguards remain important and need to be overhauled to make them effective again.

M3 - Chapter

SN - 978 94 6298 358 8

VL - 32

SP - 145

EP - 173

BT - Exploring the boundaries of big data

A2 - van der Sloot, B.

A2 - Broeders, D.

A2 - Schrijvers, E.

PB - Amsterdam University Press/WRR

ER -

Lammerant H, de Hert P. Predictive profiling and its legal limits: Effectiveness gone forever. In van der Sloot B, Broeders D, Schrijvers E, editors, Exploring the boundaries of big data. Vol. 32. Amsterdam University Press/WRR. 2016. p. 145-173