@sh

Profiling and automated decision-making: Legal implications and shortcomings

. chapter 6, page 123-153. Springer, (November 2018)
DOI: https://doi.org/10.1007/978-981-13-2874-9_6

Abstract

The increased use of profiling and automated decision-making systems raises a number of challenges and concerns. The underlying algorithms embody a considerable potential for discrimination and unfair treatment. Furthermore, individuals are treated as passive objects of algorithmic evaluation and decision tools and are unable to present their values and positions. They are no longer perceived as individuals in their own right: all that matters is the group they are assigned to. Profiling and automated decision-making techniques also depend on the processing of personal data, and a significant number of the available applications are highly privacy-intrusive. This article analyses how the European General Data Protection Regulation (GDPR) responds to these challenges. In particular, Art. 22 GDPR, which provides the right not to be subject to automated individual decision-making, as well as the information obligations under Art. 13 (2) (f) and Art. 14 (2) (g) GDPR and the access right under Art. 15 (1) (h) GDPR, will be examined in detail. General data protection principles, particularly the principle of fairness, as well as specific German scoring provisions and anti-discrimination rules, are looked at, too. In conclusion, various shortcomings of the present legal framework are identified and discussed and a short outlook for potential future steps presented.

Links and resources

Tags