Data is power: Towards additional guidance on profiling and automated decision-making in the GDPR

Frederike Kaltheuner, Elettra Bietti


In contrast to automated decision-making, profiling is a relatively novel concept in European data protection law. It is now explicitly defined in Article 4(4) of the EU General Data Protection Regulation (GDPR), and refers to the automated processing of data (personal and not) to derive, infer, predict or evaluate information about an individual (or group), in particular to analyse or predict an individual’s identity, their attributes, interests or behaviour.
Through profiling, highly sensitive details can be inferred or predicted from seemingly uninteresting data, leading to detailed and comprehensive profiles that may or may not be accurate or fair. Increasingly, profiles are being used to make or inform consequential decisions, from credit scoring, to hiring, policing and national security.
Ever since the approval of the regulation in 2016, debates have focussed on the GDPR’s potential to limit or offer protection against increasingly sophisticated means of processing data, in particular with regard to profiling and automated decision-making. While the GDPR offers new rights and protection, their scope and limits are open to debate, partly due to the clumsy syntax of the relevant articles and the lack of authoritative guidance concerning their interpretation.
The European Data Protection Board that will replace the Working Party on the Protection of Individuals with regard to the Processing of Personal Data is specifically tasked with publishing ‘guidelines, recommendations and best practices’ on the GDPR. In October 2017, the Working Party 29 has published draft guidance on profiling and automated decisionmaking.
In this article we propose our suggestions to contribute to the development of guidelines which provide the strongest protections for data subjects.

Full Text:



Achara, J.P., Acs, G. and Castelluccia, C., 2015, October. On the unicity of smartphone applications. In Proceedings of the 14th ACM Workshop on Privacy in the Electronic Society (pp. 27-36). ACM.

Bradley Hagerty, B., 2012, To Get Out The Vote, Evangelicals Try Data Mining. New Hampshire Public Radio. Available from: [Accessed 1st August 2017]

Cheney-Lippold, J., 2011. A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society, 28(6), pp.164-181.

Citron, D., 2016. (Un)Fairness of Risk Scores in Criminal Sentencing. Forbes. Available from: [Accessed 1st August 2017]

Coudert, F., 2010. When video cameras watch and screen: Privacy implications of pattern recognition technologies. Computer Law & Security Review, 26(4), pp.377-384.

Edwards, L. and Veale, M., 2017. Slave to the Algorithm? Why a ‘Right to Explanation’ is Probably Not the Remedy You are Looking for.

Goodman, B. and Flaxman, S., 2016. European Union regulations on algorithmic decision-making and a" right to explanation". arXiv preprint arXiv:1606.08813.

Hannak, A., Soeller, G., Lazer, D., Mislove, A. and Wilson, C., 2014, November. Measuring price discrimination and steering on e-commerce web sites. In Proceedings of the 2014 conference on internet measurement conference (pp. 305-318). ACM.

Hildebrandt, M., 2008. Profiling and the rule of law. Identity in the Information Society, 1(1), pp.55-70.

Information Commissioner’s Office, 2017. Feedback request – profiling and automated decision-making. ICO. Available from [Accessed 1st August 2017]

Kayyali, D., 2015, Facebook's Name Policy Strikes Again, This Time at Native Americans. EFF.

Liccardi, I., Abdul-Rahman, A. and Chen, M., 2016, May. I know where you live: Inferring details of people's lives by visualizing publicly shared location data. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 1-12). ACM.

Mikians, J., Gyarmati, L., Erramilli, V. and Laoutaris, N., 2012, October. Detecting price and search discrimination on the internet. In Proceedings of the 11th ACM Workshop on Hot Topics in Networks (pp. 79-84). acm.

de Montjoye, Y.A., Quoidbach, J., Robic, F. and Pentland, A., 2013, April. Predicting Personality Using Novel Mobile Phone-Based Metrics. In SBP (pp. 48-55).

Pariser, Eli. The filter bubble: What the Internet is hiding from you. Penguin UK, 2011.

Regulation (EU) 2016/679

Rosenblat, A. and Kneese, T., 2014. Networked Employment Discrimination.

Saunders, J., Hunt, P. and Hollywood, J.S., 2016. Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. Journal of Experimental Criminology, 12(3), pp.347-371.

Sweeney, L., Abu, A. and Winn, J., 2013. Identifying participants in the personal genome project by name.

The Royal Society, 2017, Machine learning: the power and promise of computers that learn by example. Royal Society. Available from [Accessed 1st August 2017]

Tene, O. and Polonetsky, J., 2012. Big data for all: Privacy and user control in the age of analytics. Nw. J. Tech. & Intell. Prop., 11, p.xxvii.

Tucker, P., 2016, Refugee or Terrorist? OBM Thinks Is Software Has the Answer. Defense One. [Accessed 1st August 2017]

Wachter, S., Mittelstadt, B. and Floridi, L., 2017. Why a right to explanation of automated decision-making does not exist in the general data protection regulation. International Data Privacy Law, 7(2), pp.76-99.



  • There are currently no refbacks.

Copyright (c) 2018 Frederike Kaltheuner, Elettra Bietti

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.