What does the UK GDPR say about automated decision-making and profiling? (2024)

In detail

The UK GDPR gives people the right not to be subject to solely automated decisions, including profiling, which have a legal or similarly significant effect on them. These provisions restrict when you can carry out this type of processing and give individuals specific rights in those cases.

  • What type of processing is restricted?
  • What does ‘solely’ automated mean?
  • What types of decision have a legal or similarly significant effect?
  • Automated decision-making systems are a key part of our business operations – do the UK GDPR provisions mean we can’t use them?
  • We profile our customers to send relevant marketing to them – does Article 22 stop us doing this?

What type of processing is restricted?

Article 22(1) of the UK GDPR limits the circumstances in which you can make solely automated decisions, including those based on profiling, that have a legal or similarly significant effect on individuals.

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly affects him or her”

Article 22(1)

What does ‘solely’ automated mean?

Solely means a decision-making process that is totally automated and excludes any human influence on the outcome. A process might still be considered solely automated if a human inputs the data to be processed, and then the decision-making is carried out by an automated system.

A process won’t be considered solely automated if someone weighs up and interprets the result of an automated decision before applying it to the individual.

Example

A factory worker’s pay is linked to their productivity, which is monitored automatically. The decision about how much pay the worker receives for each shift they work is made automatically by referring to the data collected about their productivity.

This is an example of solely automated decision making.

Many decisions that are commonly regarded as automated actually involve human intervention. However, the human involvement has to be active and not just a token gesture. The question is whether a human reviews the decision before it is applied and has discretion to alter it, or whether they are simply applying the decision taken by the automated system.

Example

An employee is issued with a warning about late attendance at work. The warning was issued because the employer’s automated clocking-in system flagged the fact that the employee had been late on a defined number of occasions. However, although the warning was issued on the basis of the data collected by the automated system, the decision to issue it was taken by the employer’s HR manager following a review of that data.

In this example the decision was not taken solely by automated means.

What types of decision have a legal or similarly significant effect?

A decision producing a legal effect is something that affects a person’s legal status or their legal rights. For example when a person, in view of their profile, is entitled to a particular social benefit conferred by law, such as housing benefit.

A decision that has a similarly significant effect is something that has an equivalent impact on an individual’s circumstances, behaviour or choices.

In extreme cases, it might exclude or discriminate against individuals. Decisions that might have little impact generally could have a significant effect for more vulnerable individuals, such as children.

Example

A social security process which automatically evaluates whether an individual is entitled to benefit and how much to pay is a decision ‘based solely on automated processing’ for the purposes of Article 22(1).

As well as having a legal effect, the amount of benefit received could affect a person’s livelihood or ability to buy or rent a home so this decision would also have a ‘similarly significant effect’.

Other similarly significant effects include:

  • automatic refusal of an online credit application; or
  • e-recruiting practices without human intervention.

Example

An individual applies for a loan online. The website uses algorithms and automated credit searching to provide an immediate yes/no decision on the application.

Example

As part of their recruitment process, an organisation decides to interview certain people based entirely on the results achieved in an online aptitude test. This decision has a significant effect, since it determines whether or not someone can be considered for the job.

By contrast, the following example is unlikely to have a significant effect on an individual:

Example

Recommendations for new television programmes based on an individual’s previous viewing habits.

If you are unsure whether a decision has a similarly significant effect on someone you should consider the extent to which it might affect, for example, their:

  • financial circumstances;
  • health;
  • reputation;
  • employment opportunities;
  • behaviour; or
  • choices.

The guidelines produced by WP29 include more advice on identifying legal or similarly significant effects.

Automated decision-making systems are a key part of our business operations – do the UK GDPR provisions mean we can’t use them?

Used correctly, automated decision-making is useful for many businesses. It can help you to interpret policies correctly and make decisions fairly and consistently.

The UK GDPR recognises this and doesn’t prevent you from carrying out profiling or using automated systems to make decisions about individuals unless the processing meets the definition in Article 22(1), in which case you’ll need to ensure it’s covered by one of the exceptions in Article 22(2). See When can we carry out this type of processing for more information about the exceptions.

We profile our customers to send relevant marketing to them – does Article 22 stop us doing this?

Creating or applying a profile to someone in order to tailor your marketing is a decision about them, but for profiling or automated decision-making to be restricted by Article 22 there needs to be:

  • no human involvement; and
  • a legal or similarly significant effect on the individual.

The UK GDPR isn’t designed to stop you from running your business or promoting your products and services. However, there could be situations in which marketing may have a significant effect on the recipients. You need to think about your target market. For example, vulnerable groups of individuals may be more easily influenced and affected by behavioural advertising.

The UK GDPR highlights that children in particular deserve additional protection, especially where their personal information is used for the purposes of marketing and creating online profiles.

“Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles...”

Recital 38

Also remember that people can object to you profiling them for marketing purposes under Article 21. You must bring details of this right to their attention and present it separately from other information. If you receive this type of objection you must stop the processing and confirm to the individual that you have done so within a month of receipt.

Any telephone/electronic direct marketing you carry out must also meet PECR requirements – see our direct marketing guidance for more information.

Further reading - ICO guidance

ICO Guidance on Children and the UK GDPR

Further reading – European Data Protection Board

The European Data Protection Board (EDPB), which has replaced the Article 29 Working Party (WP29), includes representatives from the data protection authorities of each EU member state. It adopts guidelines for complying with the requirements of the GDPR. EDPB guidelines are no longer directly relevant to the UK regime and are not binding under the UK regime. However, they may still provide helpful guidance on certain issues.

WP29adoptedguidelines on automated individual decision-making and profiling – Chapter IV, which have been endorsed by the EDPB.

What does the UK GDPR say about automated decision-making and profiling? (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Terrell Hackett

Last Updated:

Views: 5877

Rating: 4.1 / 5 (72 voted)

Reviews: 87% of readers found this page helpful

Author information

Name: Terrell Hackett

Birthday: 1992-03-17

Address: Suite 453 459 Gibson Squares, East Adriane, AK 71925-5692

Phone: +21811810803470

Job: Chief Representative

Hobby: Board games, Rock climbing, Ghost hunting, Origami, Kabaddi, Mushroom hunting, Gaming

Introduction: My name is Terrell Hackett, I am a gleaming, brainy, courageous, helpful, healthy, cooperative, graceful person who loves writing and wants to share my knowledge and understanding with you.