The Risks and Pitfalls of Automated Decision Making

Advances in technology mean that it is now even easier to conduct workplace surveillance but what do we mean by this? 

Workplace surveillance extends beyond monitoring via CCTV. Increasingly, it can mean collecting data about employees by other methods and using that data to make automated decisions. This is known as automated decision-making. The General Data Protection Regulation (GDPR) requires additional safeguards to be put in place to protect individuals from being subject to automated decision-making where those decisions about them will either have legal or similarly significant effects on them.   

The UK GDPR does not prevent businesses from using automated systems as long the processing meets the requirements set out in Article 22 of the Regulation. Article 22 sets out that this type of processing can only be carried out when one of the following exemptions is met:

  1. When the decision is necessary for a contract;
  2. When the decision is authorised by law; and
  3. When the decision is based on the individual’s explicit consent.

In each case, suitable measures should be put in place to protect the rights and freedoms of the data subject. This should include the right to obtain human intervention on the part of the data controller, for the data subject to be able to express their point of view. Where consent is relied on, the data subject has the right to obtain an explanation and consent to the decision where automated decision-making takes place. 

A recent case that the Amsterdam Court of Appeal heard saw a group of London-based employees successfully bring three different cases against Uber and Ola Cabs for breaches of GDPR. The employees sought, amongst other things, greater transparency about how decisions were made using automated means. The employees were successfully able to bring claims against Uber before the court in Amsterdam on the basis that this is where the company’s European regional head office is located and from where the personal data is controlled from. 

The first case involved the dismissal of four drivers by Uber with limited human intervention in the company’s automated decisions to dismiss the workers. 

The second and third cases involved requests for access to personal data made by six Uber drivers and three Ola Cabs drivers. The requests were denied both by Uber and by Ola Cabs. 

The Court rejected Uber’s argument that the UK-based workers did not have a right to bring the complaints before the Amsterdam Court of Appeal.

 

Image removed.

Automated Decision Making

In relation to the first case, the Amsterdam Court of Appeal held that there had been limited intervention on a human level in the decision taken by Uber to dismiss the workers. Human intervention would prevent the decision from being regarded as ‘automated’. Uber argued that the drivers had been dismissed on the basis of ‘fraudulent activity’ after its risk team carried out a manual investigation on each of the workers. The Court, however, found that Uber had “not been able to make this sufficiently plausible”, concluding that “there was insufficient evidence of actual human intervention”.

Consequently, the Court has ordered Uber and Ola Cabs to inform the workers what factors have been considered in reaching their decision. 

Uber has also been found to have profiled the workers and managed their performance through automated means. Profiling is often part of an automated decision-making process whereby the personal data of an individual is used to evaluate certain things about them. Uber has now been ordered to disclose how automated decision-making and profiling are used to determine factors such as pay and pricing systems, and how work is allocated to drivers. The Court has also ordered Ola Cabs to disclose information about the automated decision-making of worker earnings and ‘fraud probability scores’ which have been used in automatically allocating work and fares.

As part of their defence, both Uber and Ola Cabs argued that to be obliged to explain why allegations had been made and other automated decision-making negatively affecting workers would threaten their right to protect trade secrets. The Court rejected this argument, ruling that this would be entirely disproportionate when compared to the impact on workers of unexplained dismissals via automated means and of disciplining workers. 

What should employers be doing to ensure compliance with data protection law?

While the ruling has come as a result of a breach under EU GDPR, the relevant data protection principles remain the same are the UK GDPR. Although automated decision-making can prove useful, especially where data is being processed on a large scale, the additional obligations that apply must be recognised. 

As a matter of best practice, and to ensure compliance with the relevant data protection principles, we recommend that employers:

  • Be transparent about monitoring decisions. Inform individuals if their data is going to be used for automated decision-making purposes; 
  • Consider whether the processing is necessary for the relevant purpose;
  • Ensure that privacy notices and other relevant data protection documentation are up to date and accurately reflect your processing practices; 
  • Ensure that there is a process in place to deal with requests/appeals from individuals that are dissatisfied with how their data is being used; 

Carry out a Data Protection Impact Assessment (DPIA). Automated decision-making is likely to be regarded as high-risk processing; and

  • Adopt and implement procedures to help mitigate any risks you identify as part of the DPIA and in order to ensure compliance with the relevant requirements under UK GDPR. 
  • If you intend to rely on explicit consent, ensure that it meets the threshold as set out under the UK GDPR in order for the consent to be considered valid. 

If your business carries out automated decision-making or you are thinking about introducing a new technology to collect personal data and would like more information about your compliance obligations, please click on our contact us button on this page or sign up for our Data Protection Hub.

Expert
Maria Spencer
Solicitor