Automated decisions are based on data processing without human intervention. Examples of such processing are the automatic rejection of an online credit application or an online recruitment procedure without any human involvement.
According to the General Data Protection Regulation, such automated decisions are only permissible if consumers consent to them, if they are necessary for fulfilling a contract or if national laws permit these decisions. In these cases, data subjects can demand that a person intervenes in the automated decision-making process on the part of the controller. Consumers also have the right to express their own views on the decision making and to challenge the automated decision.
In addition, the General Data Protection Regulation stipulates that data subjects must be informed about the “logic involved” and the effects of the automated decision. What exactly is meant by the term “involved logic” has yet to be clarified by way of court decisions. An interpretation of the Article 29 Working Group, which is the EU advisory body on data protection law, can be found here (PDF file). According to this group, the explanation of the “logic involved” does not refer to a detailed explanation of complex algorithms. Rather, it is merely intended to explain in a simple and understandable manner which criteria were included in the automated decision. In addition, concrete examples should be used to point out the importance of an automated decision for data subjects.
The two terms profiling and scoring are closely linked to the concept of automated decision making. Profiling is the automated processing of personal data in order to make certain predictions about individual persons or groups of people. These include the expected work performance, assessments of the expected health or someone’s personal preferences. Profiling is often used in online advertising, for instance, to place personalised ads based on presumed interests or preferences.
Scoring is a special form of profiling that is mainly used in connection with loans or insurance contracts. In credit scoring, for example, the probability of repaying loans is calculated on the basis of the subject’s data.
In Germany, scoring is regulated by the Federal Data Protection Act. The data used to calculate the probability value must be based on a scientifically recognised mathematical-statistical procedure and must not rely exclusively on the address data of those concerned. If address data are used, data subjects must be informed of this.
Article 22 GDPR (Automated individual decision-making, including profiling)
1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
2. Paragraph 1 shall not apply if the decision:
(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;
(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
(c) is based on the data subject’s explicit consent.
3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.
4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place.
Source: Regulation (EU) 2016/679 (see also recital 71)
Article 14(2) GDPR (Information to be provided where personal data have not been obtained from the data subject)
In addition to the information referred to in paragraph 1, the controller shall provide the data subject with the following information necessary to ensure fair and transparent processing in respect of the data subject:
(g) the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.
Source: Regulation (EU) 2016/679 (see also recitals 60, 61 and 62)
Section 31 BDSG (Protection of commercial transactions in the case of scoring and credit reports)
(1) For the purpose of deciding on the creation, execution or termination of a contractual relationship with a natural person, the use of a probability value for certain future action by this person (scoring) shall be permitted only if
1. the provisions of data protection law have been followed;
2. the data used to calculate the probability value are demonstrably essential for calculating the probability of the action on the basis of a scientifically recognized mathematic-statistical procedure;
3. other data in addition to address data are used to calculated the probability value; and
4. if address data are used, the data subject was notified ahead of time of the planned use of these data; this notification shall be documented.
(2) The use of a probability value calculated by credit reporting agencies to determine a natural person’s ability and willingness to pay shall be permitted in the case of including information on claims only as far as the conditions of subsection 1 are met and only claims concerning a performance owed which has not been rendered on time are considered
1. which have been established by a final decision or a decision declared enforceable for the time being, or if an executory title has been issued under Section 794 of the Code of Civil Procedures,
2. which have been established under Section 178 of the Insolvency Act and have not been disputed by the debtor at the verification meeting,
3. which the debtor has explicitly acknowledged,
4. for which
a) the debtor has received at least two written reminders after the due date of the claim,
b) at least four weeks have elapsed since the first reminder,
c) the debtor was previously informed, at least in the first reminder, of possible consideration by a credit reporting agency and
d) the debtor has not disputed the claim, or
5. the contractual relationship on which the claim is based can be terminated without prior notice for payment in arrears and the debtor has been informed of possible consideration by a credit reporting agency.
The lawfulness of processing, including the calculation of probability values, other data relevant for credit reports pursuant to general data protection law shall remain unaffected.
Section 37 BDSG (Automated individual decision-making, including profiling)
(1) In addition to the exceptions given in Article 22 (2) (a) and (c) of Regulation (EU) 2016/679, the right according to Article 22 (1) of Regulation (EU) 2016/679 not to be subject to a decision based solely on automated processing shall not apply if the decision is made in the context of providing services pursuant to an insurance contract and
1. the request of the data subject was fulfilled, or
2. the decision is based on the application of binding rules of remuneration for therapeutic treatment and the controller takes suitable measures, in the event that the request is not granted in full, to safeguard the data subject’s legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision; the controller shall inform the data subject of these rights no later than the notification indicating that the data subject’s request will not be granted in full.
(2) Decisions pursuant to subsection 1 may be based on the processing of health data as referred to in Article 4 no. 15 of Regulation (EU) 2016/679. The controller shall take appropriate and specific measures to safeguard the interests of the data subject in accordance with Section 22 (2), second sentence.