Statement on the assessment memorandum on the need to regulate automated decision-making within public administration in general legislation (TAS 296/2020, issued on 31 August 2020)
The Ministry of Justice requested a statement from the Ombudsman for Equality concerning the assessment memorandum on the need to regulate automated decision-making within public administration in general legislation (Publications of the Ministry of Justice, Reports and guidelines 2020:14).
The Ombudsman for Equality stated that the need for general legislation on automated decision-making is clear. It is important to prepare the legislation concerning a new type of decision-making system carefully. The Ombudsman for Equality considered it especially important to ensure that automated decision-making will not lead to discrimination prohibited by the Act on Equality between Women and Men or the Non-Discrimination Act. In fact, in the opinion of the Ombudsman for Equality ensuring non-discrimination and promoting equality and non-discrimination should be taken into account in the preparation of regulations as comprehensively as data protection, for example.
The Ombudsman for Equality stated that there is a risk of discriminatory decisions linked to automated decision-making. For example, if statistical data is used as the basis of assessing a discretionary matter, assumptions concerning an individual based on gender or other prohibited grounds for discrimination may be made based on statistical data without taking individual circumstances into account. There is also a risk of indirect discrimination related to statistics-based decision-making, if a seemingly neutral criterion in reality classifies individuals based on gender, age or origin, for example.
The more discretion the decision-making requires, the higher the risk of a discriminatory decision. Because of this, in the view of the Ombudsman for Equality automated decision-making should only be used in situations, in which the law leaves the authority no room for discretion regarding the end result of the decision. In fact, the Ombudsman for Equality supported the solution presented in the memorandum, according to which automated decision-making should be limited to situations where a decision can be mechanically derived from legislation based on known straightforward facts, and there is no element of discretion involved in the decision-making. The Ombudsman for Equality considered it necessary to limit the field of application in this way in order to minimise the risk of discrimination due to automated decision-making.
Similarly, the Ombudsman for Equality supported the proposed starting point, according to which under general legislation, automated decision-making could only be based on deduction rules pre-determined by the relevant authority in accordance with the legislation, and not on a learning artificial intelligence, for example. In the view of the Ombudsman for Equality, this would also reduce the risk of discriminatory decisions.
The Ombudsman for Equality considered it important that the prohibition of discrimination is also taken into account when an authority is establishing an automated decision-making system and defining the deduction rules used as the basis of decision-making. The Ombudsman for Equality supported the proposal, according to which the decision-making rules used in an automated decision-making system should be approved with an express decision. In addition, the Ombudsman for Equality supported the proposal, according to which an obligation to present a description of the decision-making rules applied to automated decision-making and the selection of issues to be decided automatically should be imposed on the authority. In the view of the Ombudsman for Equality, these regulations promote the transparency of the decision-making system, which is key to monitoring the non-discrimination of the system and proving potential discrimination after the fact, among other things.
The Ombudsman for Equality stated that it is important to ensure that the faults in automated decision-making can be addressed quickly. Self-regulation by the authority is a good method for the purpose. In ex post facto monitoring, it should be ensured that the feedback on the decision-making system coming from outside the authority can also be processed effectively and the resulting needs for correction can be addressed. It would also be good to investigate whether a separate monitoring system should be created for automated decision-making in addition to the self-regulation by the authorities and ordinary legal protection measures.