Skip to content

Risk class: What is the assistant used for?

The risk class is determined by the purpose of the AI assistant. The AI Act divides use into different categories:

CategoryDescription and typical municipal areasImportant clarifications and exceptions
High-riskUse in education, employment and recruitment, access to essential services (incl. welfare), and democratic processes.Exception: If the assistant performs a limited, procedural task that does not pose a risk to health, safety or fundamental rights, an exception can be claimed.
Not high-riskUse that poses minimal risk. For example tasks that perform a purely limited procedure, or that are not intended to replace or significantly affect a human assessment.
Prohibited useSystems with unacceptable risk to fundamental rights (e.g. deliberate manipulation, social scoring, assessment of a person’s risk of committing a crime).Largely irrelevant for municipalities.

Exceptions are very common for municipal use of AI assistants. Even if an assistant is used in an area originally defined as high-risk (such as education or welfare), the use can qualify for an exception from the strictest requirements. This applies particularly if the assistant:

  • Is not intended to replace or influence a human assessment.
  • Is designed to perform a limited, procedural task.

To illustrate that there are many safe and highly useful use cases that are not classified as high-risk, here are two typical examples from municipal day-to-day work:

Example 1: Quality assurance of case documents (Human assessment retained)

An adviser uses an AI assistant to quality-assure case documents before they are sent to the city council. The AI assistant refers to relevant templates and gives feedback on the documents: It checks that previous case reference numbers are cited in the right place, that the content meets minimum requirements, and that the written style meets plain language requirements.

  • Why is this an exception? The AI assistant is only used to improve the quality of work the caseworkers have already done. The adviser reads through the feedback, makes their own assessments, and manually edits the documents before sending. The assistant neither replaces nor makes independent decisions.

Example 2: Rule lookups for schools and after-school programmes (Limited and procedural)

A school department head builds an AI assistant based on the municipality’s own admission regulations, enrolment rules and priority criteria. The purpose is for colleagues to easily look up answers to procedural questions: What documentation is required when applying for an after-school place? What are the deadlines? How are the school catchment zone boundaries defined?

  • Why is this an exception? Even though this concerns “education”, the AI assistant only performs a limited, procedural task — explaining rules already published on the municipality’s website. It makes no assessment of an individual child or a specific case. The employee takes the information, assesses it, and writes a reply to the parents.