The use of ChatGPT and its peers to make work easier and faster – whether permitted, tolerated or prohibited – is already part of everyday working life in many companies. However, the spread of that technology has raced far ahead of the law so the legal consequences of that use (employment rights and obligations, data protection, employee inventions, etc.) are in many cases still far from clear.

In one of the first judicial decisions in this area, the Hamburg Labour Court has recently addressed one particular question remaining unresolved around the use of AI and its consequences, the extent of the employer’s obligations to consult with works councils before AI is brought into the workplace. In its decision of 16 January 2024 (case no. 24 BVGa 1/24) the Court said that the works council has no right of co-determination pursuant to Section 87 (1) nos. 1, 6 and 7 Works Constitution Act where the employer chooses to permit the use of ChatGPT and comparable generative AI systems by staff using their own private web-based accounts, and to regulate that use by an AI policy. To be clear, a right of co-determination goes further than a right merely to be informed and consulted about workplace measures. Co-determination is the strongest form of the works council’s participation rights. In matters of co-determination the employer may not make decisions without the consent of the works council; instead, the works council must authorise any planned measures, or the parties must come to an agreement (e.g. a works agreement).


The employer decided to allow its employees to use generative AI as a new tool to support their work and published an AI Policy on its intranet stipulating terms and rules to be observed by employees in that use. These AI tools were accessible via a web browser and employees who wanted to use them had to obtain a private account (in this case, ChatGPT) at their own expense. The employer itself did not create a company AI tool. No doubt fearful of the possible longer-term impact of such tools on job security, the works council considered the consent to use ChatGPT and the publication of the AI Policy to be a gross violation of its co-determination and participation rights. Among other things, it requested the employer to block ChatGPT and prohibit its use.


Use of AI tools does not concern “issues relating to the organization of the company and the orderly behaviour of employees in the company”

The Labour Court ruled that the works council’s right of co-determination under Section 87 (1) No. 1 Works Constitution Act was not affected, as generative AI systems constitute a work tool and therefore affect the work behaviour of the employees (which is not subject to co-determination) and not the orderly behaviour (which is subject to it).

Permission to use of AI tools not considered as “introduction and use of technical equipment designed to monitor the behaviour or performance of employees”

The Court also rejected the works council’s claim for rights of co-determination under Section 87 (1) No. 6 Works Constitution Act. It thought that the use of ChatGPT via personal, browser-based accounts does not constitute technical equipment for the collection and storage of personal data and therefore does not give rise to a right of co-determination on the part of the works council. This is because where employees use an account they have created themselves, and (importantly) to which the employer has no access, the employer does not know which employees have used ChatGPT, when, for how long or for what purpose. This also applies even if the employees do have to disclose when they achieve work results using AI because that particular form of monitoring is not a function of the technical equipment itself. If the employer had required that AI usage by its employees had to go through its own account then it is likely that at least some of the outcomes here would have been different.

Use of AI tools does not require “regulations on the prevention of accidents at work and occupational illnesses as well as on health protection within the framework of statutory regulations or accident prevention regulations”

Finally, in the opinion of the Court, the works council’s right of co-determination pursuant to Section 87 (1) No. 7 Works Constitution Act was not engaged by the possibility of any psychological stress caused by the use of AI. No concrete threat to mental health was identifiable.

Further Comment by Labour Court:

In an additional comment (as this was not part of the case at hand), the Labour Court did, however, point to the information and consultation right of the works council under Sec. 90 (1) Nr. 3, (2) Works Constitution Act. This regulation gives a works council the right to be properly informed about the proposed use of AI systems prior to their introduction and to be consulted on that proposal. However, this information and consultation right does not entitle the works council to demand a works agreement as a condition of allowing the proposal to go ahead, nor then to block the employer’s eventual decision on the use of AI systems.


This decision cannot be applied across the board to all companies and types of AI use. In particular, co-determination in the context of Section 87 (1) No. 6 Works Constitution Act may have to be assessed differently if the employer either requires use of AI systems developed in-house or requires staff to use external AI systems but only through company accounts set up with the external providers. In those cases, the employer could certainly access or view (hence monitor) the accounts and the corresponding data around the employees’ use of the system. If you are considering allowing the use of AI systems at your company, how you do it may have significant impacts on the required extent of works council involvement.

We also recommend the introduction of a reasonably detailed set of regulations for the use of generative AI (including type of work, scope, labelling requirements, confidentiality obligation etc.) to avoid any misunderstanding or mismatch of employee expectations in your workplace.