26.09.2024 Newsletter

Focus IT&C – 3. Quarter 2024

We have compiled some important and exciting new developments and case law from IT law and data protection for you. We hope you enjoy reading it!

 

1. Regional Court of Kiel: fraudulent misrepresentation when answering risk questions - insurer contests cyber insurance

2. AI lawsuit at the Regional Court of Hamburg

3. To which systems does the EU AI Regulation apply?

4. New FAQ by the EU Commission on the Data Act: important insights one year before its applicability

5. New DSK decision on asset deals: requirements for handling personal data

6. News about the DSA and DDG: future guidelines for greater online protection of minors and the first certified national dispute resolution centre for online platforms

 

 

1. Regional Court of Kiel: fraudulent misrepresentation when answering risk questions - insurer contests cyber insurance 

In view of the tense cyber security situation and the high damage potential of cyber attacks, the need for appropriate insurance cover is increasing rapidly. However, many questions regarding insurance cover are still unresolved.

The first judgement on cyber insurance was handed down by the Regional Court of Tübingen in May 2023 and showed how important it is for insurers to carry out a risk assessment by asking extensive risk questions (see our Focus IT&C 1st quarter). Now, in its ruling of 23 May 2024 (case no. 5 O 128/21), the Regional Court of Kiel has addressed the risk questions and once again underlined their high relevance - also for policyholders.

The case

The parties had concluded a cyber insurance policy. The contract had been concluded via an online portal according to the so-called “invitatio” model, whereby a series of risk questions on the condition of the IT and the risk circumstances had to be answered first. This was undertaken by an employee of the insurance broker in consultation with the head of the plaintiff's IT department. Based on this information, the defendant as insurer then made an offer, which the plaintiff accepted.

The plaintiff was later the victim of a cyber attack. Malware was infiltrated via one of the plaintiff's computers used as a web server. It came to light that the plaintiff's IT system had several serious security flaws. The web server was running the Windows 2008 operating system, for which software and security updates were no longer provided in the absence of a support agreement with the manufacturer. There was also no virus scanner. Other computers were running the Windows 2003 operating system, which was also without a virus scanner, and security updates from the manufacturer were not available here either. The status of the plaintiff’s domain controller - a server for the centralised authentication of users and computers in a network - was also still that of its time of delivery in 2019 and had not been updated since then.

The defendant contested, among other things, the contract on the grounds of fraudulent misrepresentation, as some of the risk questions had been answered incorrectly and security defects had been concealed.

The court’s decision

The court upheld the contestation of the insurance contract on the grounds of fraudulent misrepresentation and dismissed the policyholder's claim. In full awareness of his lack of knowledge, the plaintiff's employee had made his statements randomly and had thus answered the following risk questions in the context of the invitatio objectively incorrectly with "yes":

  • "All stationary and mobile work computers are equipped with up-to-date software to recognise and prevent malware" (risk question no. 3).
  • "Available security updates are carried out without culpable delay, and only products for which security updates are provided by the manufacturer are used for the software required to operate the IT system (this applies in particular to operating systems, virus scanners, firewalls, routers, NAS systems)" (risk question no. 4).

Answering the risk questions was a necessary step for entering into contract negotiations with the defendant and at the very least influenced the amount of the insurance premium. Whether these questions had also been asked in text form in accordance with Section 126b of the German Civil Code (Bürgerliches Gesetzbuch, BGB), as required in the case of Section 19 (1) of the German Insurance Contracts Act (Versicherungsvertragsgesetz, VVG), was irrelevant. This is because a case of fraudulent misrepresentation can also exist if questions asked verbally before the contract is concluded are answered incorrectly, so this applies all the more to questions that are only visible on screen via an online portal.

The term "work computer" in risk question no. 3 encompasses all computer systems that perform functions in a company, for the network as a whole is only as secure as its weakest links.

The fact that the head of the IT department had not thought about the systems concerned and that he was unaware that the domain controller was still in its delivery state was considered by the court to be a "deliberate lack of knowledge" in the sense of a "so what?", which fulfils the requirements of fraudulent misrepresentation.

Conclusion

The judgement of the Regional Court of Kiel shows that policyholders must take the pre-contractual risk questions seriously. They should always answer the questions truthfully, familiarise themselves with the requested content and inform themselves about their systems. Otherwise, there is a risk that insurers may be released from their obligations in the event of a claim and policyholders will be left with the costs.

Dr. Hanna Schmidt

Back

2. AI lawsuit at the Regional Court of Hamburg

The training of artificial intelligence ("AI") is known to require large amounts of data. This data is often obtained directly from the internet or made available by providers of suitable databases. It has not yet been clarified in detail to what extent the copyright limitation rule for text and data mining under Section 44b of the German Copyright Act (Urhebergesetz, UrhG) can be used to obtain such data sets.

According to Section 44b UrhG, reproductions of lawfully accessible works for text and data mining are generally permitted. Exceptionally, this is not the case if the right-holder has reserved the right of use of the work for text and data mining (opt-out). However, the right-holder must declare such a reservation of use in a machine-readable format in order for it to be valid (see Section 44b (3) UrhG).

1. Facts of the case

A lawsuit is currently pending before the Hamburg Regional Court (case no. 310 O 227/23), which is the first time the applicability and requirements of Section 44b UrhG in the context of AI training is the subject of a court decision.

The defendant is the non-profit organisation LAION e.V., which has set itself the goal of promoting research in the field of AI. LAION provided a dataset with almost six billion image-text pairs for AI training under the name LAION 5B. The dataset includes an image created by photographer Robert Kneschke. Robert Kneschke had previously uploaded his image to the website Bigstock. Bigstock's terms of use stipulate that the images may not be used for "automated programmes".

Key questions of the proceedings before the Hamburg Regional Court are (i) whether Section 44b UrhG even applies to AI training datasets and (ii) what requirements are to be placed on the machine-readability of a reservation of use.

2. Preliminary legal opinion of the court

According to its preliminary legal opinion, the court readily considers Section 44b UrhG to be applicable to AI training datasets. This is confirmed in particular by the AI Regulation that recently came into force. With this regulation, the legislator expresses its interest in promoting developments in the field of AI. This has been disputed to date in the legal literature. The applicability of Section 44b UrhG to generative AI is called into question with the argument that the legislator only intended to cover automated pattern recognition, but not AI, at the time Section 44b UrhG was written.

The Hamburg Regional Court has still left unanswered the question of what requirements are to be placed on the machine-readability of a reservation of use pursuant to Section 44b (3) UrhG, in particular whether such a reservation can be effectively expressed in general terms and conditions. According to a broad understanding of machine-readability, it suffices if the reservation of use is included verbatim in the GTC. No special technical formulation is necessary. A narrow understanding, on the other hand, requires that the reservation is technically recognisable by the AI. For search engines, for example, these narrow requirements have been implemented for years through use of the robots.txt file format. It therefore remains to be seen which interpretation the Hamburg Regional Court will follow.

3. Outlook

The Hamburg Regional Court will announce its decision on 27 September 2024. Providers and operators of AI systems would be well advised to keep an eye on the proceedings and adapt their AI training to correspond with the decision.

Dr. Axel Grätz

Back

3. To which systems does the EU AI Regulation apply?

On 13 June 2024, Regulation (EU) 2024/1689 of the European Parliament and of the Council ("AI Regulation") entered into force. The AI Regulation only applies to "AI systems". The understanding of the term “AI system” is defined in Art. 3 No. 1 of the AI Regulation. Companies wishing to know whether and in relation to which software they must comply with the AI Regulation must therefore check whether they are using an AI system, i.e.

  1. "a machine-based system, 
  2. designed to operate with varying levels of autonomy and
  3. that may exhibit adaptiveness after deployment, and
  4. that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions, 
  5. that can influence physical or virtual environments."

The definition mentions the results generated by an AI system only vaguely: as mentioned above, they must be "outputs such as predictions, content, recommendations, or decisions". An AI system cannot therefore be recognised by the results it produces. Rather, companies need to understand how the results are generated.

Since all AI systems are computer-implemented and therefore "machine-based", adaptiveness is only an optional criterion and all output influences at least one virtual environment, it is generally agreed that the decisive criteria of the definition are (i) the AI system being designed "to operate with varying levels of autonomy" and (ii) its ability to derive "how" outputs are created from inputs: 

  • According to the ISO standard on artificial intelligence (ISO/IEC 229892022), autonomy means the ability "to modify its intended domain of use or goal without external intervention, control or supervision". If this is not the case, there are various levels of automation. According to sentence 2 of Recital 12 of the AI Regulation, mere "automatic execution" is not sufficient, however. On the other hand, AI-supported applications that modify their domain of use or goal without human intervention do not (yet) exist. With this understanding, the definition would therefore currently be meaningless. In addition, the wording "with varying levels" suggests that the requirements for "autonomous operation" are not overly stringent. Therefore, fulfilment of the criterion should be affirmed if the system - in contrast to rules defined by humans - independently arrives at (unpredictable) results (but in doing so does not change the specified domain of use or the defined goals).
  • The further criterion of derivability applies here without any break in content, for the AI system must have the ability to derive "how" outputs are generated from the "inputs received", i.e. it must to a certain extent establish the applicable rules itself. In this sense, sentence 2 of Recital 12 of the AI Regulation states that an AI system may not be "based on rules defined exclusively by natural persons [...]".
  • It is questionable whether the rules must be derived from inputs (e.g. prompts) made by the operator during the utilisation phase. This would be problematic, as the AI applications currently available - with a few exceptions - only derive rules from the training data during training phases. However, Art. 3 No. 1 of the AI Regulation refers to "inputs", while Art. 3 No. 33 of the AI Regulation defines the term "input data", specifically to justify provider obligations for high-risk AI systems (see Art. 12 (3), 13 (3) (b) (vi), 15 (5) and 26 (4) of the AI Regulation): According to this, input data is "the data provided to or directly acquired by an AI system on the basis of which the system produces an output". Firstly, Art. 3 No. 1 AI Regulation uses a different term and, secondly, Art. 3 No. 33 AI Regulation itself does not differentiate between the training and utilisation phase or data input by providers or operators.

There is therefore much to suggest that AI systems refer to software that defines the rules it applies itself, particularly during the training phase, and therefore acts independently of rules defined by humans, i.e. autonomously.  However, a clear definition of the central term "AI system", as requested in the first sentence of Recital 12 of the AI Regulation, specifically does not exist. Even if this term should be clarified in future by ECJ case law, it remains the case that companies purchasing AI applications must check or at least obtain binding assurances as to whether or not this requirement is met.

Dr. Marc Hilber LL.M.

Back

4. New FAQ by the EU Commission on the Data Act: important insights one year before its applicability

The EU Commission has recently published detailed FAQ on the Data Act (Regulation (EU) 2023/2854), which aim to provide guidance on the interpretation and implementation of key provisions of the Data Act. As the Data Act will be directly applicable in just under one year, companies for whose products or services the Data Act is or could be relevant should in future take the FAQ into account when considering how to implement the Data Act.

1. Background - scope of application of the Data Act

In particular, the Data Act is intended to create a legal framework for access to and the use of data. A key point of the Data Act is the obligation of so-called data owners (regularly manufacturers of networked products or providers of "connected services") to grant users of products or connected services access to data that is generated when using the products/services. "Users" can be both companies and consumers. The Data Act is therefore relevant in both the B2B and B2C sectors. Manufacturers must design their products or services in such a way that the data generated is accessible to the user "easily, securely, free of charge and in a comprehensive, structured, commonly used and machine-readable format". If data owners wish to use the data themselves, a data licence agreement must be concluded with the user. Another key point of the Data Act is the right to transfer data to third parties "on the instructions" of the user, but with restrictions for so-called gatekeepers within the meaning of the Digital Markets Act (Regulation (EU) 2022/1925) to avoid any further strengthening of their dominant market position. Such gatekeepers include players such as Google, Meta and Amazon. In addition, the Data Act aims to strengthen the interoperability of cloud providers and make it easier to switch between them.

2. Key points of the new FAQ

The new FAQ address the following questions in particular:

  1. The relationship between the Data Act and the General Data Protection Regulation (GDPR). Unfortunately, the EU Commission limits itself to making general statements in this context and in particular does not address which legal bases under the GDPR can be invoked by data owners with regard to the granting of data access, e.g. when a user requests access to personal data of a third party under the Data Act.
  2. The territorial applicability of the Data Act, e.g. when a product is only temporarily located in the EU.
  3. The more precise definition of the parties addressed by the Data Act, namely users, data owners and third-party data recipients.
  4. The conditions under which data owners, in order to protect trade secrets, only have to disclose data under certain conditions (e.g. after concluding a confidentiality agreement) or not at all.
  5. The possibility for users to demand remuneration for the use of the relevant data by the data owner.
  6. The right of the data owner to implement a process to identify a user who requests access to data.

In its introductory remarks, the EU Commission emphasises that the FAQ are not to be understood as an official statement.

3. Evaluation and outlook

The FAQ provide a clear overview of the most important contents of the Data Act. However, they are limited mainly to a repetition of the wording of the law and therefore do not answer most of the open legal questions arising from the Data Act. As the Data Act will come into force on 12 September 2025, companies would be well advised to take a close look at the consequences for their respective business models. Not only will this lead to new obligations; in many cases, it will also give rise to opportunities to open up new areas of business by gaining access to data from other companies that was previously inaccessible.

Marco Degginger

Back

5. New DSK decision on asset deals: requirements for handling personal data

On 11 September 2024, the Data Protection Conference (Datenschutzkonferenz, DSK) of the independent federal and federal state data protection authorities published a new decision on the transmission of personal data in the context of asset deals.

Asset deals, in which a company's assets, such as customer data, are transferred, have long been a topic of controversial debate in data protection law. The new DSK decision aims to ensure a standardised application of the EU General Data Protection Regulation (GDPR) and provide companies with a clear framework for the lawful transmission of data in asset deals.

This article summarises the important contents of the new DSK decision and shows the practical implications for transactions.

Key requirements of the DSK decision

Regarding the question of whether and under what conditions personal data may be lawfully transferred as part of the asset deal, the DSK distinguishes between the central phases of asset deals and the type of data transferred:

Data transmissions during the due diligence phase

During the due diligence phase, potential buyers examine the economic and legal circumstances of the target company. During this phase, the transfer of personal data to the potential acquirer is generally not permitted. This rule applies to the personal data of customers, suppliers and employees.

However, such data may be transferred if the data subjects have given their voluntary and express consent in the individual case (Art. 6 (1) lit. a GDPR). This consent must fulfil the data protection requirements of the GDPR. In case of employees, the voluntariness of their consent is usually lacking, as a certain degree of dependency exists in the employment relationship. This is only not the case if the sellers and the employees concerned pursue "aligned interests". In this case, the consent may be valid in the employment context. In any case, consent must be given in writing or electronically (Section 26 (2) of the German Federal Data Protection Act (Bundesdatenschutzgesetz, BDSG)).

During advanced transaction negotiations, a legitimate interest (Art. 6 (1) lit. a GDPR) may exceptionally justify the transmission of personal data of key players, such as employees with management responsibility or key competences or main contractual partners.

Transmission of customer data

The DSK defines the requirements for transferring customer data according to the different phases of the asset deal:

  • During the contract initiation phase - i.e. the specific contract negotiations between the seller and the customer - customer data may only be transferred if the customers concerned continue the contract negotiations that they started with the seller with the acquirer without complaint. The data transmission is justified if the data is required in order to continue the negotiations (Art. 6 (1) lit. b GDPR). In addition, the seller may only transfer customer data if this does not conflict with any overriding interests of the customer (Art. 6 (1) lit. f GDPR). In order for these to be taken into account, the customers concerned must be informed of the planned data transfer and given the opportunity to object to it. According to this so-called objection solution, the transfer is also permitted based on legitimate interests (Art. 6 (1) sentence 1 lit. f in conjunction with (4) GDPR), with it being understood the data subjects do not object within a reasonable period. For this purpose, the DSK considers an objection period of about 6 weeks to be reasonable.
  • This is different in the case of ongoing contractual relationships with customers, for example due to unfulfilled contractual performance obligations of the seller or existing warranty claims. In such cases, the seller may transfer the data required to fulfil the contract based on the contract (Art. 6 (1) lit. b GDPR), as the acquirer assumes the seller's contractual obligations vis-à-vis the customers. In this case, the consent of the customers concerned is not required. A transfer is also to be permitted if existing customer contracts or obligations are assumed by the acquirer in accordance with the requirements of civil law (assumption of contract or debt).
  • In cases where the contractual relationship has already ended, customer data may only be transferred within the scope of the statutory retention obligations. This data must be stored by the acquirer separately from active customer data (so-called "two filing cabinets solution"). The seller and acquirer must conclude a data processing agreement (Art. 28 (1), (3) GDPR). The acquirer may only use the data for its own purposes with the express consent (Art. 6 (1) lit. a GDPR) of the customers.
  • The DSK provides for separate rules for the transfer of customer data as the only asset (e.g. sale of a customer database). These cases regularly require the consent of the data subjects. An exception may apply for small and very small companies (less than 10 employees or less than 50 employees and a maximum turnover of 10 million euros). In this case, an objection solution may be sufficient (objection period of about 4 to 6 weeks). The data transfer may only take place if the respective customer does not object within the deadline.

Transmission of supplier data

The transmission of personal data of suppliers and their employees is permitted, provided that this does not conflict with any overriding interests of the data subjects (Art. 6 (1) lit. f GDPR). According to the GDPR, this is generally the case for business-related contact data, as the transfer may even be in the supplier’s interests.

Transmission of employee data

The transfer of employee data is permitted in particular in case of a transfer of business pursuant to Section 613a of the German Civil Code (Bürgerliches Gesetzbuch, BGB). The legal basis in this case is the fulfilment of the contract (Art. 6 (1) lit. b GDPR). For special categories of personal data, Section 26 (3) BDSG applies.

Practical implications for transactions

Companies planning asset deals should implement the requirements of the DSK decision in order to minimise data protection risks. In particular, sellers and buyers should establish the legal basis for the transfer of data in advance. In the early phase of an asset deal, they should ensure that the data protection conditions are clear in order to avoid delays later on.

Documentation of the legal basis for the processing of personal data is also crucial. Companies should be able to explain clearly why and on what basis they process personal data. In addition, buyers should consider measures such as the anonymisation or pseudonymisation of data during the due diligence phase in order to ensure the protection of data subjects and avoid potential breaches.

Conclusion

The new DSK decision replaces the previous DSK decision on asset deals from 2019 and provides companies with more detailed and, in some cases, clearer guidelines for the data protection-compliant handling of personal data in the context of asset deals.

However, the scope of action for companies remains unclear as far as the so-called objection solution is concerned. While the DSK decision dating from 2019 exhaustively listed the case groups in which the seller and acquirer were permitted to proceed in accordance to this approach, the new DSK decision only touches on the objection solution or only addresses it in relation to special constellations such as the sale of customer data as the only asset. Here, it remains unclear whether a transfer of data based on the objection solution is to remain generally permissible or whether the DSK wishes to limit the scope of application of this approach to the constellations mentioned.

It therefore remains to be seen how the authorities will further develop their practice with regard to the transfer of data in the context of asset deals.

Valentino Halim

Back

6. News about the DSA and DDG: future guidelines for greater online protection of minors and the first certified national dispute resolution centre for online platforms

The Digital Services Act ("DSA") is an EU regulation that has been fully applicable since 17 February 2024 and forms the basis for a new set of rules for regulating digital services in the EU. At national level, the DSA is flanked by the German Digital Services Act (Digitale-Dienste-Gesetz, “DDG”), which in particular ensures the enforceability of the obligations arising from the DSA and defines the responsibilities of the authorities (see also our previous articles on the DSA and DDG in Newsletters 04/2023 and 02/2024).

Guidelines for the online protection of minors

One of the central objectives of the DSA is to protect minors from harmful content, cyberbullying and other dangers when using the internet. Art. 28 (1) 1 DSA therefore expressly stipulates that providers of online platforms that are accessible to minors must take appropriate and proportionate measures to ensure a high level of privacy, security and protection of minors within their service. It is clear that they must adopt a risk-based approach and are therefore required to carry out impact assessments and implement measures to mitigate any potential risks to minors. Such an approach is not unique, as it can already be found in various other legal texts of the EU legislator, for example in the GDPR when taking appropriate technical and organisational measures in accordance with Art. 32 GDPR. However, it is still unclear how the measures can be designed in such a way that they are considered suitable and proportionate. The DSA expressly provides that the EU Commission can draft guidelines on this in order to offer guidance to providers of online platforms (cf. Art. 28 (4) DSA).

At the end of July, the EU Commission initiated the first steps for such guidelines and invited all stakeholders to submit contributions to the proposed scope and approach of the guidelines with the aim of ensuring a high level of privacy, security and protection for minors on the internet. Key aspects of the guidelines are, in particular, a harmonised approach to age verification and the restriction of access to inappropriate content. It should be noted that the guidelines will generally apply to all online platforms that are accessible to minors, whether intended or unintended due to inadequate age verification measures. The only exceptions to this are micro and small enterprises within the meaning of Recommendation 2003/361/EC of the EU Commission. When designing their services, online platforms should therefore ensure that the rights and welfare of minors are adequately taken into account.

Feedback on the EU Commission's request is possible until 30 September 2024. There will be a separate consultation following the drafting of the guidelines. The final guidelines are expected before summer 2025.

User Rights GmbH as the first certified dispute resolution centre in Germany

The DSA extensively formalises the procedure for dealing with user complaints. In addition to the requirements of the notification and redress procedure under Art. 16 DSA, online platforms must maintain an internal complaints management system (Art. 20 DSA) and participate in out-of-court dispute resolution procedures (Art. 21 DSA).

The out-of-court dispute resolution organisations are certified by the Digital Services Coordinator ("DSC") at the German Federal Network Agency (Bundesnetzagentur), provided they fulfil the requirements set out in Art. 21 (3) DSA. Users can contact these certified organisations, for example, if they wish to have an online platform's decision on the deletion of content or the restriction of use or blocking of an account or user account reviewed.

On 12 August 2024, the DSC certified User Rights GmbH as the first out-of-court dispute resolution body in Germany. Users can now submit arbitration requests there free of charge in order to defend themselves against decisions made by online platforms. User Rights GmbH focuses on dispute resolution for social media platforms and is initially only processing applications in connection with the platforms TikTok, Instagram and LinkedIn. However, other platforms will follow.

Tobias Kollakowski LL.M.

Back

Legal Tech Tools - Digital applications for more efficient solutions

Discover our wide range of legal tech tools! Learn more ...

 

New: Oppenhoff Taskforce AI

 

Find out here how the interdisciplinary Oppenhoff AI Taskforce ensures your compliance with the requirements of the EU’s AI Act and many other legal requirements for AI from the very outset, as well as the many other legal requirements for AI, and that AI systems can be used in a legally compliant manner.

Download the brochure here.

 

Back to list

Dr. Marc Hilber<br/>LL.M. (Illinois)

Dr. Marc Hilber
LL.M. (Illinois)

PartnerRechtsanwalt

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 612
M +49 172 3808 396

Email

LinkedIn

Dr. Hanna Schmidt

Dr. Hanna Schmidt

Junior PartnerRechtsanwältin

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 613
M +49 172 1475 126

Email

Marco Degginger

Marco Degginger

Junior PartnerRechtsanwalt

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 365
M +49 162 1313 994

Email

Tobias Kollakowski<br/>LL.M. (Köln/Paris 1)

Tobias Kollakowski
LL.M. (Köln/Paris 1)

Junior PartnerRechtsanwaltLegal Tech Officer

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 423
M +49 173 8851 216

Email

LinkedIn