04.06.2024 Newsletter

Focus IT&C – 2nd Quarter 2024

We have compiled some important and exciting new developments and case law from IT law and data protection for you. We hope you enjoy reading it!

 

1.  Ways of avoiding falling under the scope of application of high-risk AI systems (Part III)

2. Update and brief overview of the Cyber Resilience Act 

3. Turning point for healthcare research: Improved Use of Health Data Act

4. Non-material damages under the GDPR: Current case law of the ECJ

5. The German Digital Services Act: Patchwork despite harmonisation?

 

 

1. Ways of avoiding falling under the scope of application of high-risk AI systems (Part III)

The AI Act strictly regulates high-risk AI systems. Operators and providers of high-risk AI systems would therefore be well advised to take measures to avoid falling under their scope of application.

Pursuant to the exemption regulated in Art. 6 (3) AI Act, AI systems that would generally be categorised as high-risk AI systems under Art. 6 (1) AI Act are exempt from this categorisation according to certain criteria (see Part I of our series).

Art. 25 (2) AI Act exempts the provider of a high-risk AI system that has been placed on the market or put into operation, e.g. in the case of white label systems, insofar as such system bears the label of the operator (who thus becomes the provider) from its obligations under Art. 16 AI Act, (see Part II of our series).

Operators can also avoid the application of the AI Act to their high-risk AI systems by using their legacy systems unchanged to the extent possible after the AI Act comes into force:

Solution 3: unchanged use of legacy systems by operators, Art. 111 (2) AI Act

According to Art. 111 (2), the AI Act does not apply to operatorsof those high-risk AI systems that were placed on the market or put into operation up to 24 months after the date of its entry into force, i.e. up to mid-2026, as long as the operator does not significantly change them. Operators therefore do not have to fulfil the comprehensive obligations under Art. 26 and 27 AI Act for these high-risk AI systems.

However, there is a need for further clarification in detail:

Operators must also fulfil the high-risk AI obligations for legacy systems if they significantlychange the system after the AI Act comes into force. However, in Art. 3 No. 23 the AI Act itself only defines the term "substantial modification". Such substantial modification is deemed to exist if 1. the modification was not provided for in the original conformity assessment and conformity is impaired as a result, or 2. the intended purpose of the system is changed.

A conformity assessment in accordance with the AI Act generally should not exist for legacy systems. At best, operators could use a hypothetical conformity assessment. The definition of substantial modification in Art. 3 No. 23 AI Act cannot therefore simply be used for the assessment of a significant change within the meaning of Art. 111 (2) AI Act, even if the obvious thing would be to use corresponding criteria.

For the time being, operators of legacy systems would therefore be advised to continue to operate them unchanged if possible.  

Dr. Axel Grätz

Back 

2. Update and brief overview of the Cyber Resilience Act 

Having been adopted by the EU Parliament on March 12, 2024, the Cyber Resilience Act (CRA) will enter into force twenty days after its publication in the Official Journal of the European Union once it has received the required approval of the European Council. The CRA stipulates various implementation deadlines for its obligations and will become fully applicable thirty-six months after the date of its entry into force (Art. 71 CRA).

The CRA applies to "products with digital elements" (PDEs) that are made available on the European market and is intended to ensure the cybersecurity of these products.

PDEs are software or hardware products and their remote data processing solutions, including software or hardware components to be placed on the market separately (Art. 3 no. 1 CRA). The decisive factor is that the purpose of the product is to establish a direct or indirect data link with other devices or networks. Mere internal device processing, in contrast, is not sufficient. Standalone cloud solutions fundamentally only fall within the scope of application if they are categorised as remote data processing solutions (Recital 12, Art. 3 (2) CRA).

The CRA therefore has a product-specific reference point and is expected to cover a large number of smart products. The CRA applies to all economic operators involved in the life cycle chain of PDEs, i.e. (i) manufacturers (and their authorised representatives), (ii) importers and (iii) distributors, with the CRA imposing the most extensive obligations on manufacturers of PDEs.

Since the allocation to these economic operators is based on the natural or legal person, it is conceivable, especially in group structures, that group companies involved in the same PDE may act as manufacturers, importers and distributors and will each be subject to the specific obligations of the CRA.

In principle, the CRA has a broad scope of application and covers a large number of PDEs. However, there are exemptions, in particular for those that are already subject to specific sectoral regulations (Art. 2 CRA). With regard to the scope of this exemption (as well as the other exemptions under the CRA), the question arises as to whether - based on the product-specific reference point of the CRA - all economic operators involved in the value chain of the PDE can invoke the exemptions of the CRA or whether the exemption criteria only apply to the company directly subject to the sectoral regulation (while the others remain subject to the CRA).

In addition to product requirements for cybersecurity and vulnerability management, the requirements and obligations of the CRA also include various documentation and retention obligations as well as information and reporting obligations. Furthermore, a conformity assessment procedure must be carried out before a PDE can be made available on the European market. The scope of these obligations depends on the role of the company under the CRA.

Violations of the CRA are subject to administrative fines of up to EUR 15,000,000 or up to 2.5% of the total worldwide turnover for the preceding financial year, whichever is higher, depending on the nature of the violation.

Christian Saßenbach

Back

3. Turning point for healthcare research: Improved Use of Health Data Act

At the end of April, the German Improved Use of Health Data Act (Gesetz zur verbesserten Nutzung von Gesundheitsdaten, GVNG) and the German Act to Accelerate the Digitisation of the Healthcare System / Digital Act (Gesetz zur Beschleunigung der Digitalisierung des Gesundheitswesens, Digitalgesetz) came into force. Together, these two laws mark a decisive turning point in the utilisation of health data, especially for medical research purposes.

To date, healthcare research in Germany has been hampered by a narrow legal framework, an inconsistent and strict interpretation of the GDPR in particular and a patchwork of specific federate state laws.

Data utilisation in the healthcare sector difficult in the past

The German healthcare system is still far from the desired level of digitisation. Patient data is usually stored locally in patient files at the service providers. The use of such patient data often requires compliance with a large number of different federal and federate state data protection laws, especially when the data is processed for research purposes through "secondary utilisation". In most federate states, only a hospital’s own research work is permitted without consent, while the transfer of data to third parties and even the transfer of health data to processors (Art. 28 GDPR) may be restricted.

Changes to electronic patient records

The German Social Code Book Five (SGB V) regulates the legal framework for a standardised electronic patient record (EPR, in German: ePA). The EPR was already introduced as a voluntary service in 2021, but so far only around one percent of those with statutory health insurance have opted to receive an EPR. The Digital Act now changes the legal framework for EPRs: according to Sec. 342 (1) SGB V as amended, statutory health insurances must set up an EPR for their insured by January 15, 2025, unless they object. The purpose of this newly introduced opt-out procedure is to enable the nationwide introduction of EPRs. According to Sec. 347 (1) SGB V as amended, service providers, in particular doctors, are obliged to store all treatment data in their patients' EPR unless the patient has objected.

Amendments to SGB V: Release of additional data for research purposes

In particular, the GVNG introduces amendments to the SGB V aimed at improving data availability and access for research purposes.

For example, health data stored in EPRs will now be automatically transferred in pseudonymised form to the Health Data Lab (HDL) (Forschungsdatenzentrum Gesundheit) at the Federal Institute for Drugs and Medical Devices (Bundesinstitut für Arzneimittel und Medizinprodukte, BfArM), unless the insured person in question has objected to the transfer (see Sec. 363 (1), (2) SGB V as amended). It is therefore to be expected that the data available to the HDL will increase immensely from January 15, 2025. However, this will only be the case if the technical and practical launch and filling of the EPRs by the service providers goes as planned.

With the regulations on data transparency, the SGB V enables third parties to obtain service data and health data on treatment by service providers from the EPR for scientific research purposes and to process them without the patient’s consent. The GVNG removes the previous restriction that only certain (public) bodies could request such data from the HDL in accordance with Sec. 303e SGB V in its previous version. Data requests now depend solely on the purposes for which the requester seeks to access and process the data.

The permitted purposes are defined quite broadly and allow, among other things, processing for the purpose of "scientific research on health-related issues" (see Sec. 303e (2) no. 4 var. 1 SGB V). This means that, for example, research-based pharmaceutical companies will be able to obtain personal health data for scientific research purposes from the HDL in the future.

GDNG: Standardised framework for the use of health data for research purposes

Together with the GVNG, a new, separate law on the improved use of health data (Gesetz zur verbesserten Nutzung von Gesundheitsdaten, GDNG) was introduced. The GDNG applies to the processing of health data within the meaning of Art. 4 no. 15 GDPR for research purposes (Sec. 1 (2) GDNG).

Sec. 6 (1) GDNG allows healthcare providers to process health data collected during the treatment of their patients for quality control, research and statistical purposes without separate consent. This is a significant improvement compared to the previous legal situation, as it provides for standardised nationwide requirements for service providers' own research.

Sec. 3 GDNG provides for the establishment of a central data access and coordination centre for health data at the BfArM. The access and coordination centre is the central point of contact for data users who apply for data access at the HDL and is intended to make the access process as effective as possible.

Within the framework of data access via the HDL, Sec. 4 GDNG allows the linking of data available in the HDL with data available in the clinical cancer registries of the federate states if certain requirements are met.

In the case of research projects involving multiple controllers responsible for the processing of health data and for which multiple supervisory authorities would be responsible under the GDPR and/or German data protection laws, the research partners involved can file an application to ensure that one supervisory authority is the lead authority (Sec. 5 GDNG). This ensures more effective and consistent data protection supervision.

Sec. 7 GDNG regulates research confidentiality, which is comparable to the confidentiality obligations of certain professional groups (in particular doctors). This research confidentiality is protected by criminal law in the event of a breach (Sec. 9 GDNG).

Conclusion

From a purely legal perspective, the GVNG significantly liberalises the use of data for research purposes in the healthcare sector. It has the potential to sustainably strengthen Germany as a research location, especially by enabling private companies (including research-based pharmaceutical companies) to access personal health data for research purposes.

However, a major uncertainty factor is the timely implementation of the technical and organisational requirements for the effective provision of the relevant data. The availability of data in the HDL depends, among other things, on whether the EPR is actually available to the majority of people with statutory health insurance from mid-January 2025 and whether doctors have all the technical requirements to fill in the EPRs. Another crucial factor is that the BfArM is able to fulfil its extended tasks in terms of expertise and personnel, so that access to the data is possible.

Marco Degginger

Back

4. Non-material damages under the GDPR: Current case law of the ECJ

For a long time, it was unclear when data subjects could claim non-material damages for data breaches under the EU General Data Protection Regulation (GDPR).

Since 2023, the European Court of Justice (ECJ) has developed guidelines for damage claims under Art. 82 GDPR in a series of judgments, such as in the Cases Österreichische Post (C-300/21) of May 4, 2023 (read our article of May 5, 2023), Gemeinde Ummendorf (C-456/22) and Natsionalna agentsia za prihodite (C-340/21) of December 14, 2023, Krankenversicherung Nordrhein (C-667/2) of December 21, 2023 and MediaMarktSaturn (C-687/21) of January 25, 2024.

According to the now established case law of the ECJ, a damage claim under Art. 82 GDPR requires three conditions to be met:

  1. an infringement of the provisions of the GDPR for data processing,
  2. a concrete damage suffered by the data subject and
  3. a causal link between the unlawful processing and the damage.

Furthermore, there is no de minimis limit or materiality threshold for minor damage. With further judgments from April and June 2024, the ECJ has now further tightened the eligibility requirements for non-material damages.

Judgment in Case C-741/21 of April 11, 2024 (juris)

The judgment in the Case juris (C-741/21) of April 11, 2024 concerns an action brought by a lawyer and client against the operator of the legal database “juris”. The plaintiff had objected to the processing of his data for advertising purposes. Nevertheless, he received advertising letters containing test codes which, when entered on the defendant's website, led to an order form being pre-filled with the plaintiff's personal data. The defendant defended itself by arguing that an employee had disregarded instructions on how to handle objections to marketing.

The ECJ clarifies that the mere fear of a misuse of personal data resulting from a loss of control can be considered non-material damage in individual cases (as in the Cases MediaMarktSaturn (C-687/21) and Natsionalna agentsia za prihodite (C-340/21)). However, the data subject must substantiate such a fear and its effects; the fear must not be merely unfounded.

The ECJ also commented on when the data controller can exculpate itself from liability under Art. 82 (3) GDPR in the event of misconduct by a person under its authority contrary to prior instructions (Art. 29 GDPR). Accordingly, a controller cannot exculpate itself from liability solely by referring to an employee who has acted contrary to instructions. This would impair the practical effectiveness of damage claims. However, exculpation is possible if the controller can prove that there is no causal link between the breach in question and the damage that has occurred.

Finally, the ECJ ruled on the question of how to calculate the amount of non-material damages, stating that the legal criteria for assessing administrative fines (Art. 83 (2), (5) GDPR) should not be used. The ECJ emphasises that damages under Art. 82 GDPR have a purely compensatory function, not a deterrent or punitive one (as already in the Case Krankenversicherung Nordrhein (C-667/21)). As a result, neither the severity of the offence nor the degree of fault are to be taken into account in the assessment. The solely decisive factor is the actual, concrete damage incurred and to be presented by the plaintiff.

Judgments in Cases C-182/22 and C-189/22 of June 20, 2024 (Scalable Capital)

On June 20, 2024, the ECJ handed down judgments following a referral from the Munich Local Court on a non-material damage claim in two related Scalable Capital cases (C-182/22 and C-189/22). The plaintiffs had opened accounts on Scalable Capital’s trading application in which they had stored personal data. Following a cyber attack in 2020, their login details and securities account data were accessed by unknown third parties, but not (yet) used fraudulently.

The ECJ first confirms that the compensation under Art. 82 GDPR does not serve any deterrent or punitive purpose, but is intended to fully and effectively compensate for the actual damage suffered (see Case juris (C-741/21)).

The ECJ also ruled that the GDPR recitals specify various possible physical, material or non-material damages and do not rank them in any order of priority. A breach of the protection of personal data is therefore not fundamentally less serious than bodily harm. However, the ECJ reiterates that the breach itself is not an indication of damage, which must be specifically presented and proven by the plaintiff.

Finally, with regard to identity theft or fraud (Recital 75), the ECJ has ruled that in these cases it is not sufficient that an unauthorised third party has obtained data concerning data subjects’ identities, but that plaintiffs must present and prove that a third party actually assumes the identity of a person whose personal data have been stolen.

Judgment in Case C-590/22 of June 20, 2024 (PS GbR)

On the same date, the ECJ handed down a further judgment (C-590/22) on a referral by the Wesel Local Court in a lawsuit brought by two clients against their tax adviser, who had used an outdated address of the plaintiffs to send them a tax return. Third parties had allegedly gained knowledge of the plaintiffs' names, dates of birth (including those of their children), tax identification numbers, bank details, religious affiliations, the severely disabled status of a family member, as well as their professions and places of work.

In this case, the ECJ confirmed a large number of previous judgments, in particular that the fear that personal data could be misused by third parties, even if only for a short period of time, can only constitute "non-material damage" within the meaning of Art. 82 GDPR in individual cases if this is specifically presented and proven.

Furthermore, the ECJ reaffirms that breaches of national provisions relating to the protection of personal data, but which do not have this as their purpose, are not to be taken into account when assessing the amount of damages. Notwithstanding the foregoing, a court may award damages in excess of those provided for in Art. 82 (1) GDPR - but only if national law permits this. Under German law, for example, it is not clear under which standard provisions an even more extensive non-material damage claim would exist.

Dr. Jürgen Hartung and Valentino Halim

Back

5. The German Digital Services Act: Patchwork despite harmonisation? 

With the entry into force of the German Digital Services Act (Digitale-Dienste-Gesetz, DDG) in Germany on May 14, 2024, the regulation of digital services has reached a new stage. Together with the EU's Digital Services Act (Regulation (EU) 2022/2065) (DSA), it addresses the risks associated with the fundamental transformation of the internet economy. The EU Regulation and the DDG as its act of transposition aim to regulate liabilities in the event of infringements, protect the freedom of expression and information of users of digital services and minimise influence on elections and threats to social cohesion. In addition, consumer protection is being strengthened in the e-commerce sector, particularly against counterfeit products.

With the entry into force of the DDG, the German Network Enforcement Act (Netzwerkdurchsetzungsgesetz, NetzDG) largely ceased to apply and the German Telemedia Act (Telemediengesetz, TMG) ceased to apply in its entirety. The provisions contained therein can now be found, mutatis mutandis, in the DSA, which has been in full force since February 17, 2024, or have been transferred to the DDG. For example, the mandatory information in the imprint is now regulated in Sec. 5 DDG (formerly Sec. 5 TMG). As the term "digital services" has replaced the term "telemedia", some editorial changes also had to be made in various laws. For example, the Telecommunications Telemedia Data Protection Act (Telekommunikation-Telemedium-Datenschutz-Gesetz, TTDSG) is now called the Telecommunications Digital Services Data Protection Act (Telekommunikation-Digitale-Dienste-Datenschutz-Gesetz, TDDDG).

Significantly expanded scope of application through the DSA

The purpose of the DDG is to enforce the DSA. The term "intermediary service", which is central to the question of the applicability of the DSA, includes "mere conduit", "caching" and "hosting" services (see Art. 3 lit. b DSA). The DSA regulates far more digital intermediary services than the former German Network Enforcement Act (NetzDG) and German Telemedia Act (TMG). From now on, not only social networks and video sharing platforms, but all hosting service providers must provide procedures for reporting complaints about illegal content. It is disputed whether the provision of a simple chat or comment function on a website or in an online shop can lead to the applicability of the DSA. If they have not already done so, providers of digital intermediary services should urgently familiarise themselves with the requirements that apply to them.

Multi-level regulatory approach

The DSA follows a multi-level regulatory approach and imposes different obligations on intermediary services depending on their type and size. If a service provider is categorised as an online platform, it must, for example, adequately label advertising and make the ranking of search results more transparent. Very large online platforms and very large online search engines are subject to even stricter regulations: they must independently assess and minimise systemic risks such as the spread of hate speech and election interference, which have led to some concerns about over-blocking.

Safeguarding freedom of information and special national approaches

In order to guarantee freedom of information, liability privileges throughout the EU for intermediary services that do not play an active role in the dissemination and selection of information still exist throughout the EU. In particular, there is no obligation for operators to proactively screen information shared by users for illegal content.

While the DSA generally adheres to the exclusive privileged treatment of paid service providers, the German legislator has decided to continue to apply the liability privileges to unpaid services as well. Together with the already existing exemption from liability for Wi-Fi providers, this creates a patchwork that was not actually intended by the EU. The obligation to appoint an authorised recipient for non-European service providers is also maintained in Germany in order to make it easier for private individuals to enforce their rights in the area of social networks - this is also no longer actually provided for.

The DSA itself will also lead to differences in application in the individual EU member states as some terms require interpretation: Although the notice and action mechanisms for illegal content already known from the NetzDG are now regulated directly in the EU regulation, the core requirement of "illegal content" is (also) determined by the law of the individual EU member states. This makes the review of reported content by hosting service providers highly complex and also involves the risk of over-blocking.

Central role of the Federal Network Agency and sanction options

In Germany, the Federal Network Agency (Bundesnetzagentur)is responsible for the supervision of intermediary services and enforces the DDG and the DSA with the help of the newly created "Coordination Centre for Digital Services" (Koordinierungsstelle für digitale Dienste), which is not subject to directives. It also serves as a complaints centre for users. The supervisory authorities have extensive powers, which can go as far as temporarily blocking an intermediary service in the event of infringements. Given the high fines, which can reach to up to 6% of a company's total worldwide turnover for the preceding financial year, intermediary services should familiarise themselves with the new regulations as soon as possible.

In addition, the DSA clarifies that users can also claim damage under private law in the event of a DSA violation (see Art. 54 DSA).

Outlook

Despite various amendments during the legislative process for the DSA, some questions remain unanswered, although these are not exclusively attributable to the German legislator. For example, unlike the NetzDG, the DSA does not mention the specific criminal offences that trigger a reporting obligation for hosting service providers, but instead refers to the concept of a criminal offence "that poses a threat to the life or safety of a person or persons", which is subject to interpretation. This may lead to arbitrary interpretations by the companies concerned. It remains to be seen to what extent the courts will provide more clarity here. In view of the high fines provided for in the DSA, this is to be welcomed.

In future, it will also be crucial to see how both the German Federal Network Agency and the new "European Board for Digital Services" assess developments on the part of service providers in their annual activity reports and what experiences are gathered in other EU countries.

Tobias Kollakowski

Back

 

Legal Tech Tools - Digital applications for more efficient solutions

Discover our extensive range of legal tech tools! Learn more ...

 

New: Oppenhoff Taskforce AI

Find out here how the interdisciplinary Oppenhoff AI Taskforce ensures your compliance with the requirements of the EU’s AI Act and many other legal requirements for AI from the very outset, as well as the many other legal requirements for AI, and that AI systems can be used in a legally compliant manner.

Download the brochure here.

Back to list