IT Law and Data Protection29.03.2023 Newsletter

Focus IT&C – 2nd Quarter 2023

Find out more about our IT law & data protection practice group - now regularly summarised for you at a glance! On a quarterly basis, we will be presenting you with the most important developments in IT law and data protection. In addition to informing you of the latest draft laws and developments in the field, we advise you on classic IT law, data protection law and new media. Please also feel free to contact us for audits, IT project support and consulting, including cloud computing, e-commerce topics and social media issues.

1. Europe and digital identity - reform of the eIDAS Regulation

2. From BAG to ECJ and back:  dismissal of data protection officers

3. AI-based systems: processing of personal employee data to detect attacks (cyber threat monitoring tools)

4. OLG Brandenburg: limitations of the request for information under data protection law to determine errors in the calculation of insurance premiums

5. Artificial intelligence: will the AI Act drive ChatGPT out of Europe?

6. BGH: renewed referral to ECJ on standing to sue for GDPR infringements

1. Europe and digital identity - reform of the eIDAS Regulation 

Digital wallets on the rise

It is the European Commission’s vision that at least 80 percent of the population should be able to identify themselves digitally by 2030 when making use of public services. A digital wallet, the so-called wallet for the European digital identity, or “EUid wallet”, shall enable this vision. This digital wallet is to be operated via mobile phone, among other things. Not only will personal identification be possible, but it will also be possible to provide driving licences, certificates and health certifi-cates via the EUid wallet. If things go according to Commission President Ursula von der Leyen, it should even be possible to hire a bicycle via the wallet. The EU Commission has worded the scope of the planned project in correspondingly broad terms: 

“All EUid wallets should allow users to electronically identify and authenticate themselves online and offline across borders in order to access a wide range of public and private services."

Legal framework

Legally, this vision is embedded in the EU Commission's new draft of 3 June 2021 amending the eIDAS Regulation on electronic identification and trust services (draft eIDAS Regulation), on the basis of which the EU Parliament formally approved various amendment applications on 16 March 2023. The first EUid wallets are expected to be available as early as 2024. To date, the eIDAS Regulation is known in particular for regulating so-called qualified electronic signatures (QES). A QES is relevant in cases where the law stipulates the written form as a validity requirement for a legal transaction, as this can be replaced by a QES in accordance with Sec. 126a (1) of the German Civil Code [Bürgerliches Gesetzbuch, BGB]. A frequent aid in this context is the ID card, whose electronic functions have to be enabled for the QES. Following on from this, citizens should also be able to create such QES with the EUid wallet in future (cf. Art. 3 no. 42 draft eIDAS Regulation).

Comparable to the already existing QES, the member states do not have to issue and manage the EUid wallets themselves (even under the new eIDAS Regulation). Rather, they may use third parties for this purpose or entrust this task to a recognised, independent body (Art. 6a (2) draft eIDAS Regulation). The issuance of the QES is also reserved for so-called qualified trust service providers, which are certified in Germany by the responsible Federal Network Agency [Bundesnetzagentur].

Acceptance and data protection

It also remains to be seen how the EUid wallet will be accepted by the population. Although most Germans are familiar with the online ID card function, according to a mid-2021 survey only seven percent were using it. The technical implementation and intuitive handling of the EUid wallet are therefore likely to be decisive factors. As is so often the case with digital services, data protection will also play a major role. Special requirements exist here, since the legislative plan also includes the processing of sensitive data (Art. 9 GDPR). At the same time, the functions of the EUid wallet are to be usable throughout Europe. It will be interesting to see what technical and legal challenges the European digital identity will pose.

Dr. Axel Grätz

Back

2. From BAG to ECJ and back:  dismissal of data protection officers

Following the referral by the German Federal Labour Court [Bundesarbeitsgericht, BAG] of two cases to the European Court of Justice (ECJ) for a preliminary ruling in connection with the dismissal of a data protection officer, the matter is back before the BAG now that the ECJ has answered the referred questions. Firstly, the BAG has now ruled that the office of works council chairperson is not compatible with the function of company data protection officer (judgement dated 6 June 2023 - 9 AZR 383/19). Secondly, the BAG played the ball back to the court of appeal for further clarification of the facts and repealed the judgement of the Saxon Regional Labour Court [Landesarbeitsgericht, LAG] (judgement dated 6 June 2023 - 9 AZR 621/19).

Referral to the ECJ

Back in 2019, the BAG had referred various questions to the ECJ for a preliminary ruling on the dismissal of a company data protection officer for cause (“wichtiger Grund”). The ECJ commented on this matter in February of this year (cf. our article of 9 February 2023).

The BAG has now addressed the matter again and has issued two decisions regarding a conflict of interests in case of a company data protection officer and the resulting possibility of a dismissal from office.

Incompatibility of the works council office with the appointment as data protection officer (9 AZR 383/19)

In the case to be decided by the BAG, the plaintiff had held the position of works council chairman at the defendant's business. With effect from 1 June 2015, he was appointed as company data protection officer. However, following the entry into force of the GDPR, the defendant revoked the appointment for operational reasons, as a conflict of interests existed with regard to the plaintiff’s position as data protection officer due to his position as chairman of the works council. In his legal action, the plaintiff sought a declaration to the effect that his legal status as company data protection officer continued unchanged.

The lower courts upheld the action and the BAG suspended the proceedings in order to obtain preliminary clarification of individual issues by the ECJ. While the ECJ did not specifically answer the question of whether a conflict of interest existed in the case at hand, it did state that a conflict of interest within the meaning of Art. 38 (6) GDPR exists when a data protection officer is assigned other tasks or duties that would require him to determine the purposes and means of the processing of personal data. The assessment should be made on a case-by-case basis and should take all relevant circumstances into consideration.

The BAG now states that the works council decides by council resolution under which specific circumstances it requests which personal data from the employer when exercising its statutory duties and in which way it subsequently processes this data. Within this framework, it determines the purposes and means of processing the personal data. The prominent function of the works council chairperson, who represents the works council within the framework of the resolutions passed, cancels out the reliability required for the performance of the duties of a data protection officer within the meaning of Sec. 4f (2) sentence 10 German Federal Data Protection Act [Bundesdatenschutzgesetz, BDSG] (old version). Accordingly, the dismissal from office for cause was also justified in the case at hand.

Further clarification of facts by the court of appeal required (9 AZR 621/19)

In other proceedings before the BAG (9 AZR 621/19), the plaintiff processed financial data of citizens in the course of his professional activities as an application consultant. The defendant dismissed the plaintiff as data protection officer on grounds that his activities as data protection officer conflicted with his professional activities. With his legal action, the plaintiff thus claimed the invalidity of his dismissal. There was no cause for the termination.

While the lower courts ruled in favour of the plaintiff, the BAG first had the ECJ clarify the relationship between national provisions on dismissals from office and the provision of Art. 38 (3) sentence 2 GDPR. The ECJ then stated in its judgement of 9 February 2023 (C-453/21) that Art. 38 (3) sentence 2 GDPR, according to which a data protection officer may not be dismissed from office due to the performance of his duties, was not conclusive. The member states could adopt stricter national rules protecting a data protection officer from dismissal from office. The prerequisite for this, however, is that the national provision does not impair the achievement of the objectives of the GDPR, in particular the protection of the independence of the data protection officer.

The BAG has now overturned the appeal ruling of the Saxon LAG and referred the dispute back for a new hearing and decision. There is no BAG press release regarding these proceedings, but it can be assumed that the description of the plaintiff's activities to date does not yet sufficiently indicate whether the activities impair the independence of the data protection officer.

Conclusion

Fortunately, the BAG has clarified that the office of works council chairperson is not compatible with that of a data protection officer. The decision was made with regard to Sec. 4f (2) sentence 1 BDSG (old version). However, in its answer to the question presented for a preliminary ruling, the ECJ expressly made it clear that the provision of Sec. 38 (2) in conjunction with Sec. 6 (4) sentence 1 BDSG (new version), which allows a dismissal from office only for cause within the meaning of Sec. 626 BGB, does not fundamentally conflict with the Union law provision of Art. 38 (3) sentence 2 GDPR. This also applies if the dismissal from office is not related to the performance of the duties of the data protection officer. Accordingly, also under the provisions of the GDPR and the BDSG (new version), companies may also dismiss a data protection officer insofar as the data protection officer simultaneously holds the office of works council chairperson.

However, the BAG has left open the question of whether membership of the works council fundamentally precludes the assumption of the office of data protection officer. However, this preclusion is supported by the fact that, although the works council chairperson represents the works council, the consent to any decisions must be given by the council and thus also by the individual works council members. A (Supreme Court) decision insofar is still awaited.

As the referral back to the court of appeal in the further proceedings of the BAG has shown, however, whether there is a conflict of interests that is incompatible with the office of data protection officer must always be examined on a case-by-case basis. Companies should therefore examine the specific activities conducted by the data protection officer in their capacity as employee before any dismissal from office.

Annabelle Marceau 

Back

3. AI-based systems: processing of personal employee data to detect attacks (cyber threat monitoring tools)

The demands on companies to mitigate cyber risks are constantly increasing. Companies that operate critical infrastructure have been required by Sec. 8a (1a) of the German Act on the Federal Office for Security [Gesetz über das Bundesamt für Sicherheit in der Informationstechnik, BSIG] since 1 May 2023 to deploy attack-detection systems. While these defence systems typically do not process content data (such as the content of e-mails or other communications), they do process network usage data, including IP addresses, which can be assigned to specific employees based on anomalies or incidents that have occurred.

Criterion of approval

For this reason, approval is required for the processing of personal data. First of all, it is established that the material scope of application of Sec. 26 (1) of the German Federal Date Protection Act [Bundesdatenschutzgesetz, BDSG] is excluded (to the extent that it even comes into consideration as a legal basis at all following the ECJ ruling of 30 March 2023 (docket no. C-34/21)), since the purpose of the data processing is prevention in the area of IT security and the focus is on the clarification of incidents other than criminal offences. This therefore leaves recourse to Art. 6 (1) GDPR, which is rightly permissible according to the prevailing opinion.

  • The processing of personal employee data for identifying and defending against IT security risks can fundamentally be in the prevailing legitimate interest of the employer and therefore permissible pursuant to Art. 6 (1) sentence 1 lit. f GDPR. This is because data processing conducted to defend against cyber-attacks also serves to prevent hackers from accessing the data of employees and customers and is therefore also in their interests (Regional Labour Court [Landesarbeitsgericht, LAG] of Munich, decision dated 23 July 2020 - 2 TaBV 126/19, margin nos. 118, 123 et seq.). However, data processing must always be limited to what is absolutely necessary, conducted by means of black box processing to the extent possible, and a viable authorisation concept must be drawn up for accessing the data. Finally, appropriate deletion and archiving rules need to be implemented.
  • Insofar as the monitoring serves the protection of critical infrastructure, the processing can arguably also be based on Art. 6 (1) sentence 1 lit. c GPDR (processing to comply with a legal obligation), as its purpose is the implementation of the preventive IT security measures required under Sec. 8a BSIG.
    • Accordingly, operators of critical infrastructures must take appropriate organisational and technical precautions to prevent disruptions to the availability, integrity, authenticity and confidentiality of their information technology systems, components or processes.
    • The provision of Sec. 8a BSIG has been supplemented by the 2nd IT Security Act with paragraph (1a) BSIG, which even explicitly obliges these companies to use attack-detection systems (Sec. 2 (9b) BSIG) as of 1 May 2023.
    • In our opinion, these obligations should also be sufficiently concrete to derive from them legal obligations to conduct concrete processing operations (also according to the LAG Munich loc. cit. margin no.  125). Were stricter requirements to be imposed, Art. 6 (1) sentence 1 lit. c GDPR would run the risk of becoming meaningless.
  • Since the establishment and use of cyber threat monitoring tools is subject to co-determination pursuant to Sec. 87 (1) no. 6 German Works Constitution Act [Betriebsverfassungsgesetz, BetrVG], a works agreement is required as basis for the processing of employee data pursuant to Art. 88 (1) GDPR in conjunction with Sec. 26 (4) sentence 1 BDSG.  Although, because of Art. 88 (2) GDPR, it is questionable whether data processing by way of works agreement is even permissible beyond what is permitted under Art. 6 GDPR, there is still the benefit, at least in practical terms, that a works agreement can at least specify the modalities of permissible implementation.

Notification obligations

The information to be provided to employees as part of the mandatory information pursuant to Art. 13 and Art. 14 GDPR is of a general nature and does not have to contain details about the functioning of the system used. This is because this would be counter-productive and could make it easier to circumvent the systems. Independently of this, an individual data subject must be informed insofar as their data are processed in connection with a detected anomaly, unless there are concrete indications to suggest that tracks would have to be covered up or the functioning of the system would have to be disclosed after the receipt of such information.

Data protection impact assessment

If the system creates systematic employee profiles (IP-based or similar), a data protection impact assessment (DPIA) is required in accordance with Art. 35 (1) of the GDPR. Otherwise, in the absence of special circumstances, it is more likely that no obligation exists (cf. in this respect no. 8 of the list of the Conference of Independent Data Protection Authorities of the Federation and the Länder (“DSK”) (DSK DPIA Positive List) as well as the guidelines of the Art. 29 Data Protection Working Party ("WP 248 Rev. 01", confirmed 2018 by the European Data Protection Board). According to the report, a DPIA is required for a: "company systematically monitoring its employees' activities, including the monitoring of the employees' work station."

Dr. Marc Hilber

Back

4. OLG Brandenburg: limitations of the request for information under data protection law to determine errors in the calculation of insurance premiums

In a recent decision (judgement of 14 April 2023 - 11 U 223/22), the Higher Regional Court [Oberlandesgericht, OLG] of Brandenburg addressed the question of where to draw the line in the case of a request for information based on Art. 15 GDPR. Against the background of the protective purpose of the GDPR, the Senate’s opinion was that the provision of information on executed premium adjustments constitutes an abusive use of rights if its sole purpose is to enforce possible payment claims against an insurer due to possible formal deficiencies.

The customer of a private health insurance company sought information on premium adjustments from his insurer by means of an action by stages in order to subsequently obtain a declaratory judgement to the effect that possible premium increases were invalid. This approach has recently become more common in insurance law, albeit that the OLG reinterpreted the action by stages to be a general accumulation of actions.

In the case, the lower court, like the OLG, granted a claim to information on premium adjustments for the years 2013 to 2020 through the submission of insurance policies and supplements pursuant to Secs. 3 (3) and (4) of the German Insurance Contracts Act [Versicherungsvertragsgesetz, VVG]. However, the OLG denied a further-reaching, comprehensive right to information with regard to all cover letters and enclosures, in particular with regard to Art. 15 GDPR.

Legally abusive nature of a request for information

The case demonstrates the relevance of the objective of the request for information expressed by the data subject for the question of the legality of such request, and at the same time the relevance of the data controller’s right of refusal under Art. 12 (5) sentence 2 lit. b) GDPR.

Here, the OLG makes reference to the question referred by the German Federal Court of Justice [Bundesgerichtshof, BGH] to the European Court of Justice (ECJ) on 29 March 2022 (VI ZR 1352/20, margin no. 12 et seq.), whilst simultaneously emphasising that the question of whether the right to information can be restricted in terms of content when pursuing other, non-privacy-related yet legitimate purposes specifically is not relevant in this case. Here, namely, not only was the plaintiff's request for information not based on a data protection objective, it also failed to serve any legitimate purpose whatsoever. It therefore was to be regarded as an abuse of rights.

When assessing whether a request for information constitutes an abuse of rights - and thus at the same time is "excessive" within the meaning of Art. 12 (5) sentence 2, the protective purpose of the GDPR is of particular importance. Recital 63 GDPR states that the right of access stipulated in Art. 15 GDPR serves to enable the data subject to gain awareness, easily and at reasonable intervals, of the processing of their personal data and to verify the lawfulness of such processing.

In this case, however, already according to the plaintiff’s own submissions, this had nothing to do with any ‘gaining of awareness’ for the purpose of checking the permissibility of the processing of their personal data under data protection law. Rather, the purpose of the information requested was exclusively to review possible premium adjustments made by the insurer for possible formal defects under insurance contract law. In the opinion of the OLG, such an approach is not covered by the protective purpose of the GDPR.

With this decision, the court distinguishes itself from a decision of the OLG Celle. The latter’s decision had been based on the argument that the plaintiff’s motivation was irrelevant, because the GDPR does not make the right to information dependent on the plaintiff’s specific objective and, accordingly, the request for information does not have to be substantiated (judgement dated 15 December 2022 - 8 U 165/22, margin no. 121).

No claim to information on grounds of a different claim basis

All other claim bases coming into consideration in this context were ultimately also ruled out:

A claim to the requested documents arose neither from Sec. 3 (3) and (4) VVG, the scope of which does not include cover letters and enclosures, nor from Sec. 810 BGB, which permits the inspection but not the provision of information and sending of documents.

Finally, Sec. 242 BGB in conjunction with the insurance contract was also excluded as a claim basis. This would require, among other things, sufficient indications that a specific enforceable claim exists (BGH, judgement of 11 February 2015 - IV ZR 213/14, margin no. 29). In the case at hand, however, the purpose of the information was to provide the initial information as to whether a claim for payment actually existed against the insurance company.

Conclusion

The decision is an example of how the GDPR at times only protects those who share its particular concern: the protection of personal data. Just as the obligations of data controllers under the GDPR may not be dismissed as a nuisance, the rights given to the data subject may not be understood as a "cure-all" that can be pulled out of a hat whenever personal data is involved in some way. According to the decision of the OLG Brandenburg, the right to information under Art. 15 of the GDPR is definitely only for those data subjects who are at least also concerned about protecting their personal data.

Dr. Hanna Schmidt

Back

5. Artificial intelligence: Will the AI Act drive ChatGPT out of Europe?

Media coverage of Artificial Intelligence (AI) coverage has long since surpassed being just exciting and ground-breaking. Warnings about dangers, such as disinformation campaigns, malfunctions and the commission of crimes are increasing. Recently, AI experts from research, science and the tech industry have even feared the risk of human extinction.

For example, the co-founder of the ChatGPT provider OpenAI warns of the dangers of uncontrolled AI. He was one of several hundred experts who, at the end of May, raised the call for the regulation of AI in a single but very memorable sentence:

"Reducing the risk of extinction from AI should be prioritised globally - on par with other risks to society as a whole, such as pandemics and nuclear war."

As a result, calls for firm security standards for AI development and regulation are growing louder. Discussions on legal issues of data protection and copyright are also gaining momentum.

The first EU draft of the Artificial Intelligence Act (“AI Act” for short) dates back to April 2021.

The introduction of ChatGPT and similar applications in the meantime has led to numerous changes to the draft. On 11 May 2023, a "compromise" draft of the regulation was published that accounts for the current developments surrounding the AI hype. This Commission draft was tightened up and published by the EU Parliament in mid-June.

What rules is the EU planning for ChatGPT & Co?

The AI Act focuses specifically on the regulation of "high-risk AI systems". These are systems that pose a significant risk to the health, safety or fundamental rights of individuals, such as AI applications in the healthcare or human resources management sectors. Providers of such high-risk applications must register their systems in an EU-wide database and conduct conformity assessment procedures before putting them into operation. Particularly risky applications, such as social scoring or facial recognition, are to be banned in their entirety.

Speech models like ChatGPT and image generators like Midjourney have in common that they do not have a predefined purpose, but can be used in a variety of ways. Therefore, the term general purpose AI, i.e. multi-function or multi-purpose AI, has emerged for these models.

The new draft introduces the concept of foundation models for these applications. They include, for example, speech and image generators as basic models that are suitable to be adapted and further developed for a much wider range of specific tasks. A foundation model thus forms the basis for the development of more specialised AI applications.

One of the difficulties of regulating foundation models lies in their open-purpose. Is it the base model or the "fine-tuning" that gives rise to the greater risks?

For representatives from the industry, science and research, one thing is clear: "The same rules should apply to manufacturers of systems like ChatGPT and Midjourney as apply to manufacturers of high-risk systems."

(How) can you regulate something that holds unimagined possibilities?

However, the AI Act does not classify foundation models as high-risk AI systems. For a variety of duties, however, it does still refer to providers of programmes like ChatGPT.

Providers of generative AI, for example, must take the responsibility for ensuring that the models are designed and trained in such a way that no illegal content can be generated or copyrighted data published.

The regulation also places high demands on the transparency of the applications. Firstly, this requires that AI systems be developed and deployed in a way that is comprehensible and interpretable. Users must always be aware that they are communicating or interacting with an AI system. They must be educated about the capabilities and limitations of the system and provided with rights of appeal and information.

In addition, the EU is handing over some of the ongoing regulatory work to providers by requiring them to identify and mitigate in advance any "foreseeable risks" that their models could pose.

Violators face fines of up to EUR 20 million (for data management or transparency violations) or up to EUR 40 million for placing prohibited AI systems on the market.

Will tech giants benefit from the planned regulation?

Considering all possible applications and risk scenarios in advance could become an unmanageable mammoth task for smaller companies, scientists and smaller developer communities.

Lawyers, AI scientists and entrepreneurs therefore fear that strict regulation through the AI Act could slow down research in Europe and shift the development of foundation models to large corporations outside of the EU in the future.

As of January 2022, about 73 percent of major AI models were developed in the US and 15 percent in China. In the US, as many as 542 companies with a focus on "artificial intelligence" were founded last year, compared to just 41 in Germany.

What influence future EU regulation will have on this development remains to be seen.  

After publication of the EU Parliament's position in mid-June, the way is clear for the trialogue. In negotiations with the EU Commission and the member states, agreement is to be reached on the final version of the law by the end of the year, meaning that it can probably enter into force at the beginning of 2024. Companies will then have two years to adapt to the changed framework conditions.

Michael Lamberty

Back

6. BGH: renewed referral to ECJ on standing to sue for GDPR infringements

Who can sue for GDPR violations? Legal standing of associations and competitors under scrutiny

The German Federal Court of Justice [Bundesgerichtshof, BGH] has again referred questions to the European Court of Justice (ECJ) regarding the possibility for consumer associations to take legal action on the basis of the German Injunctive Relief Act [Unterlassungsklagegesetz, UKlaG]. Additionally, the BGH has for the first time referred the question to the ECJ of whether the GDPR precludes national regulations granting competitors a right of action in the event of assumed data protection violations.

Possibility of legal action by consumer associations

The BGH's questions on the possibility of legal action by associations concern the interpretation of Art. 80 (2) GDPR. Among other things, the provision allows EU member states to grant certain non-profit bodies active in the field of personal data rights protection (especially consumer associations) judicial remedies against companies that violate their data protection obligations to the detriment of data subjects. Already in 2020, the BGH had submitted the question to the ECJ whether Art. 80 (2) GDPR precludes national legal action options that allow consumer associations to take legal action against data protection violations regardless of the violation of concrete rights of individual data subjects, without their mandate (decision dated 28 May 2020, docket no. I ZR 186/17). This is because the UKlaG provides for such an "independent possibility to take legal action". The ECJ answered this question in the affirmative, stating that the GDPR does not stand in the way of a national right of consumer associations to take legal action without a mandate from specific data subjects affected (judgement of 28 April 2022, case C-3-19/20).

The BGH has now referred to the ECJ the question of whether the infringement of information obligations under the GDPR can also be challenged by consumer associations by way of legal action (decision dated 10 November 2022, docket no.  I ZR 186/17). The background to the question is that, according to its wording, Art. 80 (2) GDPR only allows an action if the complaining association is of the opinion that the rights of a data subject under the GDPRhave been violated "as a result of a processing". This is questionable in the proceedings up for decision by the BGH because the plaintiff, the consumer association Verbraucherzentrale Bundesverband e.V.(vzbv), is challenging a deficient provision of information pursuant to Art. 12, 13 GDPR by the Facebook parent company Meta. However, the fulfilment of the information requirements does not constitute a "processing" according to the terminology of the GDPR, rather it is only upstream of such processing. Furthermore, it is unclear what exactly is the "legal infringement" that, according to the wording of Art. 80 (2) GDPR, must have occurred "as a result of the processing".

The question now referred to the ECJ by the BGH is also highly relevant for the possibility of legal action by the actual data subjects concerned. This is because Art. 79 (1) of the GDPR, which is relevant in this regard, also allows the data subject to bring an action (only) if they are of the opinion that "the rights to which they areentitled under [the GDPR] have been infringed as a result of the processing of their personal data [in breach of the GDPR]". Whether these requirements are met when the data subject takes legal action to obtain an injunction against certain data protection violations has been a matter of dispute to date. This is because, in the case of legal actions for forbearance from certain processing operations, the data subject - unlike in the case of actions for damages, for example - does not object to an "infringement of their rights as a result of a processing", but directly to the processing itself. If the data subject does not request forbearance from the "processing" but rather forbearance from other data protection violations (e.g., the lack of information), the question arises - as in the case of the BGH – of how the term "processing" is to be interpreted in the context of Art. 79 (1) GDPR. Depending on the outcome of the proceedings before the ECJ, data subjects with a request for an injunction could be limited to filing a complaint with a competent supervisory authority pursuant to Art. 77 (1) GDPR.

Possibility of legal action by competitors

Another current referral case concerns the possibility of competitors taking legal action against a controller who violates provisions of the GDPR (BGH, decision dated 12 January 2023, docket no. I ZR 223/19).

In the proceedings up for decision by the BGH, a pharmacist is suing another pharmacist who sells pharmacy-only drugs via Amazon. The plaintiff is of the opinion that the order data constitutes health data, for the processing of which the defendant must obtain consent, which it did not obtain. On the basis of Secs. 3a, 8 of the German Unfair Competition Act [Gesetz gegen den unlauteren Wettbewerb, UWG], the plaintiff demands that the defendant forbear from processing the data without obtaining consent. The BGH has referred to the ECJ, among other things, the question of whether the GDPR precludes national regulations that grant competitors the power to take legal action against the infringer for violations of the GDPR. The background to this question is that the GDPR – unlike in the case of data subjects themselves and consumer associations (see above) - does not regulate any possibility of legal action. Insofar as the ECJ considers the legal protection possibilities of the GDPR to be conclusive, an action by competitors on the basis of the UWG would be excluded.

If the ECJ comes to the conclusion that there is room for national legal action in addition to the GDPR, the BGH will have to decide whether data protection violations concern market conduct regulations within the meaning of Section 3a UWG. This question has been controversial in German case law up to now. The decision of the ECJ and the possible subsequent decision of the BGH will now finally clarify whether the UWG allows competitors to issue warning letters and file lawsuits in the event of data protection violations. This question is of high practical relevance. Were such a possibility of issuing warnings to exist, warnings for DSGVO violations could ultimately become a mass phenomenon.

Marco Degginger

Back

Back to list

Tobias Kollakowski<br/>LL.M. (Köln/Paris 1)

Tobias Kollakowski
LL.M. (Köln/Paris 1)

Junior PartnerRechtsanwaltLegal Tech Officer

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 423
M +49 173 8851 216

Email

LinkedIn