Danish DPA bans Google Workspace use and US transfers

🧨 Danish DPA with most significant (?) decision since the Schrems II ruling - banning certain use of Google products and US transfers.

22 days ago   •   5 min read

By Rie Aleksandra Walle

TL;DR: The Danish data protection authority banned a municipality's use of Google Workspace for Education and suspended all US transfers in a landmark decision of July 2022.

Listen to the Grumpy GDPR discussing this decision. And don't miss our follow-up episode speaking with the Danish DPA - coming soon!

In what’s perhaps the most significant event since the Schrems II ruling, and almost exactly two years later, the Danish data protection authority (DPA) Datatilsynet published a decision relating to the municipality Helsingor and their processing of personal data in the primary and lower secondary school (“folkeskole”).

Note that their decision is for this specific municipality. However, the DPA also writes that many of their conclusions "likely will apply to other municipalities using the same processing setup" and expects them to "take necessary actions".

The whole discussion around using Google products in Danish schools actually dates back to 2010, when the DPA concluded that the intended processing of special category personal data in Google Apps couldn’t fulfil the security requirements of the (then) Danish Personal Data Act. However, in 2017, the DPA gave Hedensted municipality persmission to use Google Apps for Education with 2010 SCCs (case file 2014-323-0221, case referenced on Dataethics.eu).

The decision of 14 July 2022, however, is a result of a prior decision issued to the municipality in September 2021. As both decisions are part of the same case and closely connected, let’s start by summarizing the latter.

The first decision – September 2021

The case against Helsingor municipality dates back to a complaint lodged by a parent in 2019, after his son became very anxious after reading a negative comment on his YouTube account – an account the parent was completely unaware of.

And neither was the municipality; at some point Google updated their terms to include several additional services such as YouTube and Gmail, but the municipality evidently hadn’t read these (!).

And even though the municipality hadn’t even conducted an initial risk assessment for the use of Google products in their schools, they managed to conclude, after the complaints from parents, that it was “unlikely” that the incident could represent any risk for the childrens’ rights and freedoms – so they decided not to notify the DPA.

But Helsingor isn’t the only one underperforming in privacy and data protection classes – a survey done by the Danish news outlet Version2 in 2021 shows that an astonishing 24 municipalities had never conducted a risk assessment at all, nor a DPIA, despite using Google products in schools for years. Another one admitted that they chose to continue using Google after the Schrems II ruling, despite identifying high risks. Perhaps worse is that they decided that a DPIA was “unnecessary”, although this is a clear requirement for high-risk processing activities.

The use of Google products in Danish schools has created a lot of debate over the past years, shown by numerous news articles and comment fields on 🔥. Although we won’t be able to cover it all in this article, you can read more in the various links.


To sum up the September 2021 decision, the DPA held that the municipality:

  • Failed to assess risks for the pupils’ rights and freedoms, consequently failing to demonstrate compliance with the GDPR Article 5(1)(c) as they didn’t minimize the personal data processed when creating users (the municipality admitted that it was indeed possible to minimize data).
  • Failed to demonstrate compliance with Article 5(1)(f), especially as they couldn’t demonstrate how they had configured new users account and, thus, could ensure sufficient security for the processing activities.
  • Lacked a legal basis in (and thus violated) Article 6(1) for processing personal data in Google’s “Additional Services” (such as YouTube and Gmail).
  • Failed to assess possible risks of using Additional Services after Google changed their terms of service. Since the DPA considers that the use of novel and complex technology, including software, and especially in the education space where the data subject are children and youth, generally represents a high risk, they held that the municipality had breached Article 35(1).
  • Failed, in several instances, to protect children whose names should be changed to an alias, resulting in a personal data breach as per Article 4(12).
  • Breached Article 32(1) by i) not conducting a risk assessment to identify risks connected with using Googles Additional Services, ii) not analyzing the consequences of pupils accessing these Additional Services, iii) not testing the functionality and scope of these services before starting to use them, iv) not ensuring that pupils with protected names and addresses were anonymized and, finally, v) not sufficiently securing the computers against unauthorized access.
  • Failed to notify the DPA for several violations with a clear risk for the data subjects, in breach of Article 33(1).

Based on the above, the DPA ordered the municipality to bring their processing activities in line with the GDPR, by i) assessing risks following from the use of Chromebooks and G-Suite (now Workspace) for Education, which shows the personal data streams the processing activities entail, and, ii) if the risk is deemed high, conduct a DPIA.

💡 Finally, the DPA instructed the municipality to contact all parents to ensure necessary corrections, anonymizations or erasure is done where their kids’ personal data has been unintentionally published or shared.

The second decision – July 2022

Many were slightly shocked to see that the DPA on 14 July ordered Helsingor municipality to suspend all US transfers and imposed a general ban on all processing in Google Workspace, giving them a deadline until 3 August. ⏰

However, as the prior decision clearly shows, this has been ongoing for several years, so the municipality has had ample time to rectify the violations. What’s particularly interesting in this case, though, is the focus on transfers to the US and the DPA’s findings.


In the July 2022 decision, the DPA held that the municipality:

  • Breached Article 5(2), cf. 5(1)(a) by i) not including all risk scenarios, ii) not testing the functionality and scope of selected hardware and software, and, iii) not documenting how the municipality controls Google’s access to the personal data, especially in Chrome OS and Workspace’s interaction with Google’s backend.
  • Failed to document that the appointed processor could provide suffient guarantees to satisfy the requirements of Article 28(1). Because they couldn’t exclude the possibility of the processor breaching their agreement, the DPA also held that this is a high risk in itself and the municipality should have conducted a DPIA as per Article 35(1).
  • Failed to ensure that transfers of personal data to the US is sufficiently protected from possible surveillance by the US government, thus breaching Article 44, cf. Article 46(1)(c). The municipality had localized storage to Europe, but their agreement with Google was contingent on allowing for support from outside of the EEA. For this, they had the 2021 SCCs in place. The DPA, however, found that Google is an “electronic communications service provider” as defined in 50 U.S.C. § 1881(b)(4) and, thus, subject to FISA 702 as we all know is highly problematic (read more here). Thus, the DPA held that contractual and organizational measures were insufficient and the municipality has to put technical measures in place.

In sum, the DPA expresses serious criticism of the municipality’s processing of personal data, which is in violation of Articles 5(2), cf. 5(1)(a), 24, cf. 28(1), 35(1) and 44, cf. 46(1).

What happens next

As mentioned, the municipality has until 3 August to remedy all violations as outlined above. And the debate has taken off once again: Will they manage in time? Who will pay? What will they use instead of Google products? Will the kids get their fundamental privacy rights back?

One thing’s for sure – we’ll watch this case closely. 🧐 (We might have another very interesting podcast episode coming up!)

PS: If your GDPR work is in a mess, contact us for an Actionable GDPR Audit to Identify & Manage Your Compliance Risks

Keep reading