On the EU-US data transfer problem

At Runbox we are always concerned about data privacy – “privacy is priceless” – and we put some effort into keeping ourselves updated on how the EU’s General Data Protection Regulation (GDPR) affects privacy related issues. That’s because we want to be prepared in case something happens within the area that will affect the Runbox organization, our services, and consequently and most important: our customers.

The case of EU-US data transfer is highly relevant because Runbox has an organizational virtual modus operandi, and that this could lead to an opportunity to involve consultants that are residing in the US. We know that many of our customers are as concerned as we are about data privacy, so we believe it is pertinent to share our findings.

In blogpost #15 in our series of the GDPR we referred to the Executive Order signed by US President Joe Biden on 07 October 2022. This happened six months after the US President and the President of the EU Commission Ursula von der Leyen with much publicity signed the Trans-Atlantic Data Privacy Framework on 25 March 2022.

Joe Biden and Ursula von der Leyen at a press conference in Brussels. [Xavier Lejeune/European Commission]

In this blog post we will take a closer look at the Trans-Atlantic Data Privacy Framework, and the process thereafter.

Trans-Atlantic Data Privacy Framework

The objective of the Framework is to (re)establish a legal (with regards to the GDPR) mechanism for transfers of EU personal data to the United States, after two former attempts (Safe Harbour and Privacy Shield) were deemed illegal by the Court of Justice of the European Union (CJEU).

The Framework ascertains United States’ commitment to implement new safeguards to ensure that ‘signals intelligence activities’ (SIGINT, intelligence-gathering by interception of signals) are necessary and proportionate in the pursuit of defined national security objectives. In addition, the Framework commits the US to create a new mechanism for EU individuals to seek redress if they believe they are unlawfully targeted by signals intelligence activities.

Following up the 25 March 2022 Biden–von der Leyen agreement, the US president signed on the 7 October 2022 the Executive Order (EO) ‘Enhancing Safeguards for United States Signals Intelligence Activities’.

US compliance with the GDPR

Subsequently a process was initiated on 13 December 2022 within the EU Commission to assess whether the US, after the implementation of the EO, will meet the requirements qualifying the US to the list of nations that is compliant with the GDPR Article 45 “Transfers on the basis of an adequacy decision”. That is, whether the European Commission has decided that a country outside the EU/EEA offers an adequate level of data protection. To those countries, personal data may be transferred seamlessly, without any further safeguard being necessary, from the EU/EEA.

Inclusion of the US on that list is of course very important, not least for companies like Facebook and Google, and US companies offering cloud-based services as well. The Court of Justice of the European Union (CJEU) has deemed earlier transfer schemes (Safe Harbour and Privacy Shield) illegal, so “the whole world” is waiting for the EU Commission’s adequacy decision.

This came, as a draft, 14 February 2023 where the Commission concludes (page 54) that “… it should be decided that the United States ensures an adequate level of protection within the meaning of Article 45 of Regulation (EU) 2016/679, …)

(The figure below illustrates the “road” for legislative decisions in the EU. A more comprehensive description of the legislative procedure can be found here.)

However, the same day, 14 February 2023, the Committee on Civil Liberties, Justice and Home Affairs of the European Parliament concludes.. the EU-US Data Privacy Framework fails to create actual equivalence in the level of protection; ..” and “..urges the Commission not to adopt the adequacy finding;”.

Incompatible legislative frameworks

There are two important arguments, among others, behind the Commission’s conclusion: 1) There is no federal privacy and data protection legislation in the United States, and 2) the EU and the US have differing definitions of key data protection concepts such as principles of necessity and proportionality (for surveillance activities etc.), as pointed out by the Court of Justice of the European Union (CJEU).

Shortly thereafter, on 28 February 2023, the European Data Protection Board (EDPB) made public their opinion on the decision of the EU Commission regarding the adequacy. The EDPB has some concerns that should be clarified by the Commission, for instance relating to exemptions to the right of access, and the absence of key definitions.

Furthermore, the EDPB remarks that the adequacy decision is only applicable to US organizations which have self-certified, and that the possibility for redress provided to the EU data subjects in case of violation of their rights is not clear. “The EDPB also expresses concerns about the lack of a requirement of prior authorization by an independent authority for the collection of data in bulk under Executive Order 12333, as well as the lack of systematic independent review ex post by a court or an equivalently independent body.”, as stated in Opinion 5/2023.

The next step in the process is voting over the Commissions proposal in the European Parliament, probably in April, and thereafter the adequacy decision must be approved by all member states, before the EU Commission’s final decision.

The Commission may set aside the results of the voting in The Parliament, but one should expect that the critics from The Committee on Civil Liberties, Justice and Home Affairs, and the concerns of EDPB, will impact the implementation of the Framework.

Here it would be prudent to recall the statement made by the Austrian non-profit organization NOYB, chaired by Maxmillian Schrems: “At first sight it seems that the core issues were not solved and it will be back to the CJEU sooner or later.”. This refers to the verdicts of the CJEU (Court of Justice of the European Union) condemning the former frameworks Safe Harbour and Privacy Shield – the verdicts bearing the name Schrems I and Schrems II, respectively.

Bottom Line: The final outcome of the process is unclear, but in any event we have to wait for the final decision of the EU Commission.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought regarding any specific circumstances.

Continue Reading →

Privacy, GDPR, and Google Analytics

This is blog post #15 in our series on the GDPR.

GDPR

Four European Data Protection Authorities (DPAs) have thus far concluded that the transfer of personal data to the United States via Google Analytics is unlawful according to the General Data Protection Regulation (GDPR).

It is quite certain that other European DPAs, including the Norwegian Data Protection Authority, will follow suit because all members of EU/EEA are committed to comply with the GDPR.

Website analytics vs privacy

Everyone who manages a website is (or should be) interested in the behavior of users across web pages. For this purpose there are analytics platforms that measure activities on a website, for example how many users visit, how long they stay, which pages they visit, and whether they arrive by following a link or not.

To help measure those parameters (and a lot of others) there exists a market of web analytics tools of which Google Analytics (GA), launched in 2005, is the dominant one. In addition, GA includes features that support integration with other Google products, for example Google Ads, Google AdSense and many more.

The use of GA implies collecting data that is personal by GDPR definition, for instance IP-addresses, which can be used to identify a person even if done in an indirect way. GA may use pseudonymization, using their own identifier, but the result is still personal data.

The fact that data collected by GA, of which some data is personal, is transferred to the USA and processed there, has brought the DPAs of Austria, Denmark, France, and Italy to conclude that the use of Google Analytics is not compliant with the GDPR.

None Of Your Business

This conclusion has been reached after complaints submitted by the Austrian non-profit organization NOYB (“my privacy is None Of Your Business”) to a number of European DPAs.

The complaints are based on the Court of Justice of the European Union (CJEU) concluding that the transfer of personal data to the US, without special measures, violates the GDPR.

According to NOYB the Executive Order signed by US President Joe Biden recently will not solve the problem with EU-US data transfers with regards to the potential for mass surveillance.

DPAs on the case

The Danish DPA writes that even if Google has indicated that they have implemented such measures, these measures are not satisfactory in order “to prevent access to transferred personal data by US law enforcement authorities”.

Datatilsynet logo

The Norwegian DPA has thus far received one complaint regarding Google Analytics, and they are saying on their web site that the case is being processed.

They “will place great emphasis on what other countries have come up with”, they say in an email conversation.

Runbox will continue following these developments and keep you updated.

Note: Runbox used GA during a short period between 2011 and 2013. When we became aware of how Google collects data and how they potentially could use these data across their various services, we terminated the use of GA in October 2013. Since then we use only internal statistics to monitor our service and visitor traffic on our web site, and these data are not shared with anyone in accordance with our Privacy Policy.

Continue Reading →

The Norwegian Consumer Council’s voice is heard worldwide

The Norwegian Consumer Council (NCC) has taken a strong position against commercial surveillance online, and has made it very visible how the Ad-Tech industry is exploiting personal data for business purposes.

“Big data” has since the entry of social platforms on the Internet, been accumulated and used unscrupulously by some companies for profit. Some of the players in the field are sharing information they collect on users with third party advertisers without their users’ knowledge or consent. The driver is all the money connected to targeted advertising. However, sharing of personal data in this way is prohibited according to the EU’s General Data Protection Regulation (GDPR).

The NCC has no authority to enforce personal data legislation, but the Norwegian Data Protection Authority (NDPA) does. And so, the NCC can freely report findings of breaches of the GDPR and Norwegian data protection regulations to the NDPA.

NCC and NDPA at the forefront

A good illustration of this interaction is the case against Grindr. Earlier this year the NCC, based on the report “Out of Control” (2020), raised the case against Grindr and five Ad-Tech companies that were receiving personal data through the app: Twitter`s MoPub, AT&T’s AppNexus, OpenX, AdColony and Smaato.

All the complaints were filed (in cooperation with the European Center for Digital Rights, noyb.eu), at the NDPA because of violations of the GDPR. The complaints concern Grindr transmitting sensitive personal data as for example group affiliation, sexual orientation, and geographic location, with several other parties without encrypting the traffic.

Even if data is anonymised, such as when third parties operate with their own proprietary identification numbers, it is possible to combine data from various sources with openly available information to produce a picture that can identify an individual.

In January, the NDPA announced a fine of 100 mill NOK (€ 9.63 M or $ 11.69 M) on Grindr. The NCC has also in May this year acted against 8 companies and asked for details of their surveillance through the services Perfect365 and MyDays.

The Norwegian urge to protect personal data was also illustrated in May 2021. Then the NDPA submitted an advance notification of an administrative fine of NOK 25 mio to Disqus Inc. The company does widget tracking, analysing and profiling, and disclosing personal data to third party advertisers, and in doing so violates multiple articles (i.e. Article 6 and Article 7) of the GDPR.

The privacy movement grows stronger

All of these cases illustrate the NCC mission, but the NCC is working from a broader perspective: To establish a broad, international movement towards surveillance-based advertising.

This movement got a push with NCC’s seminal report Out of Control (2020), which has received media coverage in more than 70 countries, included the US and Japan (see our previous blog post).

Recently (June 2021), the NCC released another report: Time to ban surveillance-based advertising, with the subtitle The case against commercial surveillance online.

On page 4 there is quite a good summary of what the driving force is:

…today’s dominant model of online advertising is a threat to consumers, democratic societies, the media, and even to advertisers themselves. These issues are significant and serious enough that we believe that it is time to ban these detrimental practices.

In a coalition with more than 60 organizations from Europe and the US, including some 10 consumer organisations and the umbrella organisation BEUC – the European Consumer Organisation – the NCC on June 23 2021 sent an open letter to EU and US policymakers. The letter urges the policymakers to “…take a stand and consider a ban of surveillance-based advertising as part of the Digital Services Act in the EU, and for the U.S. to enact a long overdue federal privacy law.” The coalition is backing up its call with the reports by NCC.

On behalf of NCC, the consumer research company YouGov conducted a survey among a representative selection (internet population) 18 years+ about their attitude to surveillance-based advertising. The result was unambiguous: Only 10% responded positively to the idea of commercial actors collecting information about them online, and only one in five think that ads based on personal information is OK

Runbox has a clear standing against the collection of consumer data and surveillance-based advertising: Our service is ad-free, and we never expose our customers’ data for commercial purposes. We are very strict when law enforcement authorities in Norway or foreign countries request that we disclose data about our customers.

At Runbox we are proud to reside in a country that puts privacy first, and we wholeheartedly support the appeal to ban surveillance-based advertising. Therefore Runbox will annually donate to support noyb.eu, and we have joined the list of individuals supporting the appeal.

Continue Reading →

Out of Control: Apps that share personal data revealed by the Norwegian Consumer Council

If you are not paying for the product, then you are the product”.

This is a common saying when referring to online services that are offered for no financial payment (“free”).

The reason is that they often collect some personal data about you or your use of the service that the provider then can sell to the online advertising and marketing industry for payment. The payment they get for this covers the cost of providing the service to you and also allows for a profit to be made.

And so, they earn their money, and the app users are their product.

Apps as a source for big personal data

At Runbox we collect only the data that is required in order for us to provide our services to you, and that data is never shared with anyone for marketing or financial purposes.

However, it is common knowledge that companies like Google and Facebook use our personal data for targeted advertising. The personal data collected is anonymized and often aggregated to produce larger data sets, which enable them to target individuals or groups based on common preferences — for instance that they live in a certain location or like to drink coffee.

The idea that your data is anonymized might provide some comfort. But because of smartphones and the smartphone software applications (“apps”) many people use, companies can collect a large range of types of data and so trace individuals without asking for personal details such as your name. An example of this type of data is your smartphone unique identifier (IMEI-number1), and IP-address (when connected via WiFi).

Combined with your email address, GPS data, app usage etc., it is possible to identify specific individuals -– namely you!

Exposing the AdTech industry

To investigate this issue, The Norwegian Consumer Council (NCC), a government funded organization representing consumer interests in Norway, published a groundbreaking report last year about how the online marketing and AdTech (Advertising Technology) industry operates.

The report’s title immediately raised the flag: “Out of Control” (OuC)2. And the subtitle outlines the findings: “How consumers are exploited by the online advertising industry”.

The report tested and analyzed 10 popular “apps” under the umbrella “social networking apps”, and the findings were concerning. Most users of such apps know that registering your personal data is optional, and after the introduction of the GDPR every app is careful to ask for your consent and encourages you to click OK to accept their Privacy Policy.

What many users will not know is how much and how far the personal data is distributed. Only a few users will be aware that clicking OK implies that your data is fed into the huge AdTech and MarTech industry, which is predicted to grow to USD 8.3 billion in annual revenues by 20213.

The players in this industry are giants such as Amazon, Facebook, Google and Twitter. If that was not enough, both iOS (Apple) and Android (Google) have their ways to track consumers across different services.

Apple being more privacy minded than some others have recently developed options to allow the user to reset the “unique” advertising identifier in devices and also stop tracking across WiFi networks to break the identification chain and make it harder to target a specific user.

But the industry also has a large number of third-party data and marketing companies, operating quietly behind the scenes.

The far-reaching consequences of AdTech

This is what the NCC’s report is about, and the findings are concerning:

The ten apps that were tested transmit “user data to at least 135 different third parties involved in advertising and/or behavioral profiling” (OuC, page 5).

A summary of the findings is presented on OuC page 7, and here we find social networking apps, dating apps and apps that are adapted to other very personal issues (for instance makeup and period tracking). The data that is gathered can include IP address, GPS data, WiFi access points, gender, age, sexual orientation, religious beliefs, political view, and data about various activities the users are involved in.

This means that companies are building very detailed profiles of users, even if they don’t know their names, and these data are sent to for instance Google’s advertising service DoubleClick and Facebook. Data may also be sold in bidding processes to advertising companies for targeting advertising.

It is one thing to see ads when you perform a Google search, but it’s quite another to be alerted on your phone with an ad while you are looking at a shop’s window display, or passing a shop selling goods the advertiser knows you are interested in. Scenarios like these are quite possible, if you have clicked “OK” to a privacy policy in an app.

Personalized directed ads are annoying, but even worse is that the collection and trade of personal data could result in data falling into the hands of those who may then target users with insults, discrimination, widespread fraud, or even blackmail. And there is clear evidence that personal data have recently been used to affect democratic elections4.

What happened after The Norwegian Consumer Council published “Out of Control”, will be covered in our next blog post, but we can reveal that one of the companies studied had a legal complaint filed against them for violating the GDPR and is issued an administrative fine of € 9.6 million.

So stay tuned!

References:

  1. IMEI stands for International Mobile Equipment Identity.
  2. The report Out of Control was referred to in our previous blog post GDPR in the Wake of COVID-19: Privacy Under Pressure.
  3. Source: https://privacyinternational.org/learn/data-and-elections
  4. Source: https://bidbalance.com/top-10-trends-in-adtech-martech/

Continue Reading →

The Norwegian COVID-19 contact tracing app is banned by the Data Protection Authority

GDPR in the Wake of COVID Spread: Privacy under Pressure – Part 2

Our previous blog post in this series concerned mobile phone applications under development, or already developed, in various countries for tracing the spread of COVID-19 infections. In particular the blog described the situation in Norway, and we expressed our concerns, but also our trust, in the fact that The Norwegian Data Protection Authority (‘Datatilsynet’) would be on the spot to safeguard privacy – as regulated by strict Norwegian privacy regulations.

The Norwegian Data Protection Authority — more than a watchdog

Temporary suspension of the Norwegian Covid-19 contact tracing app
The Norwegian Smittestopp app

We were right, and we are proud of the intervention by the Norwegian Data Protection Authority (NDPA), which in June banned the Norwegian COVID-19 tracker app Smittestopp. The ban illustrates NDPA’s independency, and that NDPA has legal power to enforce privacy protection when public (and private) organizations violate the law.

This power is anchored in the Personal Data Act (personopplysningsloven), the Norwegian implementation of GDPR, and the Personal Data Regulations (personopplysningsforskriften).

After evaluating the app Smittestopp as it was implemented in April this year, NDPA concluded that the app violated the privacy legislation in mainly two respects:

  1. The app was not a proportionate intervention of the user’s fundamental right to data protection.
  2. The app was in conflict with the principle of data minimization.

On June 12, The NDPA notified The Norwegian Institute of Public Health (NIPH) that the app would be banned, which was confirmed on July 6. Consequently, NIPH immediately stopped collecting data from the around 600,000 active users of the app, and deleted all stored data on their Azure server.

What the requirement for proportional intervention means

The breach of the requirement for proportional intervention concerned the expected low value of the app regarding infection tracking, due to the relatively small number of the population in the testing areas actually using the app (only 16%).

The reason for the breach of the principle of data minimization was that the app was designed to cover three different purposes:

  1. Movement tracing of individuals (for research purposes).
  2. Spread of the infection among the population.
  3. The effectiveness of infection control measures.

The NDPA was also critical to the app because it was not possible for the users to choose for which of the three purposes their data would be used.

A new app is already being planned

The government has decided to terminate further development of Smittestopp, and will instead focus on the development of a new app. After seeking advice from NIPH, the government has decided to base a new app on the Google Apple Exposure Notification (GAEN) System, or ENS, which they call “the international framework from Google and Apple” because many countries (for instance Denmark, Finland, Germany, Great Britain) are going “the GAEN way”.

Important arguments for the government’s decision are that GAEN supports digital infection tracking only (Bluetooth-based), involves no central data storage, and includes the possibility to exchange experiences and handle users’ border crossings. In the meantime the EU has implemented a recommendation for decentralized Corona tracking applications, putting GAEN “squarely in the frame“.

NIPH was given the task to specify a request for proposal in an open competition for the development assignment of the new app, and now (October 20) the Danish Netcompany is hired to do the development. Netcompany has a similar contract with the Danish health authorities, and was the only bidder (!). The new app expected to be implemented this year (2020).

The privacy debate continues

Three main issues are still being debated, and the first is technical: Is Bluetooth reliable enough? Experiences show that false positives, but also false negatives, do occur when Bluetooth is being used.

The second issue is of course privacy. Even if personal data is stored locally on the phone, notifications between phones have to be relayed through a network – so what about hacking? In addition, Trinity College in Dublin has uncovered that on Android phones, GAEN will not work unless it is sending owner and location information back to Google.

This leads to the third issue: Is it sensible to let the tech giants control a solution that involves processing very personal information? “Do Google or Apple get to tell a democratically elected government or its public health institutions what they may or may not have on an app?”

The Norwegian Data Protection Authority published a report on digital solutions for COVID-19 (‘Coronavirus’) infection tracking on September 11 this year. The report was developed by Simula Research Laboratory, who did not bid on the contract for the new GAEN-based application (arguing that they are a research institution and not a software development company).

The report “… focuses on efficiency, data privacy, technology-related risks, and effectiveness for government use. In terms of privacy and data protection, the report notes that if location data is still stored by Google, the COVID-19 app Smittestopp would be less privacy intrusive than the GAEN one.”

Conclusion

We will conclude with a quote (in our translation): “There is no perfect solution for digital infection tracking. Effective infection control and privacy stand in opposition to each other.”

For us at Runbox, privacy is priceless, and we are still wondering if the pros outweigh the cons.

Continue Reading →