Out of Control: Apps that share personal data revealed by the Norwegian Consumer Council

If you are not paying for the product, then you are the product”.

This is a common saying when referring to online services that are offered for no financial payment (“free”).

The reason is that they often collect some personal data about you or your use of the service that the provider then can sell to the online advertising and marketing industry for payment. The payment they get for this covers the cost of providing the service to you and also allows for a profit to be made.

And so, they earn their money, and the app users are their product.

Apps as a source for big personal data

At Runbox we collect only the data that is required in order for us to provide our services to you, and that data is never shared with anyone for marketing or financial purposes.

However, it is common knowledge that companies like Google and Facebook use our personal data for targeted advertising. The personal data collected is anonymized and often aggregated to produce larger data sets, which enable them to target individuals or groups based on common preferences — for instance that they live in a certain location or like to drink coffee.

The idea that your data is anonymized might provide some comfort. But because of smartphones and the smartphone software applications (“apps”) many people use, companies can collect a large range of types of data and so trace individuals without asking for personal details such as your name. An example of this type of data is your smartphone unique identifier (IMEI-number1), and IP-address (when connected via WiFi).

Combined with your email address, GPS data, app usage etc., it is possible to identify specific individuals -– namely you!

Exposing the AdTech industry

To investigate this issue, The Norwegian Consumer Council (NCC), a government funded organization representing consumer interests in Norway, published a groundbreaking report last year about how the online marketing and AdTech (Advertising Technology) industry operates.

The report’s title immediately raised the flag: “Out of Control” (OuC)2. And the subtitle outlines the findings: “How consumers are exploited by the online advertising industry”.

The report tested and analyzed 10 popular “apps” under the umbrella “social networking apps”, and the findings were concerning. Most users of such apps know that registering your personal data is optional, and after the introduction of the GDPR every app is careful to ask for your consent and encourages you to click OK to accept their Privacy Policy.

What many users will not know is how much and how far the personal data is distributed. Only a few users will be aware that clicking OK implies that your data is fed into the huge AdTech and MarTech industry, which is predicted to grow to USD 8.3 billion in annual revenues by 20213.

The players in this industry are giants such as Amazon, Facebook, Google and Twitter. If that was not enough, both iOS (Apple) and Android (Google) have their ways to track consumers across different services.

Apple being more privacy minded than some others have recently developed options to allow the user to reset the “unique” advertising identifier in devices and also stop tracking across WiFi networks to break the identification chain and make it harder to target a specific user.

But the industry also has a large number of third-party data and marketing companies, operating quietly behind the scenes.

The far-reaching consequences of AdTech

This is what the NCC’s report is about, and the findings are concerning:

The ten apps that were tested transmit “user data to at least 135 different third parties involved in advertising and/or behavioral profiling” (OuC, page 5).

A summary of the findings is presented on OuC page 7, and here we find social networking apps, dating apps and apps that are adapted to other very personal issues (for instance makeup and period tracking). The data that is gathered can include IP address, GPS data, WiFi access points, gender, age, sexual orientation, religious beliefs, political view, and data about various activities the users are involved in.

This means that companies are building very detailed profiles of users, even if they don’t know their names, and these data are sent to for instance Google’s advertising service DoubleClick and Facebook. Data may also be sold in bidding processes to advertising companies for targeting advertising.

It is one thing to see ads when you perform a Google search, but it’s quite another to be alerted on your phone with an ad while you are looking at a shop’s window display, or passing a shop selling goods the advertiser knows you are interested in. Scenarios like these are quite possible, if you have clicked “OK” to a privacy policy in an app.

Personalized directed ads are annoying, but even worse is that the collection and trade of personal data could result in data falling into the hands of those who may then target users with insults, discrimination, widespread fraud, or even blackmail. And there is clear evidence that personal data have recently been used to affect democratic elections4.

What happened after The Norwegian Consumer Council published “Out of Control”, will be covered in our next blog post, but we can reveal that one of the companies studied had a legal complaint filed against them for violating the GDPR and is issued an administrative fine of € 9.6 million.

So stay tuned!

References:

  1. IMEI stands for International Mobile Equipment Identity.
  2. The report Out of Control was referred to in our previous blog post GDPR in the Wake of COVID-19: Privacy Under Pressure.
  3. Source: https://privacyinternational.org/learn/data-and-elections
  4. Source: https://bidbalance.com/top-10-trends-in-adtech-martech/

Continue Reading →

The Norwegian COVID-19 contact tracing app is banned by the Data Protection Authority

GDPR in the Wake of COVID Spread: Privacy under Pressure – Part 2

Our previous blog post in this series concerned mobile phone applications under development, or already developed, in various countries for tracing the spread of COVID-19 infections. In particular the blog described the situation in Norway, and we expressed our concerns, but also our trust, in the fact that The Norwegian Data Protection Authority (‘Datatilsynet’) would be on the spot to safeguard privacy – as regulated by strict Norwegian privacy regulations.

The Norwegian Data Protection Authority — more than a watchdog

Temporary suspension of the Norwegian Covid-19 contact tracing app
The Norwegian Smittestopp app

We were right, and we are proud of the intervention by the Norwegian Data Protection Authority (NDPA), which in June banned the Norwegian COVID-19 tracker app Smittestopp. The ban illustrates NDPA’s independency, and that NDPA has legal power to enforce privacy protection when public (and private) organizations violate the law.

This power is anchored in the Personal Data Act (personopplysningsloven), the Norwegian implementation of GDPR, and the Personal Data Regulations (personopplysningsforskriften).

After evaluating the app Smittestopp as it was implemented in April this year, NDPA concluded that the app violated the privacy legislation in mainly two respects:

  1. The app was not a proportionate intervention of the user’s fundamental right to data protection.
  2. The app was in conflict with the principle of data minimization.

On June 12, The NDPA notified The Norwegian Institute of Public Health (NIPH) that the app would be banned, which was confirmed on July 6. Consequently, NIPH immediately stopped collecting data from the around 600,000 active users of the app, and deleted all stored data on their Azure server.

What the requirement for proportional intervention means

The breach of the requirement for proportional intervention concerned the expected low value of the app regarding infection tracking, due to the relatively small number of the population in the testing areas actually using the app (only 16%).

The reason for the breach of the principle of data minimization was that the app was designed to cover three different purposes:

  1. Movement tracing of individuals (for research purposes).
  2. Spread of the infection among the population.
  3. The effectiveness of infection control measures.

The NDPA was also critical to the app because it was not possible for the users to choose for which of the three purposes their data would be used.

A new app is already being planned

The government has decided to terminate further development of Smittestopp, and will instead focus on the development of a new app. After seeking advice from NIPH, the government has decided to base a new app on the Google Apple Exposure Notification (GAEN) System, or ENS, which they call “the international framework from Google and Apple” because many countries (for instance Denmark, Finland, Germany, Great Britain) are going “the GAEN way”.

Important arguments for the government’s decision are that GAEN supports digital infection tracking only (Bluetooth-based), involves no central data storage, and includes the possibility to exchange experiences and handle users’ border crossings. In the meantime the EU has implemented a recommendation for decentralized Corona tracking applications, putting GAEN “squarely in the frame“.

NIPH was given the task to specify a request for proposal in an open competition for the development assignment of the new app, and now (October 20) the Danish Netcompany is hired to do the development. Netcompany has a similar contract with the Danish health authorities, and was the only bidder (!). The new app expected to be implemented this year (2020).

The privacy debate continues

Three main issues are still being debated, and the first is technical: Is Bluetooth reliable enough? Experiences show that false positives, but also false negatives, do occur when Bluetooth is being used.

The second issue is of course privacy. Even if personal data is stored locally on the phone, notifications between phones have to be relayed through a network – so what about hacking? In addition, Trinity College in Dublin has uncovered that on Android phones, GAEN will not work unless it is sending owner and location information back to Google.

This leads to the third issue: Is it sensible to let the tech giants control a solution that involves processing very personal information? “Do Google or Apple get to tell a democratically elected government or its public health institutions what they may or may not have on an app?”

The Norwegian Data Protection Authority published a report on digital solutions for COVID-19 (‘Coronavirus’) infection tracking on September 11 this year. The report was developed by Simula Research Laboratory, who did not bid on the contract for the new GAEN-based application (arguing that they are a research institution and not a software development company).

The report “… focuses on efficiency, data privacy, technology-related risks, and effectiveness for government use. In terms of privacy and data protection, the report notes that if location data is still stored by Google, the COVID-19 app Smittestopp would be less privacy intrusive than the GAEN one.”

Conclusion

We will conclude with a quote (in our translation): “There is no perfect solution for digital infection tracking. Effective infection control and privacy stand in opposition to each other.”

For us at Runbox, privacy is priceless, and we are still wondering if the pros outweigh the cons.

Continue Reading →

GDPR in the Wake of COVID-19: Privacy Under Pressure

Tech companies all over the world are rushing to support health authorities in combating the spread of the SARS-CoV2 virus, which is causing the more well-known COVID-19 disease. Whether those companies do so by invitation, by commitment, or by sheer self-interest, country after country is embracing mobile phone tracking and other technological means of tracking their citizens.

It might be worthwhile to take a deep breath and understand what’s currently technologically possible, and what might be at stake.

Tracking the infection

Everyone wants to avoid infection, and every government wishes to decrease the consequences of the pandemic within their country. And modern technology makes it possible to impose on citizens surveillance systems that represents a significant step towards realizing a Big Brother scenario.

In fighting the spread of the virus, it is crucial to know who is infected, track where the infected are located, and inform others that have been, or may come, in contact with the infected. It is precisely in this context that mobile phone tracking is playing a role, and this is currently being explored and implemented in some countries, raising ethical and privacy related questions.

Smartphone tracking apps

Once tracking of individuals’ phones is established for this particular and possibly justifiable reason, it could be tempting for a government or company to use it for other purposes as well. For instance, tracking data could be combined with other personal data such as health data, travel patterns, or even credit card records. Or the location of the infected individuals could be presented on a map along with the persons’ recent whereabouts, perhaps supplemented with warnings to avoid that area. Privacy is under pressure.

A smartphone can also be used as “electric fence” to alert authorities when someone who is quarantined at home is leaving their premises, or to fulfill an obligation from the authorities to send geolocated selfies to confirm the quarantine. Some authorities even provide individuals with wristbands that log their location and share it with the relevant authorities. The examples are many, and they are real, underlining the ongoing pressure on privacy.

Big tech gets involved

Very recently two of the world’s biggest tech companies, Apple and Google, announced they are joining forces to build an opt-in contact-tracing tool using Bluetooth technology, and will draw on beacon technology as well. The tool will work between iPhones and Android phones, and open up for future applications one cannot currently imagine.

In the first version, the solution is announced as an opt-in API (application programming interface) that will let iOS and Android applications become interoperable, and — now comes crux no 1 — the API will be open for public health authorities to build applications that support Bluetooth-based contact tracing. The tool is planned for a second step — here is crux no 2 — an upcoming update of both iOS and Android will make the API superfluous. Of course, you can opt-out, but then you can’t download the operating system software update at all.

It is a double-edged sword: It is great that big tech companies are mobilizing resources to help in a public health crisis, but do we really want these companies to potentially know even more about our personal lives (in the name of the common good)? Privacy is under pressure.

Norway’s privacy oriented approach

Norway has also launched a mobile phone application to help limit the spread of the infection, but this development is done under the strict regime of privacy regulations and in accordance with the GDPR. The decision to implement the app was taken by the Government in a regulation containing specifications and strict requirements adhering to the GDPR is taken care of, including limited use until December 1, 2020.

It should be added that some of the exceptions in GDPR for authorities is put into effect because of the extraordinary situation. However, the Norwegian parliament (Stortinget) may terminate the law supporting the regulation at any time if 1/3 of the parliament members decides so.

Even if, at least in theory, it might be feasible to use a similar app from other countries, it is crucial that the software is developed from scratch in Norway. This will ensure that Norwegian authorities maintains control over all functions and data, and that the privacy regulations in the GDPR are respected.

It is also comforting that the app is developed in cooperation with The Norwegian Data Protection Authority (Datatilsynet). Other countries allow similar apps to store health information, access images or video from cameras, or even establish direct contact with the police. Such functionality is naturally out of the question in Norway’s case.

The app is designed and will be used for purposes of tracking the pandemic only, and installation and usage is voluntary. When installed and activated the app collects location data using GPS and Bluetooth, which is encrypted and stored in a registry.

In case of a diagnosed infected individual, health personnel will check if the person has installed the app. Individuals that have been in closer contact than two meters for more than 15 minutes with the “infected phone” will be notified by text message. The location data is kept for up to 30 days, and when the virus is no longer a threat the app will stop collecting data. The app users may at any time delete the app and all personal data that is collected.

What does it take to succeed?

In order for the tracking to have any impact on the spread of infections, around 60% of the population* must use the application. At the time of writing (late April), 1,218,000 inhabitants had downloaded the application, that is about 30 % of the population for which downloading is allowed (age limit 16 years).

However, the number of downloads is not a good metric and there are a few obstacles for making it operable. For instance, the “app” must be installed on the phone, permission to use GPS and Bluetooth must be given, the 4 pages long privacy declaration* has to be accepted, and the battery must provide sufficient power at any time.

The battery issue turns out to be a problem because of GPS-positioning* and the simultaneous use of Bluetooth, which seems necessary to obtain precise location data.

Furthermore, not everyone is accustomed to using the smartphone functionality that is needed, depending of the user interface. For instance elderly people and people with vision impairments* may find it difficult to use the app. And, will the criteria two meters for more than 15 minutes represent a filter that is too coarse to provide useful results and subsequent notification to the user?

For these reasons, the skeptical may wonder if using the app implies that privacy is traded for uncertain and unreliable results from infection tracking.

What the application will provide even if 60% adoption is not realized is data for later research. For instance, data from mobile phone operators who can trace mobile phones movements between base stations could be correlated to instances of infections.

In the name of fighting the pandemic, the main telecommunication companies* are now, with strict privacy considerations, cooperating with The Norwegian Institute of Public Health to analyze movement patterns of the population compared with reported infections. Data is collected in groups of at least 20 people (phones), and identification of individual persons (phones) is not possible*.

Bottom Line

At Runbox we are very concerned about privacy and any type of user tracking that may infringe on this right. While various nations are developing and implementing technological solutions to combat the spread of the decease, we are grateful that we reside in a country with strong privacy traditions. In fact, the first version of personal data protection legislation was implemented in Norway as early as 1978.

It is crucial that The Norwegian Institute of Public Health and The Norwegian Data Protection Authority ensure that the app developers at Simula Research Laboratory (a Norwegian non-profit research organization) attend to both privacy and information security issues in a responsible manner according to the well established tradition in Norway.

When privacy is under threat, as in this case, it is absolutely justified that objections arise. It is often too easy to accept privacy intrusions in the name of a perceived common good.

But one related point could be made as a final remark: Perhaps it would be more appropriate to be concerned about personal data that is collected and shared through one’s use of social media, where personal data is traded and used for purposes that are literally out of control.

* Article unfortunately only available in Norwegian.

Continue Reading →

GDPR implementation part 8: “Personal data” in the EU and the US is not the same

We usually think of “personal data” as a term that contains for instance a person’s full name, home address, email address, telephone number, and date of birth.

These are ordinary data that can obviously identify a specific person. But in the personal data category of linked personal information are also data such as social security number, passport number, and credit card numbers – data that can identify us, and data we usually feel more restrictive about.

Linkable and non-linkable information

But there is another category of data that on its own may not be able to identify a person, but combined with other information could identify, trace, or locate a person. Such data are gender, race, sexual orientation, workplace, employment etc. These are examples of linkable personal information.

Then we have the category non-personally identifiable information. That is data that cannot be used on its own to identify or trace a person, for example IP addresses, cookies, device IDs, and software IDs (non-linkable personal information).

Privacy regulations differ in the EU and the US

Now, we know that there are industries that exist almost under the radar while taking advantage of personal data. For instance, companies in the AdTech and MarTech industry base their business on collecting and trading personal data for targeted advertising and marketing.

Many of these actors try to take protection of personal data seriously, and refer to the rules and regulations for processing personal data. In Europe this is the GDPR (General Data Protection Regulation) within the EU/EEA-area1, and in the US it is the responsibility of the FTC (Federal Trade Commission).

However, what the EU/GDPR and US government agencies mean by “personal data” is different. Specifically, the definition by EU/GDPR is more comprehensive than the definition often referenced by US agencies, such as that of NIST (National Institute of Technology).

For example, the EU concept of personal data includes information such as cookies and IP addresses, which are not considered as personal data in a US setting.2

This means that if US websites in their privacy policy state that they are GDPR compliant, but combine their data with other data sets, they may breach the GDPR. For example, they must have the user’s consent to collect their IP address under the GDPR.

Definitions of “personal data”

National Institute of Technology’s definition

NIST’s definition of personal data is contained in the definition of Personal Identifiable Information (PII):

PII is any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual‘s identity, such as name, social security number, date and place of birth, mother‘s maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.

US Office of Privacy and Open Government’s definition

Another PII-definition is from the US Office of Privacy and Open Government (OPOG) as follows:

The term personally identifiable information refers to information which can be used to distinguish or trace an individual’s identity, such as their name, social security number, biometric records, etc. alone, or when combined with other personal or identifying information which is linked or linkable to a specific individual, such as date and place of birth, mother’s maiden name, etc.

EU’s GDPR definition

Compare these PII-definitions with the GDPR Article 4(1)’s definition of personal data:

‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

It is obvious that GDPR defines personal data much broader than both NIST’s and OPOG’s PII, and this is underlined by this statement found in GDPR’s Recital 30:

Natural persons may be associated with online identifiers provided by their devices, applications, tools and protocols, such as internet protocol addresses, cookie identifiers or other identifiers such as radio frequency identification tags. This may leave traces which, in particular when combined with unique identifiers and other information received by the servers, may be used to create profiles of the natural persons and identify them.

The US is lacking comprehensive regulation

That said, US authorities are moving towards stronger protection of privacy and personal data, but as late as March 2019, the US Congressional Research Service says:

Despite the increased interest in data protection, the legal paradigms governing the security and privacy of personal data are complex and technical, and lack uniformity at the federal level. The Supreme Court has recognized that the Constitution provides various rights protecting individual privacy, but these rights generally guard only against government intrusions and do little to prevent private actors from abusing personal data online. At the federal statutory level, while there are a number of data protection statutes, they primarily regulate certain industries and subcategories of data. The Federal Trade Commission (FTC) fills in some of the statutory gaps by enforcing the federal prohibition against unfair and deceptive data protection practices. But no single federal law comprehensively regulates the collection and use of personal data (our emphasis).

Conclusion

When US websites claim to follow the rules for processing personal data it is dubious at best, compared to the regulations in the EU/EEA – which the Norwegian legislation is based on and is what Runbox adheres to.

However, it should be mentioned that some US states, for instance California, do classify some anonymous data (i.e. IP-addresses, aliases and account data) as PII.

In addition, as stated in our Privacy Policy, the personal data we ask customers to register in order to use our service is very limited. We are conscious about the trust our customers place in us when they register personal data in our systems, and in return we can demonstrate that we are compliant with the regulations.

Addendum

Above we referred to the AdTech and MarTech industries and their usage of personal data to identify, trace, or locate a person for advertising and marketing purposes. That topic is outside the scope of this blog post, but is absolutely worth writing about in a later post.

1 EEA = European Economic Area, that is the EU and three countries: Iceland, Lichtenstein, and Norway.

2 https://www.forbrukerradet.no/out-of-control/ footnote on page 102.

Continue Reading →

GDPR implementation part 7: Information and Tools for Implementation of Users’ Rights

GDPR

One of the main objectives for the European Union (EU) when they developed the replacement for the Data Protection Directive 95/46 (from 1995), was to expand individual control over the use of personal data.

This can be seen in a broader view as an implementation of the right to one’s private life, as laid down in the European Convention on Human Rights (Article 8). The right to respect for one’s private and family life is also stated in the EU Treaty on Fundamental Rights (Article 7).

Norway has signed both of these agreements, and the Constitution of Norway implements these rights in Article 100 and 102 of the Constitution and in the Norwegian Human Rights Act.

Already in GDPR1 Article 1 we see the connection between the GDPR and especially the Treaty on Fundamental Rights:

This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data

Article 1-2 of the GDPR

Observe the expression “rights and freedoms of natural persons“, which is very important throughout the Regulation and is used 31 times in all.

Before we go further into the subject of this post, it is important to state that Norway’s legislation on the processing of personal data was already compliant with the GDPR before the latter was declared as the new framework for the legislation in Norway. The Norwegian Personal Data Act (PDA2), as compliant with the GDPR, tok effect 20 July 2018.

First and foremost, the GDPR states that no processing of personal data shall be done unless the data subject has given consent (Article 6-1, a). Runbox obtains consent to registration of our users’ personal data when they sign up for an account and accept our Terms of Service.

The GDPR (Article 6-1, ff.) allows a controller – that is Runbox in our context – to process personal data when there is a legitimate reason for doing so, i.e. something that is necessary to use our services.

It is an important objective for the GDPR to secure one’s control of one’s own personal data. In this respect, the GDPR has given the data subjects eight fundamental rights (Article 15—17).

When implementing these rights in Runbox, we found that most of those were already there. However, the introduction of the GDPR provided us with a checklist and the opportunity to analyze our status, and to improve our services in this respect.

Our Privacy Policy provides exhaustive information about how we process personal data, but here is an overview of the data subject’s rights, and our implementation of them:

  • The right to access (Article 15): Since Runbox does not collect other types of information than what the users register by themselves, they can easily check which personal data is processed. The data processing is only done in order to process your emails, and optionally your web site and domain name.
  • The right to rectification (Article 16): You may at any time log in to your email account and change your personal information.
  • The right to erasure (‘right to be forgotten’) (Article 17): You may terminate your subscription any time, and your account contents will subsequently be deleted after 6 months. Your personal details data will be deleted after 5 years in accordance with Norwegian accounting regulations. However, you may send a request to dataprotectionofficer@nullrunbox.com for immediate erasure of your account contents.
  • The right to restriction of processing (Article 18): Runbox will never use your personal information for purposes other than providing our services to you, so restrictions are not necessary in our context.
  • The right to be informed (Article 19): Runbox uses your personal information only in order to provide our services to you..
  • The right to data portability (Article 20): In case that you wish to move to another email service provider and export your data, you will find information on how to do this through our services and documentation.
  • The right to object (Article 21): Since we never will use your personal data for other purposes than to deliver the services you have agreed to, this right is implicitly fulfilled.
  • The right to individual decision-making (Article 22): This article is intended to protect data subjects against automated data-processing that might involve profiling them based on personally identifiable information, which is something Runbox doesn’t do.

Regarding questions or concerns about our implementation of the GDPR, customers may use the email address dataprotectionofficer@nullrunbox.com as a direct channel to our appointed Data Protection Officer.

Some final remarks about consent: Runbox uses cookies in order to provide our services, and new users must give express consent to this on our signup page. On this page, and on the Account page once logged in, you may also give/revoke consent to future news and offers from Runbox.

In our next post in this series, we will consider our contractual situation regarding GDPR requirements. Stay tuned.

Footnotes

1. The GDPR means Regulation EU 2016/679 of 27 April 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46 / EC General Data Protection, General Data Processing Regulation. Article refers to Article in the GDPR, unless stated otherwise.

2. The Personal Data Act (the PDA) means the regulations that are currently in force in Norway for the protection of individuals in connection with the processing of personal data, which includes the implementation of GDPR in Norway (2018-07-20).

Continue Reading →

GDPR implementation part 6: Access Control and Permissions

In part 3 of this blog series we described how we mapped the “world” of our operations, including the following components:

  • Server infrastructure, including all servers and other hardware as well as the links between these.
  • Software components that comprise our application stack from the operating system level to the front-end application level.
  • Data networks, including how and where our serves are connected to the Internet, but also the Local Area Network at our premises.
  • Data inventory, i.e. all personal data including customer and employee data, financial records, information about partners/associates, etc.
  • Applications necessary to run the company itself, meaning software that is managerial in nature.

Access control concerns permissions attached to system-related objects. Within each of the components listed above, there may be several sub-objects — servers, software modules, data files, catalogues etc., to which restricted access should be implemented.

Creating an Access Control Table

These objects then form one axis of an Access Control matrix or table (ACT). The other axis of the table include organizational units, broken down into person-related objects, for instance segments or groups, but also individuals, for each unit.

After breaking these objects down to an appropriate level, we attached roles to each of these components. In terms of the GDPR, data processor and data controller are examples of roles to use in this context.

To each of the defined roles, we attached categories of tasks, for instance sysadmin, developer, and support staff tasks.

For our email service systems we found it convenient to structure the system-related objects in 3 main categories:

  • General software.
  • Application software.
  • Personal data.

Within each of these categories there are various numbers of objects, to which access permissions are attached, comprising the Access Control Table for the realm in question. For other realms of our “world” we used a similar approach, resulting in a number of ACTs that implement a principle of least privilege.

With this the groundwork was laid for establishing various mechanisms for implementing the access control regime, in order to secure our most precious pieces of hardware, software, and data.

In our next blog post in this series we will look at Information and Tools for Implementation of Users’ Rights.

Continue Reading →

GDPR implementation part 5: Risk Assessment and Gap Analysis

In previous posts in this blog series we have referred to our main planning document, Rules and Regulations for Information Security Management, or RRISM for short, where our road to GDPR compliance started out. In that document we worked out the structure of the project, based on descriptions and definitions of the various components.

Obviously, risk management has to be taken very seriously, and the RRISM lays the groundwork for how we should handle this aspect of information security. The baseline is that risk management is an essential part of the company’s life, and one that comprises all its assets.

Defining and assessing risks

As usual, we first had to agree upon some definitions, and we found the following to be adequate for our purpose — directly from NIST (National Institute of Standards and Technology):

Risk is the net negative impact of the exercise of a vulnerability, considering both the probability and the impact of occurrence. Risk management is the process of identifying risk, assessing risk, and taking steps to reduce risk to an acceptable level.

Risk is a function of the likelihood of a given threat-source’s exercising a particular potential vulnerability, and the resulting impact of that adverse event on the organization.

In order to assess risks, we first have to identify possible threats that may exploit vulnerabilities in our systems or our organization.

In short: Risk management shall first and foremost have as objective to protect assets that are at potential risk.

Analyzing assets

Then we outlined the methodology we adopted:

  1. Identify the assets that could be at risk.
  2. Identify possible threats and vulnerabilities.
  3. Identify the possible consequences of each potential vulnerability.

Each threat was characterized by probability and criticality which together gives one of four risk levels: Very High (red), High (orange), Medium (yellow), and Low (green). This helped us decide what we should prioritize regarding improvements, measures, and other actions.

Analyzing our assets we actually found more of these than anticipated, grouped in 21 different asset types, ranging from our customer base, general software in use and our own key business systems, through hardware and communication lines, and employees and partners – and more.

Threat, vulnerability, and gap analysis

Then we reviewed the vulnerability potentials (what could go wrong) for each asset and created scenarios for possible consequences if something happened that exploited a vulnerability.

The question raised thereafter was: Do we have the necessary measures and remedies in place to eliminate the potential vulnerabilities, or mitigate the consequences if things went wrong — or is there a gap?

The next step was to find out what actions should be taken in order to close the gaps in cases where we were not satisfied with the situation, and this will be the topic of future blog posts in this series.

Conclusion

Our mantra through this process has been: Threats we can imagine will sooner or later be reality, but never as we expect them to happen, and never where we expect them.

We live in an ever-changing environment, which means that risks have to be monitored continuously, and so our risk assessment and gap analysis is continually evolving as well.

Continue Reading →

GDPR implementation part 3: Mapping our “world”

This is the third post in our series on Runbox’ GDPR implementation.

After having structured our GDPR project, the next piece of necessary groundwork was to map out status on relevant facts about important areas of our business. The reason is that it’s impossible to establish and maintain good security and privacy – and to determine GDPR compliancy — if the “territory” is not clearly described.

The “territory”

The “territory” in question was foremost and first of all,

  • The email service delivery system, that is the Webmail and backend systems and files – the development platform that is used, the components of which the system is built, the dependencies between the components, description of access points etc. – while being well aware of that the GDPR compliancy also includes Privacy of Design requirements.

Other realms that are necessary to describe were for example:

  • The economic system in which the company operates; i.e. mapping out the network of organizations with which our company is involved – including partners, associates, suppliers, financial institutions, government agencies, and so on – in order to serve our customers.
  • Server infrastructure with all physical links and channels, and not the least: All software components.
  • Data networks, including how and where our serves are connected to the Internet, but also the Local Area Network at our premises.
  • Data catalogue, including of course all personal data, that is, what kind of data are registered on customers and also employees and partners/associates as well.
  • Applications of all sorts necessary to run the company – applications that are managerial of nature.

Level of description

One problem encountered is how detailed the descriptions should be. Too many details will make the job unnecessarily big in the first place, followed by a lot of maintenance to keep the documentation current.

We chose to start with a “helicopter view”, to obtain an overview of the different realms with the intention to fine-grain the documentation depending on the requirements of the ultimate goal: To identify areas where privacy and security is of concern, ticking off issues that are well taken care of in light of the GDPR, or followed up with measures to improve the situation to achieve GDPR compliancy.

Of course, the GDPR Implementation Project is not a sequential one, as development projects seldom are. Therefore, from time to time we had to go back and adjust our planning tools when needs arose.

The next blog post in this series will concern our Information Security Policy.

Continue Reading →

Data Privacy Day

January 28th is Data Privacy Day, and was initiated by the Council of Europe in 2007. Since then, many advances to protect individuals’ right to privacy have been made.

The most important of these is the European Union’s General Data Protection Regulation (GDPR) which was implemented on May 25, 2018. Runbox has promoted data privacy for many years, anchored in Norway’s strong privacy legislation.

At Runbox, which is located in the privacy bastion Norway, we believe that privacy is an intrinsic right and that data privacy should be promoted every day of the year.

Your data is safe in the privacy bastion of Norway

We’re pleased that Data Privacy Day highlights this important cause. Many who use the Internet and email services in particular may think they have nothing to hide, not realizing that their data may be analyzed and exploited by corporations and nation states in ways they aren’t aware of and can’t control.

While threats to online privacy around the world are real and must be addressed, we should not be overly alarmed or exaggerate the problem. Therefore we take the opportunity to calmly provide an overview of Norway’s and Runbox’ implementation of data privacy protection.

Norway enforces strong privacy legislation

First of all, Norway has enacted strong legislation regulating the collection, storage, and processing of personal data, mainly in The Personal Data Act.

The first version of Norway’s Personal Data Act was implemented as early as 1978. This was a result of the pioneering work provided by the Department of Private Law at the University of Oslo, where one of the first academic teams within IT and privacy worldwide was established in 1970.

Additionally, the Norwegian Data Protection Authority, an independent authority, facilitates protection of individuals from violation of their right to privacy through processing of their personal data.

For an overview of privacy related regulations in the US, in Europe, and in Norway, and describes how Runbox applies the strong Norwegian privacy regulations in our operations, see this article: Email Privacy Regulations

Runbox enforces a strong Privacy Policy

The Runbox Privacy Policy is the main policy document regulating the privacy protection of account information, account content, and other user data registered via our services.

If you haven’t reviewed our Privacy Policy yet we strongly encourage you to do so as it describes how data are collected and processed while using Runbox, explains what your rights are as a user, and helps you understand what your options are with regards to your privacy.

Runbox is transparent

Runbox believes in transparency and we provide an overview of requests for disclosure of individual customer data that we have received directly from authorities and others.

Our Transparency Report is available online to ensure that Runbox is fully transparent about any disclosure of user data.

Runbox is GDPR compliant

Runbox spent 4 years planning and implementing EU’s General Data Protection Regulation, starting the process as early as 2014.

We divided the activities implementing the GDPR in Runbox into 3 main areas:

  • Internal policies and procedures
  • Partners and contractors
  • Protection of users’ rights

This blog post describes how we did it: GDPR and Updates to our Terms and Policies

Runbox' GDPR Implementation

More information

For more information about Runbox’ commitment to data privacy, we recommend reviewing the Runbox Privacy Commitment.

Continue Reading →

“Drop Gmail, Outlook, and iCloud: Norwegian challenger clearly best on privacy”

Runbox is hailed in a major Norwegian news outlet for providing superior privacy protection.

The article is based on a study by Vienna University of Business and Economics, which compares Runbox to 4 other major email services.

Gmail is slammed in the same article for its poor default privacy settings and a pattern of privacy violations.

Comparison of email providersIn the study, Runbox scores high in all categories:

  • Informational Control: 7/7
  • Decisional Control: 7/7
  • Behavioral Control: 6/7
  • Privacy Friendly Defaults: 7/7
  • Technology Paternalism: 5/7
  • Privacy By Design: 6/7
  • Service Appeal: 5/7

At Runbox we are very happy with this increasing focus on privacy, which supports our long-held privacy commitment and our work on compliancy with EU’s General Data Protection Regulation.

The full digi.no article (in Norwegian) can be seen below in PDF format.

Dropp Gmail, Outlook og Icloud- Norsk u...rer klart best på personvern - Digi.no

Continue Reading →