Report of National Privacy Conference 2018
On Tuesday 30 January 2018, ECP|Platform for the Information Society and Privacy First jointly organised the first-ever National Privacy Conference at the Amsterdam Volkshotel.
With the rapid digitalisation of our society, the right to privacy is under increasing pressure. How can we as a society digitise while preserving and strengthening our privacy? How could the Netherlands develop into an international Privacy Guiding Country? During the conference, we jointly tried to find answers to these questions.
Speakers
Aleid Wolfsen (Personal Data Authority)
Our first keynote speaker was Aleid Wolfsen, chairman of the Personal Data Authority. He began his presentation with the video which can be found at www.hulpbijprivacy.nl, the Personal Data Authority's new campaign on the General Data Protection Regulation (AVG) that comes into force on 25 May 2018. This is an awareness campaign for everyone about the rights and obligations under the new European privacy legislation.
The current Personal Data Protection Act (PDPA) will be repealed as it is outdated and not up to date with today's digital society. The General Data Protection Regulation is technology-neutral and addresses that need.
The right to privacy is a fundamental right and is also included in several European treaties. This right is elaborated by the EU in legislation through directives and regulations. The General Data Protection Regulation has direct effect in member states.
"When we talk about privacy, we are also talking about the foundations of the Dutch legal order. If you violate people's privacy rights, you touch those foundations. And those foundations of the Dutch legal order are 1) equality, 2) solidarity, 3) the classic rights of freedom and lastly the democratic rule of law."
Privacy used to be a lot simpler, for example, you closed the curtains and no one could see what you were watching on TV or what books you had on the shelf. Privacy law is becoming increasingly important as all data can be stored and linked together. If we were to look into the phone of visitors to this conference and look up what their latest search terms were in a search engine, we might know more about the visitors than his/her partner and also know more about your health than your GP. That is why it is so important that European law now stipulates that your personal data is protected.
What is privacy really all about?
Privacy is protection from an all-powerful, all-knowing government. The right to be left alone, to be able to be yourself everywhere without being followed and spied on or having cameras everywhere. That you can live freely as a free citizen, in a free country. That you only reveal what you think should be revealed. That is why data protection is becoming more and more important by the day.
The three pillars under the AVG
The first pillar is strengthening and broadening people's privacy rights. Such as the right to inspect, consent will be stricter, right to rectification, right to oblivion, data portability and the right to complain. From 25 May this year, if you think you have been wronged, you will be able to lodge a complaint with the Personal Data Authority, and the Authority is going to deal with that complaint compulsorily. Because as a second pillar, the Personal Data Authority has been given more powers as a European privacy regulator. If the Authority finds that certain actions were unlawful, the Authority will also act. This can be through a warning, but also by imposing (draconian) fines.
As a third pillar: more responsibilities for organisations. The duty to report data processing operations to the Authority will lapse, as every government and company now processes data. However, organisations must start keeping a register of what they process and you must be able to justify this. Every government agency and some private organisations are required to appoint a Data Protection Officer. In addition, companies and governments must do a Privacy Impact Assessment. Two other important principles for data controllers are privacy by design - if you are going to develop something then it should be developed privacy-friendly - and privacy by default.
What does the law mean for the regulator?
The Personal Data Authority will be responsible for monitoring and enforcing the law, get firmer fine powers, international cooperation, cross-border powers, education for the general public and to organisations.
When is a Data Protection Officer mandatory?
Governments and public organisations are always required to have a data protection officer. In addition, also when an organisation processes special personal data, such as a healthcare institution, or if you observe people on a large scale.
Answers to questions from the audience
- What happens to the money from the fines?
These go back to the national treasury, to general resources.
- Can you talk more about the AVG Implementation Act and does it affect European cooperation?
The Implementation Act is currently still before the House of Representatives. The Netherlands is trying to implement the AVG as neutrally as possible, because the more we would deviate, the more difficult it would become to cooperate at European level.
- Is there a transition period for enforcement?
No, there is no transition period, as the entry into force has already been very long. In 2016, an extra two years was given for entry into force so that companies could compliant with the AVG can be.
Click here for the presentation By Aleid Wolfsen (pdf).
Gerrit-Jan Zwenne (Leiden University)
The second keynote speaker was Gerrit-Jan Zwenne, a professor at Leiden University's eLaw Centre for Law and Digital Technology. Zwenne made a number of comments on privacy law in his talk, starting with a quote from Niels Bohr: "It's hard to make predictions. Especially about the future." Our lawmakers make predictions about the future and technological developments. What was the beginning of modern thinking about privacy? In almost all theses that Zwenne sees come by, there is a footnote or quotation from an important 1890 publication by Samuel Warren and Louis Brandeis in the Harvard Law Review. They wrote an article on the right to be left alone following a technological development, namely the portable camera. A number of other developments, such as the train and the newspaper press, meant that an image could reach the entire continent within a considerable time. At this, Samuel Warren and Louis Brandeis pondered: should all this be allowed just like that, surely you should have the right to be left alone?
Where does our own current privacy legislation come from? This, too, is a reaction to technological developments, and we are thinking mainly of computers. As early as the 1950s, computers created fears about what happened to data. In the early 1970s, social debate focused on computer profiling. Should we regulate, through legislation, the balance of power upset by computers by capturing data and give citizens rights over what is captured in that computer? In the Netherlands, the Koopmans Committee was set up at the time, it wrote a report that was debated in the Lower House, but nothing was subsequently done with it. In Germany, legislation did come in at the time, as a direct response to technological developments. Ultimately, that was the basis for the European Privacy Directive (95/46/EC) and our Personal Data Protection Act (Wbp). With new legislation, you try to predict upcoming technological developments, as you want the law to be future-proof. In the legal domain, we solve this through open concepts and vague standards. We invent new rights and sometimes these are experimental. One bold and interesting experiment is the right to oblivion. This is actually a somewhat curious experiment; we have no idea yet what it means for our society. And it is also too unsubtle. There is also data portability, where you can transfer data from one party to another. The parties have to provide the data clearly to the consumer, often you will get your own 'lockers' in this with companies, my-telecom provider, my-music service, my-energy service. And there are also privacy risks associated with this, including in terms of security and also when you want to switch from one party to another. You can do this with a single click, but you also want strong authentication.
The right to oblivion
The Chinese emperor Qin Shi Huangdi had all books of previous regimes burnt. That's the feeling Gerrit-Jan Zwenne has a little bit about the right to oblivion. Is it a good idea to rewrite history that way? He is now a little more nuanced, in the age of Big Data, that we should have something akin to the right to oblivion, but perhaps it is still too absolute now. Maybe the right to forget should be for five or 10 years and then you get a year's notice, should you want it to be longer, that you have to make a new request. We need to balance these rights with the right to information and the right to freedom of expression. Such an absolute right to forget is not only untenable, via, say, a VPN in the United States you will find the information anyway, but it is also not desirable. The right to forget has not yet been sufficiently thought through.
There are other flaws in privacy law, according to Zwenne. You know this definition, of course: "'personal data' means any information relating to an identified or identifiable natural person". This is a core concept in the AVG and the Wbp, and it's odd, isn't it, that people with an understanding of data, with an understanding of information, wonder why we define it that way. After all, people who understand information define information by data. In privacy legislation, that is reversed and that cannot be done at all, Zwenne said.
Click here for the presentation By Gerrit-Jan Zwenne (pdf).
Researcher
Suddenly, a researcher entered the room. In a playful way, he tested the willingness of attendees to hand over their mobile phone to their neighbour. With the questions he asked, he tried to identify which information you are willing to share and which you are not. The audience at this conference was wary and shared little. The researcher was an actor from "What we do"; a theatre group that aims to bring a social issue (this time privacy) to the attention of schoolchildren, among others. For more information on this, please refer to https://watwedoen.nl/.
Jaap-Henk Hoepman (Radboud University)
The third speaker of the day was Jaap-Henk Hoepman, director of the Privacy & Identity Lab at Radboud University Nijmegen. Hoepman spoke in particular about privacy by design:
For many people, privacy by design a vague concept. It's about including privacy issues throughout the entire systems development process. From the moment you start thinking about the concept of the system, to the design and then the implementation. That actually makes privacy a kind of software quality attribute, similar to security and performance. In addition privacy by design a process, as it must be included throughout the development process.
Why is privacy by design important?
Privacy by design limits privacy risks and thus also any reputational damage or remediation costs. Those who reduce, mitigate or prevent can limit these damages, under the motto: "What you don't have you can't lose." Another aspect that is underexposed is that the new business enables, just as security by design internet banking enabled. Privacy by design is necessary if you want to get things done in, for example, healthcare, Internet of Things or Quantified Self in a responsible way. And finally why privacy by design necessary is because it is a requirement from the AVG.
In Nijmegen, they considered privacy design strategies in which vague legal standards are translated into about eight (technical) design requirements. These are eight issues that, as a techie, you need to talk about with the business and the legal department when it comes to designing the system.
Data-oriented strategies
If you look at an information-processing system like a database, where data is collected on individuals, and different attributes are collected on those individuals, such as age, address, name et cetera, then minimisation is one of the first things you can do, so don't collect all possible attributes, but make a selection from them. Secondly, you can put attributes in different databases, so that data cannot be combined. The third thing you can do is abstract data, by recording, for example, whether someone is over 18 instead of the specific age. And the last thing you can do, with a smaller database, is to protect it adequately.
Process-oriented strategies
First, inform users about the processing of their personal data. Next, give users control over the processing of their personal data. Next, commit to and enforce privacy-friendly processing of personal data. And finally, demonstrate that you process personal data in a privacy-friendly manner.
Separating data
It is important to think about physically separating data and to consider designing a system in which this data is not collected centrally, but rather on users' individual devices, for example. An interesting example of this is the ability on an iPhone to have folders created and photos selected automatically based on facial recognition; this is done on the smartphone itself and not in a central system. Another good example of this relates to social media. Facebook is centrally organised: all information is stored centrally at Facebook. But if you look at the functionality, namely direct contact with friends and acquaintances, you could also do this peer-to-peer. And that then the data you want to share is on your own smartphone and you share it directly with the smartphones of friends and acquaintances. That's a whole different way of thinking, and you can think about this when developing any system.
Abstracting data
This involves not processing data in detail, but doing so more in the abstract. One of the trivial examples where this has been applied is in the smart energy meter. With the first smart meter, the idea was still that your usage would be transmitted to the supplier in real time. This caused a lot of fuss, because that reveals a lot of information about your privacy. That is why the smart meter now collects your consumption and reports the total every three months. Other examples include age verification: if you only need to know whether someone is over 18, you don't necessarily need their date of birth.
Inform
Make sure that users can directly access the data you have collected from them in an easy way. Most companies now have a portal such as my-internet-provider, my-phone-provider et cetera, where users are informed.
Asked later from the conference organisers what Hoepman's experience of this afternoon was, he replied: "Good to see more and more privacy-friendly solutions entering the market. Privacy by design is mainly a matter of *wanting* and then doing. However, extra attention is needed for a good exchange of knowledge between the various parties: much more is possible in technology than most organisations know. This leaves potentially super-privacy-friendly solutions lying around."
Click here for the presentation By Jaap-Henk Hoepman (pdf).
Panel
Moderator Marjolijn Bonthuis (ECP) subjected the panel to a number of propositions. The audience's reaction to the propositions was often unanimous, but sometimes also varied. From the panel, Tim Toornvliet (Netherlands ICT) later responded as follows: "I thought it was a nice mix of privacy experts and privacy advocates. It was inspiring to see how the nominees aim to improve user privacy with innovative technical solutions." "Privacy is alive and well!" is a quote from panellist Lennart Huizing (Privacy Company). He found it very funny that the audience's reaction to the statement "privacy is more important than security" was massive: "that's a false dichotomy!" Then you have found an interesting audience... He also found the tangible discomfort around handing off mobiles to the neighbour very fascinating.
Dutch Privacy Awards
Nominees
There are four categories for which entries could be nominated:
- Consumer Solutions category (from businesses for consumers)
- category Business solutions (within a company or business-to-business)
- category Public services (from government for citizens)
- Incentive award for a pioneering technology or person.
From the various entries, the independent expert jury had determined the following nominees for each category:
Consumer solutions:
- IRMA (I Reveal My Attributes)
- Schluss
Business solutions:
- TrustTester
- Personal Health Train
Government departments:
- Youth Privacy Implementation Plan (Amsterdam municipality)
After the discussion with panel and audience, the five nominees were given the chance to tell their stories: Schluss, Personal Health Train, IRMA, TrustTester and Youth Implementation Plan Privacy (Amsterdam municipality) shared their great initiatives with the audience.
Presentations
- IRMA (pdf) https://www.youtube.com/watch?v=q6IihEQFPys
- Schluss (pdf) https://www.youtube.com/watch?v=f-cU5Izs5EI
- TrustTester (pdf) https://www.youtube.com/watch?v=JCivU3LQg-8
- Personal Health Train (pdf) https://www.youtube.com/watch?v=Sn-KK4reRGg
- Youth Implementation Plan Privacy, Amsterdam municipality (pdf)
Winners
After the Award pitches by the nominees, jury chairman Bas Filippini (Privacy First) took the floor and Bart Jacobs of IRMA was able to receive the coveted first-ever Dutch Privacy Award. IRMA (I Reveal My Attributes) is a state-of-the-art open source identity platform in which users can use an app to authenticate themselves based on one or more attributes from their various roles (contextual authentication). This authentication is non-identifying: using a 1-to-1 relationship between user and service provider, so-called "brokers" are no longer needed and the user can use services anonymously, without a password and with a minimum of required characteristics ("attributes"). The system was developed by Radboud University Nijmegen's Digital Security research group. Since late 2016, IRMA has been hosted by the independent Privacy by Design Foundation.
The jury commends the development of IRMA from academia as a general-purpose privacy by design application for both the private and public sectors. As a means of privacy-friendly authentication, the great innovative power of the open-source technology used, the immediate deployability and the potential societal impact of IRMA greatly appeal to the jury.
Within the broader framework of the Netherlands Privacy Guideland, IRMA was therefore unanimously chosen by the jury as the winner of the 2018 Dutch Privacy Awards.
'Drag law students'
On the initiative of five Amsterdam students, a national referendum on the controversial new Intelligence and Security Services Act ('Sleepwet') will be held on 21 March. Regardless of the outcome of this referendum, it will lead to higher (and probably more critical) Dutch privacy awareness. This fact alone was a unanimous reason for the jury to reward these students with a Dutch Privacy Award (Aanmoedigingsprijs).
Click here for the entire jury report (pdf) with participation criteria and explanation of all nominees and winners.
The afternoon ended with drinks where the winners received congratulations and the speakers further explained their stories. This edition tastes like more! More information about next year will follow soon.
For more information on the privacy by design community please refer to https://ecp.nl/community-privacy-by-design.
Privacy First and ECP | Platform for the Information Society
Privacy First organised this edition of the Dutch Privacy Awards with support from Democracy & Media Foundation and Adessium Foundation, in partnership with ECP. Would you also like to become a partner of the Dutch Privacy Awards? Then take contact on with Privacy First!