Machine translations by Deepl

Privacy in Europe: who stands up for citizens?

Report on debate night Privacy First 

On 19 September last, Privacy First organised an energising session in a packed auditorium led by Nelleke Groen debate night on privacy in Europe. Below is a brief impression:

By Ellen Timmer  

Europe forgets the citizen

It was spoken by university lecturer and Platform for Civil Rights president Tijmen Wisman on theory and reality of data protection and civil rights in Europe. The thrust of his talk was that while Europe speaks highly of the protection of civil rights and privacy, many regulations are in the pipeline or have already been realised, with little thought for citizens. Examples include the smart energy meter and the European rules for the 'smart' car (including eCall): companies and governments are taking personal data unchecked and the risks to citizens are virtually ignored. Governments believe the sales pitches of the IT industry, which is looking for large-scale data to improve their machine learning systems with it, claiming that government would become better, more efficient and (above all) cheaper through IT. In his view, the European 'Health Data Space' is going to lead to citizens becoming cash cows of medical data. The citizen is a billiard ball on the European billiard cloth and will be rolled at will as governments and companies find lucrative. Wisman argues that both in Europe and the Netherlands there should be a renewed focus on citizens' interests, citing Herman Tjeenk Willink's publications, among others.

He ends with a call for citizens to organise and counterbalance the government and big business lobbying.

Techies make the rules: 'standards'

The second speaker, Jan Smits, explains that the new trend is for rules to be made by technical people. Those rules used to be called 'standards' (think of the NEN) and are now called 'standards' following the English terminology. Those standards hide all kinds of policy choices, without being democratically discussed.

Smits worries that through the AI Act, rules will be made by tech companies. In doing so, the danger is that there will be no focus on citizens. This is partly because the AI Act is based on Article 114 of the European Convention, which is about the European market. Fundamental rights play a limited role in Article 114.

Smits is deeply concerned about the rule of law in Europe, as the interests of business become more and more paramount, oversight is privatised and standard-setting bodies are traditionally not concerned with the interests of citizens. One bright spot, according to Smits, is that the AI Act (or its implementing rules) talks about a role for civil rights organisations, but does not answer the question of where those organisations will get the funds to pay the people who contribute to the formulation of the standard on their behalf (traditionally, the business organisations that participate bear those costs themselves).

Mass claims against fundamental rights violations

Privacy lawyer Eliëtte Vaal talks about the origins and role of the General Data Protection Regulation (AVG) and its increasing enforcement (although not nearly enough). The problem is that while the Personal Data Authority has many powers, it has too few resources, leaving much to be done. It is difficult for individual citizens to take cases to court, so the incentive for companies and organisations to comply with the AVG is lacking.
Collective actions offer a possible solution. For instance, the Amsterdam court showed in a collective action against Facebook that people are not afraid to take on Big Tech. The new mass claims regulation (WAMCA) has the advantage that interest groups can claim damages directly on behalf of data subjects. As a result, an increasing number of civil rights organisations are launching proceedings against AVG sinners. One example is the proceedings launched by the Consumers' Association against Google.

Eliëtte herself is concerned At the Privacy First co-founded mass claim foundation CUIC (Consumers United in Court), which proceedings against data protection software provider Avast is going to launch. It calls on anyone who has used Avast software in the past to participate in this.

Medical data hunger

The last speaker was Jonah Walk, which is concerned about the limitless need of companies and governments for medical personal data. Even though useful uses for medical data can be imagined, such as improving the chances of success of treatments and preventing side effects, the uninhibited collection and uncontrolled dissemination of medical personal data is very risky for citizens.

There is insufficient control on misuse by companies of the data obtained.
Another risk is that through the data obtained, the government may interfere with people's health in an inappropriate way, which may interfere with fundamental rights. Influencing people becomes easy when there is a lot of personal data.

The move towards large medical data collections and large-scale and uncontrolled sharing of that data is taking place without the desired public discussion. That discussion should include questions such as:

  • for what purposes the data are collected and disseminated;
  • How it checks that governments and companies handle data carefully;
  • How a citizen can signal that mistakes are being made and how those mistakes will be corrected;
  • what alternatives are offered to those who do not want to provide respectively disseminate their data;
  • How dependence on large IT suppliers outside the EU is avoided.

Both doctors and those they treat have a naive trust in digital systems and do not consider the risks.

Also read the summary of Jonah's argument on her weblog. Jonah is also active at association The Fourth Wave.

In conclusion

It was an interesting evening on topics that the political parties that will soon participate in the elections should pay more attention to.

 

Tijmen Wisman
Jan Smits and moderator Nelleke Groen
Eliëtte Vaal
Jonah Walk
Privacy First board members Paul Korremans, Wilmar Hendriks and Jacqueline Stokman