Machine translations by Deepl

CPDP - AI & 14 dataspaces

A network of 14 linked European 'dataspaces' containing data from everyone, if possible, for (among other things) training AI. In Brussels, they like the idea.

In the closing speech at the CPDP conference be the European Data Protection Supervisor, Wojciech Wiewiórowski, on the risks of the "fragmentation of legislation" that this could result in. Currently, much of our right to privacy is still enshrined in the General Data Protection Regulation, the AVG.

With the first European data space, the European Health DataSpace (EHDS), we see how that could start to change. In it, our right to consent is abolished for medical data, citing the 'public interest'. But what is 'public interest'?

Together with Enrique Echeverria and Guido van 't Noordende Privacy First organised a workshop on the EHDS at the CPDP conference.

AI

They are hungry self-learning algorithms, fed with both public and private data, used to generate and control a new world: political propaganda, porn, deepfakes, Taylor Swift, disinformation and other things that seem real-life but never happened in reality. The big platforms are responsible for removing things that should not exist, but are not transparent about the algorithms they use to do so.

In addition, every AI has a bias which arises because data are by definition not neutral. In a workshop from Cosmonauts & Kings we experimented with AI for developing a campaign post and it became clear how you can shift reality if you don't verify AI's answer.

The regulation and oversight of AI is still in its infancy. The initial ideas are there, but if one thing became clear it is that there are still more questions than answers at the moment.

At the same time, there are the multinationals. Those should keep growing, otherwise they will be taken over themselves. BigTech could just become BigAI.

Déjà vu - the false memory experience from The Matrix (1999)

Too big to fail

"To govern, or to be governed. That's the question" was rightly the central question at this year's CPDP congress. Big institutions are centralising power at the expense of individual autonomy. All that data is a goldmine, for science as well as for training BigAI. Can we (literally) pull the plug on that any time soon, if it does not turn out as we expect?

Sideways, the questions of energy requirements, natural resources and climate impacts also came up.

CPDP Conferences 2024
CPDP Conferences 2024 | photo: CPDP

EHDS Opt-Out turns out to be fake

Two-thirds of the participants in our workshop said they wanted to use the 'opt-out'. In the EHDS version adopted by the European Parliament, everyone should get an opt-out for secondary use, yet again not.

The State may override this opt-out if there is a 'public interest'. So this is not an opt-out: everyone will be included in the EHDS and only the make available of data is blocked. In the case of a real opt-out, your data is blocked at its source and you cannot "Balancing between privacy and data availability", such as Dutch health minister wants.

With a true opt-out, you are not a member of the club and do not exist in the system.

Public interest

'Public interest' is that which is in the public interest, i.e. in the interest of every citizen in Europe. That goes without saying you would think, until millions are made from a drug developed from our 'common' data and the manufacturer argues that the availability of the medication is in the 'public interest'. The same can happen for an AI trained based on our data and then used in the software of very expensive medical devices.

While the EHDS states that the results, or outputs, of the use of our data must be made public within 18 months, it then fails to clarify what it then means by 'results' or 'outputs'.

In earlier versions of the EHDS, the use of data for commercial purposes, such as training AI, was mentioned separately, but in the latest version that has disappeared so we have to assume that any interest is a 'public interest'.

Purposes for which electronic health data can be processed for secondary use: [...] (a)public interest in the area of public and occupational health [...]

- EHDS, Article 34

Of our participants, none was willing to share data for commercial use. For research by universities, or non-profit organisations, there was a lot of enthusiasm. Such freedom of choice is not offered in the EHDS, by the way.

State of democracy

When we can no longer decide for ourselves what personal data we share and that data can be used to create an alternative reality, there is no longer any question of whether we live in a free democratic constitutional state.

We are no longer able to perceive whether that is the case or not.

This article was also published at PONT Data & Privacy, see AI & 14 European dataspaces - PONT Data&Privacy (privacy-web.co.uk)