‘I find that we no longer treat each other as “people”. Instead, we are the passive data subjects of an overarching business model’. That was how Giovanni Buttarelli, the European Data Protection Supervisor (who I talked about in my post of 1 October), opened our conversation on the impact of our digital environment in his office in Brussels, which lies in the European quarter, a stone’s throw away from the European Parliament. In his office, plain and simple like all EU offices, there are traces of his previous life as a judge: eighteen ring binders in a subtle shade of salmon pink line the floor to the right of his desk, carefully ordered in contrast to the informality of our meeting today…

We are talking about the unyielding link between personal data, privacy and the construction of algorithms. He was one of the first contributors to DigiDig.it and approached it with much interest and curiosity; and, he says he drew inspiration from it for his interventions at the 38th International Privacy Conference in Marrakesh several weeks ago (which I will cover in a future post).

In our interview, he shares some of his thoughts about some of the specific topics relating to DigiDig.it: algorithms, creation, ownership, negotiability. ‘Besides the regulatory aspect, which we cannot ignore, I would like to know what values inspire us as we develop new technologies’. As it stands today, the protection of personal data, which constitutes a right relating to personality and thus a right that may be waived, is based on a series of guarantees, provisions and duties that are non-negotiable – even if consent by the individual concerned is given. Below are his thoughts, suggestions and analysis.

Zangrandi. It is becoming increasingly urgent, and indeed mandatory, for personal data protection authorities to deal with algorithms: they need to play a role, both now and in the future, but what should that role be?

Buttarelli. While data protection authorities have a vast amount of expertise, it is limited by the fact that the information about individuals in algorithms is owned by others (for instance, your purchase history); the concept of ‘personal data protection’ will disappear in the near future, as will the concept of ‘personal data’. We will all be easier to predict and identify even without data about our individual identities, it will be easier to reuse the information and group it together with other information and interpret it accordingly

Zangrandi. Will we then be moving towards the protection of clusters?

Buttarelli. The concept of ‘data-subject’ and therefore of the individual will in all likelihood disappear too, as we will be grouped together according to segments of information…

Zangrandi. What then?

The difference between ‘personal’ and ‘non-personal’ will then be blurred; the concept of anonymous data is also at risk. This will certainly need looking into from a regulatory perspective. We are analysing this from many angles: from those which are strictly personal to those relating to law enforcement, and also from that of freedoms. Take movement, for example: the internet of things, the use of drones, and everything else, will result in machines interacting without us being aware of it. We will have to focus on the issue of the speed of the machines, as this speed will produce automated decisions and assessments more rapidly than we are able to do ourselves. In this respect, organisations (data controllers) will not be fully aware of what is going on; instead, they will be wondering: what is that machine doing?

Zangrandi. So we’d be in a Minority Report scenario? When can all this be negotiated?

Buttarelli. It all depends on our ability to build the future. Today, with Big Data, we see that individuals are in a position in which they do not control the data relating to them. This means that, for various areas of online interaction and on

social networks, search engines, and devices – tablets, smartphones – through which we manage our lives and the services we use, we cannot know what use is being made of our electronic footprint.

Zangrandi. Is this simply ignorance?

Buttarelli. It is certainly a lack of knowledge, but also of education, awareness and definitely also of ability. Europe has built up a series of deep-rooted rights in this respect, but we must now make them practicable and practical.

Zangrandi. Can we go back to the question about whether all this can be negotiated?

Buttarelli. It will take time, but some form of negotiation may develop in the future. Today, while we do not exercise our right to access personal data when it is stored and protected in a traditional or even automated way, we still have the right to get an intelligible notification of the content, rectify it if there are any errors, complete it if incomplete, delete anything not relevant, update it if it is out of date or obsolete and take action if any data were to violate our legitimate interests in any way. In any case, we have the right to know and intervene, since the manager of the personal data in question is required to be transparent.

Zangrandi. It seems difficult to apply those concepts to the creators and managers of algorithms that are based on personal data…

Buttarelli. If one day the same in-depth analysis or request for disclosure were targeted at managers of digital data that had been involved in encoding various types of models and algorithms, the only reply they could give would be along the lines of ‘I’m sorry, I can’t tell you a great deal because you’re implicit in the algorithm itself, you came up in the results of an analysis, the details of which I can’t tell you about now because it was based on an enormous amount of information that we might not even have any more, but which is the result of a lengthy certification process, in all likelihood carried out over a long period of time. And you are what the algorithm…”

Zangrandi. …has ‘decided’ that you are?

Buttarelli. No, it ‘thinks’ that you are. The algorithm perhaps knows you better than anyone else, but it won’t be able to give you an overview of the items on which the diagnosis and analysis of your person or personality was based; in all likelihood, the analysis will be very precise, so there will still be much space for transparency…

Zangrandi. …or, at least, for that type, or gradient, of transparency that is becoming less definable and identifiable, and losing the reference to the individual. Consequently, since it is becoming objectively difficult to modify algorithms, as they are changing in nature and being constantly developed, the area of negotiability of their characteristics should be anticipated in advance. And this ties in with a more general question of responsibility, reporting and possibly of ethics in constructing and creating the algorithm itself.

Buttarelli. Absolutely. Experience shows that for a certain period of time to come, human beings will continue to exert a certain type of influence on the programming of a great many applications, such as translation tools, automated decision-making through facial recognition and a wide range of automatic decision-making processes…

Zangrandi. … or fingerprints registered and recognised by smartphones, tablets, laptops, PCs and the like, the future use of which is in fact known…

Buttarelli. …certainly. In all of these examples, to which I would add the programming of drones, human beings will have a fundamental role. It will take some time before we have self-learning machines and autonomous artificial intelligence, such as robots that can programme themselves.

Zangrandi. And what can a supervisory institution like the EDPS do in such a situation? How can it manage the future while this situation is unravelling at a much faster rate than legal or regulatory instruments can be drawn up, debated, approved and implemented instruments which, by definition, can at best only manage the present, and are often based solely on the past?

Buttarelli. I have around twenty years’ experience in this field, and for a long time I’ve said that data protection can only provide many answers if managed in a creative and innovative way, by looking specifically to the future and focusing on the guarantees and the formal requirements; and thinking about how those principles, which remain set in stone, can be transferred into and accommodated by the digital society.

Zangrandi. I don’t think that will be enough.

Buttarelli. It won’t be enough because some of the questions and some of the answers revolve around ethical values and respect for the individual. I find that nowadays – even without the technological developments to come in the future – we don’t actually treat each other as ‘people’ in the ways in which we interact; instead, we are the passive subjects, or objects, of an overarching business model that is focused solely on maximising the accumulation of information, because it has already realised that the winner takes it all, and that those who amass the biggest amount of information will have not only the clients, but also the keys for establishing the winning models linked to its business…

Zangrandi. …irreversibly transforming not only the business, but also the relationship with the client this time even more radically so than in the past. What are the anticipated effects?

Buttarelli. Companies that are flourishing the most globally are those based on the gathering of information; we must think about what will happen in the future, when real Big Data, rather than industry Big Data, will dominate.

Zangrandi. How do we explain this?

Buttarelli. With industry-based Big Data, we have legitimate initiatives for the combined use of information, in order to better organise our activities. Big Data, when understood in its more restricted sense, represents a concentration, a collection of data that currently cannot be ‘processed’ by using standard software devices and instruments. As a result, relatively few people will own analytical programmes and technology.

Zangrandi. With momentous consequences…

Buttarelli. This will result in a profound overhaul and a change in the rules of the market; the concepts of sovereignty (even State sovereignty) and territory will also change, and the imbalances that currently exist between people and business will be even more pronounced.

Zangrandi. The problem, therefore, is what can be done…

Buttarelli. The time has come to open a large-scale debate on the ethical dimension.

Zangrandi. What do we mean by ‘ethical’ here?

Buttarelli. We want to understand what it means with respect to people, what ‘dignity’ means in the 21st century…

Zangrandi. …or, at least, with the pace of innovation and technological improvements in the first part of this century, perhaps within 10 years we’ll already be well beyond dignity…

Buttarelli. …true; in any case, these issues have been subject to a reform on European level, which is now being applied throughout the world. We have

counted 111 countries, from among all those outside Europe in the broad sense (non-EU), and in neighbouring regions; they have personal data protection rules in place which, unlike what has happened in the past, are principally modelled on the regulation adopted by the European Union.

Zangrandi. In a word?

Buttarelli. They are modelled on an ‘umbrella’ law that covers not only the effects on privacy, but also data protection as part of sound administration.

Zangrandi. And we are still focusing on the present…

Buttarelli. We may disagree in the future as to what extent these rules have been updated to reflect the challenges faced by the information society, but one thing that is for certain is that at least the European approach, which should still apply in full and should be enforced from 25 May 2018, will not be changed in substance in the subsequent 20-year period. Given the current rate of change, twenty years represent more than a century in the information and digital societies.

Zangrandi. The problem of facing the future remains…

Buttarelli. …a future that will be filled with instruments designed four or five years ago which are still around today but which…

Zangrandi. …are becoming obsolete?

Buttarelli. I wouldn’t go as far as to define them as obsolete, but they risk becoming so if we do not immediately adopt creative measures to keep them up to date and in line with technological advances. We all have the antibodies to keep all the ‘privacy-by-design’ and ‘privacy-by-default’ principles going, but this needs to be done in addition, and not as an alternative, to a legislative approach. And besides the regulatory aspect, which we cannot ignore at this time, I would like to know what values inspire us as we develop new technologies. We have said many times that we attach importance to not only what is technically achievable, but also to what extent it is morally sustainable and socially acceptable; there can be no overall laissez-faire attitude. In the same way as eCommerce would not have developed without robust data protection, there can be no acceptance of digital evolution without a scrupulous analysis of the ethical direction. At a certain point, there is a risk that we will witness a rejection of the system.

Zangrandi. The principle of the algorithm possibly being negotiated by the consumer still remains open, as does room for negotiating with the individual who designs, uses and markets the algorithm and the indications stemming from it.

Buttarelli. I don’t see the scope for negotiation. Today, the protection of personal data, which constitutes a right relating to personality and thus a right that may be waived, is based on a series of guarantees, provisions and duties that are non-negotiable – even if consent by the individual concerned is given. When we speak of correctness, proportionality and necessity in addition to consent, the fact that I say ‘yes’ in the working relationship with my provider is disregarded, and that, irrespective of that ‘yes’, a supervisory authority or a court can rule that the limits of acceptability have been exceeded. This is why it is not just a question of the acceptance of an individual; the model must be fundamentally uniform. There is then an imbalance that will only increase with the changes that we have been talking about.

Zangrandi. What a shame. So the algorithm is in no way universal?

Buttarelli. What I’m saying must not be construed as Neo-Luddism; it is simply a proactive contribution to prevent rejection, that is to ensure all the benefits that we anticipate such as economic prosperity and well-being can actually be pursued by the many and not just by a few.

SHARE