ESRs Bárbara and Onntje at Digital Legal Talks 2022

On November 24th 2022, ESRs Bárbara Lazarotto and Onntje Hinrichs participated in the Digital Legal Talks in Utrecht. The event was the third annual conference organised by the Digital Legal Lab, which is a research network that is constituted of Tilburg University, University of Amsterdam, Radbound University Nijmegen, and Maastricht University. The topic of the conference was dedicated to law and technology and focused on a wide variety of topics such as responsible data sharing, enforcement and the use of technology, the recently proposed and passed EU data laws, and involved as keynote speakers Sandra Wachter from the Oxford Internet Institute and Thomas Streinz from NYU School of Law (for more information consult the program of the conference). Onntje and Bárbara both presented their research in relation to the Data Act Proposal (DA proposal). However, each focussed on different aspects. Whereas Onntje discussed what the DA does or does not do for consumers, Bárbara focused on the Business-to-Government data sharing aspects of the proposal.

In his presentation “The Data Act Proposal: A Missed Opportunity for Consumers”, Onntje presented his perspective on why the DA might fail its objective of “empowering individuals with regard to their data” (a more extensive version of his findings has been published in Privacy in Germany 06/2022 (PinG)). His presentation therefore focussed on the B2C data sharing provisions in chapter II of the proposal. The DA contains various provisions that are supposed to strengthen consumer protection such as (i) an obligation to design IoT products in a way that data are accessible by default (ii) pre-contractual information obligations on data generated by IoT products (iii) obligation for data holders to make data available free of charge upon request by consumers (iv) data holders can only use any non-personal data generated by IoTs on basis of a contract (v) right to share data with third parties (including various provisions that offer protection to consumers in their relation to third-parties). 

Onntje concluded, however, that these provisions might not be sufficient to empower consumers with regard to ‘their’ data as claimed by the proposal – instead, they might strengthen (or at least confirm) the position of data holders as de-facto owners of IoT-generated data in B2C relations: First, the access rights are likely to be designed in a very restrictive way. Instead of allowing consumers to port data, data holders will likely only be obliged to ‘make data available’. Second, the obligation imposed on data holders to conclude a contract as the basis for use of any non-personal data generated by IoT devices would not provide any meaningful protection to consumers. Due to the complete lack of any safeguards with regard to that contract, it would not solve any problems related to control but only legitimize them. By confirming the de facto ownership position of data holders as de facto owners of IoT data, the DA would therefore fail to create any incentives for data holders to design their products in a more privacy-friendly way. To provide more meaningful protection, the DA could have introduced, for instance, provisions that sanction unfair contract terms with regard to devices that excessively collect and process data which are entirely unrelated to the product or the services it provides.

In her presentation “The implications of the Proposed Data Act to B2G data sharing in smart cities”, Barbara made an analysis of how business-to-government data sharing provisions of the DA can be applicable to smart cities contexts. Chapter V creates a general obligation to make privately held data available based on “exceptional needs”, highlighting the circumstances in which those needs would exist, namely public emergencies and situations in which the lack of data prevents the public sector from fulfilling a specific task in the public interest. In a general analysis of Chapter V, Barbara expressed that some recent modifications made by the Czech Presidency gave the Proposal a different dimension especially when it comes to the possibility of the development of smart cities. The adoption of a more detailed Recital 58, which now defines guidance on what can be considered lawful tasks in the public interest, opens the path to a discreet development of smart cities. The modifications also have enhanced the connections of the Act with other regulations, especially with the GDPR, demanding stricter personal data protection measures by the public sector, a matter that was a source of criticism by many opinions on the Proposal.

Overall, Bárbara concluded that the recent changes can be considered a good step when it comes to enhancing personal data protection in business-to-government data sharing and also creating a lawful basis for the sharing of data in the public interest which can benefit the development of smart cities. Nevertheless, she highlighted that the Proposal still falls short of having a broad impact on these contexts since it does not fight against the power imbalance between municipalities and the private sector, nor creates measures against the “silo mentality” that infiltrates data-sharing provisions between the businesses and governments in smart cities contexts. Bárbara was also invited to participate in a Panel on “Transparency Rules for Digital Infrastructures” organized by Max van Drunen and Jef Ausloos. The Panel started with a general comparison between the Data Act and Digital Services Act on access to data for research purposes. The issues with labeling “vetted researchers” and what it means to be a researcher was topic of discussion. At last, the “dependence on data” to conduct research was debated.

IRIT and LeADS Conference on Data Sciences and EU Regulations

The LeADS project will co-organize a Conference, titled “Conference on Data Sciences and EU Regulations” in a hybrid format at the IRIT, Université Toulouse III – Paul Sabatier, Toulouse, France on Tuesday 6 December 2022.

Program:

9:00 – 9:15 – Introduction by Jean-marc Pierson (IRIT)

9:15 – 10:30 – Session 1

Keynote talk on “Technological barriers & opportunities for Data Sciences” – Jean-michel Loubes  (IMT-ANITI)

Keynote talk on “Barriers and opportunities for Data Sciences brought by EU regulations” – Emanuel Weitschek
(Italian Competition Authority)

Debate

10:30 – 11:00 – Session 2

Posters presentation by Early Stage Researchers of the LeADS project

11:00 – 12:30 – Session 3

Panel on “Best practices for digital technology development in the era of big regulation” –  Gabriele Lenzini
(University of Luxembourg), Jessica Eynard (Université Toulouse 1 – Capitole), Nicolas Viallet (Université de Toulouse), Teesta Bhandare (Art Garde)

Debate

12:30 – 14:00 – Lunch

 

Participation is free and open to all, but prior registration is mandatory for December 4 at the latest.

ESRs Research Pitches are available on Youtube!

Our Early Stage Researchers present their research topics on LeADS Channel on Youtube!

Each Researcher has prepared an individual 1-minute Research pitch in which they present their research topics, their main research questions, why their topics are important, and how they aim to research them. It is also possible to read a summary of their topic in the description of the video.

The videos are also organized in an easy-to-watch playlist that can be accessed here. We invite you to watch them and follow the LeADS Project on Youtube!

ESR Barbara Lazarotto participation at 1st Democracy & Digital Citizenship Conference Series

Early Stage Researcher Barbara Lazarotto (ESR 7) presented her research at the 1st Democracy & Digital Citizenship Conference Series, hosted by Roskilde Universiteit on September 29-30 2022 in Roskilde, Denmark.

 

As a part of the panel named Data bodies/digital panopticonBarbara presented her topic of “The myth of “dataism” and the construction of citizen-centered cities” exploring the role of sociotechnical imaginaries on the datafication of cities.

To do that, Barbara first presented the concept of sociotechnical imaginaries, explaining its role in pushing for the datafication of society, especially cities, promising impartial, reliable, and legitimate decision-making, yet giving the datafication and categorization of citizens and entire populations.

Barbara presented a solution of placing citizens at the center of the decision-making process of creating smart cities, connecting with Lefebvre’s idea of “Right to the City”. The new participatory city-making does not intend to replace the current democratic system but add to the current one new forms of citizen participation. At last, Barbara highlighted that Data Protection is also essential to increase citizen empowerment in cities, by enhancing data minimisation, transparency, and proportionality.

ESRs on their first day in Kraków

A Recap of LeADS Training Module 5

From the 14th to 23rd of September 2022, the 15 ESRs met again for their fifth and final training module. This time they had the chance to meet in the beautiful city of Kraków at Jagiellonian University which is one of the seven beneficiaries of the project. The LeADS training program is structured around several training modules that together aim at training a new generation of researchers as legally attentive data scientists that become experts in both law and data science capable of working within and across the two disciplines. Whereas the fourth training module in Crete focussed on the computer scientist perspective, this training module focussed again more on the legal perspective.

Training Week 1: On Data and Ownership

The first week kicked off with issues surrounding propertization of information. Ewa Laskowska-Litak from Jagiellonian University introduced the ESRs to existing legal regimes as well as technical solutions that can provide legal or de facto protection to data. Dariusz Szostek and Rafał Prabucki introduced the ESRs to benefits and risks of blockchain as an opportunity for privacy and intellectual property rights. Furthermore, Adrianna Michałowicz presented and discussed current approaches of two key deliverables from the European Strategy for Data: The Digital Governance Act as well as the Data Act and how they reflect the latest European approach to data governance. Marietjie Botes from Luxemburg University presented legal approaches to ownership and how prior to the European Strategy for Data the Commission reflected on introducing new property rights in machine-generated non-personal data.

ESRs on their first day in Kraków

ESRs on their first day in Kraków

Katarzyna Południak-Gierz from Jagiellonian University focussed her presentation more on the consumer perspective and how personalisation techniques (e. g. behavioural or contextual tailoring) are used on them online. Together with Fryderyk Zoll, the ESRs had the opportunity to discuss the methodology of legal research and difficulties they have encountered throughout their first year of research. A completely different topic was subsequently presented by Katherine Lee, research engineer at Google Brain. In her talk on language models, i. e. models that learn a probability distribution of a sequence given the previous tokens, she elaborated on how they are trained with data as well as how they might pose privacy risk if they leak information. The first training week ended with a session by Marietjie Botes on discrimination and debiasing of algorithms.

Training Week 2: The Next Steps of the LeADS Project and Conference on Human Rights Centred AI

Week 2 kicked off with a session by Arianna Rossi from Luxemburg University on ethical and legal aspects of social media data mining from a researcher’s perspective. In her lecture on transparency and algorithmic decision-making, Agnieszka Jabłonowska focussed on transparency, its promises as well as its shortcomings. Using the example of consumer law, she discussed with the ESRs the efficiency of newly imposed transparency obligations on online platforms with regard to the disclosure of ranking parameters. Looking at the concrete example of a platform where consumers can reserve hotels, the group concluded that in its current implementation the imposed transparency obligations might not achieve their goal of informing consumers on why certain offers are ranked above o thers. An IP law perspective was adopted in the following lectures. Michał Wyrwiński explained approaches to “Protection of Image” in different jurisdictions. Furthermore, Żaneta Zemła-Pacud and Gabriela Lenarczyk discussed approaches to propertization of information in life sciences and to what extent the IP system already provide control over data.

Patricia Thaine adopted again a more technical perspective in her presentation on privacy enhancing technologies such as differential privacy or synthetic data. Mohamed Ali Kandi from Toulouse University introduced the ESRs to the fundamentals of networking, e. g. by presenting to the researchers the technical structure ‘behind’ the internet. Claudia Castillo Arias from INDRA, one the LeADS partners that is closely involved in LeADS activities, provided ESRs with insights into their business activity in her lecture on Military operations planning: difficulties and challenges. On Wednesday 21st, all ESRs attended the conference “Designing Human Rights Attentive AI – An Interdisciplinary Perspective” co-organized by LeADS and LiderLab of Scuola Superiore Sant’Anna. The conference perfectly reflected the interdisciplinary approach of the LeADS project with diverse presentations on topics such as “Can we trust Fair-AI?” (Salvatore Ruggieri – University of Pisa), “Ethical and Legal Issues in Accessing Biobank Data to develop AI tools” (Andrea Parziale – Maastricht University) or “Explaining AI in Cyber Defence” (Marco Antonio Sotelo – Indra).

Conference on Human Rights Attentive AI

Conference on Human Rights Attentive AI

On the day after the conference, ESRs and beneficiaries of the LeADS project met to discuss and evaluate the overall progress of the project. The training in Kraków constituted the last training module for the ESRs. The next time, the ESRs will come together in December in Toulouse for the Technical innovations In Law Laboratories (TILL) workshop, where they will work on challenges that the non-academic partners of the LeADS project are confronted with in their business activity.

 

 

 

Finding a Way Out the Ethico-Legal Maze of Social Media Data Scraping

For the latest issue of the European Research Consortium for Informatics and Mathematics (ERCIM) news journal, Arianna Rossi from LeADS beneficiary University of Luxembourg wrote an article titled “Finding a Way out of the Ethico- Legal Maze of Social Media Data Scraping”.

In her contribution, Arianna writes about the experience of the interdisciplinary project “DECEPTION” with regard to thecompliance of data protection rules and ethical research ethics principles when doing research on internet data.

She concludes that in order to encourage researchers to comply with the ethico-legal maze, practical guidance drafted in laymen terms as well as best practices and toolkits tailored to certain technologies should be created. Her article is available on this website. A more extensive version of their findings has been published in the Privacy Symposium 2022.

TcIoT Workshop “Trusted Computing and the Internet of Things”

 

We are pleased to invite you to the workshop “Trusted Computing and the Internet of Things” that will take place on Thursday, November 10 in hybrid mode at Institut de Recherche en Informatique de Toulouse – IRIT (a LeADS partner). Participation in this event is free and open to all, but prior registration is mandatory. For more information and registration go to the event website.Program:

9:15-9:30 – Introduction

9:30 – 10:30 – Session 1 

  • Trust management in large-scale IoT – Pr. Mawloud OMAR (Université Bretagne Sud)
  • How to authenticate things (objects) remotely? Opportunities and challenges of today’s technologies – Dr. Gabrielle LENZINI (University of Luxembourg)

10.30 – 11.00 – Coffee Break

11.00 – 12.00 – Session 2

  • Analyzing the risk of IoT enabled cyber-physical attack paths against critical systems – Pr. Panayiotis KOTZANIKOLAOU (University of Piraeus)
  • Privacy Preserving Authentication for Internet of Vehicles (IoV) – Dr. Khaled HAMOUID (ESIEE Paris)

12.00 – 14.00 – Lunch Break

14.00 – 15.00 – Session 3

  • Secure integration of IT and OT – Prof. Sokratis KATSIKAS (Norwegian University of Science and Technology)
  • Trust in the IoT ecosystem – Dr. Youcef IMINE (Université Polytechnique Hauts-de-France)

15.00 – 15.30 – Coffee Break

15.30 – 16.30 – Session 4

  • Greater reliability in the IoT thanks to the group – Pr. Maryline LAURENT (Télécom SudParis)
  • Blockchain-based cryptographic key management for IoT – Dr. Mohamed Ali KANDI (Paul Sabatier University – Toulouse 3)

16.30 – 17.30 – Round table and conclusion

The social contract sauce. Contains: Europol, big data, spyware, employment contracts (May contain traces of privacy)

Credit: Europol

On the 19th and 20th of October I was invited to participate to the Europol Cybercrime Conference held at the headquarters of Europol in Den Haag, Netherlands.

This year’s theme was “The evolution of policing” and it brought together law enforcement (“LE”) agents, private and public cybersecurity experts, Data Protection Officers, researchers and professors from all around the world to answer the question whether there’s a need for a social contract in cyberspace.

Although it may seem that the topic of cyber policing is somewhat distant from the LeADS’ scope, the two are surprisingly connected. Many links exist between aspects of European cybersecurity and law enforcement to key issues in the LeADS project, such as the regulation of cyberspace within contrasts of individual freedom vs public interests, the concept of trust and its declensions in law enforcement of the metaverse, fair vs effective data governance, the use of big data vs machine learning, as well as opportunities and challenges of portability, interoperability and re-usability of data for policing purposes.

Introducing one of the first debates, the Commissioner of Home Affairs Ms. Ylva Johansson -one of the only 8 female speakers out of 39 in the conference–opened with the statement that security is the social contract. It is understandable why Rousseau’s social contract idea be intertwined with that of demanding security to the power of the state, with its checks and balances, and take away the pursuit of justice from the hands of people who are driven by their individualistic amour propre. However, there is a part of that reading that is missing that I personally believe is the most important, and which has been too many times discounted throughout the conference-sometimes accidentally, sometimes wilfully—that is the following: in Rousseau’s vision of the social contract each person should enjoy the protection of the state “whilst remaining as free as they were in the state of nature.”

Security is a necessary pillar for the existence and evolution of democratic societies, but it is only a starting point, one of the bases, not the social contract itself. It is conditional to the existence of the social contract, but it is far from exhausting its functions. There is so much more that citizens of a democratic society can and should expect form a national state other than the mere prevention and investigation of crimes, offline and online. Examples are, the upholding of policies for improving social welfare, civil rights, healthcare, protection from discrimination or anti-competitive behaviors, and so on. Understanding the social contract in its miopic security meaning would legitimize Orwellian-like states that secure people through mass surveillance and social credit scoring. Privacy, in this context, is the first line of protection to that Rousseau’s individual freedom, together with personal data protection that functions as proxy to the protection of every other fundamental right, freedom and legitimate interest enshrined in the European “constitutions“. It is in the anticipation of the moment for law enforcement action before the violation of fundamental rights that lays the essence of the social contract, while all fundamental rights are in the balance—and security is only one among many.

This underlying leitmotiv of the conference has resurfaced in many occasions. Representatives of law enforcement have repeatedly lamented that bureaucracy concerning rule of law and privacy most times end up dulling investigative tools, for example, when limiting the collection of personal data to specific legal bases, along with the time for its retention and analysis. However, what these laws limit is only the indiscriminate, trawl collection of non-contextual data for unspecified use and unlimited time in case they might come handy in the future. It seems also clear that LE is still holding on to the promise of big data analytics, with its tenet of always collecting and retaining everything possible, while discounting the use of privacy friendlier alternatives powered by machine learning algorithms that do not need such amounts of data, but smaller, sanitized, quality datasets to train and test models. A hybrid system that combines machine learning models to targeted data analysis would reduce dramatically the need for voluminous, noisy, cumbersome, leakable data collection and storage, while respecting privacy of non-interested citizens: the first would help in the hunt for suspicious activities online, while the second circumscribes the area of investigation to only suspected individuals –so upholding proportionality.

LE’s request for more access to data depends on the trust of people in governmental institutions. And such trust is hard to establish, but breaks easily. One investigative journalist, in this regard, raised the thorny issue about the use of the Pegasus spyware by European LE agencies. The reference was to the spyware found installed on phones belonging not only to criminal suspects, but also to journalists, European prime ministers, members of parliament, and civil society activist; in total, it collected 4 petabytes of data of innocent people before being exposed by Citizen Lab, a Canadian research centre. Mutatis mutandis, but with the same critical lenses, we should look at the current EDPS legal action against Europol. Pending before the ECJ, the EDPS wants to fight the legitimacy of the new articles 74a and 74b of the Europol Regulation that retroactively legalize Europol’s processing of large volumes of individuals’ personal data with no established link to criminal activity. It is no wonder that the happening of such events erode the trust for people in LE. Transparency in operations and decision making could have played a positive role in establishing trust between private citizens and LE, yet in these occasions the lack thereof backfired abundantly–perhaps irremediably.

The problems that LE is facing is not only the need for more data and easier access to it, but also that data be formatted, visualized and shared in a way that is actionable. Data actionability, in the context of coordination and crime prevention, requires both understandability by operators (starting with human-readability) and portability to receiving system (starting with machine-readability). Unfortunately, on the side of operators, many high-level officers lamented the extreme lack of human resource with data sciences skills, which is in stark contrast with their pledge to big data and their concomitant jettisoning or not-hiring of digitally competent personnel coming from civil society or the private sector—most open vacancies at Europol are restricted to seconded officials. On the side of portability and interoperability of data and systems there is a lack of standardization, which renders communications and coordination among the national police forces cumbersome and inefficient–much like in the European market for data.

All in all, the conference left a bitter taste in my mouth. One of the biggest tenets that years of research in regulatory aspects of technology taught me is that technology regulation is complex. To make sense of it, analysts need a granular, expert and sensible look at the specific context in which technologies are deployed, but also an understanding of their effects in the macroscopic picture of international geopolitical, economic and social systems. Cybercrime prevention and repression is one of such complex systems, whose analysis and management need multidisciplinarity, of the box thinking, lateral and longitudinal vision, innovative skills, state of the art tools. But most importantly, this evolutionary process of policing will need to be built on the essence of Rousseau’s social contract, the credo that security is corollary to freedom-not the other way around–and it must serve its purposes.

Unfortunately, at least from an organizational standpoint, it seemed that Europol is following a different-if not altogether opposite–path to reach its security goals: the call for more data retention, the discounting of machine learning, the lack of expertise in digital skills and the admission to have difficulties in acquiring some, the hunt for human and technical resources from only inside LE seems less like an evolution of a trustworthy, pioneering, EU values-driven agency, and more like a gradual transformation into an old-school police department.

Scuola Superiore Sant’Anna ESRs at Bright Night – Night of the Researchers

On 30.09.2022, teams from Scuola Superiore Sant’Anna and Consiglio Nazionale Della Ricerche have been scouting “intergalactic parliamentarians” to solve the most pressing legal challenges of the next 100 years. And what better parliamentarians than those who will be living then? Children and parents, participants of the discussion game Regolare Technologie che Regolano (“how to regulate technologies that regulate”), got to try it for themselves!

The ESRs joined efforts to deliver an electrifying interactive spectacle during the Bright Night – Night of the Researchers, as the discussion game gave participants a unique chance to learn about the regulatory aspects of new technologies while engaging in dynamic and family-friendly debates based on an idea of tug-of-war. Participants were presented with an idea or problem [“Should we install one thousand new CCTV cameras in Pisa?” “Should we restrict video game time to only three hours a week?” Should we provide housecare robots to all over 65ers?”] and then asked to express their opinion by moving around the room and placing themselves on one of the five sectors of the parliament (strongly in favor, in favor, not sure, in disagreement, strongly in disagreement).

Participants were then confronted with a set of facts, based on real-life events, specifically picked to question their primitive opinions –and hopefully switch sides, repeatedly! According to the first goal of the game, this first part of the game was designed to make them reflect on how hard it is to regulate technologies, as the changing of context and use would significantly affect their “gut” opinions. After the round of facts was finished, teams were formed based on their positioning into one (Agree) or the other (Disagree) “hemisphere” of the parliament. Eventually, the two opposing sides clashed against each other in a heated and often unexpectedly funny debate about the pressing issues. Based on the outcomes of the debate, a final vote was cast, and the proposition was either adopted, abandoned, or modified to reach an agreement –the highlight of the game was when the kids stacked against their parents snatched the result to play 1.5 hours a day!

The procedure was so designed to reach goal 2 of the game, to critically evaluate facts and put forward the most convincing argumentation, and 3, to learn how democratic debates develop by mimicking the actual rules of parliamentary democracies (albeit – in a slightly simplified version).

The game lasted for more than 4 hours and dozens of participants debated over more than seven available topics that touched on different areas of regulatory challenges. Given h
how much attraction and enthusiasm the game has generated we expect reeditions in the upcoming years.

Join us there!

RGDP: Une maturité sans cesse challengée Conference

The LeADS supervisor Prof. Jessica Eynard is co-organizing a Conference titled “RGDP: Une maturité sans cesse challengée” at the Université UT1 Capitole – Amphithéâtre Maury, Toulouse on Friday, October 21 2022.

Program:

13:30 – Introduction by Prof. Jessica Eynard

13:45 – Introduction by Prof. Reinout Van Tuyll

14:00 – Quelle Effectivité des droits de la personne concernée? – Prof. Jessica Eynard and Remi Cauchois

15:00 – Le casse-tête des durées de conservation des données – Prof. Guillaume Desgens-Pasanau and Dr. Benjamin Laroche

16:00 – Pause

16:30 – Les impossibles (?) transfers e données vers les États-Unis – Prof. Cécile de Terwangne and Reinout Van Tuyll

17:30 – Une Approache par le risque à Renouveler? – Fabien Crozet and Prof. Yves Poullet

18:30 – Closing remarks

For registration, please contact  julie.del.jarrit@ut-capitole.fr