Special Edition Blog Series on PhD Abstracts (Part IV)

This post is a continuation of the blog post series on PhD abstracts. You can find the first part of the series here.

 

Barbara Lazarotto: Business to Government Data Sharing in the EU and the protection of personal data: Making sense of a complex framework.

Data is a crucial resource that plays an essential role in the economy and society. Yet, due to market failures, data has been often treated as a commodity and held in silos by a few actors, often large companies. In light of recent developments, there have been talks about transferring data from exclusive control of certain groups to making it accessible for public use. The European Union has taken a step in this direction by introducing the “European Data Strategy”, a set of rules and regulations that amongst other objectives, also aimed at making it easier for stakeholders to share data among themselves and with governments. However, this regulatory framework which includes different modalities of business-to-government data sharing is fairly new and the synergy between them is still yet to be seen since many of them may overlap and have possible contradictions.

Against this backdrop, there is a pressing need to analyze the current legal and regulatory landscape for business-to-government data sharing in the EU, how they interact with each other, and their possible consequences for the rights of data subjects. The analysis will delve into the complexities of the regulatory conundrum associated with business-to-government data sharing and explore whether the current framework effectively addresses the data subject’s data protection rights as enshrined in the GDPR. Ultimately, this research aims to provide a comprehensive understanding of the legal and regulatory landscape for business-to-government data sharing and its connections with data subject’s rights.

Fatma Dogan: Navigating the European Health Data Space: A Critical Analysis of Transparency Implications in Secondary Data Use under GDPR.

This thesis aims to critically examine the European Health Data Space (EHDS) proposal, with a specific focus on its secondary use framework and the implications of transparency requirements of the General Data Protection Regulation (GDPR). The research delves into the intricate intersection of EHDS provisions, GDPR transparency requirements, and the proportionality principle. In this context, whether a rights-based approach to privacy regulation still suffices to address the challenges triggered by new data processing techniques such as secondary use of data will be discovered. GDPR’s rights-based approach grants individuals a set of rights and obligation to offer transparency is one of them. However, it is highly unclear how these rights could be able to employ by data subjects under EHDS secondary use framework.

Xengie Doan: Tools and Methods for User-Centered, Legal-Ethical Collective Consent Models: Genomic Data Sharing.

Health data is sensitive and sharing it could have many risks, which is especially true for genetic data. One’s genome might also indicate physical or health risks that could be used for more personalized healthcare or personalized insurance premiums. These risks affect not only the individual who has initially consented to the collection and sharing, but also those who may be identified from the DNA, such as genetic relatives or those who share a genetic mutation. How can relevant individuals come together to consent to genetic data sharing? Collective consent stems from indigenous bioethics where indigenous tribes fought for their right to consent to biomedical research as a community, not just as individuals. It has been used in research partnerships with indigenous groups to improve stakeholder involvement instead of treating indigenous populations as test subjects. Though it has been proposed, no digital collective consent (wherein multiple individuals consent in different via different governance structures such as families or tribal leader) exists for the general public. Challenges span legal-ethical issues and technical properties such as transparency and usability. In order to build collective digital consent to meaningfully address real world challenges, this work uses genetic data sharing as a use case to better understand what tools and methods can enhance a user-friendly, transparent, and legal-ethically aware collective consent. I conducted a theoretical and empirical study on collective consent processes for health data sharing. First, we explored the privacy and biomedical gaps in collective consent, as it has not been implemented widely outside of indigenous populations. Then I surveyed user goals and attitudes towards engaging elements within different consent mediums, then I analyzed the transparency and user-relevancy of policies from notable DTC genetic testing companies to find gaps in. Last, I validated the framework for transparent, user-centered collective consent with a use-case with a company in Norway.

LeADS Organises Innovation Challenge and Final Conference on Legally compliant data-driven society

Pisa, October 2024—The LeADS Project (Legality Attentive Data Scientists) hosted its final three-day meeting at the LeADS beneficiary Sant’Anna School of Advanced Studies. The event featured a diverse range of activities, from an innovation challenge on the AI Act to intensive panel discussions.

The event opened with an innovation challenge on the AI Act. External participants of the challenge needed to create a solution that helps developers or deployers of AI systems navigate the AI Act’s risk classification system and understand which requirements apply to them.

The third day focused on the final LeADS Conference, named “Legally compliant data-driven society,” which explored how a multidisciplinary approach to governance can reap the benefits of new technologies while guaranteeing fundamental rights and freedoms. The conference had outstanding speakers across three panels, each addressing critical facets of a data-driven society.

The first panel included an introduction by Giovanni Comandé from Sant’Anna School of Advanced Studies, followed by a keynote from Giovanni Pitruzzella, Judge at the Italian Constitutional Court, who discussed regulatory challenges and opportunities in data markets. Giuseppe Turchetti of Sant’Anna School then explored innovation ecosystems fueled by data, while Antonio Buttà from the Italian Competition Authority reflected on the evolving competition landscape shaped by data flows. The panel concluded with a lively discussion on fostering innovation while preserving market fairness.

The second panel focused on the topic of “Research and Secondary Use of Data.” Comandé introduced the session, followed by Paul de Hert from Vrije Universiteit Brussel, who addressed the ethical and legal frameworks supporting secondary data use in research. Regina Becker of Luxembourg National Data Service presented a European perspective on data stewardship, and Piotr Drobek from Poland’s Personal Data Protection Office (UODO) emphasized the challenges of privacy in secondary data applications.

Finally, the third panel explored the topic of “Data Society and Technological Sovereignty,” and featured an introduction by Michelle Sibilla from Université Toulouse III. Jorge Maestre Vidal from Indra Digital Labs explored the relationship between data sovereignty and security, while Giovanni Comandé provided insights on the legal implications of emerging data technologies. Nicola Lattanzi from IMT School for Advanced Studies Lucca concluded the panel with a reflection on how data policies can foster technological independence.

During the engaging three-day event, ESRS had the opportunity to participate in a productive and enlightening discussion. The conversation emphasized the crucial need to harmonize technological progress with the fundamental principles of sovereignty and security.

 

Contact details:

Veronica Virdis, LeADS Project Manager

pm@legalityattentivedatascientists.eu

 

Special Edition Blog Series on PhD Abstracts (Part III)

This post is a continuation of the blog post series on PhD abstracts. You can find the first part of the series here.

Mitisha Gaur: Re-Imagining the Interplay Between Technical Standards, Compliances and Legal Requirements in AI Systems Employed in Adjudication Environments Affecting Individual Rights

The doctoral thesis investigates the use of AI technology in automated decision making systems (ADMS) and subsequent application of these ADMS within Public Authorities as Automated Governance systems in their capacity as aides for the dispensing of public services and conducting investigations pertaining to taxation and welfare benefits fraud. The thesis identifies Automated Governance systems as a sociotechnical system comprising three primary elements- social (workforce, users), technical (AI systems and databases) and organisational (Public Authorities and their internal culture).

Fuelled by the sociotechnical understanding of automated governance systems, the thesis’ investigation is conducted through three primary angles, Transparency, Human Oversight and Algorithmic Accountability and their effect on the development, deployment and subsequent use of the Automated Governance systems. Further, the thesis investigates five primary case studies against the policy background of the EU’s HLEG Ethics guidelines for AI systems and the regulatory backdrop of the AI Act (and on occasion the GDPR).

Finally, the thesis concludes with observed gaps in the ethical and regulatory governance of Automated Governance systems and recommends core areas of action such as the need to ensure adequate agency for the decision-subjects of the AI systems, the importance of enforcing contextual clarity within AI Systems deployed in a high risk scenario such as Automated Governance and advocates for strict ex-ante and ex-post requirements for the developers and deployers of Automated Governance systems.

Maciej Zuziak: Threat Detection and Privacy Risk Quantification in Collaborative Learning

This thesis compiles research on the brink of privacy, federated learning and data governance to answer numerous issues that concern the functioning of decentralised learning systems.  The first chapters introduce an array of issues connected with European data governance, followed by an introduction of Data Collaboratives – a concept that is built upon common management problems and serves as a generalization of numerous approaches to collaborative learning that have been discussed over the last years. The subsequent work presents the results of the experiments conducted on the selected problems that may arise in collaborative learning scenarios, mainly concerning threat detection, clients’ marginal contribution quantification and assessment of re-identification attacks’ risk. It formalizes the problem of marginal problem contribution, introducing a formal notion of Aggregation Masks and Collaborative Contribution Function that generalizes many already existing approaches such as Shaple Value. In relation to that, it presents an alternative solution to that problem in the form of Alpha-Amplification functions. The contribution analysis is tied back to threat detection, as the experimental section explores using Alpha Amplification as an experimental method of identifying possible threats in the pool of learners. The formal privacy issues are explored in two chapters dedicated to spoofing attacks in Collaborative Learning and the correlation between the former and membership inference attacks, as the lack thereof would imply that similar (deletion-based) metrics would be safe to employ in the Collaborative Learning scenario. The last chapter is dedicated to the selected compliance issues that may arise in the previously presented scenarios, especially those concerning the hard memorization of the models and the consent withdrawal after training completion.

PUBLIC PRESENTATION – INNOVATION CHALLENGE “AI Act Compass: Navigating Requirements for High-Risk AI Systems”

PISA -10 OCTOBER 2024

1  CHALLENGE

7  TEAMS FROM ALL OVER EUROPE

7  INNOVATIVE IDEAS

1   WINNER (OR MAYBE 3)!

Join us to discover the 7 innovative solutions that will help developers or deployers of AI systems to navigate the risk classification system of the AI Act.

The EU project “LeADS- Legality attentive data scientists- GA 956562”, in collaboration with the Pisa Internet Festival, is happy to invite you to attend the 7 presentations and  discover which Team will find the BEST solution of the Innovation Challenge“AI Act Compass: Navigating Requirements for High-Risk AI Systems” and win 2.500€

WERE

Sala Kinzica – Officine Garibaldi, via Via Vincenzo Gioberti 39, Pisa , Italy

WHEN

10 October 2024

16.00-18.00 presentations

19.00  Winners Announcement

 

 

LeADS Final Conference: Legally compliant data-driven society

11th of October 2024

Aula Magna – Sant’Anna School of Advanced Studies  

Piazza martiri della Libertà 33, Pisa

Free Event – Organized in the framework of the Pisa Internet Festival 

Data drive our societies, open to new technological solutions and scientific discoveries. Data create new market opportunities and new challenges also to security. These processes require a multidisciplinary approach for a governance able to reap the benefits of them while guaranteeing fundamental rights and freedoms. The LeADS final conference tackles this task in 3 key domains with its outstanding speakers.

Panel 1:  12.00 – 13.30 Data-driven Markets and Innovation

12.00 – 12.05 Giovanni Comandé Sant’Anna Scool of Advanced Studies Introduction
12.05 – 12.25 Giovanni Pitruzzella – Italian Constitutional Court
12.25 – 12.45 Giuseppe Turchetti – Sant’Anna Scool of Advanced Studies Introduction
12.45 – 13 .05 Antonio Buttà – Autorità Garante della Concorrenza e del Mercato
13.05 – 13.30 Discussion

 

Panel 2: 14.00 – 15.30 Research and secondary use of data

14.00 -14.05 Giovanni Comandé SSSA: Introduction
14.05 – 14.25 Paul de Hert – Vrije Universiteit of Brussel
14.25 – 14.45 Regina Becker – Luxembourg National Data Service LNDS
14.45 – 15 .05 Piotr Drobek– UODO – Personal Data Protection Office of Poland
15.05 – 15.30 Discussion

Panel 3:  16.00 – 17.30 Data Society and technological sovereignty\ security

16.00 – 16.05 Michelle Sibilla – Université Toulouse III – Introduction
16.05 – 16.25 Jorge Maestre Vidal – Indra · Digital Labs
16.25 – 16.45 Giovanni Comandé – SSSA
16.45 – 17 .05 Nicola Lattanzi – IMT Scuola Alti Studi di Lucca
17.05 – 17.30 Discussion

Registration form