Published date
Written by
Ethan Auer
European Governance Lab
Elisa Alejandra Payan Camargo
Central European University
Alexander Vogt
Council of Europe
Share
Culture Opinion

Delete Cookies, Not Digital Rights

Salzburg Global Fellows believe data protection needs to move beyond pop-ups and back to principles

Published date
Written by
Ethan Auer
European Governance Lab
Elisa Alejandra Payan Camargo
Central European University
Alexander Vogt
Council of Europe
Share
hand using laptop while choosing to accept internet cookies policy or reject

Photo Credit: Shutterstock.com/2486814287

This thought piece was written by Salzburg Global Fellows Ethan Auer, Nino Kharabadze, Alejandra Payan, and Alexander Vogt. They attended the Public Policy New Voices Europe session on "Rebuilding Trust and Cohesion in European Public Policy".

The views expressed in this article are those of these Fellows individually and should not be taken to represent those of Salzburg Global or any organizations to which they are affiliated.

We live in the age of big data. The surge of AI has rekindled the debate on data ownership and use, but the problem with data collection online is about far more than training AI with content.

Data is at the core of the success of platforms like Google and Meta, while cybercriminals steal and deal data. In the end, big corporations and criminals alike harvest, and profit from, our data, which is far from harmless. Data is powerful, as even the smallest amount can reveal a lot about us - we are not just talking about personal data, but also about linking information. For example, seemingly anonymized data can easily be linked to identify a person. Almost 90% of people living in the U.S. can be uniquely identified by combining their ZIP code, birth date, and their legal sex, while more than half can be identified by merely knowing their city, birth date, and legal sex.

Even more strikingly, in 2013, researchers built a model that predicted personal traits based on the Facebook likes of a user. The success rate varied between 60 to 90%, predicting traits like whether the user's parents were still together before the user turned 21 or political party affiliation. All of this goes to show that data is sensitive, and even keeping track of simply insignificant data points has the potential to de-anonymize us. Not only is our data at risk, but also our autonomy and freedom.

A straightforward solution to this issue would obviously be the mass exodus from the online world. But refraining from using the internet is almost impossible today. May it be a job hunt, finding an apartment, or accessing governmental services - we are increasingly bound to the digital sphere, whether we like it or not. While data collection, handling, and storage are crucial issues to tackle, it is equally important that users are aware of what happens to their data and what they consent to when they use digital services. Importantly, we are not only limited by the data that is collected now, but also how the data collected today could be used in the future.

At the European level, the General Data Protection Regulation (GDPR) and the ePrivacy regulation were designed to protect and inform users. In reality, the mandatory cookie notifications that pop up on every website one opens have led to consent desensitization, taking away the meaning of consent.

From Harmless Clicks To Security: The Importance of Data Security

You might think all of this has little to do with you. After all, being identifiable online may not seem dangerous. But consider the following cases, which show how easily companies can exploit our data, often without us realizing it. It is important to remember that large platforms will always use our information in their own interest unless we demand proper protection.

Between 2013 and 2018, Facebook promoted a “fun” personality quiz that secretly harvested data from users and their friends, later selling it to a political consulting firm. That information was used to micro-target political messages and influence voting behavior, turning a harmless game into a tool for manipulation.

In 2017, the fitness app Strava published a global “heat map” that unintentionally exposed the locations and routines of military personnel around sensitive bases. Many users were unaware that their workout data could reveal such details, demonstrating how “unintentional” data sharing can have serious security consequences.

A prominent EU case is the 2020 Vastaamo scandal in Finland, where a psychotherapy clinic was hacked and thousands of sensitive patient records were stolen. Attackers later tried to extort both the company and individual patients, showing how failing to protect personal data, especially sensitive health information, can have severe real-world consequences.

These are only snapshots of a much broader problem, one that often goes unnoticed precisely because it is invisible. And yet, at the same time, the EU is considering regulatory changes that could weaken existing protections. The European Commission has proposed delaying key parts of the AI Act and easing GDPR rules, making it easier for tech companies to use personal data to train AI without consent and reducing the need for cookie-consent requests. Civil society groups warn that these changes would weaken digital rights, let companies use more sensitive data, and reduce privacy protections.

Numerous similar incidents remain overlooked by the public, and their invisibility is precisely what makes them so alarming. These examples show that our personal information has much more power than we often tend to think it does. This also means that users can have much more power than they are currently granted. The issue isn’t the absence of data-protection mechanisms, but the lack of public awareness about how personal information can be misused. That’s why raising awareness and teaching users how to protect themselves is essential.

Meaningful Consent in the Age of Data

It is time to restore genuine consent in digital spaces through a system that is simplified and transparent. Giving consent should not require a law degree; every user must be able to understand what is behind the “accept all” button. Current cookie consent banners are often created in a way that fills them with jargon and makes them excessively long. Instead, websites and platforms should provide short, clear summaries that tell users exactly what data is collected, why it is collected, how long it is stored, who owns it, and for what purpose it is used. Basically, this means no euphemisms and no buried disclosures. The user should have the option to still access the full versions of what they are signing, but it is crucial that the first interaction is one that is transparent, yet simple and understandable.

Our idea is that of a two-step consent model. The first step would be a concise executive summary. Then, the second step would be an optional detailed section for those who wish to examine further what they are consenting to. For this process to be truly beneficial for the user, manipulative design features (for instance automatically accepted cookies or the lack of a “deny” button on the initial pop-up) must be banned. Deceptive design practices like this decrease user trust by making websites and companies seem untransparent. Even if transparency wasn’t an issue, who would willingly spend the equivalent of a work-week every month reading through the privacy policy of each website they visit or app they use? Giving consent should not require a law degree – nor should it mean that users should, in theory, spend half of their lives reading tedious data policies and terms (and conditions) sections. Improving disclosure practices will help reclaim users’ rights by enabling real understanding.

For people to understand what is at stake, there needs to be education about our rights to privacy and data protection as fundamental to digital literacy. Schools and workplaces must integrate short modules on online privacy, consent, and personal data rights. People need to understand the implications of data collection and selling. The internet is not a neutral territory - with every click and like, we are positioning ourselves and contributing to the making of the world online (and offline). As users, we might feel like decisions are up to those who create our platforms, but we hold power through our consumption.

Making Digital Consent Work

Restoring the meaning of, and then working towards re-claiming, digital consent also necessitates the enactment of policy-change at the community level. As it stands (and especially after recent back-paddling), the EU’s data protection framework provides something by way of principle, but has little to offer in terms of consistency in practice. “Consent” is meant to be informed and freely given, but the way in which privacy notices are presented to users varies significantly from one platform to another – leading to unnecessary confusion and, ultimately, growing desensitization. As a quick fix, a standardized consent format should be mandated across platforms and websites. Harmonization of this kind will help ensure that “choice” can mean the same thing across the web.

Disclosure should be pushed further as well. Dealing with personal data that is gathered in what is increasingly a digital public sphere, companies should clearly disclose what data is stored where, by whom, why, and for how long. While the GDPR already mandates certain things to this end, the way in which these obligations are met is through privacy notices that are all but unreadable for standard users. Improving disclosure practices will help reclaim users’ rights by enabling real understanding.

Above and beyond the reactive practice of doling out fines for infringement, regulators must reclaim the power they have by stepping up the enforcement game. Mandated routine, independent auditing by public authorities or certified third party bodies will ensure that compliance becomes part of daily business for companies – instead of being a pesky afterthought.

Given that it is ultimately also in companies’ own interest to restore their appearance of trustworthiness in the eyes of consumers, it is quite fundamental that we advocate for ethical considerations to become part and parcel of corporate governance. The GDPR requires organizations that process a certain ("large" or “far-reaching”) amount of personal data to appoint a Data Protection Officer. While we might want to extend this practice to all companies handling personal data, it is worth considering whether, given the far-reaching societal impacts of data-practices, the duties of such officers should not be extended, or whether it might be worthwhile additionally appointing “data ethics officers.” Such officers would oversee practices in terms of broader notions of fairness and bias, handle internal monitoring of consent and privacy practices to detect potential misuse early, and provide regular reporting to authorities and the wider public.

Ethan Auer

Ethan Auer is a value-driven policy advocate and researcher specializing in inequality, social justice, and civil society. With an interdisciplinary approach at the intersection of philosophy, economics, and politics, Ethan has expertise in migration, class, environment, gender, and European policy and institutions. He holds a bachelor's degree in Philosophy, Politics, and Economics and a master's degree in Political Science from the Vrije Universiteit Amsterdam. Most recently, he completed a research master's degree in Philosophy at Utrecht University and currently serves as a Migration Policy Research Intern for the European Governance Lab.

Nino Kharabadze

Nino Kharabadze is a former intern at the U.S. Embassy in Georgia and Transparency International - Georgia. With a strong foundation in international affairs, governance, and policy analysis, her academic and professional experiences have shaped her commitment to developing effective and impactful public policy solutions. She holds a bachelor's degree in International Relations and a master's degree in Conflict and Developmental Studies.

Elisa Alejandra Payan Camargo

Alejandra is a BA student at Central European University in the multidisciplinary program, Culture, Politics, and Society. She is majoring in Political Science and Human Rights. Her academic interests extend to migration studies, gender-based violence, women's rights, human rights, peace building, and democracy. Alejandra has over 4 years of experience working different roles in international organizations that focus on women's rights and social justice. Additionally, her more recent position is as a participant of GenderSAFE, where she contributes to designing policies addressing gender-based violence in academia both within her institution and as part of broader efforts to inform policy at the EU level.

Alexander Vogt

Alexander Vogt currently works as a policy officer at the Council of Europe in Strasbourg, France, where he is part of the secretariat of the newly created intergovernmental Steering Committee on Democracy (CDDEM). He holds a Master of Science in International Politics from KU Leuven, Belgium, and a Master of Arts in Political Science from the University of Vienna, Austria, as well as Bachelors of Arts in Philosophy and in Political Science, also from the University of Vienna. His academic and professional interests include the values and cultural aspects underlying democracy, processes of democratisation, and the challenges of democratic backsliding and erosion.

Stay Connected

Subscribe to Our Monthly Newsletter and Receive Regular Updates

Search
favicon