Geneva
March 7, 2025

Redefining the Urban Social Contract: Gen Z’s Expectations for Digital Integrity

Sophie Zermatten

In an era where digital interactions shape nearly every aspect of life, how can we redefine the social contract between citizens, governments, and businesses?

Board

In an era where digital interactions shape nearly every aspect of life, how can we redefine the social contract between citizens, governments, and businesses? This question was at the heart of our 5th Digital Intercity Dialogue, where participants from different Edgelands cities explored how Gen Z navigates today’s hyperconnected world and what they expect from the systems that govern it.

Through this conversation, key themes emerged around choice, control, censorship, and surveillance, underscoring a deep concern for digital integrity, but also a desire to reimagine the power dynamics of the digital landscape.

Living Under Surveillance: A False Sense of Security?

Participants shared first-hand experiences of surveillance in everyday life, from facial recognition at airports to AI-driven behavioral tracking in supermarkets, and health insurance apps that track daily habits such as physical activity and diet. These technologies promise efficiency and security, yet many expressed discomfort with the trade-offs, especially as data collection becomes more intrusive.

A notable concern was the geopolitical dimensions of surveillance, particularly regarding social media giants like TikTok. While most major tech companies engage in similar data collection practices, certain platforms are scrutinized more than others, often due to political tensions rather than genuine privacy concerns.

Beyond data collection and geopolitical concerns, surveillance also plays a role in enabling censorship. It was pointed out that increased surveillance of digital spaces, often disguised as the need to create safer spaces, raises important questions about freedom of expression and the spread of misinformation.

Despite these anxieties, participants acknowledged a troubling paradox:  they and their peers continue to use and depend on platforms they know are abusive, and they continue to accept invasive policies without question, signing off on terms and conditions they haven’t read. This silent compliance, they all agreed, has become an unspoken rule of digital engagement.

The Double-Edged Sword of Digital Platforms  and Digital Governance

Digital platforms and AI tools present both opportunities and risks, and this tension was evident in global examples raised during the discussion.

In Kenya, for example, activists used AI-powered tools to translate complex legal bills, making information accessible and helping to combat misinformation from public officials. AI also helped index corruption cases, exposing misconduct and holding leaders accountable.

Similarly, social media platforms have facilitated grassroots organising and public mobilisation against government policies. However, the same technologies have been weaponized for repression-- they have been deployed to track and intimidate political dissidents, with reports of individuals being abducted or disappearing after government crackdowns.

These cases illustrate a central paradox: AI and social media platforms can enhance democracy, but it can also strengthen authoritarian control.

Participants also reflected on the growing surveillance culture in academic and professional spaces. Cloud-based systems raise questions about how universities, employers, and governments access and use stored data. Some highlighted that researchers investigating sensitive topics face unique vulnerabilities, often lacking the same protections afforded to journalists.

While open-source alternatives with better privacy and security practices exist, they remain difficult to adopt and integrate widely, leaving many users trapped in digital environments they don’t trust but can’t escape.

Rethinking Digital Power

Throughout the conversation, one theme stood out: the need to rethink who holds power in digital spaces. Participants called for more public safe spaces, open conversations, stronger  regulations, and alternative digital infrastructures to face growing concerns over corporate overreach, surveillance, and the erosion of digital autonomy.

Here were some propositions that were made:

  • There should be more available open-source software and publicly funded alternatives to reduce dependence on corporate-controlled platforms.
  • Governments must create consistent, long-term policies on data privacy and ethical AI frameworks that do not fluctuate with political leadership.
  • AI and digital spaces should be community-driven and serve public needs.
  • The power to shape digital spaces should be decentralized. Citizens should have more control over their digital interactions, whether through increased data privacy rights, access to non-commercial platforms, or the ability to challenge opaque algorithms.
  • Users must have clearer access to information about how their data is collected and used. Platforms should provide simplified, accessible breakdowns of key terms.
  • Digital literacy education should be embedded in school curriculums, equipping young people to critically navigate these systems.

The discussion underscored a collective concern: without meaningful intervention, digital spaces will continue to be shaped by those who profit from them, rather than by those who live in them. Reclaiming digital autonomy requires public awareness, policy shifts, and a fundamental rethinking of the digital social contract.