DIGITAL PUBLIC INFRASTRUCTURE
Community Privacy, Agency, and Consent
ISSUE 1: THE UN/MAKING ISSUEBig tech platforms regularly use, sell, or exploit our data, with few, if any, options to exercise agency or say no— Meta and Reddit use our content to train AI; OpenAI and Midjourney train their large language models on copyrighted materials—and often in ways that benefit law enforcement, surveillance, and defense—such as Israel using WhatsApp data in the targeted killing of Palestinians; Telegram giving user data to cops; even Pokemon Go being sold to the Saudi government.
Most of us are not aware of the extent to which our own information is shared or used against us without our consent. Car companies, including General Motors, Kia, Subaru, and Mitsubishi, sell our driving data to insurance companies, resulting in increased premiums and insurance denials; therapy platforms, including BetterHelp and Talkspace, sell sensitive mental health data to social media platforms and advertisers; period tracking app Flo shared health data on millions of users to third party advertisers such as Facebook and Google.
In addition to advertisers, data brokers also sell your information to scammers, law enforcement–particularly to ICE–health insurance companies, banks, employers, and more. This practice of un-consentful sharing now often results in “online redlining,” with many facing denied housing and employment as a result of their information–often inaccurate–being shared against their will. As an example: data broker Gravy Analytics uses data from games like Candy Crush, dating apps like Tinder, even pregnancy tracking and religious prayer apps, to track and sell your location–through its subsidiary, Venntel–to clients including Immigration and Customs Enforcement, Customs and Border Protection, the IRS, FBI, and the Drug Enforcement Administration.
Furthermore, any company that can be subpoenaed can make your unencrypted data available to law enforcement. These practices exacerbate risks of retribution and attack against vulnerable communities, including immigrants, domestic violence survivors, abortion seekers, racial and gender minorities, and more. Meanwhile, the rise of facial and travel surveillance only continues to harm immigrants.
Currently, few tools and protections exist against these forms of surveillance capitalism. Even relying on cultural or legal frameworks for consent means violations can still occur, with consequences only ever following after harm has already been perpetrated.
What if privacy, agency, and consent could be fundamentally integrated into digital public infrastructure?
Perhaps consent starts with holding agentic control of your own, or our own, data.
What is Consent?
Consent can be defined as “agreement or permission expressed through affirmative, voluntary words or actions that are mutually understandable to all parties involved.” There are a variety of frameworks for consent: cultural frameworks include Planned Parenthood’s FRIES framework (Freely given, Reversible, Informed, Enthusiastic, Specific), and a similar framework posed by The Intimacy Directors and Coordinators organization (CRISP: Considered, Reversible, Informed, Specific, Participatory). In community governance, sociocracy proposes consent decision making, a framework for group facilitation and collective decision-making. Existing institutional frameworks for consent include: informed consent in medicine, where patients can make informed and voluntary decisions about a given procedure and may revoke consent at any time; and affirmative consent, under Title IX in federal law, which promotes consent as a “knowing, voluntary, and mutual decision among all participants to engage in sexual activity,” which can also be withdrawn at any time.
Digital Frameworks for Consent
There are a handful of existing frameworks for establishing consent digitally, each of which carry their own points of interest and drawbacks.The Digital Impact Alliance, for instance, describes consent mechanisms in digital public infrastructure as meaningful and informed, names considerations around data exchange, privacy-preserving metadata, access controls, and accessibility and inclusion, and recommends policy support for enforcing consent in digital infrastructure.
US laws such as California’s Privacy Rights Act (CPRA) and Virginia’s Consumer Data Protection Act (VCDPA) follow the opt-out model, which describes an act of refusing or withdrawing consent or declining from participating; when opting-out is the default, consent is assumed until withdrawn.
By contrast, in the European Union, the General Data Protection Regulation (GDPR) requires websites to ask for explicit consent before storing cookies, which can contain personally identifiable information and other data. The EU’s GDPR and Brazil’s Lei Geral de Proteção de Dados (LGPD) follow the opt-in model, which requires an affirmative action of giving or asking for consent online, defined as: freely given, informed, specific, unambiguous, and documented. When opting-in is the default, consent has to be explicitly given.
While these examples are non-exhaustive, new frameworks for digital consent continue to emerge. Further policy regulations for data privacy are needed to keep up with the rapid pace of data production and AI adoption today.
Consentful Technology
A growing movement towards consensual software provides an alternative framework for digital infrastructure. The Consentful Tech Project advocates for and provides resources on digital consent and consentful technologies, building on frameworks of consent culture, design justice, digital justice, and community technology; recommends technical mechanisms such as differential privacy, homomorphic encryption, and decentralization; and advocates for community-based strategies for addressing harm in digital spaces. The project asks, what does consent mean for digital bodies?
Perhaps consent starts with holding agentic control of your own, or our own, data.
Personal Data Stores
Personal data stores involve storing data primarily on secure and/or decentralized personal servers. Entities—individuals, organizations, etc.—can choose what data to share, with whom, and can also revoke access to any of these data slices at any time.
This flips existing data storage architecture, based on centralization and large depersonalized servers, on its head: instead of our data being stored in centralized servers outside of our control, our data becomes fundamentally ours to manage, with agency and choice over how this information can be accessed.
One example is Solid, founded by Tim Berners-Lee, the inventor of the World Wide Web. Solid is an open standard with the aim of giving people “more agency over their data,” is being explored by the BBC, and is a part of a growing shift towards data sovereignty in digital infrastructure.
How can privacy-preserving personal data stores be extended into community data stores? Why would community data stores matter?
Data is relational; so, too, is privacy.
Collective Data Privacy
Importantly, the economic value of data comes less from any one individual, but rather when it exists in bulk; our data is extracted and exploited via collection and aggregation, so we must face this problem by determining how we will collect, store, and govern our own data together. As such, movements towards collective data propose treating data less as an individual asset, and more as a collective resource. Emerging models for community-stewarded data, such as data co-ops, data coalitions, data trusts, data unions, data commons, and more, introduce new possibilities for collective governance and stewardship, with the potential to redistribute power away from tech monopolies and towards communities.
Open Data Manchester proposes a few frameworks for consent interfaces within data cooperatives:
- Granular consent is a mechanism where every member has the option to exercise agency over the type of data shared, with whom, and in what form; while this grants a lot of choice-making, it can also be burdensome both to the individual and the collective.
- Persona/Archetype-based Permissions allow co-op members to choose a set of guidelines or behaviors, reducing the burden of choice.
- Traffic light consent is a consent mechanism where members can categorize pieces of data as permissive (green), controlled (yellow), or restricted (red), with each category defining how each piece of data is used, or if it needs further considerations.
Data is relational; so, too, is privacy. For example, if my family member submits genetic data to 23andMe, they’re also submitting genetic information about me, whether I consent to it or not. Social media sites such as Facebook use social network information and contact lists to create shadow profiles of users who are not even members of the site, making it almost impossible to opt-out of this type of non-consensual data collection. Privacy, therefore, is more than just an individual choice; privacy becomes a collective responsibility.
It is important to note here that inclusive and equitable digital public infrastructure requires holistic considerations from and for marginalized communities, along with deliberate efforts towards digital enfranchisement.
Community Co-Design and Accessibility
In addressing holistic community needs, participatory co-design, or design by and with communities, prioritizes the voices of those most directly impacted throughout the design process. Similar design frameworks that center the needs of marginalized communities include: inclusive design, which emphasizes accessibility to people of all abilities, cultures, genders, ages, and backgrounds; design justice, which explores design processes led by those who are historically marginalized and centers the voices of those most directly impacted; and design from the margins (DFM), which similarly centers communities most impacted and marginalized from beginning to end.
Usability is also important in promoting the adoption of privacy-preserving alternatives over existing extractive technologies. For example, an iPhone user messaging an Android user may actually find the cross-platform interface of Signal more enjoyable to use. By making the privacy-preserving option more usable and convenient than an existing extractive alternative, privacy and security become easier for communities to adopt.
Co-creating fundamentally new infrastructures for community privacy, agency, and consent will take a collective effort across a broad range of domains, including technologists, designers, researchers, community organizers, policymakers, and more. Integrating layers of technology—building infrastructure for consent mechanisms, collective data governance, and collective privacy; culture—shifting towards consent culture and supporting solidarity economies, community organizing, and collective action; and policy—advocating for equitable civic infrastructure, legal protections, and data privacy regulations, true digital public infrastructure for community privacy, agency, and consent calls for holistic considerations on all fronts, for all communities.
Special thanks to Jess Zhou, Ying Tong Lai, and Nick Sweeting for their thoughtful feedback and support on this piece.
NOTES
1.