DIGITAL PUBLIC
INFRASTRUCTURE
Community Privacy,
Agency, and Consent
FORTUNATELY DIGITAL: BIG INTERNET“Most of our fears or anxieties about technology are best understood as fears or anxiety about how capitalism will use technology against us.” — Ted Chiang, from an interview with Ezra Klein, 2023
Most of us are not aware of the extent to which our own information is shared or used against us without our consent. Car companies, including General Motors, Kia, Subaru, and Mitsubishi, sell our driving data to insurance companies,6 resulting in increased premiums and insurance denials; therapy platforms, including BetterHelp and Talkspace, sell sensitive mental health data to social media platforms and advertisers;7 period tracking app Flo shared health data on millions of users to third party advertisers such as Facebook and Google.8
In addition to advertisers, data brokers also sell your information9 —particularly to ICE,10 health insurance companies,11 and banks.12 This practice of non-consensual sharing now often results in “digital redlining,” with many facing denied housing and employment as a result of their information13—often inaccurate14—being shared against their will. As an example, data broker Gravy Analytics uses data from games like Candy Crush, dating apps like Tinder, and even pregnancy tracking and religious prayer apps, to track and sell your location—through its subsidiary, Venntel—to clients including Immigration and Customs Enforcement, Customs and Border Protection, the IRS, FBI, and the Drug Enforcement Administration.15
Furthermore, any company that can be subpoenaed can make your unencrypted data available to law enforcement. These practices exacerbate risks of retribution and attack against vulnerable communities,16 including immigrants,17 domestic violence survivors,18 abortion seekers,19 racial and gender minorities,20 and more. Meanwhile, the rise of facial and travel surveillance only continues to harm immigrants.21
Currently, few tools and protections exist against these forms of “surveillance capitalism.”22 Even relying on cultural or legal frameworks for consent means violations can still occur, with consequences only ever following after harm has already been perpetrated.
What if privacy, agency, and consent could be fundamentally integrated into digital public infrastructure?
Perhaps consent starts with holding agentic control of your own, or our own, data.
What is Consent?
Consent can be defined as “agreement or permission expressed through affirmative, voluntary words or actions that are mutually understandable to all parties involved.”23 There are a variety of frameworks for consent: cultural frameworks include Planned Parenthood’s FRIES framework (Freely given, Reversible, Informed, Enthusiastic, Specific),24 and a similar framework proposed by The Intimacy Directors and Coordinators organization: CRISP (Considered, Reversible, Informed, Specific, Participatory).25 In community governance, sociocracy proposes consent decision making, a framework for group facilitation and collective decision-making.26 Existing institutional frameworks for consent include: informed consent in medicine, where patients can make informed and voluntary decisions about a given procedure and may revoke consent at any time;27 and affirmative consent, under Title IX in federal law,28 which promotes consent as a “knowing, voluntary, and mutual decision among all participants to engage in sexual activity,” which can also be withdrawn at any time.
Digital Frameworks for Consent
There are a handful of existing frameworks for establishing consent digitally, each of which carry their own points of interest and drawbacks.The Digital Impact Alliance, for instance, describes consent mechanisms in “digital public infrastructure”29 as “meaningful and informed,”30 identifies key considerations around data exchange, privacy-preserving metadata, access controls, and accessibility and inclusion, and recommends policy support for enforcing consent in digital infrastructure.
U.S. laws such as California’s Privacy Rights Act (CPRA) and Virginia’s Consumer Data Protection Act (VCDPA) follow the opt-out model,31 which describes an act of refusing or withdrawing consent or declining from participating; when opting-out is the default, consent is assumed until withdrawn.
By contrast, in the European Union, the General Data Protection Regulation (GDPR)32 requires websites to ask for explicit consent before storing cookies, which can contain personally identifiable information and other data. The EU’s GDPR and Brazil’s Lei Geral de Proteção de Dados (LGPD)33 follow the opt-in model, which requires an affirmative action of giving or asking for consent online, defined as: freely given, informed, specific, unambiguous, and documented. When opting-in is the default, consent has to be explicitly given.
While these examples are non-exhaustive, new frameworks for digital consent continue to emerge.34 Further policy regulations for data privacy are needed to keep up with the rapid pace of data production and AI adoption today.
Consentful Technology
A growing movement towards consensual software provides an alternative framework for digital infrastructure.35 The Consentful Tech Project advocates for and provides resources on digital consent and consentful technologies, building on frameworks of consent culture, design justice, digital justice, and community technology;36 recommends technical mechanisms such as differential privacy, homomorphic encryption, and decentralization; and advocates for community-based strategies for addressing harm in digital spaces. The project asks, what does consent mean for digital bodies?37
Perhaps consent starts with holding agentic control of your own, or our own, data.
Personal Data Stores
Personal data stores involve storing data primarily on secure and/or decentralized personal servers. Entities—individuals, organizations, etc.—can choose what data to share, with whom, and can also revoke access to any of these data slices at any time.
This flips existing data storage architecture, based on centralization and large depersonalized servers, on its head: instead of our data being stored in centralized servers outside of our control, our data becomes fundamentally ours to manage, with agency and choice over how this information can be accessed.38
One example is Solid, founded by Tim Berners-Lee, the inventor of the World Wide Web. Solid is an open standard with the aim of giving people “more agency over their data.”39 The BBC is now implementing Solid, and it is part of a growing shift towards data sovereignty in digital infrastructure.40
How can privacy-preserving personal data stores be extended into community data stores? Why would community data stores matter?
Data is relational; so, too, is privacy.
Collective Data Privacy
Importantly, the economic value of data comes less from any one individual, but rather when it exists in bulk. Our data is extracted and exploited via collection and aggregation, so we must face this problem by determining how we will collect, store, and govern our own data together. As such, movements towards collective data propose treating data less as an individual asset, and more as a collective resource. Emerging models for community-stewarded data, such as data co-ops,41 data coalitions,42 data trusts,43 data unions,44 and data commons governance45 introduce new possibilities for collective governance and stewardship, with the potential to redistribute power away from tech monopolies and towards communities.
Open Data Manchester proposes a few frameworks for consent interfaces within data cooperatives:
- Granular consent is a mechanism where every member has the option to exercise agency over the type of data shared, with whom, and in what form; while this grants a lot of choice-making, it can also be burdensome both to the individual and the collective.
- Persona/archetype-based permissions allow co-op members to choose a set of guidelines or behaviors, reducing the burden of choice.
- Traffic light consent is a consent mechanism where members can categorize pieces of data as permissive (green), controlled (yellow), or restricted (red), with each category defining how each piece of data is used, or if it needs further considerations.46
Data is relational; so, too, is privacy.47 For example, if my family member submits genetic data to 23andMe, they’re also submitting genetic information about me, whether I consent to it or not.48 Social media sites such as Facebook use social network information and contact lists to create “shadow profiles” of users who are not even members of the site, making it almost impossible to opt-out of this type of non-consensual data collection.49 Privacy, therefore, is more than just an individual choice; privacy becomes a collective responsibility and the foundation for collective data rights.50
It is important to note here that inclusive and equitable digital public infrastructure requires holistic considerations from and for marginalized communities, along with deliberate efforts towards digital enfranchisement.
Community Co-Design and Accessibility
In addressing holistic community needs, participatory co-design, or design by and with communities, prioritizes the voices of those most directly impacted throughout the design process.51 Similar design frameworks that center the needs of marginalized communities include: inclusive design, which emphasizes accessibility to people of all abilities, cultures, genders, ages, and backgrounds;52 design justice, which explores design processes led by those who are historically marginalized and centers the voices of those most directly impacted;53 and design from the margins (DFM), which similarly centers communities most impacted and marginalized from beginning to end.54
Usability is also important in promoting the adoption of privacy-preserving alternatives over existing extractive technologies. For example, an iPhone user messaging an Android user may actually find the cross-platform interface of Signal more enjoyable to use.55 By making the privacy-preserving option more usable and convenient than an existing extractive alternative, privacy and security become easier for communities to adopt.
Co-creating fundamentally new infrastructures for community privacy, agency, and consent will take a collective effort across a broad range of domains, including technologists, designers, researchers, community organizers, policymakers, and more. Integrating layers of technology—building infrastructure for consent mechanisms, collective data governance, and collective privacy; culture—shifting towards consent culture and supporting solidarity economies, community organizing, and collective action; and policy—advocating for equitable civic infrastructure, legal protections, and data privacy regulations, true digital public infrastructure for community privacy, agency, and consent calls for holistic considerations on all fronts, for all communities.
Special thanks to Jess Zhou, Ying Tong Lai, and Nick Sweeting for their thoughtful feedback and support on this piece.
NOTES
1. Deborah Sophia in Bengaluru, “Meta to start using public posts on Facebook, Instagram in UK to train AI” Reuters, September 13, 2024. See also Emma Roth, “Google cut a deal with Reddit for AI training data,” The Verge, February 22, 2024.
2. Noor Al-Sibai, “OpenAI Pleads That It Can’t Make Money Without Using Copyrighted Materials for Free,” Futurism, January 8, 2024.
3. Julia Conley, “Report Indicates Israel Uses WhatsApp Data in Targeted Killings of Palestinians,” Truthout, May 19, 2024.
4. Joseph Cox, “Telegram Confirms it Gave U.S. User Data to the Cops,” 404 Media, October 2, 2024.
5. Samantha Bradshaw, Dean Jackson, “Gotta Track'em All: Data Privacy and Saudi Arabia’s Pokémon Go Acquisition,” Tech Policy Press, March 17, 2025.
6. Kashmir Hill, “Automakers Are Sharing Consumers’ Driving Behavior With Insurance Companies,” The New York Times, March 11, 2024.
7. A.W. Ohlheiser, “Teletherapy can really help, and really hurt: From privacy breaches to bad providers, teletherapy services often come with a hidden cost,” Vox, May 16, 2024.
8. Federal Trade Commission, “Developer of Popular Women’s Fertility-Tracking App Settles FTC Allegations that It Misled Consumers About the Disclosure of their Health Data,” January 13, 2021.
9. Justin Sherman, “Data Brokers Are a Threat to Democracy,” Wired Magazine, April 13, 2021.
10. Johana Bhuiyan, “US immigration agency explores data loophole to obtain information on deportation targets,” The Guardian, April 20, 2022.
11. Marshall Allen, “Health Insurers Are Vacuuming Up Details About You — And It Could Raise Your Rates,” National Public Radio (NPR), July 17, 2018. See also Matt LoJacono, “Sanford Report on Data Brokers & Health Data,” Sanford School of Public Policy, Duke University, February 20, 2023.
12. U.S. House Committee on Energy and Commerce, “Expert Warns Data Brokers Profit from Unregulated Surveillance.' Energy and Commerce,” May 18, 2023. See also Federal Trade Commission, “FTC Charges Data Brokers with Helping Scammer Take More Than $7 Million from Consumers' Accounts,” August 12, 2015.
13. Lauren Kirchner, “When Zombie Data Costs You a Home,” The Markup, October 6, 2020. See also Steven Melendez, “When Background Checks Go Wrong,” Fast Company, November 17, 2016.
14. Suzanne Smalley, “'Junk inferences' by data brokers are a problem for consumers and the industry itself,” The Record from Recorded Future News, June 12th, 2024. See also Electronic Privacy Information Center, “Data Brokers.”
15. Joseph Cox, “Candy Crush, Tinder, MyFitnessPal: See the Thousands of Apps Hijacked to Spy on Your Location,” The Wire, January 9, 2025.
16. Brooke Tanner and Samantha Lai, “Examining the Intersection of Data Privacy and Civil Rights,” Brookings, July 18, 2022.
17. Electronic Privacy Information Center, “How Data Brokers Harm Immigrants,” 2024. See also Max Rivlin-Nadler, “How ICE Uses Social Media to Surveil and Arrest Immigrants,” The Intercept, December 22, 2019.
18. Catherine Fitzpatrick, “For Domestic Violence Victim-Survivors, a Data or Privacy Breach Can Be Extraordinarily Dangerous,” Tech Xplore, December 4, 2023.
19. Joseph Cox, “Location Data Firm Offers to Help Cops Track Targets via Doctor Visits,” 404 Media, December 10, 2024.
20. Jon Keegan and Alfred Ng, “Gay/Bi Dating App, Muslim Prayer Apps Sold Data on People’s Location to a Controversial Data Broker,” The Markup, January 27, 2022.
21. Ashley Del Villar and Myaisha Hayes, “How Face Recognition Fuels Racist Systems of Policing and Immigration — and Why Congress Must Act Now,” American Civil Liberties Union, July 22, 2021.
22. Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019).
23. Indiana University, “What Is Consent?: Policies & Key Terms: Stop Sexual Violence,” Stop Sexual Violence, 2019.
24. Planned Parenthood, “Sexual Consent.”
25. Intimacy Directors & Coordinators, “Defining Consent: From FRIES to CRISP!” Idcprofessionals.com, 2022.
26. Ted Rau, “Consent Decision Making,” Sociocracy For All, July 13, 2022.
27. Parth Shah, Imani Thornton, Nancy L Kopitnik, and John E Hipskind, “Informed Consent,” StatPearls Publishing, November 24, 2024.
28. St. Lawrence University, “Affirmative Consent,” Stlawu.edu.
29. Romina Bandura, Madeleine McLean, and Sarosh Sultan, “Unpacking the Concept of Digital Public Infrastructure and Its Importance for Global Development,” Center for Strategic & International Studies, 2023.
30. Digital Impact Alliance, “Good Digital Public Infrastructure Relies on Effective Consent Mechanisms. Here’s How They Work,” December 10, 2024.
31. Shreya and Cookieyes Team, “Opt-in vs Opt-Out: What They Mean and How to Comply,” CookieYes, May 5, 2020.
32. Richie Koch, “Cookies, the GDPR, and the EPrivacy Directive,” GDPR.eu, May 9, 2019.
33. International Association of Privacy Professionals, “Brazilian General Data Protection Law (LGPD, English Translation),” Iapp.org.
34. Miranda Bryant, “Denmark to Tackle Deepfakes by Giving People Copyright to Their Own Features,” The Guardian, June 27, 2025.
35. Danielle Leong, “Consensual Software: How to Prioritize User Safety,” InfoQ, May 18, 2017. See also Una Lee and Dann Toliver, “A Framework for Good Consent in Technology,” And Also Too, 2017.
36. For an example of digital justice and community tech in action, see the Detroit Community Technology Project.
37. According to The Consentful Tech Project, Digital bodies are “made up of pieces of personal data. Like our physical bodies, our digital bodies exist in relationship with others and can participate in communities. They can also experience harm. Although the harm to them might not be physical, our digital bodies are frequently acted upon in non-consensual ways.” See The Consentful Tech Project. “What Is Consentful Tech?” Consentfultech.io, November 24, 2021.
38. Khalid U. Fallatah, Mahmoud Barhamgi, and Charith Perera, “Personal Data Stores (PDS): A Review,” Sensors 23, no. 3 (January 28, 2023): 1477.
39. Solid Project, “About Solid,” 2025.
40. Eleni Sharp, “Personal Data Stores: Building and Trialling Trusted Data Services,” BBC Research & Development, September 28, 2021.
41. Katharine Miller, “Radical Proposal: Data Cooperatives Could Give Us More Power over Our Data,” Stanford Human-Centered Artificial Intelligence, October 20, 2021. See also Data2X, “How to Build a Data Cooperative: A Practitioner’s Handbook,” June 24, 2024.
42. Jack Henderson and Matt Prewitt, “Data Coalitions and Escrow Agents,” RadicalxChange, May 31, 2023.
43. Jack Hardinges, “What Is a Data Trust?” The ODI, July 10, 2018.
44. Eli Freedman, “Data Unions: The Need for Informational Democracy,” California Law Review, May 2, 2023.
45. Mozilla, “Data Commons Governance,” data.org, accessed October 7, 2025.
46. Julian Tait, “Designing a Data Cooperative to Help Make Homes More Energy Efficient,” Open Data Manchester, January 22, 2021.
47. Salomé Viljoen, “A Relational Theory of Data Governance,” The Yale Law Journal, November 2021. See also Carissa Véliz, “Privacy Is a Collective Concern,” New Statesman, October 22, 2019.
48. Mario Trujillo and Jason Kelley, “A Sale of 23andMe’s Data Would Be Bad for Privacy. Here’s What Customers Can Do,” Electronic Frontier Foundation, October 9, 2024.
49. Jürgen Graf, “Investigating Shadow Profiles: The Data of Others,” Tech Xplore, September 22, 2023.
50. Martin Tisne, “Collective Data Rights Can Stop Big Tech from Obliterating Privacy,” MIT Technology Review, May 25, 2021.
51. Inclusive Design Research Centre, “Introduction to Community-Led Co-Design,” Community-Led Co-design Kit, 2025.
52. Sam Waller, Joy Goodman-Deane, Mike Bradley, Ian Hosking, John Clarkson, “What Is Inclusive Design?,” Inclusive Design Toolkit, 2024.
53. Sasha Costanza-Chock, “Design Justice: Towards an Intersectional Feminist Framework for Design Theory and Practice,” 2018.
54. Afsaneh Rigot, “Design from the Margins.” The Belfer Center for Science and International Affairs, May 13, 2022.
55. nina-signal, “Signal Is for Everyone, and Everyone Is Different,” Signal Messenger, 2023.