These are notes from my session on community-driven tech and techniques of opacity at Mozilla Festival House

“Violence is not a concept to be taken apart” it is people’s lives, experiences, it is unfixed and sometimes exists in traceless ways.
Earlier this year, I began a 10-month research project that sought to investigate how certain countries used discriminatory policies to expand information controls, narrative power, and digital surveillance against gender minorities. For years, my work investigated visibility and invisibility of marginalised communities within technology systems and cultures. I must say, just like many research projects, mine also began with some hypotheses , concepts, speculations and extrapolations.
Mid-May 2024, to keep up with timelines, I reluctantly travelled to my research location for my planned focused group discussion. At the end of the day on 31st May, I stopped recording the final conversation. It was relieving, but left me with a great sense of anger and pain. I could not have anticipated the stories told, or the details of trauma that was communicated through them. I felt hopeless, forcing me to take a long pause before advancing on this work ( I will go in detail in hopefully another piece).
Between the period of collecting stories as data, sitting painfully with the heaviness it left on me, and simply doing the things life required of me including my job, I spoke to another researcher. He was conducting work on how caste inequalities are reconfigured within technology and its affordances. In the conversation, he said,
“I do research from a place of rage. This industry still has many people who are privileged, and you can tell when they do the work on marginalised groups it comes from a place of pity, yet when we do the work, there is a genuine rage and pain that inspires it.”
The pain he described was what I felt while sitting in the room, reassessing and engaging with stories about death, sexual violence, and losing livelihoods. All the while wishing I never committed to the work in the first place.
As I reviewed, analysed and re-listened to the stories, there were relations they described that were not entirely within the realms of privacy and security per se. These relations within the technologies they used moved “covertly, traceless, and in an unfixed manner”. How they pronounced certain words, the language they used to describe specific actions, or spoke in low tones that only those in the room could hear but was distorted and barely captured on my recorder, seemed something beyond concepts that is popularised in technology studies today. It was a practice that they used to only provide access to parts of themselves in particular ways and for only particular people.
Edouard Glissant characterises these practices as the right to opacity, which is the “displacement of reduction” or that which cannot be reduced. In a recent anthology, Queer Data Studies, Nikita Shepard and Gary Kafer, make the case that data in some instances was understood to be an important component for liberatory action, whereas in other cases served to engender violence through surveillance. Kafer on the other hand, just like Glissant, argued that it is a form of immeasurability, and most predictive analytic systems that seem to make people ‘transparent’ accommodates a degree of opacity. People are only ‘transparent’ or ‘measurable’ within the perspectives of those seeking to make them legible in the first place. Values and interpretations may be false in relation to the person(s) actual life.
Surveillance in the same way seeks to make people legible and transparent through various means. ‘Making’ in this sense is often forceful. For most of the people I spoke with, opacity lies between knowing when data is required for survival or liberatory action, and when it is used to facilitate violence and exclusion. Glissant makes the claim that we need to give up the old obsession with discovering what is at the bottom of nature(s), serving the basis of my interest in how this could be a grounding principle for care and dignity within our techno-sciences.
If we focus on the texture of the weave and not the nature of its components as posed by Glissant, what forms of technologies would we create? How would minoritized and targeted communities relate to them differently without the constant discomfort and urge to navigate surveillance?
Exploring this further, I took these questions to a session I facilitated at Mozilla Festival House in Zambia called Flowers and Gardening : Techniques of Opacity As a Method to Achieving Trustworthy AI . I experimented with the analogy of the plant by asking people in the room their gardening techniques. How they cared for their plants, and whether they needed an X-ray with the anatomical composition of their plants to care for, and grieve for them when they died? Most explained that their practices were experimental, ones which stuck with me were, ‘less is more’, and ‘doing the opposite of what the nursery recommends could be useful.’ Plants are made of various opacities and just as Glissant argues, death as well. Plants on the other hand signify life, energies and death, all at once. They rarely speak, but in unique ways communicate what is necessary to keep them alive, or signal their end. For the everyday person, a plant is not transparent or legible. We do not dig up or cut them open to see what lies beneath, yet we cohabitate and care for them.
Opacity to me, and in relation to the stories from the research location, became a necessary resistant practice that may lead us towards certain visions of liberatory technologies we have long thrived for. It may encode dignity and care, while forcing us to relate with each other without the violence of reduction. Currently people in digital rights communities may practise aspects of opacity by fighting for encryption, or choosing alternative digital products. It may be people who use text only formats to access newsletters without tracking links or those who refuse to open or click on a newsletter with such links.
Once my brain discovers a potential approach through community survival practices, my next step is to workshop it with public interest technologists to experiment otherwise. At the festival, The Engine Room’s African AI Pluriverse caught my attention as a set of characteristics that opacity as care and dignity may wriggle and fit into to perhaps complete the circle.
During the discussion, Kiito Shilongo, a technology researcher and policy expert, proposed that for opacity to be built into our techno-societies, it may need to be a data point. The basis may be that to want to opt-out is a data point too, but not as a reason to see more. I may need to further interrogate Kiito on what this meant.
For now, opacity still largely exists as a socio-technical and community practice. It may have to remain so for a while to be truly useful.
