top of page
Search
Writer's pictureKerry Mackereth

Gamergate, Harassment and Feminist Design with Caroline Sinders

In this episode we talk to Caroline Sinders, the human rights researcher, an artist, and the founder of convocation, design and research. We begin by talking about Gamergate, when that was all this kerfuffle about women being gamers. We also talk about what it's like doing high risk research about abusive misogynists online and experiences of doxing. Just to give you a heads up. We do talk about online harassment in today's episode. If you're facing online harassment and you need immediate help Caroline's organization offers pro bono support, so just email, rapid@convocation.design. And they'll get back to you.


Caroline Sinders is a machine-learning-design researcher and artist. For the past few years, they have been examining the intersections of  technology’s impact in society, interface design, artificial intelligence, abuse, and politics in digital, conversational spaces. Sinders is the founder of Convocation Design + Research, an agency focusing on the intersections of machine learning, user research, designing for public good, and solving difficult communication problems. As a designer and researcher, they have worked with Amnesty International, Intel, IBM Watson, the Wikimedia Foundation, and others.


Sinders has held fellowships with the Harvard Kennedy School, the Mozilla Foundation, Yerba Buena Center for the Arts, Eyebeam, STUDIO for Creative Inquiry, and the International Center of Photography. Their work has been supported by the Ford Foundation, Omidyar Network, the Open Technology Fund and the Knight Foundation. Their work has been featured in the Tate Exchange in Tate Modern, Victoria and Albert Museum, MoMA PS1, LABoral, Ars Electronica, the Houston Center for Contemporary Craft, Slate, Quartz, Wired, as well as others. Sinders holds a Masters from New York University’s Interactive Telecommunications Program.


Reading List:



Phillips, Amanda. Gamer Trouble: Feminist Confrontations in Digital Culture. 2020.


Bradley, and De Noronha. Against Borders: The Case for Abolition. 2022. [Kerry calls this 'Abolish Borders' in the episode, but it's 'Against Borders!'].


Drage, Eleanor, and Kerry McInerney. The Good Robot: Why Technology Needs Feminism. 2024. [see the section on 'Good Designs'].


Transcript:


KERRY MCINERNEY:

Hi I'm Dr. Kerry McInerney. Dr. Eleanor Drage and I are the hosts of The Good Robot podcast. Join us as we ask the experts. What is good technology? Is it even possible? And how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website www. thegoodrobot.co.uk where we've got a full transcript of the episode and a specially curated reading list by every guest. We love hearing from listeners, so feel free to tweet or email us. And we'd also so appreciate you leaving us a review on the podcast app, but until then sit back, relax, and enjoy the episode.


ELANOR DRAGE:

Today, we talked to Caroline Sinders, the human rights researcher, an artist, and the founder of convocation, design and research.


We begin by talking about Gamergate, when that was all this kerfuffle about women being gamers. We also talk about what it's like doing high risk research about abusive misogynists online and experiences of doxing.


Just to give you a heads up. We do talk about online harassment in today's episode.


If you're facing online harassment and you need immediate help Caroline's organization offers pro bono support, so just email, rapid@convocation.design. And they'll get back to you. We'll put that email address in the show notes that go with today's episode.


We hope you enjoy the show.


KERRY MCINERNEY:

 So thank you so much for joining us here today. It really is a pleasure to get to chat to you. So just to kick us off, could you tell us a little bit about who you are, what you do, and what's brought you to thinking about feminism, gender, and technology?


CAROLINE SINDERS:

Yeah, thank you so much. I'm Caroline Sinders.


I'm a human rights researcher and artist and a lot of my work actually, the core part of my work looks at how technology impacts marginalized groups through a lens of intersectionality and human rights. Looking at gender and technology is then a very key part of my entire practice. Since, technology is in a way, sometimes a mirror of societal bias, but also amplifies bias and asymmetries, taking an intersectional look at the harms that technology helps facilitate, is incredibly important.


ELEANOR DRAGE:

Awesome. We're looking forward to talking about it. But first, we're the good robot. So what is good technology? Is it even possible? And how can feminism help us get there?


CAROLINE SINDERS:

Oh, that's such a good question. I think Kerry's probably heard me rant about this a little bit because we were on a panel together and we had a lovely coffee. I have such a hard time with the word good, 1 because I think it is something we collectively as a society, even though there's many societies and many collectives within our human existence good is something we should aim for, but good is not universally defined. And I don't think it can be, like, I think good or fair to 1 group, is not necessarily equity for another group.


I have to believe, I think, in human rights that like, better is possible and so that's how I think about good that with whatever we have as the status quo right now, whatever sort of the reality is that we're existing in , that there is a better 1 and that there are ways to get to there.


But I don't know if we can ever have good technology in the way that I don't think we can ever have feminist technology in a way that I don't think we can ever have equitable, truly equitable technology.


Not to sound like a broken millennial record, but I think a lot of that is based on the way that society is structured. I'm like, I don't know if we can have feminist technology if we live under capitalism, if we live under a VC corporate structure, right? I don't think that's possible if so much of our technology is coming from like, 1 place like Silicon Valley. So I think. I'm often trying to aim for something that's better with the understanding of the true sort of North star that I'm aiming for might not ever be something I get to experience in my lifetime or in the next generation's lifetime.


That doesn't mean we shouldn't aim for those things and very much push for regulatory changes and societal shifts to get towards that better.


KERRY MCINERNEY:

And that's really fascinating and actually also reminds me quite a lot of some of the work that Eleanor has done on utopia as a method rather than a blueprint.


So utopia as something that we're always maybe trying to practice and understand, but that we might not ever actually fully realize. At the same time, it's really interesting about your answer to me is on the one hand you say, I don't think that we can have feminist technology under the current structures that we live in, which I would, certainly be really sympathetic to, but you're also the founder of a very famous project called Feminist Dataset.


So could you tell us a little bit more about why you started this particular project with this kind of general kind of criticism in mind and tell us a little bit about it. So what is Feminist Dataset? What does it do and why did you make it?


CAROLINE SINDERS:

Sure, so Feminist Dataset is also an older project of mine.


I'm in my mid thirties, and so I started thinking about, the beginnings of Feminist Dataset before it had a name in 2015 and 2016 when I was a design researcher at IBM, working on IBM Watson. So I was working on natural language processing APIs. And at this time of my career, I'd been focusing a lot on online harassment.


So I'd really cut my teeth as a young researcher looking at Gamergate and how design can impact people's safety. And how do we think of design as a tool to facilitate safety? And one thing I want to contextualize when I started that work in like 2013, like that wasn't really a conversation we were seeing in STS circles.


It was also not a conversation we were seeing in design circles.


ELEANOR DRAGE:

Tell everyone what is Gamergate and also STS is science and technology studies.


CAROLINE SINDERS:

Yeah, so Gamergate was, I would argue a worldwide phenomenon, a massive online harassment campaign that was impacting women, people of color, marginalized groups in video games.


And there's actually a lot of fascinating things about GamerGate we could probably have a 2 hour long conversation about it. It's something I really want to write more about, but Gamergate. At its core was also, I think, representative of of a schism that was happening inside of video games.


So for a lot of people that don't know, video games is a billion dollar industry. It's as big as the film industry. Oftentimes the way games gets portrayed, I think, in pop culture. And I think we've evolved now in 2023 versus 2013 to a slightly different mindset. That's more nuanced about games, but for a long time, games are to self identify as a gamer.


You're taking on like a nerd identity. You're taking on in some ways, I want to say like a lower social class identity, right? And that came with it all different kinds of harmful pop cultural stereotypes that gamers were like white, overweight, older men. That's not actually the main group of gamers.


I think it was in 2015, 2016 there was a census done on who was buying video games and who was playing them. And with the rise of mobile games and Kim Kardashian's game, actually the largest amount of gamers are women 18 to 24 and that really throws into really disrupts and turns upside down this idea of who we think of as gamers, but everyone's a gamer.


If you have a cell phone. You're a gamer, right? In a way that you don't identify as a booker, you read books, or you don't identify as like a movie or a filmer. If you watch movies games are a form of entertainment. Every culture has forms of games. We all play games as children, as adolescents, as adults gaming and games generally.


Even outside of video games are like a very large part of society. I would argue.


KERRY MCINERNEY:

You gave me the most visceral, like emotional and physical visual flashback. I definitely played that Kim Kardashian game. It's...


CAROLINE SINDERS:

that game was really fun.


And but also like play is a massive part of our society, right? Play and entertainment. And these are not superfluous things. They're core parts of our society. Like how do children learn? Play is actually a part of that. Play is how we learn how to build empathy, engage with other.


Plays are playing are, is, are great ways to just understand how systems work. We could also, again, have another whole side conversation, I think, on issues of gamification. Because I don't think that's the answer to education, but play is like a core part of our society. Gamergate came out of this movement in video games that was happening way before Gamergate.


This is my theory, and this is from interviewing different games critics talking to indie gamers. There are indie filmmakers, right? There is an indie game scene. It's a very thriving economy. It's also, again, worldwide here in the UK. Places like Manchester and Bristol have really thriving game studios and game centers.


This is true across Europe. It's true across Australia. It's true across, Asia African continent, the Middle East, et cetera, North America, South America. There are game studios everywhere and there are big game studios and small game studios, right? So again, it's a thriving economy. So prior to 2013, there was like a lot of tools that were being built to help people make all different kinds of games.


And I think there was also a criticality coming into games that was really necessary that was happening in like the 2000s. And it was like a normal criticality that you would see in other industries as well, people reviewing games and being like, what's the story versus just like, how does the game play?


And gameplay is a thing you should consider when looking at games, obviously, but like the story plot, who are the characters, right? Are there women characters in this world? For example, I'm blanking on one of the games, I think it's Assassin's Creed, but there was like no women at all in that video game.


Not even as like background characters. That's absurd, right? Some of these are absurd things. And this criticality was coming into how people were reviewing games. Again, this is normal. If you read a book review, or like a film review, or a play review, or an art review, right? These are normal aspects of criticality you would bring into any medium.


And so there was pushback with a lot of gamers online who are white men of 'just this kind of social justice is not needed, people are misunderstanding this, these people are not real gamers;, and a lot of this also coincided with how a lot of mainstream or well known games media like Kotaku, which was a part of the Gizmodo Media Group, which was really well known, Gamasutra, which is also a games blog, Polygon, et cetera, of how they were also reviewing games and the games they decided to start reviewing.


So people started reviewing more indie games, which tend to be more experimental, which also tend to be at times, sometimes more intersectional, right? And so this was leading to a lot of criticism and pushback. So already there was this schism forming inside of games, right? There is a documentary that came out during Gamergate that was about women and gamers that was filmed pre Gamergate, just the amount of harassment that women game developers were facing in terms of the kinds of games they were starting to make, and these are indie developers. Zoe Quinn's Depression Quest is a really great example of this. I think that's also one of the reasons people really jumped on Zoe Quinn, we could argue, as this, as the victim zero, if you will, of Gamergate.


But there's also this canonical essay that came out by Leigh Alexander, who's now a narrative designer, and at the time was a games designer. She's based here in the UK, on how the gamer is dead, we're all gamers. And again, this is like a type of essay I would expect to see in any kind of entertainment medium.


I'm trained as a photographer. 1 of my undergrad professors, Fred Richen. Has a whole essay on how the photographers are dead, we're all photographers because of cell phone photography again, a very kind of normalized framework to approach thinking about. What is the identity of this? But again, all this is building up to 2013.


So there's this massive schism. And I think what kind of ended up breaking that was harassment of Zoe Quinn. Her ex boyfriend wrote this long diatribe about her and framed it as her being this problem that was indicative of games.


ELEANOR DRAGE:

Zoe Quinn is this American...


CAROLINE SINDERS:

Narrative designer. She writes a lot of comics now still also works in games and she was known at the time for authoring this really amazing game I recommend all of you play called Depression Quest. And it is really much like, how does it feel to have depression? It's very text based.


So also, a lot of people didn't like it because it was a text based game, which was seen as like a lesser form of games. It's not. We've had hypertext games and hypertext art for a very long time. And electronic media but this essay, I think encapsulated a lot of this really misogynistic, harmful thinking that you could see perpetuating in the games industry.


Again. That was early games fandom. If you will. That was bubbling up from this sort of schism that was happening with this kind of criticality. So gamer gate became a name for that. Now, for a lot of the harassers and bad actors and sort of fans of gamer gate, I will say that, for them, this became a rallying call to group together a lot of disparate claims and thoughts they had about video games that I would argue are harmful, also harmful in the way that they manifested.


One of the claims was like, it's about ethics and video games. Now, we could have a lot of conversations about ethics in journalism, et cetera, ethics in enter any medium, but their claims were more like, These are friends writing about friends and like women developers are sleeping with the journalists and that's how they're getting their games covered and this is unfair.


And these are like, not proven, but also, 1 thing I do want to highlight is when you work in tech, let's say, or any industry and you're a creator, you do start to know the people that cover your work. You might become friends with some of them. You might not, but it's not weird to like have the email address of a journalist to be like, I launched this paper.


Do you want to read it? That is a sort of very normal thing. And for whatever reason, this kind of normalcy was also like, not making it into gamer gate. So it involved a lot of conspiratorial thinking, um, again, that manifested in real world harm, like doxing, like rape threats, like people being driven from their homes, bomb threats.


When people were speaking, this happened repeatedly to Anita Sarkeesian. And so a lot of this harassment was happening online under a hashtag called Gamergate across multiple platforms and also Gamergate had open and closed channels of where they would talk about this coordinated harassment.


They would talk about it on their subreddit Kotaku in action, which I don't believe was ever shut down by Reddit, even though there was multiple kinds of documentation that harassment was coming from here. There was 2 different boards on 4chan and 8chan called GG or some kind of variation of that.


There were, there are private channels that folks were talking on like IRC channels. This was fairly pre discord. I'm sure if Discord had been around at the time, they would have been on Discord and twitch. I think it was really interesting for me as a young online harassment researcher was also seeing, Again, to put this into context is 2013, 2014, 2015 was seeing that when harassment happened on one platform, even though there's heavy evidence of it.


People coordinating on Reddit to harass someone on Twitter, Twitter would not consider that evidence, right? Because the coordination, the planning wasn't happening on Twitter, it was just manifesting onto Twitter. And I think this also really helped change and evolve current forms of trust and safety of where we are.


by understanding that this coordination can happen out in the open. And it's not just conjecture, right? These are actual plans. So for me, a lot of my work, I think the way you could classically define it is related to trust and safety, online gender based violence, and it came out of studying Gamergate.


One of the things I would do is I would interview a lot of victims because I ended up actually knowing a lot of people affected by Gamergate because I was making video games for fun when I was in grad school. And so I knew a lot of people personally, but I also knew I could see this sort of unfolding around in my community.


And in a way, I was engaging in varying forms of digital, and then non digital ethnography and anthropology also by then purposely starting to attend more and more events. Now, we can get into sort of, is that the best way to do those methods. I have a lot of thoughts about that we can chat about later.


For example, I would infiltrate some of their spaces, I received criticism from a researcher that I should have announced, for example, that I was there studying them. And I was like, do you understand how harassment works online? I'm absolutely not going to announce that I'm here.


That is a really great way for me to get harassed. And eventually I ended up facing a lot of harassment. Gamergates sent a SWAT team to my mom's house. They would like, repeatedly contact my work. Anyway, so all this was happening while I was graduating and working at IBM. I was doing a lot of this research outside of my time at IBM and I was thinking a lot about design interventions and then could there ever be technical interventions.


And at a certain point, this work evolved into me also studying the rise of the American alt right and the American far right because around the election of Trump, a lot of actors in Gamergate started moving into this, again, classical space we could call the far right. And so at a certain point, I needed like a palate cleanser that was not just studying online harassment and white supremacy.


I needed to look at something else. And so at the time I was still really excited about the potential of AI as a very young researcher and young designer. And so I was curious could you ever have Feminist AI. So what started in 2017, which was supposed to be a very short project has emerged into not a short project and to something that is maybe not as hopeful when I started it, but something I'm still very much invested in, which is really trying to answer this question of what is feminist AI and is it possible?


And what does it look like, which is a long way to get to where we are now.


ELEANOR DRAGE:

Wow. What a story. It's so hard when you do that kind of in depth research on harassment online. I have students who are doing something similar and they want to get their ethics approval forms signed, and it's just really tricky and I'm trying to work out now, like what's safe to put yourself through, how should this kind of research be conducted. Yeah, it's really hard. So what is a feminist approach to trust safety in practice? What should we be doing?


CAROLINE SINDERS:

I think it's, I think it's hard to say so 1 of the things that's important is contextualize at least a lot about my work is you, we have to always place the criticism in a context. So feminist trust and safety generally at its core would be. Do you have very intersectional practices?


Are you able to embody those practices at scale? For example, we actually look a little bit the policy that meta has outlined, particularly for Facebook. It's actually not very bad. It's actually quite good. The reason it's not necessarily feminist is in how it's actually embodied in a way that it's not very consistent, right?


It's also not very transparent in terms of how do they make how do they make decisions? And this is a very thorny thing to try to summarize. And part of that is, it's trust and safety has many different levels and aspects to it. So you can have a trust and safety team. You'll have a content moderation team, which also might do forms of investigation.


And this is not consistent across all companies or platforms or social networks. I used to work on what is now a trust and safety team at Wikimedia used to be called the anti harassment team. And like, how Wikimedia looks at trust and safety and how it's handled is very different. For example, then Facebook, and I, it's, I will not say that it's necessarily better because it's not because Wikimedia relies on a lot of volunteer labor?


And I would argue that's not a feminist value or principle, even if stuff is happening in the open. There are multiple sort of areas to unpack with generally just that, but something to think about is, harassment investigations should take a little bit of time. There's many things that victims need that systems cannot provide for them.


So some of that is care. Some of that is guidance, depending upon the kind of harassment they're facing. A lot of these gaps that I noticed in my research or things that my, my human rights lab tries to help fill those gaps. We provide free, rapid response for anyone facing online harassment, any form of online harm. Right now we've been doing a lot of work to help mitigate doxing that people are facing related to protest related to aspects of their identity. But 1 could argue that these are things, perhaps a platform should be providing.


If you're facing harassment on their platform in a way, right, shouldn't, shouldn't they be the ones helping you walk through and think about what you can do to mitigate these harms. There's a lot of harms that let's say, if you're again, being doxed on Twitter, there's a lot of things you can do even outside of Twitter, right?


To ensure that you're safe. And this is this is hard information sometimes to pass to the general public because it does require again, a little bit of a background in security and privacy. And when we think about what is feminist trust and safety, there's also this much larger context.


We don't live in a harassment literate society. We don't live in a security safe society, right? Or security savvy society. And that's not the feeling of victims, that's that's a societal problem. And so a lot of the design of these platforms, even from the very get go are not designed necessarily to be safe in terms of this kind of intersectional space that we're looking at, meaning when you sign up for something like Facebook, Twitter, or TikTok, like, a lot of things are turned on automatically for you. You're extremely exposed. And I think people end up making decisions because they haven't faced harassment yet, so they don't understand what does it mean to be this exposed in a system this large, right?


If we look at um, a theory of proxemics of how there's different kinds of social networks you're existing in, the very intimate, to the personal to the social, to the public sphere of tick tock is larger than probably any public sphere. You're used to and like a physicalized environment.


So then you don't, we don't have a 1 to 1 understanding of what it means to be that exposed and even if let's say exposed is the right term for that. So we're asking people to make actually incredibly nuanced decisions about what does it mean to exist publicly, we're asking them to make a lot of decisions that I think are really hard to conceptualize even for me, an expert in online harassment. I have a hard time understanding those trade offs as well because there's not anything that's ever prepared me for those trade offs.


And what's really awful is that even if you are digitally literate, because a lot of our clients end up being millennials and zoomers, even if you're digitally literate, there's not anything that really prepares you in your life for online harassment until you face harassment. So a lot of preemptive things you can take to reduce harassment also fly in the face of norms that social media has set up.


So deciding to be semi private, right, can be hard or deciding to use a different name or, trying to hide aspects of where you live. Those are like. Those are like anti norms to social media. And so it's asking a lot of users and consumers to do that preemptive work when it's going against the norm design of the social network itself.


To even try to answer what is feminist trust and safety? It does really go back to maybe a different question of what is a feminist social network and then an even deeper question of sometimes to create right now a feminist social network actually means creating maybe a semi closed network.


And is that truly feminist? If if those that face less harm than us can be actually more public and this gets into a really big issue that I see a lot with the work that I do with human rights groups ,with the harassment that journalists face, which is the self censoring effect that people will self censor or leave platforms because they face harassment.


So what we're seeing is less of a freedom of expression. So this is this is an example of why also, can we ever have feminist AI such a hard question to answer, because there is a theory and a practicality that have to come together when we answer these questions. And so to try to envision what is feminist AI, what is feminist trust and safety means we have to start to try to envision a world without harassment, I don't know if we have that. So a lot of my work is thinking, instead of getting to that future, what is the bridge to build to safety right now that people need immediately? What should things look like in the next few years?


Once we get to that, is that the space where we can start to imagine truly what actually is feminist trust and safety, and I guess a much shorter answer is maybe we need a variety of social networks because maybe a social network with 100 people, maybe that actually is the feminist social network. Maybe that is where feminist trust and safety can truly be embodied at a small scale. And then maybe that becomes the blueprint for these sort of larger imaginings. But this is to say it's complicated, and I know that's not a very satisfying answer.


KERRY MCINERNEY:

That's such an important one though, and I think the tensions you raised between not only what it means to be present and to be public versus what it means to be more private or to be more safe um, but also the tensions between, wanting this kind of harassment free future and also needing desperately actions towards safety now.


I think those attentions are so central to so many of our feminist concerns when it comes not only to technology, but to our societies. And I really like, and I think it's Abolish Borders by Luke de Noronha and Gracie May Bradley. Just for all our listeners, we do have a reading list on our website.


We'll put in some of the books and articles that we are flagging. And of course, all of Caroline's super cool projects. So you can go check that out at www. thegoodrobot.co.uk. But something that Luke and Gracie talk about is this idea of non reformist reforms. What are the policies and the actions that we take towards abolishing national borders while simultaneously not seeing those things as making borders better, but rather being necessary steps to guard against their harms until we can have this better future.


And I think this question around, what would it mean to feel safe versus the unfairness of feeling unsafe was really like brought forward to me. I went to this discussion group around specifically anti-Asian violence during COVID. And I remember there was this real tension amongst participants around, what can we do to keep ourselves safe, and then this feeling of we shouldn't have to feel this unsafe.


So why are we deliberately accommodating the sense of unsafety? And I don't think there was ever like a resolution really to that tension. But just, on that note, because I recognize we're in this kind of like strange, weird, slightly confusing and distressing space, which is often where feminist work takes us.


So I want to ask you about the future. Again, this is not compelling you towards a positive answer, but just because I think these questions are really interesting. So what for you then is the future of your work like feminist data set of your kind of work thinking about feminism and technology? Like, where do you see this work going in the future?


CAROLINE SINDERS:

Thank you so much for asking that a big thing. I often advocate for is I want us to live in harassment literate society, partially because I think that would make a lot of my work so much easier in a way. 1 of the things we used to say at Wikimedia, which was like very idealistic is the goal is to put ourselves out of jobs, which like, meaning that there shouldn't, if we can help try to build or facilitate a system in such a way that.


You wouldn't necessarily need this much trust and safety. I will say now that that was something 1 of my male colleagues would say, and I always took it as more of a sign of we're aiming for reduction, but I think we're always going to need trust and safety because community doesn't mean friendship and it doesn't mean conflict free.


And I think it's important to understand while I do not like this book, I like the title that conflict is not abuse, but that more in the sense of as a community, we might not like everyone or each other, and that doesn't mean that we should have to be friends, but it does mean we need to exist together.


So I think there's always a role for someone to be a mediator to someone, to understand what's happening. Because I also believe that there's a role for restorative and transformative justice for abolition and for education. And I don't want to be too idealistic and say that would solve everything.


But I think those are things we can look towards. For me, I think when I think about what is the way to facilitate the starting of like Feminist Dataset, which is called Dataset because I started with data, but it's really much more about intersectional machine learning. And again, at the time in 2017, we were using feminism a lot more instead of intersectional or to name rename it.


Now, I'd be like, intersectional ML perhaps, but. I think a very core understanding is if we lived in a harassment literate society, that would actually help better reveal a lot of the tensions inside of what Feminist Dataset is trying to reveal. And what I mean by that is I see this a lot with open source groups that I work with, and I see this a lot with communities where sometimes there's an idea of just if we have a code of conduct, that's enough.


It's not enough. You have to be trained to implement a code of conduct. You have to be trained to understand that harassment is a very wide spectrum. I think folks sometimes have an idea of what they would do with, let's say, more quote, unquote, extreme forms of harassment. Let's say physical violence, but I think Kerry, touching on what you were bringing up.


Safety actually looks like a variety of things and the ways to be safe. You need a variety of tactics, so many tactics, and a lot of this involves what I call threat modeling, which is being able to assess a situation, understand what are the likelihoods of threats from the micro to the macro from the more extreme to the less extreme from the most immediate to maybe the most downstream or unrealistic right and being able to then understand what's available to you to mitigate those harms. Now, this is, this is a kind of mindset that those in the security and privacy space have, especially those providing support to folks that need security help.


This is not a universal fluency, but in ways we actually have universal threat modeling. If you can look at a busy street, decide when to cross it. You're already knowing how to threat model. Its just we aren't taught how to apply this, let's say, in other ways, shapes or forms to to interactions we have in communities and into online species, because that requires a deep level of knowledge of how settings work, of how technically a system works, what's available to you, and also trying to understand all these different kinds of later on impacts.


So if you've never been doxed, you might not understand. All the ways that could impact your life. Doxing is the release of public information and or private documents about you. Social security number or an immigration number or a visa number or your address or your personal email or your personal phone number or your mom's personal phone number.


Like, understanding that the whole gamut of doxing does take, threat modeling does take a lot of reading about doxing, and so one of the things I think that would make our lives better in a way is if we lived, if we could get to harassment literacy, of understanding all the different kinds of forms of harassment, not just some of the more extreme forms, and I think this would even impact how we're able to look at all different kinds of conflicts with a lot more nuance, versus seeing things as a binary.


I think related to this 1 of the things I often say is 1 of the hard things about people understanding harassment for us getting to a harassment literate society: people internally grappling with their own feelings about a harasser. This might not be someone who's harassing you. 1 thing I often say is, harassment's straightforward, what's complicated is your feelings about the bad actor or the harasser. So if the harasser is your friend and they're harassing someone, you don't know, you might try to like, really understand or create a logic framework around why that person is engaging in something.


And sometimes what people lean on is what did the victim do? They must have done something to do this. And sometimes the sad answer is they didn't do anything. They were just hanging out. They were just there. That person just did not like them. And that's again, a deeply unsatisfying answer, but being able to understand that's an answer that's a reason that is a reason someone could engage in harm that nothing happened removing that cause and effect idea .Harassment is what sort of gets us there. And I think that's more helpful in understanding. How did a machine do X? Or I think it's even more helpful in understanding, like, how a piece of software isn't neutral. I think it helps us broaden the idea and deeper understanding again, and the nuance of harm and what harm looks like across the physical and the digital because those are 2 linked worlds.


The hope I have is especially with Generation Z and the next generations, they're starting to actually have such more nuanced understandings of what harm and justice look like in a way that I don't see in my generation, and I don't see in older generations. And so I'm hoping that we are going to get much sooner to this space of harassment literacy, but we're still not there yet.


And so I think that is. That's my hope, but that's also this complicating factor.


ELEANOR DRAGE:

Thank you so much for drawing that out for us. And I think that, the utopia there is working with each other and the possibility of something beyond this that we're working towards. So just to come back to Kerry's thoughts at the beginning, this is why learning from these processes is super important, but also knowing what we don't want.


And the idea that I find most compelling at the moment is Hannah Arendt's idea of living together in this unchosen way. We don't choose who we occupy the earth with. And that makes things really complicated. Anyway, I could, we could talk all day, but thank you so much.


CAROLINE SINDERS:

Yeah, thank you. I feel so bad because we barely talked about feminist data set. 1 thing I do want to say about feminist data set is 1 of the things that did change my thinking around. It was from 2017 I think to a few years later was actually thinking about what does feminism mean in every step of the machine learning pipeline?


Because I started it thinking about data and then as my practice grew much more deeply, in the sense of I was engaging a lot more with policy and not just rapid response, but understanding how policy design and technology actually all need to fit together. That really changed the way in which I started approaching, thinking about building software and making software in the context of software.


Again, it's right and I'm sure this is again, things y'all have covered on so much on the podcast. It's not just the data set. It's the context of the product and the software. It's how it's built. It's. I think a redress process of when harm arises, of what does that look like and how do you really greet and respond to that harm.


I think that involves so many things, and that is what really shifted my thinking. And then even thinking too of like, where are things breaking down? Like I use a MacBook Pro. Is that a feminist tool? No, , but but there are aspects about it that are good and better than other tools for me. Like I sometimes need accessibility tools.


Apple products are designed really well for that. And so a lot, a large part of my human rights thinking is thinking about sort of the reality of these trade offs and what they mean, and trying to walk away from a binary of good and bad of this tool might benefit you for a particular need.


We can still critique the tool, that's not critiquing you using the tool necessarily, but it's understanding that we live in this incredibly imperfect world. And I think my utopian idea is that we can call out those imperfections and work towards fixing them, understanding that there are always going to be imperfections, but that, that there can be things built and addressed and redressed as harm continues to arise.


And that we actually greet that as a design and research ethos to say it's on all of us. And including the companies, but it's on us to say we are maintainers and builders and we're stewards of care and care requires a constant, a constant maintenance process.


And so that's my utopian thinking.


ELEANOR DRAGE:

This episode was made possible thanks to the generosity of Christina Gaw and the Mercator Foundation. It was produced by Eleanor Drage and Kerry McInerney and edited by Eleanor Drage.

30 views0 comments

Commentaires


bottom of page