In this episode, we talked to Dr. Matt Mahmoudi, a researcher and advisor on artificial intelligence and human rights at Amnesty International, and an affiliated lecturer at the Department of Sociology at the University of Cambridge. We discuss how AI is being used to survey Palestinians in Hebron and East Jerusalem, both in their bedrooms and in their streets, which Dutch and Chinese companies are supporting this surveillance, and how Israeli security forces have been pivotal to the training of US police. We also think about creative resistance projects like plastering stickers on cameras to notify passes by that they're being watched.
Matt Mahmoudi is Researcher/Adviser on Artificial Intelligence & Human Rights at Amnesty Tech, where he has spent the last two years leading the effort to ban facial recognition technologies. He is an Affiliated Lecturer at the University of Cambridge. Mahmoudi is co-author of the book Digital Witness, published by Oxford University Press.
Reading List:
Resisting Borders and Technologies of Violence, https://www.haymarketbooks.org/books/2094-resisting-borders-and-technologies-of-violence
Digital Witness: https://global.oup.com/academic/product/digital-witness-9780198836070?cc=us&lang=en&
Automated Apartheid: https://www.amnesty.org/en/documents/mde15/6701/2023/en/
Transcript:
KERRY MCINERNEY:
Hi, I'm Dr. Kerry McInerney. Dr. Eleanor Drage and I are the hosts of The Good Robot Podcast. Join us as we ask the experts what is good technology, is it even possible, and how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website, www.thegoodrobot.co.uk, where we've got a full transcript of the episode, an especially curated reading list by every guest. We love hearing from listeners, so feel free to tweet or email us. And also so appreciate you leaving us a review on the podcast app. But until then, sit back, relax, and enjoy the episode.
ELEANOR DRAGE:
In this episode, we talked to Dr. Matt Mahmoudi, a researcher and advisor on artificial intelligence and human rights at Amnesty International, and an affiliated lecturer at the Department of Sociology at the University of Cambridge.
We discuss how AI is being used to survey Palestinians in Hebron and East Jerusalem, both in their bedrooms and in their streets, which Dutch and Chinese companies are supporting this surveillance, and how Israeli security forces have been pivotal to the training of US police. We also think about creative resistance projects like plastering stickers on cameras to notify passes by that they're being watched. We hope you enjoy the show.
KERRY MCINERNEY:
Um, amazing. Thank you so much for joining us here today. Uh, you're someone I've really wanted to have on the podcast for a super long time. So could you tell us a little bit about who you are, what you do, and what's brought you to thinking about gender, race, and technology?
MATT MAHMOUDI:
Yeah, thanks for, for having me on. Um, so I am Matt Mahmoudi. I am a researcher and advisor on artificial intelligence and human rights at Amnesty's Technology and Human Rights Project, and I'm an affiliated lecturer at the Department of Sociology back at Cambridge. Um, I am basically working on hunting bad tech companies for a living.
Um, so I look at the ways in which technologists provide, uh, products to, uh, governments and to various other institutions that are then often used, um, in pursuit of security oriented goals ostensibly, but really towards the oppression and discrimination of historically marginalized communities.
And whilst a lot of this work has historically, at least within the NGO sector, been focused a lot on sort of like the negative outcomes of various forms of artificial intelligence, be they along racial lines, gender lines, lines of sexuality, my interest to this comes more from a structural, uh, point of view.
And so I'm more interested, um, and from my own research in the ways in which racial capitalism, which underpins the ways in which the tech sector in itself was set up to start with, has led to the forms of discrimination, um, that have often been associated with the sector as sort of like a side effect or as a, as a sector having gone rogue, but really like thinking more deeply about how. Those are actually, uh, design features. They're, they're not bugs as they were. And, and my foray into this comes from the migration angle. So specifically how communities that are mobile, specifically how communities that are made stateless in other ways are subjected to the use and discriminatory aspects of these technologies and sort of, being stuck into a place in which value can be extracted from them again, in search for, or ostensibly in search for security- oriented goals, but really for for profit.
ELEANOR DRAGE:
Well if any of us thought we were underachievers, now is is the moment... I sat next to you during your podcast and you also have those beautiful books where you wrote all these lovely notes and I just thought, oh my God, I'm definitely a second tier academic. Um, so I'm so excited for you to give your take on our good robot questions. What is good technology? Is it even possible? And how can feminism help us work towards it?
MATT MAHMOUDI:
Right. Okay. Um, so I'm gonna start with the second part of your question in relation to what is good technology? So is it even possible if that's okay? Um, so yes, good technology is possible. Uh, different world is always possible and it's the moment we start losing sight of that, uh, that contemporary systems of technology, augmented global racial capital takes full hold of our imaginations.
So good technologies in, in the way that I understand them are tools that are agnostic of the sort of fantasies around efficiency, profit maximization, carcerality. And instead they create infrastructures that redistribute resources equitably and advance adjust society. With that said, I'm not convinced that these technologies really exist today.
Of course, there are sort of admirable applications of technologies that are being used to resist hegemonic power structures. So in my own work, for example, I've looked at, uh, Syrian women who have organized entire women's- only newcomer Wikipedias, as it were, through social media groups, um, which in turn have helped upset power dynamics between not only local authorities who are purporting to provide services to refugees and refugees themselves, but also between Syrian men and women. And this has often worked in profoundly productive ways, but I've also seen how anti- domestic and state-sanctioned surveillance groups have plastered stickers on cameras in smart cities, um, notifying passerbys that they were being watched, right? So there's different ways in which we see technologies being used in resistive ways.
But these applications from where I sit are rooted in the struggle against global apartheid and racial capitalism. And so until we turn a corner on these structural realities, we're in resistance mode, as it were and as we should be. Um, but I'm conscious that the aspiration is to be in a place of building, a place of imagination and a place of creativity.
And it's harder for me to imagine that we are in that place today. As far as the question around how can feminism help us move towards it, you know, feminism, especially intersectional feminist scholarship, rooted within the Black radical tradition, is not a new player to this. Feminists have long laid the groundwork for growing a different world rooted in internationalism, solidarity, equitable infrastructure, and, and more, and, and from, from neoism to Xeno feminism to sort of the new framing around the new Jim code, feminist activist scholars in particular have long provided by far the most compelling groundwork, I think, for why we struggle and how we get from our technical status quo to a different reality and has sort of not just laid the groundwork for dystopian thinking and sort of like critique, but also how we turn that critique into a utopian world. Most recently a, a book from a critical race scholar that, that we'll all be familiar with, a critical race and digital studies scholar, Ruha Benjamin, um, Viral Justice sort of tried to do exactly that, which is to look at some of the ways in which small forms of equitable just societies were forming at sort of the beginnings and the height of the pandemic.
And the ways in which they were challenging some of the dominant structures that were leading to them weathering various forms of injustice and inequality, and thinking about how those efforts need to start now in smaller but viral ways to scale up to something larger, eventually.
KERRY MCINERNEY:
Oh, that's so interesting to hear you can reflect on these questions because I feel like, since so much of your work is in that resistance space and is on mapping and tracking how technologies are used in really carceral and oppressive ways, we don't often get to have the opportunity to talk about these more, you know, hopeful liberatory possibilities.
And, you know, I also love Ruha's work and just see her as someone who really lives out those forms of small viral justices that she talks about as well, just in her everyday interactions with people and the interactions I've had with her. So that's just, I think a really special and important thing in a scholar.
Um, I also love the idea, though, of like different technologies against each other. The technology of the camera versus the technology of the sticker, such a simple technology in opposition and yet very effective in some ways. Um, I actually wanna pivot though, to the huge kind of piece of research you recently put out with Amnesty Tech, which is, um, a report called Automated Apartheid: on the Use of Facial Recognition Technology and Surveillance in the Occupied Palestinian Territories.
So that's what the bulk of our conversation today will be about. So to start us off, could you tell us a bit about the history of surveillance in this region?
MATT MAHMOUDI:
Yes, of course. So what we are seeing here is really the latest in not just the slew of technological forms of surveillance, but really, practices of surveillance that harken back to British imperial, British colonial rule, um, under the British mandate in Palestine, and here you know, various rosters, uh, information, books, diaries, databases were created, keeping stock of the various different communities, um, unnamed villages at least to the, to the eyes of the British various forms of resistance that was happening at the time, taking stock of individuals down to what they look like and, and, and really down to their propensity to engage in violence and resistance and, and those forms of, you know, tracking.
Um, Are really the sort of the basis for how the state of Israel emerges in the first place. This sort of an empowered state, um, that is capable of using these earlier forms of tracking and surveillance to also perform aspects of its apartheid policies against Palestinians in the, uh, occupy Palestinian Territories.
And, and again, even this is considered a modern form of surveillance and colonial tracking. You might go as far back as to the, British invention of the census in British India at the time, um, to really get a sense of, you know, how was the Metropole trying to translate, as it were the quote unquote periphery, to various members of the aristocracy to sort of, uh, give an illusion of control over its colonial subjects.
And so the census was started in an effort to chit track of. Who people were, um, who were, uh, occupying various places in the, in the power echelon and the ways in which you might be able to incentivize them to collaborate, et cetera. And so again, the forms of enumeration strategies as are often referred to that form, much of the colonial imaginary isn't just something that we're witnessing today.
Something that goes far, far further back. It's just that now we're seeing it scaled up in a different way. Now we're seeing, you know, instead of having your ID cards that are showing your ethnic and racial attribution as you might have in the case of Rwanda or apartheid in South Africa, you're now seeing those things inscribed into our bodies and our biometrics, and the ways that they are extracted from our faces using facial recognition or the ways they're extracted from our fingerprints and based on sort of notions of what matters and notions of who occupies what place in the digital racial hierarchy as it were.
ELEANOR DRAGE:
Let's pull out this surveillance thought a little bit more and look at Palestine and the way that it's always been surveyed, but now that surveillance is exacerbated through the use of facial recognition. One of these programs has been called Red Wolf. So can you tell us what it is and what it does?
MATT MAHMOUDI:
So the Red Wolf system is a system of facial recognition that is operating at checkpoints in the occupied Palestinian Territories. And in particular, I should say that the only place in which we've been able to investigate this, uh, somewhat fruitfully has been in Hebron and in particular in H2.
So what's important to understand is that, uh, H2 is a part of two sectors in Hebron, um, H1 and H2, it was divided into this, where the H2 fell under effectively the rule of the civil administration, which falls under the state of Israel, where they have a military presence and occupation and where 33,000 Palestinians that live there, versus the 850 illegal Israeli settlers are effectively kept in place within their neighborhoods and subject to stringent movement restrictions. That's part of the argument for why this is considered apartheid in the first place, is given the sort of arbitrary restrictions on the freedom of movement.
The way that Red Wolf works is that the system operates at checkpoints that are necessary for Palestinians to pass through in order to access things such as education, work, medical services. Oftentimes you'll have situations that we heard about through our testimonies in which say, families that need to get access to an ambulance have to first pass through a checkpoint, the ambulance can't come through to their neighborhood, has to wait outside the checkpoint, and they may find themselves in a situation in which the soldier doesn't let them pass unless they're friendly with them, or for whatever reason happens to know them and act in good faith. By introducing an algorithm at the checkpoint, you're removing any form of that sort of human leniency that might have existed in the first place. So what we're witnessing is a Palestinian coming into the checkpoint, which is catered only to the Palestinian, it's, it's not for use by Israeli settlers, it's only for Palestinians, whose face is being scanned the moment they walk in through a turnstile, a light indicator shows up on a computer screen by the operator.
It might go green, which case you're lucky you can pass through. It might go yellow, in which case oftentimes there's either a bug or there's simply not clarity around whether they should let you pass you. And if it goes red, Which is what you're hoping it doesn't, you are dealing with a situation in which either you haven't been biometrically registered yet, in which case the soldier will lock down both turnstiles and you will be coerced into producing an ID to be matched with an image that was taken up for you in the first place, so as to train the system to be able to pick you up in future, or you may have been scheduled for detention, for questioning or for whatever else that you are unaware of, but crucially here, people's faces are registered into the system without their knowledge and consent. Folks rock up to the checkpoint. They have no idea how they're known, how they're being recognized, they don't know the soldier, but they're being addressed by their name and their details. So this system feeds into a much, much larger database than we believe was curated using the Blue Wolf system, which is a system of app-based facial recognition that was documented first in November, 2021. That system effectively empowered soldiers to walk around and take as many pictures of Palestinian faces as possible, and they were incentivized to do so. You know, military units were ranked by how many faces they'd captured on any given day and were given prizes, including gift cards and days off. Um, that system helped create the kind of biometric database that underpins much of the surveillance system in he run in the first place.
The reason why Red Wolf is so particularly problematic or so nefariously different to, to what we might have seen previously is the ways in which it reinforces those restrictions on the freedom of movement and subjects movement to an algorithmic calculation.
ELEANOR DRAGE:
It's really chilling. And out of interest, who's providing the kind of infrastructure required for something like Red Wolf to function? Is this something which is very much being created and enforced by the Israeli state, or is it the result of different kinds of, say, international partnerships?
MATT MAHMOUDI:
So there has in the past been documentations around what companies have been providing software-based facial recognition to the state of Israel. However, as far as the Red Wolf system goes, it's all under closed doors. We have no idea who actually develops the software. We just know that it exists.
We know from the soldiers who spoke to us and provided testimony, and we know from the Palestinian families, activists and scholars who move through those checkpoints and talk about their experiences. We do, however, know that they require physical infrastructure in order to operate, right?
So they require cameras and these don't need to be. The world's fanciest cameras they can be anywhere up to five to 10 years old as long as they're networked and provide a sort of facial capture that is cameras that have a high enough resolution that should be possible.
They're able to plug into these systems. And so, in places like East Jerusalem, for example with a different facial recognition system, we were able to document Hikvision cameras with such a Chinese surveillance manufacturer. Uh, we found TKH Security, which is a Dutch surveillance manufacturer. In Hebron, most of the cameras that were placed outside of the checkpoints were stripped of any branding, and so it was very difficult to identify exactly what company was behind them. However they looked awfully similar to a lot of the other companies' products that we'd identified elsewhere.
So that's to say that these sort of infrastructures of violence, these infrastructures of technology, augmented apart height, they don't come to existence just out of the blue one day, when Israeli state decides to develop the software, right? They're, they come on the back of the, uh, provision of products of CCTV cameras, uh, from companies outside of the state of Israel, um, who are effectively at risk of being complicit in the international crime of apartheid unless they find ways of moving their products out of the area.
KERRY MCINERNEY:
Yeah, I think that's really important to think about what kinds of infrastructures, but also what kinds of collaborations are necessary to make these surveillance regimes possible. And I guess I was also wondering about, The extent to which international collaborations were involved when it came to data ownership.
So, you know, the kind of second and third order effects of not only the immediate chilling effect of the surveillance and the way that constraints movement, but also then what is happening with the huge amount of data that is then being collected.
Um, so I wanted to ask you a little bit more about, uh, the kinds of experiences you heard from people when you were doing the research for this really comprehensive report because you conducted a lot of interviews with many Palestinian families, activists students, and experts across Hebron and East Jerusalem.
So could you share some of the stories that they told you about how surveillance is shaping and affecting their everyday lives?
MATT MAHMOUDI:
Yeah, so a lot of the stories are really related to. And something as what we might take for granted or as, as mundane as, you know, just access to any form of associational life. We had a new number of testimonies that were referring to the idea that in Hebron surveillance had killed any form of social life.
And that refers to really moments in which somebody might be crossing a street to go grocery shopping and find themselves effectively looking at the barrel of a gun, uh, unless they comply with having their, their face registered within a facial recognition system, in this case, Bluewolf.
Uh, we have instances of families who were speaking about having to move weddings and other sort of ceremonies that were of cultural significance to Palestinians away from H2 and just choose to not participate even if they were their immediate relatives, such as their sons and daughters, because of the fact that they didn't wanna subject them to possibly being stopped and held within the checkpoints were they themselves being stuck and held and therefore having to resort to simply cut ties. As far as other experiences are concerned, we've found sort of chilling, uh, impacts on the ability to exercise protest and descent. So in Hebron, a group known as Youth Against Settlement effectively were put under house arrest and any, any form of attempt to move outside of the building from which the group operated was was met with cameras outside the building being turned inwards towards the building from which youth against settlement were operating. And they were effectively saying that they didn't know whether they could walk out freely, how uh, the footage of them walking out might be perceived, even if they were just crossing the road to uh, chat with a neighbor or meet a family member or have dinner as it were. Um, in many instances, in both Hebron and East Jerusalem, we have sort of documentation of, especially women speaking to the idea of being unsafe in the home because of surveillance, cameras turning inwards towards the bedroom and turning inwards towards their homes, um, exposing their, their very private aspects of their lives.
It, it had profound impact on how. Easy and comfortable women could be, especially with regards to practices such as, um, not wearing the hijab uh, in, in some instances we had, uh, individuals speaking to, uh, the, basically taking up the task of wearing it inside of their home where they might not have previously done so because of how the cameras were turning inwards towards their homes. Um, in East Jerusalem, what's important to understand is that following the evictions of some seven families in Sheikh Jarrah, in East Jerusalem, seven Palestinian families, uh, there there was a protest in which many Palestinians came out and tried to resist and call for an end to Israeli apartheid against them.
Now, in this context there was a lot of backlash, a lot of violence, and the state of Israel has since that, um, event, uh, deployed more CCTV cameras around Sheikh Jarrah, around Damascus gate, around Al-Aqsa Mosque, around, um, places like Silwan, which are all neighborhoods and areas that are of profound significance to Palestinian communities. And so even for as sort of mundane touristy-y as an act of going to Damascus gate and having a cup of tea or coffee with a friend, we had students speaking to how they couldn't do so without having to enter in their head a calculation around whether they might be registered, whether there would happen to be a protest that day, where they might be associated with it, just by virtue of walking into the same general area.
So I think that speaks to how, you know, surveillance isn't just a problem at the checkpoint. It's creating a chilling effect, and affecting a coercive environment that is intended to make Palestinians uneasy and effectively intended to force them out of areas that are of strategic interests to Israeli authorities.
And you can see this trend across the globe in different ways, that the border is no longer where we thought it was, it encroaches in on us, it enters our bedrooms even, in that example that you gave.
ELEANOR DRAGE:
I wanted to ask about what it means that Israel is becoming this AI superpower. And generally the militarization of AI is something that we look at. Um, it's something that is very strong here in the UK even, and particularly because Tel Aviv is this prominent tech hub, uh, the Israeli defense minister is now a spokesperson for AI saying that it's gonna become a beacon of what Israel is creating and giving the world, and will create also autonomous weapons and streamline combat decision making, all the stuff that I think people are generally a bit scared about.
How will that affect Palestine and also how will that affect the world?
MATT MAHMOUDI:
That's a huge one to unpick, so, so I think it's worth noting, first of all that the state of Israel's justifications for deploying AI against Palestinians in the first place has tended to be under the auspices of protecting what is considered under international law, the presence of illegal Israeli settlers in the Occupy Palestinian Territories.
And here the usage of AI driven surveillance is effectively in place or is claimed to be in place to ensure their safety. However, under international humanitarian law, of course, the state of Israel has an obligation also to ensure that the people being occupied are also safe, but nevermind that they're shirking that responsibility. And the failure to do so, the only security justification that the state of Israel has, uh, legally speaking is to ensure the safety of illegal Israeli settlers in their move out of the area that is being occupied, but they're not doing so they continue to deployment. So again, thing we can very easily dismiss the idea that these technologies are being deployed for security, also because they do not apply at large, they just apply supposedly to Israeli settlers. Now, as far as AI goes, in general, I don't think it's a secret at this stage that the Occupied Palestinian Territories, in particular, Hebron with this sort of smart city initiative and smart city project, has become a bit of a lab for the development of security products, whether it's surveillance or whether it's sort of military products of other kinds.
And this fantasy that's rooted in AI driven violence, the exercise and the efficiency in the use of AI driven violence is the latest iteration of what has been a slew of different, um, products and techniques and tactics that have been developed over the past many years.
So Jeff Halper has often spoken about this idea of global Palestine and this idea that, you know, what happens in in Palestine goes elsewhere. And this has been pretty clear, for example, in the ways that Israeli security forces have been pretty pivotal to the training of, for example, US police personnel in the past.
And there's an exchange of practices and tactics and strategies there and certainly we know of US technology companies such as Cubic Technologies amongst others that provide like transport services in the us but as far as the state of Israel goes, they provide military simulation software for the use of Israeli security forces.
So there is an exchange there of ideas, of capacity, of tools. And my greatest fear is that, nevermind the security justification of, uh, Israeli settlers in Palestine, but that's sort of the pursuit of becoming an AI- dominating power will lay the groundwork for the further sort of deification and dehumanization of Palestinians and Palestine, but also globally, that we don't think about the ways in which our experience of surveillance here in the United Kingdom or in New York for that matter, or in India, is connected intimately with experiences of violence, surveillance, and restrictions in the freedom of movement and a apartheid in Palestine because they are, every time you see a camera that is operated by TKH Security in New York or in London, you're effectively enabling an economy or you're, you're witnessing the enablement of an economy that also maintains apartheid in Palestine and every time we think about facial recognition, just in the context of allowing of not allowing protest, which is, not to downplay the importance of protest, it, it matters tremendously, but by just thinking about it in those terms, and if you're someone who's not participating in protests and finding yourself saying, well, that's not gonna affect me, so whatever, let them roll it out, but you're not thinking about the ways in which that technology could be used to keep you within your home indefinitely, under security justifications under auspices of a threat outside or that you might even be a threat, if we don't go to the logical conclusion of what the worst possible violent, um, extreme could be simply by looking at Palestine, then we're on a slippery slope that is going to lead to the erosion of our rights and Palestine should both be sort of a reminder of that. But it social, it should also be, um, I think it should also serve as a bit of a productive leader, insofar as we should know that by calling for the end of the sales and development of products that are being used to uphold apartheid in places like Palestine, that we're also securing rights elsewhere. That these things, again, are intimately connected, and by stopping the pipeline in one place, we could stop the whole thing.
KERRY MCINERNEY:
Absolutely, and I think this is like something that's so important for us to do is to think about how historically, one of, I think the manifestations of colonialism and colonial violence has been the transformation of colonized ground and colonized peoples into testing grounds. And Eleanor's heard me talk about this, I think on previous episodes of the podcast, which is, you know, I think about this in relation to nuclear testing in the Pacific, how the so-called Cold War was, in fact, you know, hugely devastating for the peoples and the places that were transformed into these completely devastated and toxic areas, um, due to really quote unquote great powers, using them as testing grounds for weapons.
Um, and I think, you know, I really value your analysis around Palestine becoming this testing ground for development of horrific military technologies that are used not only to continue to surveil and control Palestinian populations, but also have broader ramifications for human rights worldwide.
I do want to pivot slightly here. To think about, again, this kind of question of commercialization, AI, and surveillance. Um, cause again, I think your reflections here are so on the money in terms of how surveillance beyond the act of the protest, even though that's incredibly important, and Eleanor does really phenomenal work on protest recognition software and the impact that can happen on democracy.
But yeah, how surveillance can kind of affect the way we move through the world. And I think one of my favorite speculative pieces on this was a game which, uh, the kind of conceit of the game was that instead of it being like a classic stay out of sight game, was that you lived in a dystopian state where you always had to be inside of a camera.
And if you stepped out of the kind of view of the camera, you got taken away and the idea was, you know, if you have nothing to hide, maybe that's what the game was called, Nothing to Hide, like you shouldn't have a problem having to stay in sight of a camera and yet, whenever you move through this world, like you are having to really shape every part of your kind of virtual body to make sure that you are always in sight.
Um, but I'm getting off track anyway, so this anonymous game can go to the side for now. I actually wanted to take you now to a US company because we first met Cambridge because we were both thinking a lot about issues with migration and one of the companies of. Course it always comes up in this space as a kind of particular bad actor, is the company Palentir, which is a very, very significant role, with ICE and kind of US border enforcement.
So I wanted to close a podcast by asking you about a recent announcement from Palantir. Unfortunately, I think about last week, Palantir has said they're doing this big pivot to AI powered technologies and they released an open letter on their blog saying they're planning to release their new artificial intelligence platform.
Perhaps what are the major risks you see arising from Palantir's deployment of AI?
MATT MAHMOUDI:
So what is staggering to me about Palantir is how companies like them continue to persist, and it's clear to me from various product tests and from even what we've seen in terms of the kinds of negative outcomes and really ineffectiveness of their tools in the past really show not just a scary, you know, technology company that is bringing together data sets that wouldn't be brought together under other circumstances to perform wide scale surveillance, but also the ways in which really these products don't work.
These products don't work, and yet the company continues to persist and exist. And you know, the same goes for other glitches of AI that we've witnessed in the past few months alone this sort of, the, the ramping up of, of Chat GPT and other GPT- based apps that have emerged where really what you're seeing is yes, incredible things that can be done if they're basic enough and small enough, but really you start paying attention to places where it matters. Along the lines of representation, identification transparency. And really these systems fail. They fail to work in the way that, that they were perhaps intended or they work exactly how they were intended, all to say they're snake oil for the most part.
However, the reason why I see Palantir persisting. And the reason why I'm worried about the latest announcement in particular is because Palantir and companies like them, including the sort of OpenAIs of the world provide their, their technologies, oftentimes for, for near free, right? Uh, we're talking about providing contracts to governments for sometimes just a pound, sometimes for nothing clear aI did the same, rolling out their advanced facial recognition suites effectively as a testing or a pilot suite to various law enforcement agencies, but, under the sort of, uh, prediction that if enough agencies buy into the usage of these tools, that they at least have the kind of symbolic or, or sort of techno fantastical capital, what my friend Alex done has previously referred to as Magic Capital, to be able to sell their products on for larger amounts of money in future, even though they're speculative and even though they don't work, it's just that enough agencies have used them, and there's nothing currently to suggest that any of products have worked in any way, uh, according to the sort of mission set out by the various agencies that have used their tools.
Yes, Palantir was used to devastating effect to, amongst other things arrest some 800 factory workers in Mississippi in broad daylight whilst their kids were in school for detention and deportation. Yes, Palantir claims that its tools can be used to find trafficked individuals, even though they have no data to prove in any way whatsoever that their technologies have been used in any positive way towards finding these particular individuals under question.
But the far more concerning element of this is despite of these failures, because of their ability to provide sort of an affirmative vision for what the technology will be able to do in future, along lines of AI, there is a very, very high likelihood that governments will buy into this, that governments will contract with Palentir that Palentir will provide them with initially cheap contracts and later on very lucrative deals for products that are known to either not work or have devastating outcomes on targeted and historically marginalized communities. The Palentirs of the world should be dying out. If the technology sector operates by its own rules and logics, then technologies that don't work, that don't serve the goals that they have been imbued with should go extinct. Right? That's one of the rules of the so-called free market logic under which they operate, but they don't. They stick.
ELEANOR DRAGE:
Yeah, it's tragic. And what I find particularly interesting about Palantir is that I often meet people who work in different divisions who have no idea that this stuff is going on. And so they do a very good job of decentralizing their business strategy, and that makes it hard for there even to be dissension from within.
I'll meet Palantir people and they'll often just have no idea, you know, and you are in a drinks thing or whatever, and they seem quite sweet. And you think how?
MATT MAHMOUDI:
Precisely. I think like to that point, if I may Eleanor, um, we, as a part of the No Tech for Tyrants collective confronted, uh, the UK government lead of Palantir at the university with the sort of numbers of, um, Mississippi factory workers who were considered to be undocumented immigrants that had been deported, uh, off the back of the use of the Falcon tool, which is made up of various Palentir tools and didn't know about it, and had just spent about 15 minutes allegedly talking about privacy and civil liberties, only to then veer into why safety really matters to Palantir.
And they do take those considerations into account and when prompted with these numbers, very quickly pivoted to saying that, the safety of the laws matter to them, not the safety of the people. And so again, it's sort of this loophole culture of as long as the law allows it, it doesn't matter what's violent, it doesn't matter what's eroding human rights.
As those international standards and norms of ethics don't really apply.
ELEANOR DRAGE:
And if you want to read more on how the free market logic is performative, then there's a great book called Do Economists Make Markets? The market decides logic irks me so terribly.
Thank you so much for coming on today. It was such a pleasure.
MATT MAHMOUDI:
My absolute pleasure. Thank you for having me. I am a huge fan of the pod.
ELEANOR DRAGE:
This episode was made possible thanks to the generosity of Christina Gaw and the Mercator Foundation. It was produced by Eleanor Drage and Kerry McInerney, and edited by Eleanor Drage.
Comments