top of page
Search
Writer's pictureKerry Mackereth

Neema Iyer on Afrofeminist Approaches to AI Governance and Policy

In this episode, we chat to Neema Iyer, a technologist, artist and founder of Pollicy, a civic technology organisation based in Kampala, Uganda. We discuss feminism and building AI for the worlds fastest growing population, what feminism means in African contexts, and the challenges of working with different governments and regional bodies like the African Union.


Neema Iyer is a technologist, based between Frankfurt, Germany and Kampala, Uganda. She works at the intersection of data, design and technology. Through feminist research, she tackles subjects such as digital inclusion, digital rights and civic technology. She runs a podcast called Terms and Conditions, together with Berhan Taye.


Reading List:



Iyer, N. and Achieng, G., 'Afrofeminist Data Futures Report' https://pollicy.org/resource/afro-feminist-data-futures-report/


Borokini, F. and Saturday, B. 'Exploring the Future of Data Governance in Africa', https://pollicy.org/resource/exploring-the-future-of-data-governance-in-africa/


Borokini, F., Nabulega, S. and Achieng, G., 'Engendering Artificial Intelligence', https://pollicy.org/resource/engendering-artificial-intelligence/


KERRY MACKERETH:

Hi! We're Eleanor and Kerry. We're the hosts of The Good Robot podcast, and join us as we ask the experts: what is good technology? Is it even possible? And what does feminism have to bring to this conversation? If you wanna learn more about today's topic, head over to our website, where we've got a full transcript of the episode and a specially curated reading list with work by, or picked by, our experts. But until then, sit back, relax, and enjoy the episode.


ELEANOR DRAGE:

Today we’re talking to Neema Iyer, a technologist, artist and founder of Pollicy, a civic technology organisation based in Kampala, Uganda. We discuss feminism and building AI for the worlds fastest growing population, what feminism means in African contexts, and the challenges of working with different governments and regional bodies like the African Union. I hope you enjoy the show.


KERRY MACKERETH:

Thank you so much for being with us. So could you tell us a bit about what you do, and what brings you to the topic of feminism, gender and technology?


NEEMA IYER:

Sure. So as you know, my name is Neema Iyer, and I'm a technologist and an artist. I am the founder of Pollicy. That's Pollicy with two L's, which is a civic technology organisation based in Kampala, Uganda, and we work at the intersection of data design and technology, we're very interested in things like feminist internet, looking at the intersection of gender and technology, looking at how data is used or not used, looking at digital rights, data governance, really anything along the whole spectrum of data use, you know, from, you know, conceptualising how you collect the data, collecting it, how it's used, how it's governed, and eventually how it's used in innovation in digital spaces to shape our lives.


ELEANOR DRAGE:

Fantastic. Thank you. So we'd love to know what your and what Pollicy’s take is on our million dollar question, or really our billion dollar question, which is what is good technology? Is it even possible? And if it is, how do we work towards it?


NEEMA IYER:

That is such a good question. So what is good technology? I think it's anything that improves our lives without harming people. And, you know, you could really think of anything as good technology like the wheel. You know, the bicycle, the bicycle is apparently one of the most empowering inventions for women across the world. And I would say it has very few downsides, like, yes, you could get in an accident on a bicycle. But overall, I think, you know, it has all the positive effects of, you know, getting you from one place to another, improving your life, improving logistics. And I think generally, that's something positive, you know, you could think about vaccines. And I know that this conversation is a bit more in the direction of digital technology. And I think that things become very tricky in that space, because all of us bring our own biases to our work and to what we think good is, and I like to imagine that for the most part, when people build digital technology they have good intentions, but we have so many blind spots. And this is one thing that at Pollicy that we're trying to do is to make these blind spots more obvious. So, you know, the very common take is that many software developers, for example, tend to be located in the Global North, maybe you're in Silicon Valley, maybe you've never been anywhere in Africa. And so it's very difficult for you to understand what our context is like, even when you make digital platforms, you're basing them on internet speeds of where you were born, where you were raised, not knowing you know, all the gaps that we have, when your programme takes a lot of data, it's very expensive. It doesn't load properly. A lot of people have older technology, a lot of people have lower digital skills. So maybe when you do your user testing, you know, you're doing it for a certain type of people. And that leaves out, you know, a big proportion of the world. You build all your content, perhaps in English or other white colonial languages that many people can't use. So I think that, I like to believe, as I said that everyone comes in from a positive angle, but we have to be aware that the world is such a diverse place, and how can you include as many diverse opinions and life experiences in what we're aiming to build, so that we can all have better experiences in our life using these technologies? And of course, there's some, there's some very clear cut bad technologies, right? Where people come in with intentions, you know, where you're building something for surveillance, or you're building something that's going to cut off people's social benefits. You know, you're building tools that you somehow know can have a negative repercussion on marginalised people. And that's a clear case, you know, that you somehow do know that these algorithms you're building and your bias data sets can definitely target some groups of people. So I think that's keeping that in mind that there's some very obvious good ones, there's some very obvious bad ones, and that we constantly have to strive to make sure that we're moving the direction of building technology that imparts the least amount of harm.


KERRY MACKERETH:

Absolutely, thank you. And I want to expand a bit more actually on that topic of harm by asking what kinds of gendered harms resulting from technology is your organisation Pollicy specifically attempting to counter?


NEEMA IYER:

Right so for me, one of the big ones that is is not the sexiest of topics, you know, when we're talking about more advanced technologies is generally access. So for example, in Uganda where we're based, the statistics on access are anywhere between 30 to 50% of the population has access to the internet. And there is a very large digital gender divide as well which is very prominent in Sub Saharan Africa. I think GSMA puts it somewhere between 40 to 30 30, to 40%. So if you look at it, like 30% of people have access to internet, and then the gap between men and women is 30%. So, you know, from the get go, many women do not have access to the internet, do not have access to technology, and then what are the harms that happen, because you do not have access. You do not have access to information, you don't have access to educational opportunities, to employment opportunities, and who is really getting left behind in these conversations. So it's kind of tricky, where I am where, you know, on one day, I'm trying to, you know, look at what are the benefits and harms of AI on women, and on the other hand, it's like, okay, but we actually don't really have any data on most of these people. How can we build anything that really works for them? How do we even know how they feel about anything? Because, you know, in some ways, they're really disconnected from very basic conversation, and definitely from more of these more complex conversations on gender and technology. So it's, it's a very tricky and complex issue. And then, you know, if we move from that, then there's more looking at how technologies, for example, will impact something like jobs, for example. So women tend to be less skilled, they tend to have less digital literacy. So how does that impact them in terms of how can they earn income if you know, you know, as they say, you know, tech is going to take our jobs, and there's those are the less skilled jobs that might be taken. So that's the example on the on the other end, where technology can impact homes and those ways in terms of your economic, your social, your, you know, political well being, for example.


ELEANOR DRAGE:

So what makes these problems so difficult to solve? And going forwards, even with these huge challenges in mind, how can we go about trying to tackle them?


NEEMA IYER:

I think just generally looking at, you know, the benefits and harms of technology, I think it's so difficult to solve these problems, because humans are very, very, very complex creatures, like you could build the most beautiful technology for humans to come together and share their brilliant ideas, and then you just end up with a platform full of trolls. And that's because human beings suck. As as amazing as they are in their capacity to create beautiful things, there are also many negative people. And you cannot use technology as a tool to fix what is inherently wrong with society. If you are, if people are racist, you cannot fix technology so that they're not racist, you're just putting a bandaid over it. And in general, you need to see this change in society and then see that reflected on platforms. So I think, you know, recently there was an incident two days ago, and many people blame technology platforms for not addressing the racism that was very rampant on social media platforms, but you can also only do so much. I do feel for technology companies, I do feel like the task they have been given is very, very, very immense. And I think just because of the complexity of humans, of the problem, that so many of these issues are so so so difficult to fix. And I don't have the answers, obviously, I think many people don't have the answers when it comes to things like, especially content moderation. And it's interesting at this point that, you know, I think that so many of the jobs that exist today, like did not exist 20 years ago. And because there are new problems, and I think they, they require a lot of collaboration, a lot of working with different communities to understand, like, what are the root causes? What are the problems? What are the solutions? So you know, going back to what I said earlier, in that, when you think about content moderation, for example, like you have to take the perspective of different countries, you cannot do this job alone sitting in Silicon Valley, or in London, or, or wherever. So it's just a, it's a difficult problem with no clear solution.


KERRY MACKERETH:

Absolutely. And this actually links really nicely to something we discussed in one of our previous episodes with Anita Williams, who used to work at Google, and specifically working on these issues of content, moderation, but also various kinds of platform abuse. And I wanted to pivot slightly and ask you specifically about feminism and your own feminist approaches. We are a feminist podcast, but feminism also means really different things to different people. So we want to ask, what does feminism mean to you?


NEEMA IYER :

Great question. To me, feminism really means freedom. I think it means freedom to be who you want to be, freedom to, you know, seek the opportunities that you want, to pursue your dreams. And when you are in, you know, different types of patriarchal systems, many, many, many, many women across the world do not have any of these freedoms. And when you think about, you know, the usual definition of equality, I think that's a bit tricky, because if you would say that you wanted to be equal to men, for example. But it's also like, there's so many differences in men, you know, you want to be and, for example, just taking Uganda again, there's so many class differences, there's so many ethnic differences. And so when you say something like you want to be equal, but like to which man, because, you know, a man from a lower socioeconomic class has a very different amount of opportunities and rights and freedoms compared to a man of a higher socio economic status. So, for me, I feel, I do feel like that definition could be a little bit problematic, because there isn't really equality, you know, even among the male gender. So I really look at it as, as just having this, this freedom to really thrive and succeed in life.


ELEANOR DRAGE:

I like that definition. And you're completely right. It's so contextual, and it should keep on shifting. And one of the things that we love about the show is that we can hear these different kind of shifting definitions of feminism and equality and what they mean in different contexts, and the work that they do as terms. I wanted to talk about your reports, because you've written these beautiful reports with Pollicy. And one of them's on the use of AI in Africa, how it impacts women in particular. And you talked a little bit about that at the beginning of the show. So we wanted to know how you want this important research to be used to make positive change going forwards.


NEEMA IYER:

Right. So we really wanted to do this research, because on the one hand, we feel like a lot of the discourse is dominated from the Global North, which is quite obvious, because there's significantly more resources, there's significantly more talent across Africa, we do have an issue of you know, getting enough people into STEM, getting enough women into STEM. For example, if you wanted to find a feminist software developer, who is also a woman, I think you would have a very, very difficult time. So we're also at a pivotal point where there is not much AI across Africa. And I think this is important, because we've already started to see the negative repercussions of certain AI technologies used in the Global North, you know, whether that's used to discriminate against black and brown bodies, whether it's used for surveillance purposes. And we're not yet, not yet seeing this in the African context. So I feel like we're at a great point where we can start to reimagine, or just imagine what we want the applications of AI in our context to look like before we get into a system of having technologies that have more harms than goods. So we really want to look at, you know, if, if AI, big data, whatever is being used in our context, how do we, how do we implement them in a way that really benefits women that really benefits marginalised groups? Because, on the one hand, a lot of governments and civil society are very reactive. So for example, a government will procure a certain type of technology. And then we all react to it like, Oh, that's very negative. But my question was, like, how can we be more proactive and saying, This is what we want, rather than saying, This is what we don't want. So, and it's also a matter of, you know, as I said earlier, we really don't have good, large data sets in the African context even make use of AI in any productive way. I think the biggest data sets you'll ever come across are telco data, for example, oftentimes, they're most of the telcos across Africa are, belong to foreign companies, whether they're in India or France, some in South Africa. So at this point, you know, we can decide, okay, this is the kind of technology we want, this is the kind of data we need, these are the kinds of systems we need to put in place. And we need to constantly audit this whole process. So for example, can we lead with gender audits where, whatever AI technology we bring, we constantly audited every year to see the positive and the negative impacts on women, on, on marginalised groups, on you know, minority groups, for example. So that's really where we want to go with it. And we hope that we can work closely with different governments, regional bodies like the AU, and continue to give these suggestions. Because the fact is many, many technologies are procured because of deals with foreign countries, right. And there's a lot of money to be made there and in procurement, so in some ways can a very small segment of civil society influences decisions, maybe, maybe not, but at the very least, we need to try. So that's the angle that we're coming in from.


KERRY MACKERETH:

This leads on really nicely to something else, which is another report that you did with Pollicy around Afrofeminist data futures. And we'd love to hear a bit more about this. So could you tell us a bit about what Afrofeminism is, what it's specifically claiming in the context of technology? And what your report on Afrofeminist data futures would love to see or bring about or achieve?


NEEMA IYER:

Right. Okay, another great question. So Afrofeminism, African feminism, basically centres our feminism on our lived experiences as African women, because if you look at, you know, mainstream feminism, I would be referred to as a woman of colour, which implies that there is a colourless woman somewhere, who is, who is the focus of this movement, right? So it's just from the very words that are used, there's there's very much this sense of othering of, you know, the majority of the world. And as an African woman, there are some different experiences that we have felt, for example, a very big one would be colonialism, you know, how has that impacted our culture? How has that impacted who we are till today? So for example, maybe in the past, we had a culture where in some places, men and women want equal footing, or there were different norms. And then, you know, we had very big influence from Christianity, from Islam, which brought a very, very different worldview. And so now we're kind of a mix of all these different culture, there's a lot of rejection of our original culture. And, you know, till today we feel the effects of colonialism in, in very real ways. And also, if you then look at also things like, you know, being judged for your colour, racism, class issues, so there's just this, this different intersectionality based on who you are as an African woman, and African feminism really grounds that and focuses specifically on what our needs are as an African woman. And this was really something that I wanted to bring to technology, because as we've been discussing through this podcast, where our unique needs need to be taken into account, and historically they have not. And that's really where all our research is coming from, where we're trying to say, yes, on one hand, we don't have the amount of funding you have, we don't have the amount of technologists that you have. But you know, we're here, we're a large proportion of the world, for example, Africa has 1.5 billion people, we have the fastest growing population. And, but we're always constantly judged by economy of course, we have very small markets, we have very small spending capacities. So we're often overlooked. But that, you know, we're still human beings, we still have a very important place, we still have a lot to offer. And it's important for us to take to give out our perspectives, and make sure that they're available, make sure that people don't say it's, you know, there's a lack of information, we really want to put this out there put our perspectives, put our needs, so that's really what grounds all our work. And so similarly, this work on Afrofeminist data futures, is about understanding how feminist movements across Africa make use of data in terms of their movement building. And so generally, we looked at how do they use data? What are the challenges? What are the opportunities, and what are the recommendations. So it was really fun to put together because we got to map out many feminist movements across the continent, we got to chat with feminists doing such important amazing work. And we got to understand what their pain, their frustrations and their hopes are on how the data can be used, how it can be used ethically, how it can be used responsibly, how you could work with governments to make sure our needs are met. So it was really fun to put together and I really encourage you all to read the report. It also has some illustrations from me, as I said, I'm also part time artist so I really enjoyed putting my art into the work that we do. And yeah, so that's basically what's trying to achieve. We're trying to put our viewpoints out there and we're trying to make sure that people see us, people know our needs and that people address them.


ELEANOR DRAGE:

So what does it mean to do Afrofeminist work in a civil society space?


NEEMA IYER:

Oh, that's such a loaded question. I feel like you could do a whole podcast on this. So it's interesting, it's I think it's, it's a mix of waking up each day, feeling very optimistic. And then, you know, the next day feeling like screaming, because, for example, civil society is often funded by donors. And as I already mentioned, like in Africa, we tend to have fewer resources, because of historical reasons already mentioned. And so a lot of the funding in this space, especially working on things like digital rights, data governance, a lot of the funding comes from foreign countries comes from the Global North. And so it often feels like you know, you're you all you always have to, and I think this is this is the issue with development work all across the continent, is that oftentimes you don't get to set your own agenda because you have to follow where the donor funding is. And it's very, very replicative of, you know, neocolonial structures where you have little say, in what you do, the funding comes from abroad, they dictate what you can work on who you mean, for example, there was a call for proposals on disinformation, but it was very much grounded in attacking, you know, China and Russia. And it's very strange. It's like, that's your own agenda. You know, you want to do that. And, but why are you coming to African countries and asking us to make this our agenda. And, of course, it's a very political issue, because you know, it for whatever reason, but it's just that, you know, it's it just feels like they bring their proxy fights, you know, to third countries that really don't, we don't have the time, we don't have, we're struggling with our own issues. We don't have time to fight your own fights. So sometimes, it's just so infuriating that you know that we're so dependent on these kinds of funding structures. But on the other times, I get to work on projects that I find so, so, so exciting, I get to work with amazing people. I get every day I learn more about feminism, I learn more about the struggle. And there again, it's you know, it's a combination of wow, you know, we've come this far. I was just on a Twitter space this weekend, and, Twitter spaces, it's well, like, number of people, you can get on there and have a conversation. It's really amazing. I didn't I didn't I didn't hop onto clubhouse when it was when it was rising, but I'm totally into Twitter space. And it was amazing to see people having these conversations and really, you know, shutting down the misogynists in such a powerful way. And then on the other hand, you know, you read certain threads about how women are so mistreated, you know, till today, and how we're so chained by patriarchy, by misogyny. And every day is just such an amazing learning experience. And I feel like there's so much work that we could do in this space. And that is very exciting for me, there's so much that needs to be done. And that's what keeps me going. So all this to say it's it's definitely a mix of a lot of frustration and a lot of excitement that I am able to do this work.


KERRY MACKERETH:

Yes, definitely. And they're very, very lucky to have you doing this work as well. So we just want to say thank you so much for appearing on our podcast. And for our listeners, we will definitely link you to these reports that we've been discussing. They're fantastic. And so that you can access them easily. And so yes, thank you so much again, for coming on the show.


NEEMA IYER:

Thank you so much for having me.


82 views0 comments

コメント


bottom of page