top of page
Search

Resisting Mental Health Ward Surveillance with Stop Oxevision

Writer's picture: Kerry MackerethKerry Mackereth

In this episode we talk to two activists, Hat and Nell, from the organisation Stop Oxevision, who are fighting against the rollout of surveillance technologies used on mental health wards in the United Kingdom (UK). We explore how surveillance on mental health wards affects patients who never know exactly when they're being watched, and how surveillance technologies in mental health wards are implemented within a much wider context of unequal power relationships. We also reflect on resistance, solidarity, and friendship as well as the power of activism to share information and combat oppressive technologies. Please note that this episode does contain distressing content, including references to self harm. 


Stop Oxevision is a campaign to highlight the use of Oxevision, a patient monitoring system consisting of an infrared sensor and camera, which can be used to take vital signs and observe patients remotely. Currently, increasing numbers of psychiatric hospitals in England use Oxevision, in all patient bedrooms, thus enforcing blanket, 24-hour surveillance without ongoing informed consent and individualised risk assessments.


They are calling on NHS England and individual mental health trusts to halt the rollout of Oxevision whilst an independent review is conducted into the legality and potential risks associated with use of surveillance technology within psychiatric inpatient settings.


Reading List:




Transcript:


Kerry: Hi, I'm Dr. Kerry McInerney. Dr. Eleanor Drage and I are the hosts of the Good Robot podcast. Join us as we ask the experts, what is good technology? Is it even possible? And how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website, www.thegoodrobot.co.uk, where we've got a full transcript of the episode and a sample. We love hearing from listeners, so feel free to tweet or email us. And we'd also so appreciate you leaving us a review on the podcast app. But until then sit back, relax, and enjoy the episode.


Eleanor: In this episode the amazing Hat and Nell tell us about a surveillance technology used in mental health wards. This is also a story about resistance, solidarity, and friendship as well as the power of activism to share information and combat really gruesome technologies. Just to give you a heads up in this episode, there are mentions of self-harm We hope you enjoy meeting Hat and Nell as much as we did.


Kerry: Brilliant. So thank you so much for being with us today. We're really looking forward to talking to you. So just to kick us off, could you tell us a little bit about who you are, what you do and what's brought you to thinking about technology and surveillance? So we'll start with Hat.


Hat: Hi, Really nice to meet you guys. My name is Hat, my pronouns are they/them um, I am here as a member of Stop Oxevision. And I have experience of treatment in quite a few different types of mental health hospitals. And I'm also a [00:02:00] survivor, researcher and campaigner.


Kerry: Fantastic. Thank you so much for joining us and Nell.


Nell: Thank you. Yeah. Hi. I'm Nell my pronouns are she, her and similar to Hat I'm here as part of Stop Oxevision which is a campaign about surveillance in camera based surveillance in psychiatric wards. I have my own experience of that. And so my campaigning and work around this is informed by that.


Hat: We started, I don't know, maybe nine months ago or something now. And although our campaign, so we're called Stop Oxevision and the campaign has been about challenging a specific technology, which is called Oxevision, and it is a kind of camera based monitoring system, which is located in patients bedrooms.


But, although That's the name of our group, our concerns are slightly more broad as well about a kind of wider context of surveillance in psychiatric hospitals. So that [00:03:00] includes the use of CCTV in patient bedrooms and bathrooms and also the use of body worn cameras.


So the reason that we spoke focused a little bit more specifically on Oxevision is that they're the kind of current market leaders.


Eleanor: Thank you so much for coming on. And this is a super important theme. It's interesting how there's tends to be one technology that's like a hook and then that draws you into this much bigger, broader context that you need to explore in order to understand the harms wrought by something that is much in inverted commas. So an odd question that comes next, which is what is good technology?


And I think we can obviously think about that through what is not good technology, but the, the question is supposed to be a provocation, something quite pithy that draws us. towards this possible futurity that we don't yet have. So perhaps what is, what would be the flip side, the opposite, the the thing that the technologies we're here to talk about today what's the [00:04:00] opposite of those things?


What is good technology? Is it even possible? And how can feminism help us work towards it?


Nell: Yeah, I think for me thinking about that question, it's really important to think about good for who obviously these technologies make a lot of big claims about being efficient and good for patients, good for staff.


But it's really interesting in terms of technology like marketing and it's important to ask who is saying why, like what's good and why- like our campaign has relied almost entirely on technology. We are in different parts of the country. We've had to use messaging apps video cool apps to organize, we And have needed to connect with people who have different experiences of different technologies across the country.

\

So I think like it's hard in a way, because I feel like because the system the technology is being implemented into is so [00:05:00] broken, it can feel like we have no choice to be like really reactive, like we're campaigning against bad practice rather than even having the time or the space to dream up what good technology would look like for us.


Hat: Yeah so I think one of the main things that we've been concerned about in the context specifically of mental health technologies is about how they're used within a wider context of power relationships. One of the things that's really interesting about Oxevision is that it allows sections of video to be saved and used in prosecution of patients.


So if there was an incident within a patient's bedroom that can be saved and used against them. But the way that it works means that staff couldn't control the power of that. So patients wouldn't necessarily be able to save it if they have been harmed by staff. And we've heard that video footage has gone missing when someone's [00:06:00] reported a concern because it has to be saved within 24 hours.


So often quite a lot of time might have lapsed before someone felt able to report something. And I think that's quite a clear example of how the problem is how the technology is used within those existing power relationships. And we've said that it would be good to have cameras in hospitals so that you can evidence the harm from staff.

Whilst it's the staff who have the camera, it means that it's still used over patients and remove power from us. And again, we've heard that kind of when people,, and like experienced ourselves, where you try to use your own phone to film an interaction your phone will be removed from you.


But when it's staff and they decide that they want to put the cameras in, they have all of the power. So I think in terms of what makes good technology like we need to consider it within the wider context. And I think we don't believe that kind of introducing a technology within like a harmful environment is going [00:07:00] to do anything other than mean that the technology becomes part of those harms.


Kerry: That's really helpful. And I think these questions around context, but also around being able to be proactive versus having to be reactive to certain technologies is really indicative of who holds a lot of the power. In making and producing these kinds of technologies and who ends up just getting subject to them, but it was also really interesting to hear about your work as activists and how technology has brought you together and made this campaign possible, which I'm incredibly grateful for.


So I wanted to ask you a little bit more about Oxevision with the understanding that I think this technology is really emblematic of this wider issue of carcerality and surveillance in mental health wards. So one thing now that you talked about. Was marketing and how products are marketed and who's saying something is good.


And so I was wondering if you could provide us with a bit of context, what do the people who make Oxevision say that this product does, or how [00:08:00] is Oxevision sold to mental health units? And to what extent do you think that the kinds of claims that are being made about those products are necessarily, as foolproof or as, as sensible as they originally seem?


Nell: Yeah Oxevision is claimed to be able to make observations more efficient in psychiatric wards. And it might be helpful just to explain a bit, so in a hospital You often the, you have to have obs done at least once an hour. So that's like a lot of time that staff will take up doing observations.


And obviously some people might be on a higher level of observations. So one-to-one, they have to have someone with them at all times. It might be every 15 minutes. But the base level is usually an hour, every hour kind of checking that someone's still there and is okayOxevision on is basically [00:09:00] Oxehealth the company that market it say that it is, it allows staff to get on with other things.


So it allows staff to have more human interaction, more meaningful interaction with patients because it takes out this element because there's a camera, they can look there's tablets that they can walk around the ward with and they can click on a room and they can see a 15 second live feed of the room and take that person's vital signs remotely they, can do that more than once, one of our concerns is the 15 seconds thing, how it's marketed to patients that, oh, it's only 15 seconds at a time that they can see this live feed, but they can click that button as much as they want.


As far as we know there's not a sort of system to stop them doing that.


Hat: The main part of it is a camera and an infrared sensor um, and so one of the core things that it does is take a video of a patient. The infrared sensor also enables it to [00:10:00] take pulse and respiration rate measurements.


So that is taken when staff remotely take a a spot check of that from the office. The other thing that it can do is alert when somebody's been in a bathroom for a certain amount of time or, for example, by, in a doorway, which might be an area of the room that is at risk for self harm or suicide attempt.


It can also, but beyond that, it collects a lot of information. So one of the things that it does, it's got an activity tracker, which will create a graph of the amount of time that someone spent in bed, the amount of times and the duration that they've spent in the bathroom. And that's one of the things that people often aren't informed that so much data about them is collected.


Eleanor: People don't know which 15 seconds it is that's being taken of them. So you constantly feel like you're, it could be the 15 seconds, right? [00:11:00]


Nell: Yeah, so it's really it's, You don't know when someone's watching you or when someone's not.


And that's the difference between regular observations where a human comes to the, either the door in, or looks through the window, opens the blinds and you can see, or if someone's with you at all times, you know who it is. And you can make decisions based on that, like in terms of when to get dressed and things like this.


So it's obviously quite concerning that people don't know. It's also concerning that the red light is always on regardless of if the system is turned on or off. We know that some people do manage to get the system turned off and then it's turned back on again without their knowledge, but because the camera looks, the box looks exactly the same you wouldn't know, basically.


Eleanor: When someone's in the room with you, you know that they're in the room with you, the problem with these kinds of surveillance technologies is that you never know. So you're watching yourself at all times, regardless of [00:12:00] whether it's actually someone is actually watching you. I think it's really important to track how these things affect how you feel.


What do people experience when these technologies are used on them?


Hat: Yeah, so we've had a lot of quite awful stories from patients across the country who've experienced Oxevision or again similar CCTV or kind of other surveillance. And we do have on our website, we've got what we've called our sort of 'Tales from the Ward' and it's got a lot of people who've shared their experiences there.


But what we've heard is that people have found it extremely distressing that it has made them feel very frightened, very self conscious. and in some cases increased distress. I think for people who've also experienced previous trauma that the use of the system can add to fears around that, and we've heard that people have been too scared to sleep in their bedrooms because of the camera watching them, so people have slept in the [00:13:00] garden, on the bathroom floor, in communal areas, doing whatever they possibly could to get out of view from the the camera, and really concerningly we've heard from people that where this system has been used on them that it's had really long term lasting impacts.


Even nine or so months after the admission, it's still impacting them. I think this kind of links to how we're quite concerned that they'll be additional and specific concerns for particular groups. So for example patients who are experiencing like psychosis that can often manifest in fears around surveillance anyway.


So what wars will be doing is actually legitimizing their kind of psychosis there and likely contributing to distress more. Again, as I mentioned, For people who've had traumatic experiences, and particularly if that does relate to experiences of kind of surveillance or things like that.


We also have seen documents [00:14:00] where it's been clear that people have been observed when they're praying in their room, or obviously if somebody wears like a hijab and Then wanted to take that off in their room they could then be seen by somebody else, which could really add to people's distress.


So one of the main things that we've done as part of our campaign is doing freedom of information requests to all of the mental health trusts in England. And quite a lot of the policies that we've received from those requests mentioned things like cases where, the document will say if you turn the camera on and a patient is masturbating, just turn the camera off again. There's no kind of safeguard there really to know whether the staff member has done that, what the protocol following that would be. So if somebody was undressed or they'd been masturbating or kind of a staff member had seen them otherwise compromised.


We don't know whether the protocol would be for the staff member to go and inform them so they'd know or whether that would just be left. [00:15:00] Yeah, I think this is the kind of real fear and there's so many sort of different ways that it will impact people based on like their personal I don't know where I was going with that, but yeah.


Kerry: I think this is obviously really horrible to hear and understand. And I can't imagine the huge amount of work it was to put in all these freedom of information requests and also just how emotionally draining and difficult it must have been for you to gather these stories and hear these experiences.


So yeah, thank you so much for doing this work. I think it's so crucial. I also was quite struck by what you were saying about the way that these institutions themselves and the use of technologies that surveil can reproduce previous forms of trauma. And this is something I used to think a lot in my research on immigration detention.


So I did my PhD thesis on Yarls Wood Immigration Detention Center in Bedford. And one of the kind of striking themes that would come up [00:16:00] again and again from, detainees as testimonies was the way that being in this very carceral environment, even if it was not technically a prison, really brought up huge amounts of trauma for those who had experienced forms of political violence or persecution or previously had experienced imprisonment.


And so it was quite striking to hear you, describe almost a similar phenomenon of people being retraumatized just by an environment that is ostensibly meant to be protecting or helping them, but actually maybe is bringing up alternative forms of trauma. So I wanted to ask you about Oxevision in that context because I would be interested to know a little bit more about the origins of these kinds of monitoring technologies.


Do they, for example, come from carceral institutions previously? Do they have any links, with forms of imprisonment and detention? And I guess, what do you think Oxevision itself says more broadly about the role of surveillance in mental health institutions?


Nell: I [00:17:00] think it's really interesting because Oxehealth themselves used to market the product towards like custody suites some prisons, and going back to this idea of how technology can be marketed in one way to say something completely different.

In those it's exactly the same technology, but they would be talking about it in a much more just in a completely different way. So for example, they would give a stat about how much it would cost the taxpayer per year for death, a death in custody. Just a completely dehumanizing statement about some, somebody's life.


But interestingly, since they've tried to pivot towards mental health, it's now much more about co production and the patients are really happy to be surveilled in this way because they say that it improves their privacy and dignity because they're not disturbed by observations.


So I think it also, It's these [00:18:00] kind of systems and similarly, obviously, body worn cameras are used, by police and they've entered into the sort of psychiatric hospitals now and it is that kind of carceral system still within a hospital and It's incredibly distressing. In terms of a body worn camera, that's not on all the time, but it's usually activated, for example, before a restraint.


So a patient will know they're about to be restrained because they've pressed that button on their chest to say that they're activating their body worn camera. And it's things like this that are just like, incredibly concerning to us. Like how that technology is being used.

So I don't know whether how you want to talk a bit about the health innovation network here because I think it's really interesting how these private companies are accessing the NHS and actually accessing public funds in order to get into the NHS?


Hat: Yeah. So Oxehealth who make [00:19:00] and sell Oxevision are a company who are a spin out from Oxford University. And if you go back to the beginning of the product, it's quite clear that they designed a product and then retrospectively fit that to a a kind of target population. So at the beginning, there were trials of using the technology in neonatal intensive care or kind of cancer services. Then they used it in a police custody cell and then went to psychiatric hospitals and seemed to have found that was where it stuck the most.


And yeah, I think our concern with that is that not everyone, but a lot of people who are in a hospital will be detained under the Mental Health Act which kind of by definition, removes a lot of your rights. And it has meant that because of that, the product have been able to continue. So, um, in the majority of the cases of the documents that we've reviewed mental health trusts [00:20:00] have no consent policy in place. Oxevision is used as standard care. Patients aren't required to consent to it. And in a lot of cases, as we've heard from people, when they object, they have no rights to object.


It's just kept on. Yeah, I think our fear is that kind of by nature of being a group where your rights are moved, it makes psychiatric patients quite a convenient target population and kind of we've seen that it has enabled the private company to develop their product and their algorithm further.


They're now marketing this within the US saying that it's boasting that it's used in the NHS. They've also done other things with technology looking at. Space tourism. And all of this comes from having such extensive access to patients data without their consent and sometimes without their knowledge.


What Nell was saying before is that the product was initially designed and then won some funding from something called the NHS [00:21:00] Innovation Accelerator, which is a sort of fellowship program, which gives money to technology startups, health technology startups to support them to expand their product and roll out across the NHS.


But what we have observed in the case of Oxevision and also in the case of SIM, which Was a module which involved putting police officers into mental health teams and all sorts of other concerning things. Was that in each of these cases the product appears to have been rolled out without there being evidence, without there being scrutiny, without them looking at the legal implications. In the case of Oxevision, there's kind of significant implications around data protection and human rights and things.


And all of this seems that the kind of, the time hasn't been taken to consider that would work with Patients to get patient in and Yeah, I think it's kind of part of a broader picture of [00:22:00] how technology really being pushed and promoted but not necessarily built from a starting point of looking at what the issues are and what patients want the most it is often a technology company designs a product and then fits it to possible uses


Eleanor: Retrofit technologies are the absolute worst and retrofit use cases. I bored Kerry about this so much today, but I met a girl who was a journalist that was working at a company called Dataminr that was using their tracking software to help journalists identify news trends.


And then the company pivoted towards using that same technology to surveil protests in the USA, specifically BLM. And a number of people were like, we're not happy with this technology being used in this way. These companies are such fuckers. Like they really don't give a shit about how their technologies are used. And it was interesting cause in some ways I was like, Oh I want to give these people a benefit of the doubt. Like maybe they didn't know, didn't really understand the [00:23:00] kind of contracts that the company was entering into. But I wonder what people like workers of that company experience, whether the people left, whether people issued complaints.


Like I really feel strongly about coalition building, so it's not just company vs. activists. There will be people within the companies that will be so useful going forward to help us break these things before the contracts are signed. There was, there's a question I have to ask about gender dynamics.


So how does gender affect these kinds of technologies?


Nell: Yeah I think gender hugely affects surveillance technologies in specific ways. I think we've already spoken about some of the specific ways that it affects for example women who want to be in their room with their hijab off women who are like, who have experienced domestic abuse and those kinds of relationships.


But there's also a very gendered gendered way that the mental health system works anyway. In terms of like diagnosis, for example a lot of [00:24:00] women and queer people tend to get personality disorder diagnosis, particularly BPD, borderline personality disorder. And we've seen things with Oxevision specifically, where it's like very much embedding that practice into the ward.


So they Oxehealth have these stories from the ward where they talk about how their technology is being used in the best possible way. And some of these stories are like, really disturbing. There's a case where a woman has self harmed in a specific way and the the nurse observes her and goes and takes away the implement that she's self harmed with.

And then, as she's coming round, she seems to have a seizure. Instead of speaking with her, she decides that because this woman has a personality disorder diagnosis, and in the nurse's view has exaggerated illness in the past. She just removes herself from the room and observes this woman through Oxevision and checks her [00:25:00] vital signs and things.

Rather than being alongside this person and obviously it's like There's just so many levels to it, like it's horrible that someone is having medical treatment with, withheld, regardless of if staff think that it's not necessary. But also just that missed opportunity of like relational care and being with someone in their distress.


And it's those kind of behaviorist approaches that it just seems to really embed in wards.


Hat: Can I add something? I just wanted to talk a bit about the risk of sexual abuse through the technology. So basically, Oxevision is used on half of mental health tra is used by half of the mental health trust and is also used in child and adolescent wards. So patients on those boards will be from around 11 to 18. So because the Oxevision technology can be used [00:26:00] by a staff member from the office to view a patient in the bedroom but the technology doesn't have anything built in, which prevents that being misused.

So for example, staff members don't have individual logins. So although there appears to be some way of seeing when the video has been accessed, there is no system at all which would allow identification of which staff member was doing that. So there is a huge potential there especially within the context of children's boards for people to use this maliciously, to deliberately view children or vulnerable adults like in their bedrooms. I think the function of the technology to alert when somebody's been in the bathroom, So the alerts are different on different wards. It can be set differently, but what we've heard is that quite often it will be around three minutes. So when someone spent three minutes in a bathroom, that could indicate that they are at risk, that they're self harming in the bathroom and that they need some support, but also could be [00:27:00] that they're having a shower.

So what the technology does is actually build in a function where a staff member could be alerted to the fact that somebody's in the bathroom. And this kind of exists within a wider context of really widespread issues of sexual harassment. violence in psychiatric wards, particularly like staff on patients.


So we're really concerned about how the technology itself could be used as a like a vehicle for those sexual harms. And yeah, and how that fits within a wider context of like gendered harms of psychiatric wards.


Kerry: And thank you for sharing that with us, because this is, I think, one of the really awful sides of many of these technologies, which is that people often market them as being about safety, when what they really introduce are many kinds of opportunities for abuse and for unsafety.


I know Meredith Whittaker had this argument over [00:28:00] the online safety bill here around the installation of backdoors into people's databases in order to ostensibly look for child sexual abuse imagery, which is a huge problem. But as Meredith Whittaker, I think very rightly pointed out, like you can't have a one kind of use backdoor here. Like as soon as you create that opening, all sorts of people and institutions can take advantage of that. And so you have to, are you ready for those kinds of harms and possibilities to open up? And I think this, unfortunately what you've described around these surveillance technologies being used for the purposes of sexual abuse is a pattern of gender based harm that we've seen across many different technologies where they are almost immediately exploited in order to control and harass.


Women and gender minorities and queer people because of the existing power relations that we're in. But just to round up, I guess something I would love for our listeners to know and for us to know is what can we do to support [00:29:00] Stop Oxevision? How can we bolster your campaign? What is most useful to the two of you right now?


Nell: I think really it's just- get in touch. Our campaign works on the basis of people sharing knowledge, people sharing skills, people giving a bit of their time, a lot of their time, just let us know, like you might have a bit of information that would be really useful for our campaign. And yeah, we'd love to speak to people.


And you can stay in touch. email us at stopoxevision@gmail.com. You can visit our website at stopOxevision.com and follow us on social media as well.


Eleanor: Awesome. You guys are amazing. It was a real honor to speak to you and yeah, we're looking forward to getting back in touch soon. Thanks a lot.


Eleanor: This episode was made possible thanks to the generosity of Christina Gaw and the Mercator Foundation. [00:30:00] It was produced by Eleanor Drage and Kerry McInerney and edited by Eleanor Drage.

19 views0 comments

Recent Posts

See All

Comments


Join our mailing list

Thanks for submitting!

Reversed white logo CMYK.png
  • Amazon
  • kradl, podcast recommendations, podcast discovery, interesting podcasts, find-podcast reco
  • Spotify
  • Deezer
  • Twitter
bottom of page