top of page
Search
Writer's pictureKerry Mackereth

K Allado-McDowell on technology, psychedelics and healing

 In this episode, we speak to K Allado-McDowell a writer, speaker, and musician. They've written three books and an opera libretto, and they've established the artists and machine intelligence program at Google AI. We talk about good technology as healing, the relationship between psychedelics and technology, utopianism and the counter-cultural movements in the Bay Area, and the economics of Silicon Valley.


K Allado-McDowell is a writer, speaker, and musician. They are the author, with GPT-3, of the books Pharmako-AI, Amor Cringe, and Air Age Blueprint, and are co-editor of The Atlas of Anomalous AI. They created the neuro-opera Song of the Ambassadors, and record and release music under the name Qenric.


K established the Artists + Machine Intelligence program at Google AI. They are a conference speaker, educator and consultant to think-tanks and institutions seeking to align their work with deeper traditions of human understanding.



K has spoken at TED, New Museum, Tate, Serpentine Gallery, HKW, Moderna Museet, Christie’s, MacArthur Foundation, MfN Berlin, Ars Electronica, Sónar, and many other venues, and has taught at SCI-Arc, Strelka, and IAAC.


Reading List:





Transcript:


KERRY MCINERNEY:

Hi! I’m Dr Kerry McInerney. Dr Eleanor Drage and I are the hosts of The Good Robot podcast. Join us as we ask the experts: what is good technology? Is it even possible? And how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website, www.thegoodrobot.co.uk, where we've got a full transcript of the episode and a specially curated reading list by every guest. We love hearing from listeners, so feel free to tweet or email us, and we’d also so appreciate you leaving us a review on the podcast app. But until then, sit back, relax, and enjoy the episode! 


ELEANOR DRAGE:

 In this episode, we speak to K Allado-McDowell a writer, speaker, and musician. They've written three books and an opera libretto, and they've established the artists and machine intelligence program at Google AI. We talk about good technology as healing, the relationship between psychedelics and technology, utopianism and the counter-cultural movements in the Bay Area, and the economics of Silicon Valley. I hope you enjoy the show.

ELEANOR DRAGE:

Well, thank you so much for joining us today. It's really exciting to have you on the show. What we'd like to know first is. Who are you? What do you do? And what brings you to thinking about feminism, gender, ecology, and technology? Yeah, well, thank you for inviting me onto the show. It's great to be here.


K ALLADO-MCDOWELL:

Well, to answer your question, my name is K Allado-McDowell. I'm a writer, a speaker, and a musician. I've written three books and an opera libretto, GPT 3, and I established the artists and machine intelligence program at Google AI. I've been in creative and generative AI for about eight years.


ELEANOR DRAGE:

I mean, what an incredible selection of stuff... opera, literature, Google. I read one of your, your books a while ago and it changed everything for me. I found it really mind opening. So before we get into that, can you answer our big three good robot questions, which are what is good technology? Is it even possible? And how can feminism help us work towards it?


K ALLADO-MCDOWELL:

I think good technology is possible. It's an interesting question and an interesting way of thinking about technology to frame it in terms of is good technology even possible? I think it really reveals sort of where our attitudes are that that feels like a very normal question to ask right now. You know, technology is often not seen as a good thing, yet we're surrounded by it. When it comes to figuring out what good technology looks like for me, the question has always driven me towards healing as a paradigm. So, you know, I think. There are a lot of things that we can look at that technology provides that have a mix of outcomes or a mix of value. You know, there's sometimes good, there's sometimes bad, you know, social media is a great example.


It enables lots of sharing of information and connecting of people. But of course, there are lots of downsides that we've been really living through, especially recently. But medicine, on the other hand, you know, in general is essential and something that humans have had forever. And the idea of eliminating suffering or reducing pain or enabling people's bodies and minds to heal naturally is what drives medicine. So for me, this has been a really important framing is how can I think about technology as something that provides for healing? Because I generally think it's hard to say that healing somebody is a bad thing.

You know, there's complexity, there's there's a need for a hippocratic oath, but it's hard to say that healing is is bad. So for me, this is a general good. And if I imagine what good technology would be, I would say it would hopefully be healing in the sense that it would maybe be aimed at the things that we already know are wrong at addressing the things that we already know are not working or that are making us sick or harming us.


So yeah, I guess I've always tried to approach the work of being a technologist as how can I apply this in a healing way that doesn't necessarily mean just medicine or like, you know, healing the body, it means social healing or creative healing or finding ways of taking something that could be producing harm and making it helpful.


So I think harm reduction in general and technology is, is part of that process. But for me, creativity, self awareness, these type of things are also really healing, and these are the things that we need, you know, especially right now, given the widely acknowledged. mental health crisis. And the way that, I think it's pretty clear the way that feminism, ecological thinking, prefigurative social justice thinking, these types of ways of looking at problems can help is in providing, well, as a very basic level systems for being critical and for imagining, you know, they give us frameworks for taking things that are givens or taking things that we're expected to accept at face value and understanding the motivations behind them.


The big takeaway for me when it comes to thinking about technology critically is how can I think about it outside of the frame of myself as a plastic wrapped hermetically sealed consumer endpoint. You know, how can I be more than that in my life and in my thinking around technology? Because this is, I think, the fundamental flaw in our thinking.

And it does cascade up to patriarchy and systems of power. But the reason it does is because established power and patriarchy, hegemonic structures are benefiting from us thinking of ourselves as alienated, isolated individuals and consumer endpoints rather than as networks of being or participants in an ecology or interrelational subjectivity.


ELEANOR DRAGE:

Yeah, you're totally right. And humans are not hermetically sealed. We're very porous. And so for that matter is technology. And I love that when you think about technology, you're also thinking about everything else except for technology. So when we think about good tech here at the center I work for Cambridge we also think about what else is good, you know: rest, the opportunity to think ideas through, the environment, all those other things we bring to the fore. And sometimes we actually just push away technology for a little bit so that we can think about those things. I guess that a lot of your ideas are influenced by growing up in the Bay Area in the nineties when you age ideas about psychedelics and postmodernism, and art when full flow, but also the tech industry was beginning to radically change the region's politics and its economy. So can you tell us what that was like and how this affects your work?


K ALLADO-MCDOWELL:

Yeah, I have to admit the more distance I have from my own writing and the more time I have to see what I've written, the more and more I come to see myself as a product of the Bay Area, which is, you know, something that I guess I'm proud of, but also has its cliche elements as well.


I mean, the, the major, I, you know, that have really come out or, you know, been expressed through the Bay Area and the last several decades are, you know, do have to do with technology. They have to do with psychedelics and new paradigms for spirituality. And a lot of this stuff goes back to the, you know, the counterculture there.


Fred Turner's done a really amazing job of synthesizing that with the history of Silicon Valley. But what I find really fascinating about that place is the intermingling of these different ideas and the way that, for example, psychedelics as a, as a force for social good gets mixed into technology work or the way that the classic countercultural kind of pitch of, hey, think about things the way we do and the world will be better.


You know, like we're, we're thinking we're creating a utopian society. We're creating a better way of living. This gets mixed into commercial pitches for technology. And even the focus on consciousness- raising, thinking about consciousness. Obviously, these are things that have a lot to do with psychedelics and the countercultural movements.


That they influenced, but they also have a lot to do with AI. And so, you know, right now we're seeing lots and lots of activity around AI in the Bay area in particular, and the way that that's getting mixed into the, the economics of Silicon Valley and the kind of snake oil type, you know, VC pitches, as well as in really interesting and deep philosophical ways.

You know, where people are looking at what AI is doing and saying, how does this reflect on what I am as a, as a conscious being, you know, is this conscious, is my use of language conscious if a chatbot can make sense and answer questions, what does that say about myself? And, you know, there is an emphasis on healing in a lot of these communities that I'm referring to.


And I do think it's an important aspect of Bay Area culture. I guess you might say that these things have all blended together for better or worse in some situations. So to go back to the nineties, I was in college and trying to find an apartment. For the first time right when the dot com bubble was blowing up and rents were getting really expensive and I got to experience what gentrification was like and how neighborhoods and cities could rapidly change and the kind of social tensions introduced, but also the sense of energy that a new wave of innovation and business created. I worked in a coffee shop and a bunch of people I knew were getting jobs that they never thought they'd ever have, and turning into new careers that were really exciting. And I was in music then too. And it was simultaneously exhilarating for people to be able to share music digitally.


But also I watched the music industry bottom out and artists go from being able to make a living being in an indie rock band that sold, you know, 10, 000 records at a time to not being able to make money as a musician at all, unless you were, you know, the very top 1%. So there was that aspect of it. And then there was also the, I guess, convergence of different types of concerns around theorizing technology or understanding what's happening socially, there were a lot of. really heady things going on with the conversation around psychedelics and technology. I mean, Timothy Leary, Terence McKenna, Jaron Lanier, those people were trying to fuse conversations around virtual reality and maybe You know, AI, the internet with ideas around social organization and, um, a kind of psychedelic utopianism that really, you know, does resonate deeply with the ethos of the Bay Area.


And I think still kind of carries forward, even as a sort of echo or a cast of light over the whole space, you know, big tech is not utopian. You know, but it's branding has been for so long and it's such a deep part of the cultural DNA of the place that it comes from that it's very hard for us to imagine it any other way.


ELEANOR DRAGE:

I wonder then, you know, you've been talking about subjectivity in the mind and what the impact of AI psychedelics in the environment has on it. Can you tell us a little bit about that?

Yeah, the, the ways that psychedelics influence thinking. There are so many, but there are common patterns, right?


K ALLADO-MCDOWELL:

One common pattern is realizing that you are one with everything. That's, you know, the classic joke. What did the, well, we can adapt the joke, right? What did the psychedelics users say to the hot dog vendor on the streets in New York? Make me one with everything, right? That is kind of the cliché of psychedelic experiences, is that you could experience ego death, you might see deeper parts of yourself, depending on how strong of an experience you're having, and that idea of becoming aware of yourself as part of a larger system is very cybernetic on a certain level. It has ecological implications and we can't forget that these psychoactive substances come historically from plants and even synthetics are often derived from plants and so they come out of an ecosystem and in many ways represent.

Ecological interactions in a sense, you know, you're consuming a plant or a fungus. It has an effect on your mind. This is your mind in a very material way, becoming one with its environment. Perception also, you know, it increases neuroplasticity allows for you to imagine new ways of being. It, you know, can often make you want to change how you think about yourself and this flexibility or these states of revelation or gnosis or perception at higher orders of connectivity within the brain can be really profound and an important aspect of all these experiences coming from, You know, a therapeutic point of view is we're now having more and more opportunities to frame psychedelic experiences through therapeutic lenses, or even coming from a more traditional point of view based in indigenous practice, you know, such as the types of ceremonial practice you would find in South and Central America, for example, you know, it's very important to integrate your experience into your own sense of self and your relationship with your community and your environment. And this is where I think a lot of the intersection between the cultural forces we're talking about, this is where it can go strange or go a little awry. You know, you might have a really profound insight or an idea.

And in the process of integration, if your cultural value system says, well, when you have a good idea, you should try to monetize it. And the way that you monetize it is by pitching it to a venture capitalist. Then if you go down that road, you may be taking, you know, something that was really profound, sacred gnosis or revelatory experience and trying to transform it into a capitalist structure.


And this is something that happens a lot, to be honest. So I think it's not, there's no guarantee that psychedelic experiences or, you know, even non psychedelic experiences in meditation, or what have you, that produce these inner states will necessarily produce social change that is not capitalist, you know, we, we live inside these systems.

And so when we integrate, we're integrating our experience within those systems. And I think it's quite helpful for people that are undergoing psychedelic experiences and dealing with the profound changes that can result to have critical frameworks for thinking through, you know, what they're experiencing.


So, you know, right now, what we're seeing is medicalization, legalization, and commercialization of what essentially was a drug war. So the war is being translated into something else. And it may still be a war, or it may also just be a bunch of businesses, but those have different incentive structures. They will use psychedelics differently. And the idea of productivity hacking, microdosing in order to be more effective at work, you know, it's a cliché that people do that in Silicon Valley. And sometimes you might come across somebody who just seems like they're high. And you're like, why do you, why can't you remember anything after every meeting?


And it might be because they're microdosing at too high a dose or something like that. But there's also people that are really getting very positive mental health benefits out of it within a productivity framework. So like, is it good? Is it bad? Is it taking something away from the substance and perhaps from traditional use? If these substances have any kind of agency as actors within our own minds, how would that become part of it? There are, these are, there are so many questions that we can explore and so many ways of taking the situation apart. And a lot of those I have written about in my new book, Air Age Blueprint, which is just it, I really wanted to explore some of that within the context of ecology and AI.


ELEANOR DRAGE:

Yeah, that's super cool and I really recommend the book. I always wonder, you know, for the like tech bros who go spend time with shamans, but actually don't really change the infrastructure of technology- they're not interested in engaging with the kinds of inequalities that they're contributing to with their companies or whatever.


To take psychedelics, to a microdose and just to think about yourself is such a, just feels like a waste of what you're doing. But yeah, the jury's out for me about what the benefits are. On that note, you talked about bio-semiotics which seems to be something that I should know about and completely don't.


So can you explain what bio-semiotics is and how it influences your writing on ecology and AI?


K ALLADO-MCDOWELL:

Yeah, so biosemiotics is the field that looks at the production of meaning within biological systems and between species, and it comes from, a lot of it comes from Jakob Johann von Uexküll, who in the early 20th century wrote about the relationship between different animals and how meaning was produced within that relationship and within their own sensoria and their own bodies.


I was in the middle of writing Pharmako-AI, my first book, and thinking a lot about intelligence, how evolution embeds intelligence within an animal form. So, a hummingbird can reach its nose into a flower and extract nector. And it's evolved to do that. It has a certain amount of intelligence about its environment, which has been embedded into it through the iterative processes of evolution.


And I shared that with a friend and she was like, well, you really need to get into biosemiotics. This is what this is all about. So I read some of Van Wijk's work and he builds on this idea of the umwelt, which is the inner world, or just the world of a being based on its sensory capacities and those interact with the capacities of other animals.


So a classic example is a moth with what appear to be owl eyes on its feathers. When a snake sees the moth with its wings open, it perceives that there's an owl there, gets, afraid. and leaves and the moth is protected. Now what's happening there is the moth has evolved to appear to the snake like an owl.


Does the moth know that it looks like an owl? The meaning may reside only in the perceptions of the snake, but it doesn't matter for the purpose of the moth, it's protected. And this idea of meaning and sense making happening between different species In these networks of relation within an ecosystem seemed really important to me when thinking about AI, because AI is a new kind of sense-making intelligence, you know, through like, let's just say to focus on chatbots and large language models, language is embedded inside the system.


We're struggling to understand, does it have a theory of mind? Does it understand the world? And it clearly is able to, through language, produce meaning within us. Does that matter? If it doesn't have it, you know, when you start to look at it relationally and ecologically and think, okay, meaning is being produced in a network.


It's being produced through language. Am I the center of this or am I just one piece that language is moving through? And, you know, even the question of where the intelligence lives. That question does have to do with, yes, of course, the language model and the network and what it's capable of, but there's also the training data.


And that is a history of thought expressed in language, and there are structures within that. And that happens across hundreds of generations. It's not even something that one person did. So language, in a sense, is transcending all of these structures. It's been bigger than all of these things. And this is what's so fascinating to me about trying to locate meaning in the relational network of the human AI connection. And of course we're right now, the only species that's interacting with AI, but that's not the only possibility. Other species could be, and we could be sensing and understanding other species through AI. So in a sense, if you see the world already as relational and ecological, the addition of AI to it only complexifies, enriches, entangles all of those existing relationships.


ELEANOR DRAGE:

Yeah, that's really beautifully put. I want to ask you, what does the human mean in the context of AI? Because we get asked this all the time. You know, I was in a meeting with my grant funder last week, and really all they wanted to know is what does it mean to be human in the era of AI? Or even what does writing mean in the era of GPT? Or what's art mean in the era of DALL·E ? And actually I like what you do, which is rather than attempt to answer this question outright, you look at how writing with computational devices puts you at the center of this debate and allows you to, in your words, sense or feel through the change that's happening. So what does that mean? Can you tell us a bit about that process?


K ALLADO-MCDOWELL:

Yeah, I mean, first of all, there's no way we're going to figure it out unless we start doing it, right? So this is, in my opinion, the most practical way to try to answer the question is just to start doing it. And lots of people are having that experience now.

For me, there was a moment of ontological shock, essentially, when you realize this thing is doing something that you associate with human behavior, which is producing meaningful language or coherent language, even if it's not coherent all the time, this is where maybe the correlation with psychedelics actually is kind of the strongest.


You don't know what it's like until you try it. And it's the same with this kind of writing. You don't know what it's like until you start to interact with it. And it has psychological effects. It has meaning producing effects. It feels different. And that experience changes how you think about writing and about yourself.


And there's no predetermined way that that happens, but it's, it is potent enough that it does have to happen. And on some level you will have to engage with this idea of, well, what does it mean for a system to produce language that is not a human? You know, what does it mean for it that it produces meaningful words?


Does that make it sentient or conscious? Does that make me not sentient and not conscious? Because if I base my sense of sentience and consciousness on language and, you know, there are modes of meditative intellectual inquiry that try to get us to the point beyond language where we can experience our own consciousness without language.

And there are, you know, ecstatic techniques. There are psychedelic techniques that will allow you to experience consciousness. Even without a subject/ object division. I know it sounds completely unbelievable to say that, but it's actually can happen. So one thing I think it does is it breaks open your relationship with language, at least if you're, if you're kind of allowing it to enter your mind deeply.


And this is another way that it. is a kind of psychedelic experience. I named my first book Pharmako-AI, partly after Dale Pendell's trilogy of books that began with the word Pharmako, which is an encyclopedia of plants, because to me it felt like psychoactive plant, or it felt like something that was having a psychoactive effect on my mind, and the framework that Pendel uses when he talks about a large number of different psychoactive plants is the Greek word phármakon, which is the idea that something can be a poison, a cure and a scapegoat simultaneously.


And so this is how I think about AI, especially when it comes to writing and thinking with it, that it's something you do let into your mind and it changes your cognition because you've allowed it in. And so it's a bit like consuming a psychoactive substance. It changes your cognition and you have to work with that.


You have to develop new strategies internally or new ways of thinking that allow you to process that experience and not get lost in it to help you understand better what's coming from you, what's coming from it, and to understand also what's coming from neither and is coming kind of from the, the whole.


ELEANOR DRAGE:

Yeah, completely, it's both poison and remedy. And I wonder to what extent that kind of contributes to both the fear and fetishization that we have around AI that we see it both as this terrible, terrible thing, but also something that can really change us for the better, that can fix everything or can fix climate issues, the climate crisis, at the same time as also making, contributing to the climate crisis. What's this kind of scapegoat element?


K ALLADO-MCDOWELL:

Well, I mean, the most obvious basic thing is 'the AI did it', we're not responsible, you know, that's the most obvious scapegoat is, you know, passing the buck to an automated system that can make decisions for you. And then, and then not taking responsibility for that.

But I think that is one of those things we have to figure out. It's why I put my own name on the cover of Pharmako-AI and didn't include GPT 3 in the name on the front. I mean, partly because my name is long and it wouldn't fit both. Elegantly, but also because I really felt it was important to take responsibility for the words as the writer of the words, even if they were generated and I edited and selected which ones to include the obfuscation of the system, the passing of responsibility onto a faceless system, you know, is treating it as a scapegoat and it's also dishonest. You know, these are human made systems. There's this whole entire messy problem of how to make them ethically and so how to make them socially beneficial. And, you know, so if. There, there are so many layers of mediation between the things coming in and the things coming out that I think there is a degree of scapegoating that is quite possible as a user.


And as a designer of the technology, you know, you can simply say, well, this is just what the, this is what it came out with. It's a black box. We don't really know how it works. It just did this, you know, and to some extent that's true, but there's also responsibility on the part of people that are putting these things into the world, you know, to put your name on it.

It's just another step in the computer says no direction and Kerry and I have been looking at how this is happening with hiring managers using AI tools to de bias the workforce and you know, there's no proof of that. But the point is, is that you can be like, Oh, well, it's an algorithm, it's maths.


So we don't need to do diversity training anymore, or we can kind of throw away those things that we know are really hard work in changing the culture of the organization because we can just get AI to do it instead. Or algorithms used in the States to predict how likely it is that someone that's incarcerated is going to commit another crime.


ELEANOR DRAGE:

It's a really beautiful way that you designed the book and, and actually talking to authors like you about the process of making those decisions when writing with AI, I think is going to be so important in the future when people increasingly write with GPT, learning the ethics of writing with a technology and collaborating.


You were talking about these moments when the human can have an ego death or can change their way of thinking so that they're not wrapped up in their thoughts in the same way. You've talked to me before about how the gong is used in meditation. It's used in many ways, in lots of different parts of the world, an incredible piece of technology, the gong, that when struck, you said, gives off a cloud of frequencies and shuts down a certain kind of brain processing. Can you explain a bit about that?


K ALLADO-MCDOWELL:

Yeah. So the gong plays a big part in the opera that I created with the composer, Derrick Skye he wrote the music and when we were planning it, I said, I guess I should back up and say that this opera is a neuroscience study. So we've done a few different versions of it. The most recent was at Lincoln Center in New York in October 22. And during each one, there are neuroscientists working, uh, with subjects in the audience to record their brainwaves and to try to understand what happens during different parts of the music.


And so the music is structured in alternating sections. That include, on the one hand, complex rhythm, harmony, melody, voice, and then on the other hand, long sections of gong. And we found that there are significant differences in brainwaves between the people, you know, between the sections of music within the subjects, within the data from each subject.


So, the way I understand what is happening with the gong is that, for example, if you look at the, the music in the non- gong sections of the opera, if you look at it in a spectrogram, which is a visual representation of the frequencies over time, you can see shapes and patterns, and they generally show the harmonic series of overtones.


The harmonic series is what gives a sound, it's timbre, texture. It's... Um, based on the physics of vibrating columns of air, vibrating strings, you have a certain set of frequencies that occur in a kind of step ladder type of formation that is regular and you can see those almost like these comb structures in the visualization of the music of the waveforms.

But then when you get to the gong sections, it doesn't have that. You just see a, you see a kind of noise or noise field or cloud over the frequency range and what's happening there is you're getting hit as a listener with all these different frequencies at once, and one is not taking center stage.


There might be a deep fundamental that everything's around, but you don't have the typical overtone series, which is an octave, a fifth, a fourth, a third, a sixth, a second. And what happens is your body and mind when you experience sounds is entrained to specific frequencies. And so there are so many frequencies present, you can't entrain in the typical way.


So you either are not in not entraining or you're entraining differently, and what that produces is a different set of brainwaves and we're seeing a shift coming out of those sections. So it seems that there's some time that it takes to kind of de- entrain from what was there before and enter into this different type of entrainment.


And then when we come back into the vocal sections, there's a shift. In the brain waves. So this is kind of like a rough idea of what we've seen so far, but the gong, you know, as an ancient sound healing technology, essentially, you know, people figured out how to make those so that It does something so different than what other instruments do.

And that's why I wanted it in this opera.


ELEANOR DRAGE:

That's just fantastic. And I love music so much and it has a really profound effect on me in a way that, you know, like if there's something on in the supermarket, it'll like really affect my mood, but it doesn't happen to everyone. It still signals something, you know, when you go into meditation or yoga or some practice and you hear the gong, you know what that means. So there's this really important symbolic presence of those kinds of instruments as well. And I wonder whether that also has a kind of combined effect?


K ALLADO-MCDOWELL:

Yeah, that's really interesting. That was a big piece of the thinking behind the opera was the idea of ritual. How does ritual. influence or play a part in science, in music performance and in healing. So, you know, I've been very interested in ritual structures and art since I went to grad school.


You know, my thesis was around the idea that the white cube is a kind of vestigial ritual space where you carve out a special period of time within a specific space. And within that objects have the ability to transform our subjectivity. So I've been thinking about this for a long time. Why does making one space different than another allow certain things to happen, but it is a fundamental piece of all of the disciplines that came together for the opera.


You know, scientists have their rituals of eliminating variables, of focusing their hypothesis and defining their process and kind of entering into an experiment. Performers are famously, you know, superstitious with their rituals like I have to turn three times and drink a glass of water and then I do this and then the show is going to go great.


And, you know, who knows why that works. And then, of course, healing rituals exist all over. So the gong is something that denotes ritual. It's actually, it's, it's an insightful read on the use of the instrument. You know, I hadn't thought of it as explicitly a ritualistic instrument. I just love them. And, but clearly this is something that people associate with that.


And even what we have in the opera, where we alternate between more structured sections of. Sound and the more diffuse cloud of frequencies of the gong does create a sort of entry into and out of these zones of entrainment or the different kind of entrainment. So that also I think is part of rituals- the creation of a, a magic circle or a zone or a place where there is more plasticity, there's more transformation that's possible, you know, by entering into that.


ELEANOR DRAGE:

Yeah. That's so interesting. Thank you. I think I also just love the low sounds, you know, I've always, my favorite big cat is the leopard, the baritones of the desert, you can hear them growl in a totally different way. It's great.


Well, we've definitely come to the end of our time. I have so many other things I want to ask you. Thank you so much for coming on. For our listeners, Kerry is in America. It's a time zone catastrophe and I'm under a big duvet coat. So thank you so much for having to look at me like this for the last 40 minutes. And I hope we'll see you again soon.


K ALLADO-MCDOWELL:

Thank you so much. It's been a pleasure.


41 views0 comments

ความคิดเห็น


bottom of page