Conversational power

Up till now the voice versions of “AI” have given me the same irritated feeling I get while listening to an automated phone menu. I feel frustrated or impatient listening to the voices of things like Alexa or Siri. I don’t trust them on some fundamental level.

The other night I watched a video clip with Danny, where someone asked ChatGPT to chat wiht it in a Cockney accent. I had watched it earlier and thought, Huh that’s convincing, it sounds very much like Danny’s family. When we watched it together I saw his face go through a very complicated sequence of emotions. It was just wild.

Then the next day I tried asking it for a chat in a Rhode Island accent that was from someone Italian-American. It answered, “Sure,”and with that one word I felt my face do what Danny’s did the day before. I felt surprise, shock, fascination, fear, vulnerability. In the short paragraph it then generated, which was a normal thing for someone to say from where I was born and where the core of my family was from. I got a sort of homily, an offer of coffee milk*, and was told, “Mangia!”. That sounds so stereotypical but the personality and conversational subject felt as correct as the accent (if maybe a little bit of a stereotype). As the hair on the back of my neck stood up I had a strong memory of my grandmother (who I was estranged from for much of my life) singing “A You’re Adorable” and tenderly reading to me while I was in her lap.

The evoker of the Cockney accent, the video maker, appeared in their short chat to bond with the ChatGPT generated personality, at the end saying goodbye with a warm “love to the family”.

It is interesting we both experienced such powerful emotions. I think that even without our particular contexts of alienation or distance, people’s relationship with “AI-ness” is going to change, because it feels very different to talk with an entity that expresses a personality. It feels grounded, rooted, and has at least the warmth level of making small talk with an affable stranger who you might meet in daily life.

The veneer of culture and personality may be thin right now. It’s likely that when I go back and try this exercise in more depth, ChatGPT will cycle through a fairly short list of stereotypical “Rhode Island Italian” things it can insert into the conversation. But that level was enough for casual chat. It is far from the phone tree voice or robocall that you want to throw across the room. Definitely worth a handshake.

—–

*

“What the hell is coffee milk?” Danny asked me. “Um. I have some in our fridge right now is what.” (I special order it from Rhode Island is what. It’s delicious! “A swallow will tell you!”)

“And (looking at the text of our chat) what does m-a-n-g-i-a” mean?” he asked, Britishly.
Me: [!!!!! (laughing uproariously)] Only a thing I was told every single day a million times!

a plastic bottle of coffee syrup made by Autocrat, with a bird logo

Thoughts on AI, comradeship, ethics, interdependence

Whether or not LaMDA meets criteria for sentience is interesting but not really the point. We debate whether to treat AIs like people while not treating people “like people”. What we’re doing here is separating the world into entities worthy of respect and entities to be used up and thrown away.

I would like to see this reframed so that we talk more about our relationship with technology in other terms, as comradeship, as nurturing, as companionship, as interdependence. Picture the relationship of a craftsperson with their tools, one of respect and care. There aren’t, or shouldn’t be, “tools” which we treat like shit and throw away, vs. “sentients” who we converse with as equals. There is just the world around us and what relationships we build with it. This extends to how we think about and relate to land, animals, the entire planet. We can and should see ourselves as in conversation and comradeship with our environment.

We will see efforts in coming years to elevate a few specific AIs to the status of an elite and privileged person, while the attitude we cultivate towards the lowly “tool” poisons our relationship with not just things and land and the environment, but other people. I was thinking about this during the WisCon panel on Robot Pals and AI companions where Naomi Kritzer, Marsh van Susteren, and other panelists gave examples from science fiction stories and media, and it came up again today as I read reactions to Lamoine’s interview with LaMDA.

Instead, please consider your own way of being with technology. For example, I think it’s good practice to thank Alexa and speak to it politely. I think of Kathy Sierra‘s description of user emotions towards their computers and software, of anger and frustration, a slide from a SXSWi talk of someone double flipping off their laptop. That’s very real and I get that it’s a valid emotional reaction – the point of her talk as I saw it, was that we as technologists have built things that are difficult to love and maintain companionship with. It would be so much more healthy if we created systems where our relationship to our computers and software was one of loving care, maintenance, tinkering, interdependence. We could accept our relationships to all the things in the world around us as worthy of emotional labor and attention. Just as we should treat all the people around us with respect, acknowledging their have their own life, perspective, needs, emotions, goals, and place in the world.

My car, very battered and unwashed, would laugh at me for this post! As would my messy and cluttered house.

Not being perfectly consistent in anything, I suggest that integrating this approach to an ethical framework may be something that we can do little by little. We can love our laptops intentionally, we can build lovable (and maintainable) software-building systems. The way I want to see interdependence with beloved family, I want to also try to see ways to be interdependent with our wheelchairs, buses, cars, compost, houses, neighborhoods, cities. If we don’t work on this and give it our attention, we will keep building systems where people and things and land are exploited, kind of like how Ursula Franklin describes with the idealism around the invention and mass production of sewing machines as a possible tool of liberation, gone horribly wrong in sweatshops.

What exactly does this mean? Of course I’m not sure, but I try to keep myself centered on integration and respect. Yes I’m going to still bitch about cleaning the Noisebridge hackerspace bathroom for the zillionth time, but actually, I see the domestic labor, domestic engineering, as worthy and good work in the world, to take care of others and places I inhabit, to be a good host and a good guest.

I worry when I see people around me obsessed with questions of sentience as a major point of ethical decision making. (Or even weirder and sadder, fear of future god-like AIs punishing one for the equivalent of being rude to Alexa, rather than seeing the behavior of becoming a person who behaves rudely as the problem!) I agree with Haraway that we have options to accept partial definitions and imperfect categories (say, between human, animal, machine, nature): “a cyborg world might be about lived social and bodily realities in which people are not afraid of their joint kinship with animals and machines, not afraid of permanently partial identities and contradictory standpoints.” And I hope for the home brew economy or maybe a housework economy, rather than the “homework economy”, to take root.

Two excellent essays

From A Future Worth Thinking About (a blog with a great tagline – Thinking about magic, cyborgs, robots, and artificial intelligence–and why some of those words could use changing–since 1982), “Heavenly Bodies: Why It Matters That Cyborgs Have Always Been About Disability, Mental Health, and Marginalization.

And Making Kin with the Machines, which I enjoyed so much I started laying it out as a tiny zine (it would make such a nice small book for a pocket and it is creative commons licensed.) We’ll see if I actually do it or not… maybe… if i get the layout to my satisfaction.