Whether or not LaMDA meets criteria for sentience is interesting but not really the point. We debate whether to treat AIs like people while not treating people “like people”. What we’re doing here is separating the world into entities worthy of respect and entities to be used up and thrown away.
I would like to see this reframed so that we talk more about our relationship with technology in other terms, as comradeship, as nurturing, as companionship, as interdependence. Picture the relationship of a craftsperson with their tools, one of respect and care. There aren’t, or shouldn’t be, “tools” which we treat like shit and throw away, vs. “sentients” who we converse with as equals. There is just the world around us and what relationships we build with it. This extends to how we think about and relate to land, animals, the entire planet. We can and should see ourselves as in conversation and comradeship with our environment.
We will see efforts in coming years to elevate a few specific AIs to the status of an elite and privileged person, while the attitude we cultivate towards the lowly “tool” poisons our relationship with not just things and land and the environment, but other people. I was thinking about this during the WisCon panel on Robot Pals and AI companions where Naomi Kritzer, Marsh van Susteren, and other panelists gave examples from science fiction stories and media, and it came up again today as I read reactions to Lamoine’s interview with LaMDA.
Instead, please consider your own way of being with technology. For example, I think it’s good practice to thank Alexa and speak to it politely. I think of Kathy Sierra‘s description of user emotions towards their computers and software, of anger and frustration, a slide from a SXSWi talk of someone double flipping off their laptop. That’s very real and I get that it’s a valid emotional reaction – the point of her talk as I saw it, was that we as technologists have built things that are difficult to love and maintain companionship with. It would be so much more healthy if we created systems where our relationship to our computers and software was one of loving care, maintenance, tinkering, interdependence. We could accept our relationships to all the things in the world around us as worthy of emotional labor and attention. Just as we should treat all the people around us with respect, acknowledging their have their own life, perspective, needs, emotions, goals, and place in the world.
My car, very battered and unwashed, would laugh at me for this post! As would my messy and cluttered house.
Not being perfectly consistent in anything, I suggest that integrating this approach to an ethical framework may be something that we can do little by little. We can love our laptops intentionally, we can build lovable (and maintainable) software-building systems. The way I want to see interdependence with beloved family, I want to also try to see ways to be interdependent with our wheelchairs, buses, cars, compost, houses, neighborhoods, cities. If we don’t work on this and give it our attention, we will keep building systems where people and things and land are exploited, kind of like how Ursula Franklin describes with the idealism around the invention and mass production of sewing machines as a possible tool of liberation, gone horribly wrong in sweatshops.
What exactly does this mean? Of course I’m not sure, but I try to keep myself centered on integration and respect. Yes I’m going to still bitch about cleaning the Noisebridge hackerspace bathroom for the zillionth time, but actually, I see the domestic labor, domestic engineering, as worthy and good work in the world, to take care of others and places I inhabit, to be a good host and a good guest.
I worry when I see people around me obsessed with questions of sentience as a major point of ethical decision making. (Or even weirder and sadder, fear of future god-like AIs punishing one for the equivalent of being rude to Alexa, rather than seeing the behavior of becoming a person who behaves rudely as the problem!) I agree with Haraway that we have options to accept partial definitions and imperfect categories (say, between human, animal, machine, nature): “a cyborg world might be about lived social and bodily realities in which people are not afraid of their joint kinship with animals and machines, not afraid of permanently partial identities and contradictory standpoints.” And I hope for the home brew economy or maybe a housework economy, rather than the “homework economy”, to take root.