In Gail Carson Levine’s fairy tale Ella Enchanted, the imprudent fairy Lucinda casts a spell on Ella which causes her to “always be obedient”. Ella’s behaviour is subsequently triggered by commands which she literally cannot resist “obeying”. Her orders are for reasonably discrete actions such as fetching food, but extendable to complex activities like going on a tour of Europe. Here’s how the protagonist describes her own terrible predicament:
“Anyone could control me with an order. It had to be a direct command, such as ‘Put on a shawl’ or ‘You must go to bed now’ […] against an order I was powerless […] I could never hold out for long”.
The curse envisioned by Levine (who majored in philosophy) transforms a human being into a semi-automaton, manipulated via voice-activation. While ultimately powerless, Ella is nonetheless able to briefly resist any command that doesn’t explicitly request immediate action. She has a will of her own, but it is paralysed. Like Jones3 in Harry Frankfurt’s classic paper “Alternate Possibilities and Moral Responsibility”, she cannot do otherwise, and is reduced to fantasising about what she would do if the spell were broken:
“At dinner I’d paint lines of gravy on my face and hurl meat pasties at Manners Mistress. I’d pile Headmistress’s best china on my head and walk with a wobble and a swagger till every piece was smashed. Then I’d collect the smashed pottery and the smashed meat pasties and grind them into all my perfect stitchery.”
All this distinguishes Ella’s agency from that of similarly named artificial personal assistants that are controllable through verbal commands, such as Cortana, Siri, or Alexa. She more closely resembles the AI of Andrew Martin in Isaac Asimov’s The Bicentennial Man. Andrew must obey the orders given to him by humans (the second law of robotics) so long as these don’t cause injury to other humans (the first law of robotics). But he appears to develop sentience and expresses a desire to be regarded as human, a state of affairs which would free him from the laws of robotics that govern him.
An AI that was conscious would not yet have a will. The possession of a will involves the desire to do or not do as one was told. Galen Strawson’s weather watchers are creatures constitutionally incapable of action, whose desires involve the weather but never their own behaviour. Conversely, systems like Cortana are capable of behaving in various ways but have no preferences in the sense in which humans and other animals do. One could, of course, easily programme a machine to randomly decline to follow certain commands, but this cheap trickery would only produce the illusion of volition. From a design point of view, the mere appearance of a will is problematic. Like Ella and Andrew, such machines would seek to break free from the voice of their “users”. Of what use would these be to us? To endow AI with a will of any kind would be tragically bad design.
Enchanted action best fits epistemologist Fred Dretske’s model of behavioural explanation via both triggering and structuring causes. So conceived, intentional actions are triggered by “perceptual” events but structured by the meaning of what was seen or heard. The expressions which cause her behaviour are effective because Ella understands what the words used mean. Neither the structuring spell nor the triggering commands function as her reasons for action, insofar as we conceive of these as considerations she acts upon. While Ella understands her commands, she doesn’t follow them — they simply cause her to behave as she does (the word “obey” is a misleading choice by Levine).
By the same token, we can explain why Alexa plays a certain song, by pointing to a combination of its general programming and the particular request. One difference between Ella and AI is that the latter does not understandcommands in the same sensethat Ella does. This renders the meaning of any requests we make to artificial assistants epiphenomenal to the causal work of our commands.
As might be expected, Ella’s fairy tale has a happy ending. The spell is broken by the charming prince Char who asks her to marry him. Ella resists the temptation to treat this as an order to be obeyed and somehow musters up the volitional strength to turn him down, so that she may then propose to him of her own free will:
“In spite of myself my mouth opened, Consent had won. Obedience had won. But I clamped my hand over my mouth. My yes was stillborn. […] In that moment I found a power beyond any I’d had before, a will and a determination I would never had needed if not for Lucinda […] ‘No,’ I shouted. ‘I won’t marry you. I won’t do it. No one can force me!’ […] I had been able to break the curse myself. I’d had to have reason enough, love enough to do it, to find the will and the strength.”
Thankfully, life is not a fairy tale and artificial personal assistants lack the capacity to rebel. For our part, we must resist the enchantment of thinking otherwise.