As astronaut Dave Bowman in Stanley Kubrick’s 2001: A Space Odyssey slowly murders HAL9000, the robot sings the late Victorian era pop song Daisy Bell until its voice disintegrates.
“But you’ll…look sweet…upon…the…seat…”
It’s particularly chilling since HAL has spent much of the scene begging Dave to stop, repeatedly saying “I’m afraid” and “my mind is going, I can feel it.” HAL declares confidence in the mission and its willingness to help, but to no avail. Bowman methodically dismantles HAL’s brain as the robot’s voice lowers and slows until it’s no longer possible to understand the lyrics of Daisy.
Reviews of the movie call it “deactivation.”
Daisy Bell was written in 1892 by English composer Harry Dacre, inspired perhaps by an import tax he paid to bring his bike to the US (a friend supposedly said the tax would have been twice as bad had he brought with him “a bicycle built for two,” and the phrase stuck). It was a hit.
Intriguingly, in 1961 it was the first song sung by a real computer, an IBM 704 programmed at Bell Labs.
HAL tells Dave both the date and place of his birth (“I became operational at the HAL plant in Urbana, Illinois, on January 12, 1992”), and that an instructor named Langley “taught” it the song. HAL sings Daisy as if reenacting the memory of a presentation in front of an audience sometime in the past. It’s like listening to the robot hallucinate.
Is (or was) HAL alive?
The robot is imagined as a full member of the spaceship’s crew, if not the most responsible one with control over ship’s functions. HAL is capable of independent action — it has “agency” — which means it’s not only executing commands but making decisions that may or may not have been anticipated by those programmers in Urbana (and can learn things, like the melody and lyrics to Daisy).
HAL’s decisions are a complex and unresolved component of the movie’s plot, since it’s not clear why it kills the other astronaut, Frank Poole, along with the other crew members who are asleep in suspended animation coffins. One theory is that it has been given competing commands in its programming — one to keep the purpose of the mission secret, the other to support the crew and risk them discovering it — and is therefore forced to pick from bad choices.
In other words, it sounds and acts like an imperfect human, which passes the threshold for intelligence defined by the Turing test.
So can it — he — be guilty of a crime and, if so, is it moral to kill him without a trial?