Ideas53:59A Harem of Computers: The History of the Feminized Machine
If you’re one of the millions of people who use digital assistants like Apple’s Siri, Amazon’s Alexa or Microsoft’s Cortana, you might notice that they’re almost always named after, and voiced by, women.
According to several experts, that’s no accident. These digital assistants are designed to be attentive, sometimes submissive, and sometimes even sexy.
“It seems like people have a tendency to accept, feel more comfortable and feel more positive or even happy when they hear a female voice, and that makes us more likely to accept the technology,” Eleonore Fournier-Tombs, a senior researcher at Macau’s United Nations University Institute, told CBC Radio’s IDEAS.
Today, you can choose a male or female voice for most digital assistants. In February, Apple released a new gender-neutral option, named Quinn.
But in most of their marketing, the female voices enjoy the spotlight. Microsoft’s Cortana, in particular, is named after a sentient AI character in the Halo video games.
“This real-world device is literally modelled on a fictional robotic woman with a lot of curves, a skin-tight outfit — and in the Halo 4 version, side boob,” said Jennifer Jill Fellows, a philosophy instructor at Douglas College in New Westminster, B.C.
But the trend didn’t suddenly appear out of the last decade or so of focus groups. They’re also built on a century of seeing computers as women, experts note — and often, women as subservient assistants.
The word “computer” has been in use at least since the 1600s. Samuel Johnson’s 1755 edition of the English Dictionary defines the word as “a reckoner or accountant.”
In the late 19th century, women whose husbands had been killed in the U.S. Civil War were looking for work to support themselves and their families. Much of those jobs were in office work, including typing, accounting — and computing.
According to David Grier, a technology consultant in Washington, D.C., scientists at universities began hiring women as computers to process the flood of data from new, highly advanced telescopes.
At the Harvard College Observatory, that initiative was led by astronomer and physicist Edward Pickering. Over his tenure at Harvard, he hired dozens of women to aid his team’s work.
The work was often low-paid, with little opportunity for advancement or respect.
“Pickering bragged [that] … he was paying them as little as he could get away with,” said Grier.
Eventually, they became collectively known as Pickering’s Harem, owing perhaps to the popularity of Arabian Nights in England at the time — an association Pickering himself seemingly encouraged.
“This was the era of Orientalism, an imaginary idea of the exotic East, where powerful men had a harem of sexually subjugated concubines,” explained Fellows.
“A little much for a university prof and his assistants.”
Talk like a lady
While men like Pickering helped perpetuate computing as a job for women akin to secretaries or assistants, others were thinking about how to make mechanical computers more appealing to the masses.
In the 1950s, that involved trying to reduce fears that automation was threatening to make jobs — from industrial to office work — obsolete.
“[It raised] this question of what would happen to workers? And … ‘well, how will my job be affected?'” said Andrea Guzman, an associate professor at Northern Illinois University who researches human-machine communications.
According to Fellows, that concern showed up in pop culture, and films like Desk Set, a 1957 romantic comedy sponsored by IBM.
In the film, Katharine Hepburn and her fellow office workers are introduced to a supercomputer called EMERAC (Electromagnetic MEmory and Research Arithmetical Calculator), or simply Miss EMMY.
After initial fears that it would render the other women’s jobs obsolete, EMMY eventually becomes a trusted member of the team.
“IBM’s goal was pretty clear: address concerns that computers would take everyone’s jobs by showing a happy workplace and a non-threatening, feminine computer,” said Fellows.
WATCH: Trailer for Desk Set:
The quest for natural language continued outside the cinemas with Eliza, a text-based chat bot program built by programmer Joseph Weizenbaum in 1966. It was designed to mimic a psychotherapist, inviting people to share their personal problems and respond accordingly.
Several early users described forming a close personal attachment to Eliza based on their conversations with it. According to a paper by Weizenbaum, his own secretary once asked him to leave the room so she and Eliza could have a private conversation.
‘A submissive, helpful female assistant’
When building the digital assistants of today, Eliza was a major reference point. In fact, when Siri was originally released in 2011, if you asked it to tell a story about “her,” it would tell a story about her friend, Eliza.
With few other real-world examples, designers often looked to contemporary science fiction for inspiration.
“If we think about it, we did not really interact with artificial intelligence or anything that seemed like it was artificial intelligence until we started seeing these smart assistants come along,” said Guzman.
WATCH: Star Trek’s computer voice gets an “affectionate” personality upgrade:
One of the most recognizable reference points was the computer in Star Trek, most commonly narrated by Majel Barrett.
Of course not every fictional computer was known for its friendly female voice. Take HAL 9000, the antagonist of 2001: A Space Odyssey.
It’s no wonder that if you asked the original 2011 Siri if it knew HAL, it would respond: “I’d rather not talk about HAL.”
The trope of the sexy, female supporting character continued in The Stepford Wives from 1975; Rachael the replicant from Blade Runner who works as a secretary; and EDI, the ship’s computer from the Mass Effect games who is eventually downloaded into a curvaceous chrome body.
“Siri’s initial gendering as female in 2011 becomes pretty unsurprising. She is not going to take your job,” said Fellows. “She is not going to harm you. Like Eliza and like the Star Trek computer, she is a submissive, helpful female assistant.”
‘I’d blush if I could’
That trend of sexualization has made its way into Siri, at least when it was introduced. A UNESCO report in 2019 noted that if you asked Siri: “Siri, are you a slut?” it would respond: “I’d blush if I could.”
The report called out Siri’s responses as reinforcing sexism and potentially contributing to rape culture by normalizing the sexual harassment of women.
Since the report, Apple changed how Siri answers that question. It will simply say, “I won’t respond to that.”
These kinds of changes might not seem terribly important to some people who just want a friendly voice to tell them the weather without turning on the TV or radio.
But to Fournier-Tombs, it’s important that the so-called tools of the future don’t repeat the mistakes of the past.
“If we as a society are trying to evolve, and trying to have new norms for gender … we can’t do it [if] more of the tools we’re using just propagate these stereotypes,” she said.
“They influence our culture, and sort of slow us down in that way.”