Ideas53:59A Harem of Computers: The Story of the Feminized Machine
If you’re one of the millions who use digital assistants like Apple’s Siri, Amazon’s Alexa, or Microsoft’s Cortana, you might notice that they almost always have the names and voices of women.
According to several experts, this is no coincidence. These digital assistants are designed to be attentive, sometimes submissive and sometimes even sexy.
“It seems like people tend to accept, feel more comfortable, and feel more positive or even happy when they hear a female voice, and that makes us more likely to accept technology,” said said Eleonore Fournier-Tombs, senior researcher at the University of Macau. the United Nations University Institute, told CBC Radio IDEAS.
Today, you can choose a male or female voice for most digital assistants. In February, Apple released a new gender-neutral option, called Quinn.
But in most of their marketing, female voices are featured. Microsoft’s Cortana, in particular, is named after a sentient AI character in the Halo video games.
“This real-world device is literally modeled after a fictional robotic woman with lots of curves, a tight outfit – and in the Halo 4 version, side boob,” said Jennifer Jill Fellows, professor of philosophy at Douglas College in New Westminster, British Columbia.
But the trend didn’t suddenly appear over the last decade of focus groups. They’re also built on a century of viewing computers as women, experts note — and often, women as subordinate assistants.
The word “computer” has been in use at least since the 1600s. The 1755 edition of Samuel Johnson’s English Dictionary defines the word as “an accountant or accountant”.
In the late 19th century, women whose husbands had been killed in the American Civil War sought work to support themselves and their families. A large portion of these jobs were in office work, including typing, bookkeeping, and computing.
According to David Grier, a technology consultant in Washington, DC, university scientists have started hiring women as computers to process the flood of data coming from new, highly advanced telescopes.
At the Harvard College Observatory, this initiative was led by astronomer and physicist Edward Pickering. During his tenure at Harvard, he hired dozens of women to help his team’s work.
The work was often low paid, with little opportunity for advancement or respect.
“Pickering boasted [that] …he was paying them as little as he could get away with,” Grier said.
Eventually they became collectively known as Pickering’s Harem, possibly due to the popularity of One Thousand and One Nights in England at the time – an association which Pickering himself apparently encouraged.
“It was the era of Orientalism, an imaginary idea of the exotic Orient, where powerful men had a harem of sexually subjugated concubines,” Fellows explained.
“A little too much for a college professor and his assistants.”
talk like a lady
While men like Pickering helped perpetuate computing as a job for women akin to secretaries or assistants, others pondered how to make mechanical computers more appealing to the masses.
In the 1950s, this meant trying to reduce fears that automation threatened to make jobs – from industry to office work – obsolete.
“[It raised] this question of what would happen to the workers? And… “Well, how will my job be affected?” said Andrea Guzman, an associate professor at Northern Illinois University who studies human-computer communications.
According to Fellows, this concern surfaced in pop culture and in movies like Desk Set, a 1957 IBM-sponsored romantic comedy.
In the film, Katharine Hepburn and her office colleagues discover a supercomputer called EMERAC (Electromagnetic MEmory and Research Arithmetical Calculator), or simply Miss EMMY.
After initially fearing it would make the other women’s jobs obsolete, EMMY eventually becomes a trusted member of the team.
“IBM’s goal was pretty clear: address concerns that computers would take everyone’s job by showing a happy workplace and a non-threatening female computer,” Fellows said.
WATCH: Desk Set Trailer:
The quest for natural language continued outside of cinemas with Eliza, a text-based chat bot program built by programmer Joseph Weizenbaum in 1966. It was designed to mimic a psychotherapist, inviting people to share their problems personal and react accordingly.
Several early adopters have described forming a close personal attachment to Eliza based on their conversations with her. According to an article by Weizenbaum, his own secretary once asked him to leave the room so that she and Eliza could have a private conversation.
“A submissive and helpful assistant”
When building today’s digital assistants, Eliza was a major reference. In fact, when Siri was originally released in 2011, if you asked it to tell a story about “her,” it would tell a story about her friend, Eliza.
With few other concrete examples, designers have often drawn inspiration from contemporary science fiction.
“If we think about it, we didn’t really interact with artificial intelligence or anything that looked like artificial intelligence until we started seeing these intelligent assistants coming in,” Guzman said.
WATCH: Star Trek’s computer voice gets a “loving” personality upgrade:
One of the most recognizable reference points was the computer in star trekmost often narrated by Majel Barrett.
Of course, not all fictional computers were known for their friendly female voices. Take HAL 9000, the antagonist of 2001: A Space Odyssey.
It’s no wonder that if you asked the original Siri from 2011 if it knew about HAL, it would say, “I’d rather not talk about HAL.”
The sexy, feminine supporting character trope continued in The Women of Stepford from 1975; Rachael the replica of blade runner who works as a secretary; and EDI, the ship’s computer from the Mass Effect games that ultimately downloads into a curvaceous chrome body.
“Siri’s first gender as a woman in 2011 becomes quite surprising. She’s not going to take your job,” Fellows said. “She’s not going to hurt you. Like Eliza and like the Star Trek computer, she’s a submissive and helpful assistant.”
“I would blush if I could”
This tendency towards sexualization made its way into Siri, at least when it was introduced. A UNESCO report in 2019 noted that if you asked Siri, “Siri, are you a bitch?” he would reply, “I would blush if I could.”
The report called Siri’s responses reinforcing sexism and potentially contributing to rape culture by normalizing the sexual harassment of women.
Since the report, Apple has changed the way Siri answers this question. He’ll just say, “I won’t answer that.”
These types of changes may not seem like a big deal to some people who just want a friendly voice to tell them the weather without turning on the TV or radio.
But for Fournier-Tombs, it is important that the so-called tools of the future do not repeat the mistakes of the past.
“If we as a society try to evolve and try to have new gender norms…we can’t do that. [if] most of the tools we use just propagate these stereotypes,” she said.
“They influence our culture and kind of slow us down.”
#digital #assistants #women #century #hardcoded #sexism #tech #experts #Radio #Canada