WASHINGTON (AP) — When disinformation researcher Wen-ping Liu looked into Chinese efforts to use fake social media accounts to influence Taiwan's recent election, something unusual stood out about the most successful profiles.
They were women, or at least they appeared to be, and the fake profiles posing as women received more engagement, attention, and influence than the accounts purporting to be men.
“Pretending to be a woman is the easiest way to gain trust,” said Liu, the investigator for Taiwan's Ministry of Justice.
Whether it's Chinese or Russian propaganda machines, online scammers, or AI chatbots, being female pays off, proving that even as technology becomes more sophisticated, the human brain remains surprisingly easy to hack, thanks to age-old gender stereotypes that have migrated from the real world to the virtual.
People have long assigned human characteristics, including gender, to inanimate objects — ships are one example. So it makes sense that having human-like traits would make fake social media profiles and chatbots more appealing. But as voice assistants and AI-enabled chatbots enter the market, further blurring the lines between man (and woman) and machine, questions about how these technologies can reflect and reinforce gender stereotypes are gaining attention.
“If you want to inject emotion and warmth, choosing a female face and voice is an easy way to do it,” says Sylvie Borrow, a marketing professor and online researcher in Toulouse, France, whose research has shown that internet users prefer “female” bots and perceive them as more human than “male” bots.
Borau told the Associated Press that women tend to be seen as warmer, less intimidating and more approachable than men. Meanwhile, men are more likely to be seen as more competent, but also more likely to be intimidating or hostile. This may be why, consciously or unconsciously, people are more likely to engage with fake accounts posing as women.
When OpenAI CEO Sam Altman was looking for a new voice for its ChatGPT AI program, he approached Scarlett Johansson. According to Johansson, Altman told her that her voice, used for the eponymous voice assistant in the film “Her,” would be “pleasant” to users. Johansson turned down Altman's request and threatened to sue if the company chose to use a voice she said was “eerily similar.” OpenAI put development of the new voice on hold.
Feminine profile pictures, especially those of women with perfect skin, full lips and large eyes wearing revealing clothing, can be another online attraction for many men.
Users treat bots differently depending on their gender: Borau's research found that “female” chatbots are much more likely to be sexually harassed or threatened than “male” bots.
Women's social media profiles get, on average, three times as many views as men's, according to an analysis of more than 40,000 profiles conducted for The Associated Press by Sciabra, an Israeli technology company that specializes in bot detection. Profiles of women who claim to be young get the most views, Sciabra's research found.
According to the Cyabra report, “creating a fake account and presenting it as a woman increases the account's reach compared to presenting it as a man.”
Online influence campaigns in countries like China and Russia have long used fake women to spread propaganda and disinformation, often playing on public perceptions of women: some portraying wise, benevolent grandmothers offering simple wisdom, others impersonating young, traditionally attractive women who like to talk politics with older men.
Last month, researchers at NewsGuard found hundreds of fake accounts, including ones boasting AI-generated profile photos, were being used to criticize President Joe Biden after some Trump supporters began posting personal photos with the statement, “I'm not voting for Joe Biden.”
While many of the posts were real, more than 700 were from fake accounts. Most of the profiles claimed to be young women from states like Illinois or Florida, including one named PatriotGal480. But many of the accounts shared similar language and their profile pictures were either AI-generated or stolen from other users. It was unclear who was running the fake accounts, but dozens were found to have ties to countries like Russia and China.
After NewsGuard contacted the platform, X deleted the account.
The UN report found that the reason why so many of the fake accounts and chatbots are female is even more obvious: they were created by men. The report, titled “Are Robots Sexist?”, explored the gender gap in the tech industry and concluded that greater diversity in programming and AI development could reduce sexist stereotypes embedded in products.
For programmers who want to make their chatbots as human as possible, this creates a dilemma, Borau said: If they choose a female persona, won't they reinforce sexist views of real women?
“It's a vicious cycle,” Borau said. “Humanizing AI has the potential to dehumanize women.”