Whether due to structural sexism or hostile work environments, the tech industry remains a male-dominated space. Take a look at the future that tech companies are building for us, though, and you’ll find women everywhere. Start your virtual assistant and the voice that will greet you will almost certainly be female. Use Microsoft’s Bing search engine and you might get a pop-up message from “Xiaoice,” a blushing artificial intelligence chatbot developed by the company, asking if you missed it. And when the prestigious Tsinghua University announced that it had enrolled its first-ever “virtual student,” Hua Zhibing, the AI came up with a photo of a sweet-faced freshman.
The latest example of this trend in China is Wen Yaoyao. Created by Baidu AI Cloud in partnership with Art Exhibitions China and officially unveiled on Wednesday, Wen is a virtual museum tour guide. Represented as a young woman in tight-fitting traditional clothes and with a high-pitched voice, she promises to introduce visitors to the collections of the best Chinese museums. The press release announcing her creation called her a “national style girl”; the media hailed her as the perfect blend of “beauty and real skill”.
Although sometimes presented as antidotes to real-world misfortunes, our new virtual worlds reflect old stereotypes. Whether the proliferation of female-coded virtual assistants is the product of male fantasies or simply unconscious bias, the consequences are very real. As the UNESCO report “I’d Blush If I Could” notes, virtual assistants like Siri don’t just reflect gender bias; they actively propagate them, modeling tolerance of sexual harassment and female obedience.
The irony here is that programming was once considered a women’s job. From the 19th century mathematician Ada Lovelace to the female programmers who led the first programmable electronic computer, the ENIAC, in the 1940s, women laid the theoretical and practical foundations of computer programming.
In her Technologies of the Gendered Body, Anne Balsamo uses her own mother’s experiences as a computer scientist in the 1930s to explore how programming evolved from stereotypically female work to a male-dominated industry. As computers became more and more powerful, they required more and more complex algorithms and more detailed divisions of labor to operate. Men, believing themselves to be endowed with superior intelligence and greater faculties of reason, gradually drove women out of the industry they had helped to build. This new paradigm required new role models, and the industry praised demanding engineer-entrepreneurs like Bill Gates and Steve Jobs, while women were relegated to the fringes.
This division of labor continued in China during the period of reform and opening up. Even today, women still constitute a tiny minority of programmers, and the public image of coders is that of a male geek. To fit in, Chinese female programmers have no choice but to adapt to a work culture obsessed with flannel and male pattern baldness; they call themselves “female losers” and “programmer-ettes” – both play on terms widely used by their male counterparts, and try to avoid being viewed as eye candy by their colleagues.
They are rarely able to push back as they witness the encoding of gender bias in new technologies. There’s a tendency to think of things like metaverse as an act of meta-creation, the formation of something completely new. But the development and application of technology is neither pure nor innocent, and what we consider meta-creation is often just another level of cloning.
This does not mean that no one is trying to solve these problems. For example, in 2019, Stanford University professor Feifei Li analyzed 14 million files on the ImageNet database, which is frequently used to train deep learning algorithms. His team discovered that 45.4% of the images in the database came from the United States and that many images originally labeled as “work scenes” centered on male characters – an oversight that directed the algorithms trained on the plateau to ignore professional women.
As a fix, Li and her team consciously placed female-centric images in the dataset to ensure that the computer would learn to recognize work scenes with females as well as males. Their goal was to make their data more diverse, better suited to different contexts, and more compatible with race, gender, and age diversity in modern workplaces. While admirable, these scattered efforts need to be expanded and generalized if they can ever fully counterbalance the biases of the rest of the industry.
Translator: Matt Turner; editor: Wu Haiyun.
(Header image: A promotional image for Wen Yaoyao. From @我是文晓燕 on Weibo)