AI poses disproportionate risks to women

Darrell M. West
Darrell M. West Senior Fellow - Center for Technology Innovation, Douglas Dillon Chair in Governmental Studies

November 20, 2023

  • Recent studies have found that women are uniquely concerned about the risks of artificial intelligence.
  • This is primarily driven by fears about economic security, personal security, and “megachange.”
  • AI-induced job losses are expected to disproportionately affect women, and AI-powered “deepfakes” prey mainly on women.
  • We must ensure fairer representation of women in STEM fields, crack down on digital crimes that target women, and develop workforce policies to promote gender equity.
A man leans in to speak to a cashier while checking out at a Best Buy store in Flushing
A man leans in to speak to a cashier while checking out at a Best Buy store in Flushing, New York March 27, 2010. REUTERS/Jessica Rinaldi (UNITED STATES - Tags: BUSINESS)

Women are worried about artificial intelligence (AI) and fear that it will harm their kids and their own personal well-being. According to a recent survey undertaken by Axios/Morning Consult, 53% of females say they won’t let their children use AI products, while 26% of men felt that way. In addition, only four percent of women indicated they would allow their offspring to use AI chatbots for any purpose, compared to 31% of men who expressed that view. These results echo those found by the Pew Research Center that women are more worried than men about AI being used to diagnose and treat medical illnesses.

In this article, I argue there are three reasons for this large AI gender gap: fears about economic security, personal security, and megachange, in particular. Business leaders, tech innovators, and policymakers need to pay close attention to women’s concerns given the distinctiveness of their fears about this and other emerging technologies. They need to improve women’s representation in science and technology fields, enact laws that protect women from gender-specific digital crimes, and make sure firms have equitable workforce policies in place to help with tech disruptions.

Fears about economic security

Much has been written about the potential impact of AI on future job losses and, while the estimates vary widely, a number of researchers worry there could be significant job ramifications and that much of the negative impact will fall disproportionately on women. For example, research from the McKinsey Global Institute finds that the bulk of AI-induced job losses will affect women without college degrees because those women disproportionately populate the entry-level jobs likely to be most affected by automation. Occupations such as administrative assistants, retail clerks, and finance personnel are already seeing job cuts, and this trend could accelerate as AI is deployed more widely and ubiquitously in many sectors.

Economic fears have long been a part of the gender gap in candidate preferences. It is well established that women on average earn less than men, suffer higher unemployment, and worry more about their economic security. It therefore is not surprising that these general concerns are popping up with regard to AI, and other emerging technologies, especially given the possible job losses and/or job transformations that could take place based on those innovations. In general, women are wary of AI’s economic ramifications and fear that it will make their current plight even harder than it already is.

Worries over personal security

Women also worry about the impact of AI and emerging technologies on personal security. “Fake nude” and “revenge porn” problems already have emerged, and they typically prey mainly on women. There have been cases where prominent women, such as Representative Alexandria Ocasio-Cortez, have had their heads placed on a naked body in order to shame them. And it is not just women in leadership positions who have suffered from this odious behavior. Several schools have reported cases of teenage girls being subjected to the same treatment, which is horrible for anyone but particularly upsetting to young girls at a crucial stage in their personal development.

As technologies become more widely deployed, crime has moved into the digital arena and law enforcement has reported substantial increases in online financial fraud, identity theft, and harassment. Women’s long-term fears about crime now have a digital component because they often are a target of criminals with some of the worst infractions usually linked specifically to gender. As criminals use AI to create fraudulent content, target specific victims, and explicitly attack women, they are going to worry about their online security and the world their children will inhabit.

Concerns about the disruptions of megachange

A few years ago, I wrote a book entitled Megachange that analyzed the large-scale transformation and disruption taking place in the contemporary period. AI is one such development, but there also are major trends in business models, organizational structures, geopolitics, and political polarization that worry women and make them cautious about how these shifts could harm their well-being.

When major disruptions take place with the economy, society, and community in general, women fear there will be negative consequences for them and their families. There already is abundant evidence that technology increases inequality and reduces economic opportunity for working-class people. If AI widens the gap between the haves and have-nots, women fear its growing deployment will be detrimental to them and keep them in a disadvantaged position.

The need for preemptive action

In this situation of widespread women’s fears about AI’s economic and social ramifications, policymakers, business leaders, and tech innovators need to be sensitive both to the fears and reality of tech impact on marginalized individuals. There is a real risk that certain elements of the population will be left behind during the digital revolution and that women and their families will suffer disproportionate harms from AI-induced change. As technology advances, we need to make sure there are appropriate guardrails and protections designed so that workforce changes don’t hurt women or other vulnerable people. We need to take their concerns seriously and make sure both their economic and personal security are protected.

We should make sure there is fairer representation of women in science and technology fields so that these and other concerns are incorporated in product development and deployment. Right now, men are overrepresented in STEM areas, to the detriment of women. Creating better pipelines is crucial for the entire tech sector, especially given its importance for future economic development.

In addition, we need tougher rules on digital crimes that target women in particular. There is no national statute outlawing fake nudes, for example, and that needs to be remedied so that women have recourse when egregious actions take place. Both California and Virginia have laws prohibiting the distribution of non-consensual sexually explicit images of people. Those harmed in this way can sue for damages if they can demonstrate actual malice in the distribution, and that will help slow the dissemination of these kinds of materials.

Finally, we need to make sure that workforce policies promote equity for women in the age of AI. Females often work while also caring for young children or aging parents. A lack of guaranteed paid family leave is challenging for female workers, and this can be especially problematic during times of workforce upheaval. Some experts have predicted increases in job churn (i.e., moving from position to position) as the digital revolution accelerates, so not having adequate child/elder care or having social benefits that are tied to specific jobs is particularly detrimental for women. We should make sure there is no increase in economic insecurity that arises from AI deployment and the transition to a digital economy. Focusing on equity should be a high priority for all those who work in government and the technology area.