Artificial Intelligence 'could deepen gender divide' in law

Linklaters-backed survey tests how women in law are currently experiencing the impacts of AI

Women must be fully included in the development and use of AI in the law or the profession's gender gap could be at risk of widening, according to a new report. 

The report, published by Linklaters, the Next 100 Years project and women's network She Breaks the Law, notes a persistent gender gap in AI across all professions, with women universally less likely to use ChatGPT and other generative AI tools than men. 

Hypotheses as to why this is the case include women not feeling as confident in engaging with the technology and the 'good girl' syndrome "where women, unless given explicit approval, feel the need to do the work themselves and not 'cheat' and take short-cuts".

This could have significant implications for the legal profession, given that women make up the majority of lawyers – 53% in law firms and 61% in in-house roles, according to the report. 

Linklaters partner Shilpa Bhandarkar, who co-chairs the firm's Gen AI steering group, said the findings underlined "the genuine risk that AI could further deepen the gender divide".

She added: "However, each of us has the ability to influence the outcome: whether by actively using technology to expose and challenge ingrained biases, giving our teams the time and space to experiment and learn, or by choosing vendors who share our values. If we champion these efforts collectively, we could reverse the path we’re currently on and instead accelerate the progress toward gender equality in law.”v

The report was based on what it described as a first-of-its-kind survey to understand how women in law are currently experiencing the impacts of AI. 

It found a strong consensus about AI's potential to enhance efficiency and productivity in the legal sector, with 77% of respondents saying the technology was having an "extremely significant" impact on the profession. 

More than half (52%) considered themselves very well-informed about the tech, though 31% felt only 'somewhat informed' and another 17% acknowledged they had a knowledge gap. Nearly one in five (18%) did not, or were not able to, identify where AI would impact their organisation.

The report identified industry lag as a potential contributor to this. While the majority of respondents felt their organisation had embraced AI, 37% said their organisation had not. A further 27% chose ‘None/Not applicable’ or gave no answer when asked what benefits of AI they had observed. 

"These findings, if a fair depiction of the workforce, may point to organisational – or indeed, industry-wide – resistance to adopting AI, a trend that some traced back to client and stakeholder hesitancy," it noted. 

Bias and inaccuracy were also pinpointed as barriers to adoption. Despite enthusiasm for AI’s capabilities, concerns about accuracy and the need for human oversight were identified, with 37% of respondents citing fears around AI’s reliability. Bias appears to be a core driver of this mistrust, with 43% saying they have observed bias in AI and legal tech, including biased tools, reports of biased outcomes and qualitative impacts.

The survey asked participants to consider what factors may be slowing down the adoption of AI by women in particular. Encouragingly, 32% answered ‘None/NA’ or gave no answer and many reported they had not observed a slower adoption rate amongst women; instead, they reported seeing female colleagues actively engaging with and driving AI initiatives within their organisations and leading innovations in AI implementation. 

However,18% of respondents still cited fear and lack of confidence as a factor that thwarts women’s engagement with AI. The report noted one junior legal practitioner at a law firm who discussed how women may be more affected by AI inaccuracies due to societal factors impacting confidence, fear of harsher judgement for errors, and documented biases in AI responses based on gender. In other words, women have more to lose when AI mistakes are made. 

Just over one quarter (26%) of respondents said that lack of training and resources were also blockers to adoption. Respondents pointed to practical experience (24%), comprehensive training (20%) and mentorship opportunities (20%) to help boost AI adoption amongst women. However, feedback highlighted that women who are out of the office for long periods to fulfil caregiving roles occupy positions with more ‘organisational dusting’ responsibilities or work for leaders who are not supportive of AI adoption, have less time to upskill and therefore could be at a disadvantage.

Some 30% of respondents could not point to a role for women to play in shaping the use of AI in the legal profession, depsite agreeing that they should. The report suggested this could attest to a fundamental issue
with how women across the sector are included in conversations around AI development and deployment. 

This was suported by the fact that only one respondent said that AI adoption was ‘fully inclusive’, with 29%
of respondents perceiving a lack of inclusivity, and 23% of respondents saying they were unsure. 

The perception of the wider AI landscape as a biased or exclusive space and the lack of equal representation in tech roles further complicate the landscape; women represent only 14% of AI paper authors and 22% of AI professionals globally.  

This suggests a critical need to target AI communication strategies to women and recruit women of influence to drum up enthusiasm for engaging with the new technologies, the report said. 

Women in the sector not participating equally in the technolgy could lead to the emergence of a critical skills gap, the report found, which could limit their career advancement in an increasingly tech-driven legal sector. This in turn could widen the existing gender gap in leadership in the profession. 

The report makes a number of recommendations to women, including to actively seek out AI training programmes for legal professionals. It also encourages women to take leadership roles in AI-related projects within their organisation, and emphasises the need to regularly update knowledge of AI developments. 

It also recommends organisations proactively include women, including by reviewing how they recruit talent for AI-related roles to ensure there is no gender bias, giving staff dedicated time to experiment and practise using the technology, and road-testing communication and engagement strategies with their women’s network. 

Email your news and story ideas to: [email protected]

Top