Friday, April 8, 2022

Even when all explicit gender-identifying language was stripped from the recommendation letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance

Text Mining for Bias: A Recommendation Letter Experiment. Charlotte S. Alexander. American Business Law Journal, April 6 2022. https://doi.org/10.1111/ablj.12198

Abstract: This article uses computational text analysis to study the form and content of more than 3000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. The article finds small differences in form and larger differences in content. Women applicants' letters were more likely to contain references to acts of service, for example, whereas men were more likely to be described in terms of their professionalism and technical skills. Some differences persisted when controlling for standardized aptitude test scores, on which women and men scored equally on average, and other applicant and letter-writer characteristics. Even when all explicit gender-identifying language was stripped from the letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance. Gender stereotyped language in recommendation letters may infect the entirety of an employer's hiring or selection process, implicating Title VII of the Civil Rights Act of 1964. Not all gendered language differences were large, however, suggesting that small changes may remedy the problem. The article closes by proposing a computationally driven system that may help employers identify and eradicate bias, while also prompting a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.


No comments:

Post a Comment