What's in a job posting? Apparently a lot.
Keywords used to describe open job positions can significantly affect who ends up applying, according to recent research by Textio, an artificial intelligence software startup based in Seattle.
The company found that the language in job postings perpetuated stereotypes about what kinds of jobs are better suited for women and what kinds should go to men.
The research found, for example, that words like "sympathetic," "caring," "empathetic" and "families" appealed more to women, while words such as "manage," "proven" and "superior" appealed more to men. The power of such feminine-skewing words can be seen in the health care industry, where women hold most roles like nurses and home health aides even though organizations try to recruit men for these positions.
David Weisbeck, chief strategy officer for Visier, a technology company with headquarters in San Jose, Calif., and Vancouver, British Columbia, Canada, said it's worth considering what lies at the heart of the language biases in some job listings.
"There's a predisposition in language," he said. "We have a [societal] expectation that nurses are women even though we all know from a hiring perspective that's not necessarily true. Society reinforces that description. What happens is, we put that language in the job description."
Still, companies have made great strides in recent decades in shifting the public consciousness regarding jobs and who should fill them, Weisbeck added. Some of this has been shaped even by the language in job postings.
"In a lot of jobs, we've changed the language," he said. "When I first started flying, there were stewardesses. Now, they are called flight attendants."
When Leela Srinivasan, chief marketing officer for Lever, a San Francisco-based recruiting technology company, was hired in 2015, the company's sales force was 21 percent female. In just a year, that number jumped to 42 percent. And out of its 100 employees, Lever has achieved something relatively unusual for a technology company—a 50-50 ratio of women to men. Additionally, the management team is 53 percent female, the board is 40 percent female, and the company is 40 percent nonwhite, according to Srinivasan.
She credits the rapid turnaround, in part, to a basic recruitment software tool developed by Lever that checks to see if the language used in job descriptions is biased or balanced. Also, the company changed the way it described what an applicant's background should look like and instead moved toward what Srinivasan calls "impact descriptions."
"Every posting that we have is now based on the same philosophy: what that person will accomplish in that role," she explained. "We'll say the person coming into this role in one month will accomplish X, 3 months will accomplish Y, 6 months will accomplish Z. Our goal is not to screen people out. As long as you can do the job, we don't really care what you look like and where you come from. We hire exceptional people, but we try not to rule people out."
[SHRM members-only seminar: Influencing Workplace Culture]
Srinivasan advises recruiters and hiring managers to re-examine their job postings and to pay particular attention to titles and opening paragraphs. Make sure the language that you use reflects the position in terms that both men and women will find interesting.
Dawn Onley is a freelance writer based in Washington, D.C.
Was this article useful? SHRM offers thousands of tools, templates and other exclusive member benefits, including compliance updates, sample policies, HR expert advice, education discounts, a growing online member community and much more. Join/Renew Now and let SHRM help you work smarter.
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.