Letter: A hiring algo will fail to notice male braggadocio
Pilita Clark’s column and Khyati Sundaram’s response both miss the most crucial flaws in algorithmic hiring (“Let’s Keep Humans at the Heart of Hiring Practices”, Business Life, September 6, and Letters, September 7).
First, algorithms of this kind learn from data sets of previous examples. They thus inevitably reinforce existing biases — just hiring “more of the same”. This is why Amazon was forced to change its hiring practices some time ago. Second, although an algorithm may not share these human biases, it also can’t explain its decisions, if only because it has no “self” to explain. A human HR person can have a conversation and adjust their view. An algorithm is an unfathomable black box of digits.
Most crucially, as discussed in Mary Ann Sieghart’s recent book The Authority Gap: Why Women are Still Taken Less Seriously Than Men, and what We Can Do about it, a human will understand and have experience of the way culture and temperament condition our responses.
An HR manager will know that when a female candidate says, “I’m not sure I’m up to this, but I’d love to try”, she means: “I know all about this, just let me at it!” A man who says, “I’ve been doing this for years, and I’m confident I’ll ace it” probably once chatted to somebody with a similar job.
Women consistently downplay their achievements and qualifications in interviews and men, on the whole, overstate theirs. An algorithm, faced with the woman and the man above, would take them at face value and hire the man. The human colleagues, faced with an unqualified bullshitter on Day 1, would not be so happy.
Sheila Hayman
Director’s Fellow, MIT Media Lab
London NW1, UK