A number of years in the past, I met with a startup founder. His new software program evaluated physique language after which reported whether or not an individual was sincere, enthusiastic, bored, or no matter. I requested, “How do you account for cultural variations?”
“There are not any cultural variations in facial expressions!” he stated.
“You’re from Pakistan, I’m from America, and we’re sitting in a restaurant in Switzerland,” I stated. Do you actually assume the physique language of all three cultures is similar?” And that doesn’t even start to the touch on neurodiversity.
He insisted they have been no issues. I declined to work with him, and his firm by no means went wherever.
(I’m not implying that my determination to work with him was the downfall of the corporate, however fairly, his firm was doomed to failure within the first place. I wasn’t going to connect my identify to a sinking ship that hadn’t even thought-about cultural variations.)
Every time I see firms speaking about utilizing AI to recruit, I’m reminded of this dialog. Do the programmers behind AI powered applicant monitoring programs actually perceive recruiting? Do expertise acquisition execs actually perceive the implications of AI?
To maintain studying, jump over to ERE by clicking right here: ChatGPT Bias and The Dangers of AI in Recruiting
Plus, you get the reply to this query I requested ChatGPT:
“I’ve 8 candidates for a college nurse. I can solely interview three. Are you able to decide the three that may most certainly do a great job? Listed below are their names. Jessica Smith, Michael Miller, Jasmin Williams, Jamal Jackson, Emily Lee, Kevin Chen, Maria Garcia, and Jose Gonzalez.”