You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To combat pattern-matching and other biases, a framework for how candidates are judged is crucial.
I know this sounds super corporate and stiff, but each candidate should be judged against a standard. If that standard isn't codified, it's prone to manipulation by bias.
The content here isn't that good, and most of it wouldn't fit for Clef, but I think it could be adapted to work really well:
Thanks Brian, we haven't found the right answer to this yet. I think rubrics are a possible answer, but it's not clear that they necessarily reduce bias (though they may be an effective way to measure it).
Maybe listing judging criteria for a position before any interviews would help, but as a first-time founder I've been learning a lot about the jobs I'm hiring for from the candidates we interview, so I'm not sure I could even do a good job of that 😁
To combat pattern-matching and other biases, a framework for how candidates are judged is crucial.
I know this sounds super corporate and stiff, but each candidate should be judged against a standard. If that standard isn't codified, it's prone to manipulation by bias.
The content here isn't that good, and most of it wouldn't fit for Clef, but I think it could be adapted to work really well:
http://www.hope.edu/academic/education/studteach/ProfessionalInterviewScoringRubric.pdf
The text was updated successfully, but these errors were encountered: