Many providers are on a higher visibility campaign to recruit more girls in line with their gender diversity policy. TCS, for instance, has not too long ago announced the initiative ‘Rebegin’ to attract girls with work practical experience to rejoin just after a break. Similarly, a number of tech providers such as Dell, Microsoft, Accenture and IBM have been wooing girls with practical experience to return to work. At the similar time, there is a concern that when girls apply for jobs there is an unconscious bias towards girls which blocks their entry into the workforce. The complete classic recruitment approach beginning with job descriptions, screening, interviewing and choice of the candidates is normally biased by classic mindsets specially towards girls.
Recognising the need to have to overcome gender bias in the recruitment approach, firms such as Mozilla and BBC have began applying blind hiring, that is, with no the need to have for the candidate to specify gender or name though sending in the application. An sector survey has identified that increasingly, providers are applying automated tools for hiring for virtually each and every stage of the lifecycle of hiring. Estimates indicate virtually 55% of HR leaders in the US are applying such tools for hiring. The approach of recruitment hence has got diversified to involve tech based assessments that support in understanding the capability of candidates based on test outcomes. Lately, some have began applying AI and algorithm to remove biases and bring in objectivity in the choice-generating approach. Language tools supported by AI are enabling managers to word the job descriptions with more care and remove gender associated words.
Yet there have also been indications that AI tools may possibly finish up strengthening the biases top to worse outcomes for girls in the context of hiring. The classic instance is that of Amazon which had to scrap its automated recruitment programme as it was found that the system had an in-constructed bias against girls. The hiring tool was supposed to price the candidates on a score of one to 5 stars, hence highlighting the best candidates that could be viewed as for hiring. The tool was educated to price and recognize candidates based on an algorithm that had the intelligence drawn from the hiring patterns followed in the earlier 10 years when mainly male candidates have been hired. As a outcome the system did not recognise girls talent and female candidates have been not viewed as.
It is attainable to address such concerns and construct AI tools that are much less dependent on human behaviour and are focused on objective and factual outcomes. In his HBR post, Tomas Chamorro-Premuzic states 3 strategies to get the most out of AI tools which would minimise or remove gender discrimination. First, by automating all unstructured interviews and eliminating human ratings, it would be attainable to decrease the bias and encourage meritocracy. Next he says, AI tools can be educated to ignore biases such as gender and focus solely on particular competences.
Third, these tools could be educated to recognize the actual efficiency drivers that would assess the human prospective important for the organization.
Even with the very best of attempts to make algorithms bias cost-free and allow fairer recruitment processes, biases could nonetheless creep into the algorithms. Therefore, algorithmic audits are now proposed to assure requirements of coding are followed and hence minimise the biases.
The writer is chairperson, Global Talent Track, a corporate coaching options business