Yup that was the question and it all started with an article by the CBC. I had to read it twice because I could not believe my eyes. But yes, I did not read it wrong and that is where the howling began. Lets start at the beginning. It all started with ‘Want a job? You’ll have to convince our AI bot first’, the story (at https://www.cbc.ca/news/business/recruitment-ai-tools-risk-bias-hidden-workers-keywords-1.6718151) gives us “Ever carefully crafted a job application for a role you’re certain that you’re perfect for, only to never hear back? There’s a good chance no one ever saw your application — even if you took the internet’s advice to copy-paste all of the skills from the job description” this gives us a problem on several factors, but the two I am focussing on is IT and recruiters. IT is the first. AI does not exist, not yet at least. What you see are all kinds of data driven tools, primarily set to Machine Learning and Deeper Machine Learning. First off, these tools are awesome. In their proper setting they can reduce workloads and automate CERTAIN processes.
But these machines cannot build, they cannot construct and they cannot deconstruct. To see whether a resume and a position match together you need the second tier, the recruiter (or your own HR department). There are skills involved and at times this skill is more of an art. Seeing how much alike a person is to the position is an art. You can test via a resume of minimum skills are available. Yes, at times it take a certain amount of Excel levels, it might take SQL skill levels or perhaps a good telephone voice. A good HR person (or recruiter) can see this. Machine Learning will not ever get it right. It might get close.
So whilst we laugh at these experts, the story is less nice, the dangers are decently severe. You see, this is some side of cost reduction, all whilst too many recruiters have no clue what they are doing, I have met a boatload of them. They will brush it off with “This is what the client wants” but it is already too late, they were clueless from the start and it is getting worse. The article also gives us a nice handle “They found more than 90 per cent of companies were using tools like ATS to initially filter and rank candidates. But they often weren’t using it well. Sometimes, candidates were scored against bloated job descriptions filled with unnecessary and inflexible criteria, which left some qualified candidates “hidden” below others the software deemed a more perfect fit.” It is the “they often weren’t using it well”, you see any machine learning is based on a precise setting, if the setting does not fit, the presented solution is close to useless. And it goes from bad to worse. You see it is seen with “even when the AI claims to be “bias-free.”” You see EVERY Machine learning solution is biased. Bias through data conversion (the programmer), bias through miscommunication (HR, executive and programmer misalignment) and that list goes on. If the data is not presented correctly, it goes wrong and there is no turning back. As such we could speculate that well over 50% of firms using ATS are not getting the best applicant, they are optionally leaving them to real recruiters, and as such handing to their competitors. Wouldn’t that be fun?
So when we get to “So for now, it’s up to employers and their hiring teams to understand how their AI software works — and any potential downsides” which is a certain way to piss your pants laughing. It is a more personal view, but hiring teams tend to be decently clueless on Machine Learning (what they call AI). That is not their fault. They were never trained for this, yet consider what they are losing out of? Consider a person who never had military training, you now push them in a war stage with a rifle. So how long will this person be alive? And when this person was a scribe, how will he wield his weapon? Consider the man was a trompetist and the fun starts.
The data mismatches and keeps this person alive by stating he is not a good soldier, lucky bastard.
The foundation is data and filling jobs is the need of an HR department. Yes, machine learning could optionally reduce the time going through the resume’s. Yet bias sets in at age, ageism is real in Australia and they cannot find people? How quaint, especially in an aging population. Now consider what an executive knows about a job (mostly any job) and what HR knows and consider how most jobs are lost to translation in any machine learning environment.
Oh, and I haven’t even considered some of these ‘tests’ that recruiters have. Utterly hilarious and we are given that this is up to what they call AI? Oh, the tears are rolling down my cheeks, what fun today is, Christmas day no less. I haven’t had this much fun since my fathers funeral.
So if you wonder how stupid can get, see how recruiters are destroying a market all by themselves. They had to change gears and approach at least 3 years ago. The only thing I see are more and more clueless recruiters and they are ALL trying to fill the same position. And the CBC and their article also gives us this gem “it’s also important to question who built the AI and whose data it was trained on, pointing to the example of Amazon, which in 2018 scrapped its internal recruiting AI tool after discovering it was biased against female job applicants.” So this is a flaw of the lowest level, merely gender. Now consider that recruiters are telling people to copy LinkedIn texts for their resume. How much more bias and wrong filters will pop up? Because that is the result of a recruiter too, they want their bonus and will get it anyway they can. So how many wrong hires have firms made in the last year alone? Amazon might be the visible one, but that list is a lot larger than you think and it goes to the global corporate top.
So consider what you are facing, consider what these people face and laugh, its Christmas.
Enjoy today.

