As a software developer I’m always bemused by the leaders of industry who lament the lack of skilled workers to fill open positions in my field. It seems that too many forget one of the most basic rules of capitalism: If demand is low, you’re not offering enough value. As it fits this scenario: If developers aren’t applying for your position, you’re not offering enough compensation.
This is rather cynical, but I imagine that these pronouncements aren’t for my ears anyways: They’re made to justify keeping “information technology professionals” on the overtime exempt list, or to raise immigration caps for technology workers. (Which is often well justified, but I wonder at the distortion it introduces to the domestic labour pool.)
The first thing that makes me wonder about the supposed â€œskill gapâ€ is that, when pressed for more evidence, roughly 10% of employers admit that the problem is really that the candidates they want won’t accept the positions at the wage level being offered. That’s not a skill shortage, it’s simply being unwilling to pay the going price.
But the heart of the real story about employer difficulties in hiring can be seen in the Manpower data showing that only 15% of employers who say they see a skill shortage say that the issue is a lack of candidate knowledge, which is what we’d normally think of as skill. Instead, by far the most important shortfall they see in candidates is a lack of experience doing similar jobs.
Employers are not looking to hire entry-level applicants right out of school. They want experienced candidates who can contribute immediately with no training or start-up time. That’s certainly understandable, but the only people who can do that are those who have done virtually the same job before, and that often requires a skill set that, in a rapidly changing world, may die out soon after it is perfected.
Another way to describe the above situation is that employers don’t want to provide any training for new hires â€” or even any time for candidates to get up to speed. A 2011 Accenture survey found that only 21% of U.S. employees had received any employer-provided formal training in the past five years. Does it make sense to keep vacancies unfilled for months to avoid having to give new hires with less-than-perfect skills time to get up to speed?
Employers further complicated the hiring process by piling on more and more job requirements, expecting that in a down market a perfect candidate will turn up if they just keep looking. One job seeker I interviewed in my own research described her experience trying to land â€œone post that has gone unfilled for nearly a year, asking the candidate to not only be the human resources expert but the marketing, publishing, project manager, accounting and finance expert. When I asked the employer if it was difficult to fill the position, the response was â€˜yes but we want the right fit.’â€
Another factor that contributes to the perception of a skills gap is that most employers now use software to handle job applications, adding rigidity to the process that screens out all but the theoretically perfect candidate. Most systems, for example, now ask potential applicants what wage they are seeking â€” and toss out those who put down a figure higher than the employer wants. That’s hardly a skill problem.
Meanwhile, applicants are typically assessed almost entirely on prior experience and credentials, and a failure to meet any one of the requirements leads to elimination. One manager told me that in his company 25,000 applicants had applied for a standard engineering job, yet none were rated as qualified. How could that be? Just put in enough of these yes/no requirements and it becomes mathematically unlikely that anyone will get through.