Hiring process is inherently wrong
Regularly, managers ask me questions about the hiring process. All of them complain about three things:
- They have a lot of bad candidates—candidates who clearly don't qualify for the job.
- They have few good candidates.
- They often hire persons who end up being average or below average.
Many of those managers complain that they've read all my articles on the subject, but while it did improve their understanding of the hiring process and the outcome, the change wasn't that radical. This upsets me a lot. Years ago, I thought that slight changes to the focus during the interview, minor modifications of the process, would lead to drastic improvements in terms of the quality of the hires. The practice shows that I couldn't be more wrong.
Back in 2014, I write the article about the tags shown on Stack Overflow Jobs. My point was that the relevance of those tags to a profile of a given person is extremely low. In February 2021, my profile shows that I'm top 1 for
The same year, I start writing a series of four articles (1, 2, 3, and 4) about the hiring process. In the series, I explain how to formalize the need, how to stop writing fluff in the job postings, what's the actual goal of a job posting and how to write one, when to take initiatives, who should be interacting with the candidates, how to perform an interview, how long should be an interview, and how to behave after the interview.
Still in 2014, I also write a section about the dumbest hiring practice of self-evaluation, that is, when the candidates are asked to rate themselves on a scale of zero to five for a given skill. From the feedback I received in relation to this article, I contributed to make a few dozen companies a bit less dumb about the questions they ask. That's a good thing.
Finally, the year ends with another article about the low quality of the websites which list the interview questions.
In 2017, I wrote about the key error of the hiring process: the focus on technologies, as well as about the focus on something which looks like metrics, but is just irrelevant fluff.
The next year, things started to get different. A discussion with a manager showed that my vision of the hiring process was extremely optimistic and unrealistic. Most companies are not looking for skillful people, because the actual skills are not the qualities which are valued in a corporate world. What matters more is a person who's good at politics, who is able to work despite crappy working conditions, and who gives an impression of making progress.
This article was followed in 2019 by another, explaining the reasons why a company can't test candidates based on the things that actually matter: self-deception and misinformation.
While those articles show that the hiring process currently used by most companies is primitive, the latest ones—as well as the actual experience of the managers that I was mentioning above—also give a hint that improving the process is not that easy. In retrospective, I would be inclined to believe that the process is unrecoverable: that it is either so flawed, that no single change could make it work, or that my approach of the problem itself is completely wrong.
What's the goal of the hiring process, anyway?
There is not one, but two goals. The first one is to attract the ones that are qualified. The second one is to filter out the persons who are not qualified.
Finding qualified persons
There are companies which are great. Well-known. Popular. Reputable. Many persons want to work for them.
And then there are others. Those ones are maybe not popular. Or maybe they are just not that great. Still, they have to compete with others in order to be able to find people who will go work for them rather than for their competitors.
This is not very different from selling products or services to customers. Here, the customer is the candidate, and the service is the monthly wages and the opportunity to work for a given company, with the actual work and person's time as a counterpart (while it's usually about money and person's time when it comes to products and services sold to a customer).
The practices are therefore quite similar too. It may include advertisement, direct solicitation, or just a guy telling to another guy that his workplace is not so bad, and the other guy should consider a position in his company. That's not particularly interesting; let's talk instead about the second goal of a hiring process.
Filtering out non-qualified persons
If there were no hiring processes, what could be the alternatives? Let's imagine a company where anyone could get in and decide that he will work here. You don't have to ask permission: you just come, and tell: “hey, I'm an employee now.”
It looks fancy, but there is a practical example of such a system actually working: open source projects. If I want to contribute to the source code of Linux, or go and fix an issue in Visual Studio Code, I don't have to call a human resources department and follow a several hours interview to check that I have the necessary skills and mindset to work on those projects.
And it works.
Open source projects, however, are different. If I'm an employee of a given company, I want this company to pay me on regular basis. Moreover, if I want to work on a new system used by to help landing F-16 on an aircraft carrier, I possibly need a few clearances to access the information that other persons shouldn't be able to access. In other words, the hiring process is a gate to a monthly salary and a series of clearances. Or, in other words, it's a safety mechanism which protects the company from having to share secrets and money with anyone who wants to gain access to their secrets and money.
Interesting.
How evaluating my C++ skills or asking how good am I at communicating with other people helps this purpose? Why wouldn't a bank ask me questions about Python string formatting or the basics of HTTPS encryption before lending me some money?
You can't know what you are looking for
One of the things which is bothering me since 2014 is the number of companies who live in a self-inflicted fallacy that they know they need in terms of people. I'm not even talking about those dumb postings looking for a PHP programmer—I'm talking more about companies which are a bit less primitive, but still look for a software developer, or a project manager, or a data scientist. They are missing the point.
When people ask me what's my job, I don't know what to tell. I really don't know. There is no official name for it. My expertise is quality and productivity—that is, what a given team needs to do in order to achieve better quality, while being more productive. But I also do programming and software development, interaction design, data presentation, system administration, process automation, security, training, and dozen other things. Heck, my photography skills and the things I've learnt when studying psychology are crucial in many of the projects I do, and they can bring a very concrete value to a company hiring me. But the last time I was interviewed about my photography skills during an interview was in my other life.
One company benefited from the fact that I'm fluent in Russian to understand how their system should be internationalized. Did they look for this particular skill when hiring me? Nope, they didn't. They didn't even know they need it, before I stepped in and explained that the way they do internationalization is all wrong.
In another company, I had to develop a piece of software from scratch. It's a piece of Python software, if you ask, but this is not important. While my Python skills were valuable, much more valuable was my professional experience in interaction design (the company has a legacy of making software which is particularly unusable, and the thing I made may convince them that it may be about time to start investing more in the design if they want all other software to look as the one I created), my skills in requirements elucidation, my organizational skills. Funny thing, during the interview, I had to answer questions about... C# and SQL.
Every job had a name. Sometimes I was a project manager, while I brought the most value in unit testing, software quality, and programming. Sometimes I was called an analyst in an R&D department: nobody did any development here, nor research, and there was no analysis involved either. Sometimes I was an architect, but the actual job didn't involve architecture; my value there was in terms of software design, communication, training, interaction design, security, application design, and a few other things.
Now, what value has a system of labeling, where labels are completely useless? Why would anyone create an artificial categorization system which serves no purpose, and which is, by its nature, meaningless at best, and often even misleading?
One may say that I may have an exceptional profile, ten years experience, bla-bla. I'm not. Everyone working in IT is in exact same situation. That's a characteristic of a human being: no one is interested by one and one only thing. No one is born to be a project manager, or a Ruby programmer. You can't look for people like you look for a new fridge.
The core problem is that it's the system itself which categorizes IT persons. He's a web developer. She's a security expert. Out from college, you're looking at the job offerings, and all you see are categories. Here's a bunch of companies looking for a Python programmer. And there, they look for Node.js guys. And here's a posting for a visual designer.
CVs follow the same categorization. If you've been a PHP programmer for the last decade, there is a strong chance that you'll have to answer to the postings looking for PHP programmers. Switching from one category to another is possible, but difficult.
More importantly, categories are for stuff nobody should care about. I don't care that a guy worked on Java projects for the past ten years if I'm using exclusively Ruby in my company. If the person is able to learn, and if he's willing to learn, the lack of any prior expertise in Ruby would be my least concern. On the other hand, there are outstanding qualities that don't have any category. Being a good guy is one of them. You know, a person who is a social catalyst of a team—when he's here, everybody is happy and things get done. As productivity consultant, I can ensure you that people like that are invaluable: they are sometimes one of the most valuable asset you can have in your team. But I've never seen any job posting telling that a company is looking for a good guy.
Try to remember people you were working with five years ago. Do you remember what technologies they were putting on their CVs? Or maybe instead, you remember that this guy was the one who could make progress no matter what—and who dealt with the most problematic issues in the company, saving a huge amount of money, or this gal who thrived for excellence in everything she was doing, or this one, who had an amazing creativity, imagining solutions no one else was thinking of?
When you decide to read a book, do you decide to have an interview with the author of the book, to know if it matches your expectations in terms of what you'll learn from the book? Do you know what you need to learn in the first place? When you meet someone, do you do a hiring interview to decide if the person can be your friend, and if it matches your expectations in terms of what the person will bring to your vision of friendship and your current needs?
Possible alternatives
I explained why the hiring process is inherently wrong, beyond repair, but haven't suggested anything else. Indeed, I don't have a magical solution now, although I'm thinking about the alternatives.
For jobs involving a lot of programming, I would imagine that a company could go open source, and decide to hire (that is, start paying) persons who contribute a lot. I wouldn't be surprised if I learn that Microsoft ends up by hiring those developers who have made solid contributions to .NET Core or other Microsoft's open source products. The problem with this approach is that while a person can be a great contributor to an open source project, it doesn't tell anything about his qualities if he needs to work in the office.
Or maybe somewhere in the future, a company would simply take a person in whenever he wants to be part of the company, and postpone the decision of keeping him or not for a few weeks. During those weeks, the progress of the person would grant him clearances that a newcomer don't have, as well as a better financial reward—if the person has to leave on his first day of work, he doesn't get any money; if he leaves a week later, he gets a small reward; if he leaves after a month, he gets a much larger reward; if he stays for longer than that, he could expect a usual salary.
Naturally, such system would be subjected to some control: first, the company needs to have enough money to pay the newcomers, limiting the number of persons who can join at a given moment. Second, while the newcomer can define by himself what value can he bring to the company, it's up to his pairs—the persons who are already part of the company—to evaluate his actual impact. How to make such system and prevent fraud while fostering innovation is a complex subject.
I'm not at all sure how such organization can be implemented, nor would it work or not. Anyway, this can serve as inspiration for people who own companies, if they want to try something new instead of the broken hiring process we have now. And if you do try something new, let me know where it got you.
For the others, consider that you're working with a terribly flawed process, and try to minimize its impact. Yes, the hiring process will lead to bad results, because it is itself bad. By applying a few practices I've described in my previous articles on the subject, you can make it less bad than it is in most companies, which is by itself a rather positive thing. One day, hiring as we know it today will be over, and that would be great.