Measurements as a precursor of culture of quality
When auditing different IT companies, the noticeable pattern is that companies which perform badly have problems in every imaginable area. I've never seen a company which has outstanding work quality but severe security problems, or a company which does everything very well, but have important issues with the programming tools in use.
This is understandable: if you hire skillful, talented people, they will perform well in many aspects of their jobs. A team of talented developers just can't ignore that they are losing productivity because of the lack of version control, or the absence of unit tests. On the other hand, if you hire the least desirable coders who don't know what they are doing, they will fail in whatever they do.
In this context, a question like “How to introduce a culture of quality into the work environment?” and comments like this one:
The way you convince management of anything is by making it a value proposition. How much money will it save them by implementing your suggestions? Show them the money.
is worth a discussion, especially since this opinion is very popular among IT professionals. What could be easier? Just show to the CEO how profitable his company will become if everybody starts promoting the culture of quality!
In practice, this doesn't work.
The basic logic is flawed
It doesn't matter how many examples can you list of companies which found unit tests, DevOps, version control or pair programming as precious practices which bring more money: there will always be companies which screwed up unit tests, DevOps, version control and pair programming, resulting in money loss.
As IT professionals, we often believe that some practices bring money with 100% confidence. Take version control: any developer with a bit of self-esteem will flee any team which doesn't use one. There is absolutely no technical reason not to use version control.
However, from business point of view, changing a practice of a team from “Just put your source on FTP” to a use of a version control system may present a substantial cost and some risks:
How to setup a version control server? The usage of a third-party service may be easier, but is also associated with a monthly cost.
How to backup the repository?
What happens if the server goes down? Is it still possible for the team to work during maintenance?
How much would it cost to train the team members to use version control properly?
What are branches and how do we use them?
Our previous branch usage appears completely stupid. How do we change it now?
A practical example from my last employers' workplace:
Coders who worked there lacked basic understanding of version control, resulting in a waste of one to four hours per week per developer.
The guy who was in charge of the TFS server was a moron, unable to manage anything. The server was often down, resulting in another waste of one to two hours per week per developer for the whole company.
The same guy considered it funny to play with branches, while having a deep misunderstanding of version control in general and TFS in particular. This often resulted in a waste of two to ten hours per month per developer.
Add to this the cost of hosting TFS in-house and the original price of the server, and the choice of using version control appears not that interesting. Obviously, it would be different if the company was hiring people who know the stuff.
Tools, patterns and practices are great when they are used right. When not, the best tool can drag team members instead of empowering them.
Measure the effect of a change before doing the change
The bad thing is that companies which are in a difficult position usually don't have metrics. Either they measure nothing, or when they actually measure something, they are doing it wrong or not using the results.
This makes any improvement proposal problematic. Say you want to introduce unit testing, and you convince the management that unit testing makes the company more profitable by reducing the losses related to bug solving. The problem is that nobody in this company knows how much exactly the company spends for debugging, or how much bugs are reported and solved.
So two months later, another developer tells to the management that unit tests actually harm productivity and cost money. He has no data to support his claim, but you have no data either, which means that the one who wins is not the one who is right, but the one who is closer to the management.
Measure the improvement brought by a change and make sure results are public and taken seriously.
The hard part is to make it possible to gather the statistics. Often, it requires the change in the process—a change that a team may not be ready to accept.
People don't change; at least the culture
In the end, the most important thing is the team and the culture. There are teams of inexperienced coders who are willing to make progress and are encouraged by the management. They will ultimately find a way to improve. But there are also teams which lack the basic communication skills, are managed by people who resist change, or the members of the team simply don't bother about their job.
My job as a consultant is also to determine whether the team and their management wants change. If either the team or the management couldn't care less, there is not much I can do.