Packages, dependencies and interaction between teams
Recently, a question of Programmers.SE passed unnoticed, because of its length. This is unfortunate, because the question highlights a bunch of misunderstandings within many teams on a broad range of subjects, from Continuous Integration to dependencies management to collaboration between teams within a company.
This is one of those subjects which, once you understand the idea, appear extremely simple to implement, but until you do, are very complicated even to grasp. My answer on Programmers.SE is not particularly well written and I don't want to be more detailed there, so I'll try to do a better job here.
The story of a team
It all starts by a team developing a software product in C#. They thought a lot about the architecture, and made a decision to have a main application, a console variant, and three libraries:
The one deals with a current problem which is encountered by many C# developers across many projects,
The other one deals with the specifics of the company, given that other teams within the same company could benefit from it,
The third one is inherent to the product itself, and has very limited value outside the scope of the project.
The project grows, and so does the team. Few months later, a dedicated team is created in order to maintain the first two libraries. This team is split in two few months later. This leads to numerous misunderstandings and conflicts between the teams. As they commit to the same repository, merges are painful to solve, and teams spend more time blaming each other for breaking the build than doing actual work. A few attempts to set up versioning fail, because of the difficulty of handling dependencies.
This happens a bit too often. From what I've seen when auditing companies, every time multiple teams within the company try to interact, bad things happen. Bad managers will start blaming people: lazy programmers make use of the difficult situation to transfer responsibilities to others and spend time doing nothing. We won't go this way, and will rather blame the process.
Blaming the process
In this case, blaming the process is easy, because we have an excellent example where, in a very similar context, a different process is applied and works pretty well. This context is the external dependencies.
What happens when different teams are not working together, are not even working in the same companies, and may not even know each other personally and had no meetings or other interactions whatsoever?
Some will assert that they won't be able to build a product. Well, think again.
How many times you've used Entity Framework? How often was you blaming Microsoft's developers of breaking your build after they released a new version of their ORM?
It is very intuitive to build the interaction we get with teams we don't know much about. Would it be Entity Framework, Json.NET, ASP.NET MVC or any other third-party package. When the third-party delivers a new version of the package, we have a choice between moving to the new version or staying with the one we used before. This means that whatever they did affects us through the explicit choice of migrating to a new version of the package, which also means testing that everything still works in this new version. Developers of Json.NET, in turn, don't have to know who uses their package. They simply push a new version, and keep it up to you to move to it.
The change in the process is crucial both in terms of thinking about the workflow and in terms of abstraction level. You don't have to know anything about the team of the third-party product. You don't actually care if developers of jQuery use Git, SVN or TFS, and you don't care about the structure of their team of the way they work. The package becomes the product itself.
Back to our teams who are still shouting and blaming each other, we can't left unnoticed how their workflow and organization are uselessly complicated. The problem is that every team knows too much about the others, and this makes it practically impossible to develop anything, instead of making things easier. If they stop hurting each other and interact only in terms of packages, each team can pick their own version control and work according to their specifics, while being unobtrusive for other teams.
- The first project can be moved to GitHub and receive contributions from other companies, while being distributed through NuGet,
- The second project can reside on internal version control server and be deployed through the corporate NuGet server which can be accessed by staff and partners only,
- The third project can still be developed together with the two applications, with no changes in the workflow or the infrastructure.
This makes it easy to approach a company as a series of services. Hosting a version control server, for instance, moves from a core element of the company to a simple service with a bunch of characteristics such as the reliability, the backups, the performance and the additional features. One part of a company may deploy an SVN server, another part may provide Git, and every team could freely chose between those services, depending on the quality of service, eventually the internal cost (if the company is large enough), and essentially on the preferences of the members of the team.
The best part is that it makes complete sense to approach the problem in terms of services. Classically, companies tend to centralize things like version control in order to reduce the costs of maintenance and support. This might work, but may also create an internal monopoly; an unskilled system administrator may cause every developer in the company to suffer from bad service, with no good reason. Competition, at this level, may lead to lower costs and better service in the same way an open market does at a larger scope.
As a side note, I suffered from that for a year at my previous employer's company. We had a moron in charge of TFS, so you may imagine the number of issues we got from a badly configured server which was randomly rebooting or simply freezing from time to time. Using this TFS server was mandatory for every developer, which meant that when the server was down, nobody could actually work (especially given the intrusive nature of TFS which practically requires you to be online permanently.)
So we were fifty guys wasting our time and having no other choices? WTF?! Why? Wouldn't it be much easier for a dedicated person annoyed by this problem to set up another TFS server, or even a free Git/SVN variant using Linux, and encourage every one to move to it? Why would any CEO decide to throw money and lower the productivity of his employees?
What about bidirectional interactions?
A natural remark is that the context of internal projects, unidirectional interaction of a provider—the interaction we seem to have with the developers of jQuery or Entity Framework—doesn't work well in a company, where a given package may be used by two or three “customers”, not hundreds of thousands.
The mistake here is to think that our interactions with third-party package creators are purely unidirectional. It's more subtle than that. If you encounter a bug, you can open a bug report. If you need a feature, you can ask for it. This means that the relations are purely bidirectional, despite the appearances.
What might happen is that the third-party doesn't care about the bug you found in their product, or that they find your feature request useless and stupid. But isn't that exactly what happens between teams in the same company? When interacting together, those teams will often fight over the fact that a bug isn't really a bug, or that a feature is not that useful and is, believe me, very, very difficult to implement. If teams disagree, the usual way to solve the discrepancy is to escalate to a common supervisor. With packaging system, the very same escalation could happen as well. The only difference is that there would be a bias towards the provider, which may not be that bad after all, since it reduces feature creep and overall functional instability.
Microservices
The idea of microservices is very similar when it comes to building abstractions of the world in order to focus on a small part of the system, but many packages cannot be presented as microservices. For instance, a library such as Json.NET makes no sense as a micro-service: if I let a service serialize and deserialize data, how would I be able to use the result without doing serialization and deserialization once again?
Packages have a tremendous benefit of being very low level. This makes their integration within the code very straightforward and free from additional layers of redirection. With a microservice, you'll have to use a high level communication channel such as HTTP, which makes it a bad choice for a bunch of situations (such as a library which serializes and deserializes JSON).
This is also the drawback of packages: since they are low level, it's up to you, the developer, to ensure they have the required environment to work properly. You can't simply put an URI of a service and expect everything to work magically; instead, you have to know some of the internals of the package in order to use it properly.