The typical local builds that developers run on their machines work by building the subproject they're working on but also all the dependent subprojects it requires. Usually, as building all dependent subprojects takes a lot of time, the developer infrequently checks-out other project sources and build them on demand. His focus is on his subproject that he's making modifications to (and rightly so!). This strategy has the following drawbacks:
Because of all these problems, I have been using a different approach on my current project for the past 2 years. This was mostly motivated by the fact that the project is a big project (close to 100 developers) and we were hitting the issues mentioned above. I have called this strategy "Binary Dependencies Build". If you're interested this is an approach I have presented both at TSSS2004 and at Javapolis 2004.
Here is how it works (click on image for a larger picture):
Imagine that you have a "trading" subproject that depends on 2 other subprojects ("partners" and "referenceData"). The idea is that your local build will NOT build them from sources but instead will download their latest version that work from a remote artifact repository (a location where the result of the subproject build is located). In order to accelerate even further the build, the versions downloaded are stored locally. In our example, the latest "partners" jar is already available locally and is thus not downloaded but the "referenceData" one is not. It is downloaded and then stored locally. The "trading" subproject is built using these binary dependencies.
This is all fine but there is a burning question: How do I do continuous integration with such a system? Won't the binary dependencies be old versions when I get them? The solution to this is to have a continuous build server that continuously build subprojects and puts their artifacts in the remote repository. Note that there are put in the repository only if their build passes with no errors. This ensure that there are always fresh versions available and that they are as "good" as they can get.
Doing it with Maven
The good news is that this feature is built in Maven. Maven implements this support of artifact repositories (local and remote) and it supports the process of automatic download of artifacts not available in the local repository. Usually Maven will verify first in the local repository if the artifact's version exists and if so will use it. However, if an artifact's version contains the "SNAPSHOT" keyword, Maven will always check if there's is a more uptodate artifact in the remote repository. This allows implementing easily the strategy defined above.
We've been very happy with this solution so far. I think there are 2 key points in making this work:
So on the CI build, you perform a jar:deploy AND jar:deploy-snapshot so the local builds are just dependant on SNAPSHOT? How does the 1.0-SNAPSHOT version work into this? When a new version is released, do you change all deps to 1.1-SNAPSHOT?
--Jeff Brekke, January 13, 2005 08:22 PM
What happens if we want to build the 'Big' project from sources without binary dependencies? What about circular dependencies? when one sub-project depends on other sub-project
--Zachi Hazan, May 30, 2005 05:33 AM
Post a comment