Recently I have been looking at our current test suite looking for ways to improve our own tests. As we use Bitbucket we have spent some time looking into Pipelines to improve how we run our tests. Whilst Pipelines is a very useful tool we have found there to be limitations that would make its use impractical for us. However this may not be the case for you. What follows is a run down of my experiences looking into setting up and using Pipelines.
What is it?
Pipelines is a service provided by Atlassian to allow you to integrate tests and or build process to your repository. A pipeline is added to a repository and allows you to load a docker image, clone your repository to it and perform tests or build process you require. As you can imagine this would be a very useful tool allowing you to run unit tests on any and all changes in a repository. This makes it much easier to ensure your code is safe and meets your requirements.
How can I use it?
Pipelines are easy to set up and configure but there are a lot of options that you can choose from. I don’t want to go into specifics here, especially when Atlassian already has documentation covering the subject. Such as pages like this one to get you started and this one to further configure your pipeline.
What are the advantages to using pipelines?
As the pipeline executes on a docker image, the code cloned to it is untouched by further commits whilst it is running. This means you can guarantee that the tests are run on each commit you make to your repositories. On top of this the status of the pipeline is displayed for each commit in the repository. This makes it easier to see exactly which commits have passed or failed and exactly what code has caused it all from a single location.
Using pipelines makes scaling your tests easier. Your pipeline will execute on each commit in a new docker image. Should a new commit come in whilst the pipeline is running, a new docker image will be created. As requirements grow the pipelines will grow with them, not leaving you restricted based on the hardware you have available.
Ease of set up is another advantage to using pipelines. It takes one click (and admin rights) to turn on pipelines for a repository. After performing this single click you find yourself on a page with templates of different types of config file already created to perform a task. For instance selecting the maven config file has a pipeline ready to perform mvn -B verify. There is even a validator to help you check for issues before you commit a config file. Atlassian have made it incredibly easy to get pipeline set up and running.
What are the disadvantages to using pipelines?
A big disadvantage for me is the way multi module maven projects work with pipelines. They don’t work out of the box in many cases. Unless your project is set up to store module artefacts on a remote server to be accessed. The pipelines has an option to cache directories to help speed up the build process by storing data. There are some predefined caches including one for the local maven repository for the pipeline. However these are not shared across different pipelines. If you were hoping that you could share artefacts across pipelines, I am afraid I need to disappoint you.
Another disadvantage is that you can’t store data for your pipeline reliably. With the pipeline working off a docker image you get a fresh image each time, any data created on the last pipeline is lost unless it is cached. However caches are lost after 7 days and may be lost at any time so you need to ensure the pipeline can wok with or without them. This means any test that outputs files or had files for a baseline will only work if you have them in a remote location and have them brought in when needed. Unfortunately doing this would not work well for tests that involve large numbers of input or baseline files.
–EDIT–
Since the article went live we received the following from Matt Ryall via our twitter profile.
I’d add that AWS S3 (us-east-1) is a good option for ‘large numbers of input or baseline files’. We’re hosted there, so it’s very fast to pull down the bits you need. Keeping state on the build server can cause non-reproducible failures, so we actually recommend this approach.
Final Thought
Pipelines look to be a very useful tool in the right circumstances. If you are looking for a solution to run unit tests running for singular methods or classes on a given code base then is would be the solution for you. If you are looking for a solution that allows you to run higher level tests then you will need to keep looking.
Our software libraries allow you to
Convert PDF files to HTML |
Use PDF Forms in a web browser |
Convert PDF Documents to an image |
Work with PDF Documents in Java |
Read and write HEIC and other Image formats in Java |