FOSS CI/CD with GitHub Actions

Simon Zambrovski
Holisticon Consultants
10 min readAug 15, 2020

--

Cascate del Varone, © Simon Zambrovski

If you are a passionate open-source developer, you are probably not only contributing to existing free and open source software (FOSS) projects started by others, but already started your own. If not — the time will come and you will start one.

A bunch of activities arises around such a project, from pure project management and release planning to management of source code and publishing the deliverable artifact. In the last decade, the process of delivering the artifacts from the source code is usually performed automatically through the continuous delivery (CD) pipeline. A delivery pipeline is a set of tools integrated to a single chain responsible for continuous delivery (CD). On the other hand, if your software project is not running anywhere, but is a library that is used by others, your delivery pipeline aims to publish an artifact into an artifact repositories.

There is a number of source code versioning systems available on the Internet, but I’ll focus on github.com being most popular. In the following article, I’ll summarize steps needed to set up a continuous integration/delivery (CI/CD) pipeline using GitHub Actions required to build a Java library and publish it into Sonatype Maven Central repository — one of the largest public repository for Java artifacts.

For demonstration purposes, I created a sample reference project on GitHub https://github.com/toolisticon/foss-github-actions-java

Build pipeline

First of all, let us setup a small build pipeline. Its purpose is to build the software, run all tests, and to run any kind of code analysis on any change committed by a developer. I consider you are using Apache Maven as a build tool and will focus on that (I’m aware of Gradle, but this article is on old-school Maven)

The steps you want to invoke are:

  • compile production code
  • compile test code
  • run unit tests

Apache Maven provides so-called Maven Plugins for all those steps. Even more, since those steps are very common, you don’t even need to declare the usage of the plugins in your pom.xml.

Writing Unit tests for relevant places in code is an advanced skill. The so-called unit test coverage might be an indicator helping to find missing untested code. In this tutorial, I’m using a free library JaCoCo which became the de-facto standard free solution for coverage metering. JaCoCo is shipped as a Maven plugin and is instrumenting code before execution and creates a report after the test run, indicating which places in code has been invoked during the tests.

Here is how the setup of the plugin works:

JaCoCo configuration to prepare the agent

By default, JaCoCo creates the binding to the Java Agent for the unit test run and stores it in the {argLine} variable. To use this during the run of the JUnit test you need to specify it in the configuration of the Surefire plugin:

Surefire configuration to meter test coverage

Apache Maven itself is required to be present on the build machine and ideally it should be a pre-defined version, to produce repeatable results. For this purpose, a so called Maven Wrapper may be used, a small tool for jump-starting the correct version of Maven inside your project.

If you already have a recent version of Java on your machine you will need to run ./mvnw clean verify from your command line. GitHub Actions runs the build inside of a docker container, so we will need to prepare it first.

Usually, your library depends on other libraries and frameworks which need to be available for the compilation from an artifact repository. In large projects we often joke about “downloading the Internet” since the total time consumption for this process becomes a problem. For this purpose, Apache Maven uses a local file cache on the build node. If the build node is set up every time from scratch (a fresh docker container) we can not directly benefit from this cache. Luckily, there is a GitHub Action for saving and restoring files used as cache from and to created build node.

Since we are collecting test coverage metrics, it is a good idea to track them along the development history of the projects. There are many tools on the Internet offering free access for FOSS projects. I’ll demonstrate the use of the CodeCov platform.

Now we discussed all required ingredients, so the first build pipeline can be created. The steps to be executed are:

  • checkout
  • setup JDK
  • restore cache
  • prepare Maven Wrapper
  • run build
  • upload CodeCov metrics

The steps look as following using the GitHub Actions pipeline syntax:

Default build pipeline steps

We want to make sure that the build pipeline is run on every commit on every branch, to do so add the following trigger:

on:
push:
branches:
- '*'
- '**/*'

Secrets in public repository

You probably noticed the expression ${{secrets.CODECOV_TOKEN}} in the steps listing above. GitHub Actions provides convenient mechanics to store secrets (like credentials, access codes and others) and use them inside the build scripts and pipelines. Much effort is spent to make sure that the secret remains a secret and can’t be revealed by the user, but I don’t want to focus on how safe they are — my assumption at this place is that it is safe enough, as long as you control the code base itself (like pull requests from others by doing the review).

The secret has a public visible name and a hidden value, which can be entered once and not be viewed again later. There are two scopes of secrets: per repository and per organization and the first one will overwrite the latter ones if the name of the secret is the same.

As you will see later, several credentials are required during the release pipeline and all of them will be stored in GitHub Secrets.

Sonatype OSS Requirements

After the creation of a build pipeline, let us collect all requirements Sonatype requests for publication in Maven Central repository.

  • You need a Sonatype account
  • Your account needs permissions to publish using the specified Group Id
  • For every Java artifact (jar file), there must be sources and javadoc files
  • Every artifact must be supplied with an additional signature file (this includes pom.xml, any binary jar, sources jar and javadoc jar).
  • Additional requirements on the pom.xml (artifact name, artifact description, url, scm, distribution management, license)

That’s it, sounds easy, right? In the following sections I’ll explain how to set all this up.

Sonatype account and Group Id permissions

As described in OSSRH Guide you need to create the JIRA account and file a ticket to get permission to publish under certain group id. I usually do the verification of the group id by creating a DNS TXT record in the domain used. This works very fast and you should get the response from Joel (the Sonatype employee) almost immediately.

If you already have permissions to publish the artifact, you will need to configure Apache Maven to use your credentials. For doing so, the pom.xml must declare a repository in the distribution management / specify the server id of the repository to use and configure credentials for this repository. The configuration of the credentials is performed in settings.xml used during the build. There is GitHub Action for doing so:

Setup settings.xml with credentials for the server OSSRH

I’ve seen an example of the configuration of settings.xml during setup of JDK, but I never got it running, because of Maven security, so I stick to the example above in my projects.

In order to publish an artifact to a Maven Repository, Maven provides a default deploy plugin (enabled by default). I’m not using it, but rely on the Sonatype release plugin because of better automation of the deployment and and the release process. Here is the configuration in pom.xml:

Disabled deploy plugin and nexus-staging plugin used instead

Create JavaDoc and Sources archives

As we use Apache Maven as a build manager, the corresponding Maven plugins are responsible for creating the JavaDoc and sources archives. Here is the relevant part of the configuration in your pom.xml:

JavaDoc and Sources plugin configurations

Signing artifacts

Signing artifacts is performed using the GPG (GPG2) tool. Again a special Maven plugin is used to integrate the signature into your build process, but in contrast to other activities it is just passing the configuration to the native program. This means that GPG must be installed on your machine (standard on *-ix and mac) to test the signature locally. For the run in GitHub Actions, GPG is already installed in the Docker container.

In order to sign the artifacts GPG requires your private key. Your public key should be publically available on PGP server, so everyone can verify that the artifact has been created by you. I won’t spend time and discuss how to generate private/public key pair and to distribute the public key. I just assume you managed to generate a key pair and your key is protected by a passphrase.

Here are the options of the corresponding Maven plugin:

Configuration of GPG plugin

If you now run your build (remind the batch mode) the build will fail, since it requires gpg.keyname and gpg.passphrase to be set-up. If you supply the required parameters, the plugin will invoke GPG and the latter will try to find the key by the given name later. This works locally but is challenging to achieve in a fresh-created Docker container.

The idea is to import the key(s) from a secret prior invoking Maven during the build of the artifact. First, let us export the key from your local machine. Since the key is provided in a binary format, we need to encode it using base64 encoding. In addition, GPG will reject a key generated on a random other machine and will accept it only in case the so-called owner trust key is present.

To export the owner trust, run:

gpg --export-ownertrust | base64 - | tr -d '\n' | tr -d ' '

To export the key with the name A1B2C3D4 run

gpg --export-secret-keys A1B2C3D4 | base64 - | tr -d '\n' | tr -d ' '

Both commands will produce base64 encoded strings which you need to store in your repository secrets (GPG_OWNERTRUST and GPG_SECRET_KEYS). Make sure the strings don’t contain new line/line feeds produced by your console (join manually to a single line string).

Now let’s see, how the GPG can be prepared for signature:

Github Actions to import keys from secret

After the execution of the steps, the GPG in GitHub Actions Docker will have the key for the signature of the artifact. To pass the key name and passphrase the command line parameters are used.

Move it to a profile

Now we defined a bunch of plugins which will be executed during the artifact release, but slow down the execution of a standard build locally (or in the build pipeline) drastically. To avoid this waste of time, we should consider moving all the release relevant plugins to a Maven profile (or move the definition into a pluginManagement section and reference them from the profile only). Here is how the default build block looks like:

Plugins relevant for the default build only

and here is the block relevant for the publication in a separate profile executed in addition to the default build block:

Putting all together

Here are the relevant GitHub Actions to release on Maven Central:

OSSRH Nexus publication with github actions and Maven

Please note that we used -DskipTests and -DskipExamples during the build and pass the -Prelease profile along with the credentials for the GPG signing. The target phase is deploy to activate the publication plugins.

We can skip the tests, because the run with the tests has passed just before (we release after build, so the beginning of the release-pipeline is the copy of your build pipeline).

I’m usually skipping example projects, because I don’t want to include them into publications. That is why the example Maven module is not included directly into the parent POM, but inside a Maven profile, which can be deactivated by the flag above.

Finally, the profile release activates all required plugins and the passphrase and key name are stored in secrets for signing.

Some words on code / release management

After getting it technically working a question may arise how to trigger the release and how the versioning of source code works well with release management.

There are different approaches on this, but I like to use gitflow as a git branching model. In doing so, the features are developed on feature branches and are merged by pull requests (reviewed by other developers) on develop branch. Finally, if a release needs to be created, the changes go (via release branch) to the master branch. By doing so, every commit to master branch can be used as trigger for creation of the release and manifest it in the artifact version.

Of course, you can execute the steps of Gitflow manually (with Git and Maven versioning), but there is a nice plugin called gitflow-plugin which automates the steps for you. It’s configuration is pretty straight forward:

To create a new release, update the version of the pom and finally merge everything to master you will need to run the following command:

./mvnw -B gitflow:release-start gitflow:release-finish

You can do it manually from your developer machine or even go a step further and trigger this from an additional GitHub Action (for example if a release is announced and published from a milestone via GitHub User Interface).

In the end the plugin will push to the master branch of your repository, so you can use this as trigger inside the release pipeline:

on:
push:
branches:
- master

The trigger will first build your software, then create all artifacts and finally upload it to Sonatype Nexus (OSSRH). If it was successful and all checks executed by Sonatype are passed, the staging repository will be closed and all artifacts released to the release repository automatically.

The last steps of the output of your GitHub Action should look like this:

[INFO]  * Upload of locally staged artifacts finished.
[INFO] * Closing staging repository with ID "iotoolisticon-1070".

Waiting for operation to complete...
..........

[INFO] Remote staged 1 repositories, finished with success.
[INFO] Remote staging repositories are being released...

Waiting for operation to complete...
..........

[INFO] Remote staging repositories released.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for github-actions-java-parent 0.0.2:
[INFO]
[INFO] github-actions-java-parent ......................... SUCCESS
[INFO] github-actions-java ................................ SUCCESS
[INFO] -------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] -------------------------------------------------------------
[INFO] Total time: 01:31 min
[INFO] Finished at: 2020-08-13T15:25:10Z
[INFO] -------------------------------------------------------------

Please note that it can last several hours until the artifact is accessible and searchable (and is findable in the artifact index). If any errors occur during the publication, have a look on the Sonatype Nexus report displayed in Staging Repository section.

--

--

Simon Zambrovski
Holisticon Consultants

Senior IT-Consultant, BPM-Craftsman, Architect, Developer, Scrum Master, Writer, Coach