Debian builds with Jenkins

From PDP/Grid Wiki
Revision as of 18:47, 12 May 2015 by Tamasb@nikhef.nl (talk | contribs) (binary build)
Jump to navigationJump to search

Goals

The aim of this work was to ease and automate the way debian packages are created for the supported middleware security software. Just like the Koji Testbed for automated RPM packaging, a similar solution was proposed with the use of Jenkins. A Jenkins job can be configured for every component which can be run on a debian build node. This job can than be used to create packages for multiple distributions and architectures in a clean environment with the use of cowbuilder. Much of this work was built on top of the debian building procedure already outlined before.

Prerequisites

There are a couple of prerequisites assumed to be already in place:

Note: The use of Debian Package Builder has been discarded due to its limitation on not using a clean build environment

  • The latest stable debian image (jessie at the time) configured as Jenkins slave (debian build node)
  • Software installed on the debian build node
apt-get install dh-make autotools-dev dh-autoreconf build-essentials devscripts cdbs quilt \
                debhelper fakeroot linitan pbuilder cowbuilder svn_buildpackage maven_debian_helper

Package building jobs

The list of available jobs for debian packaging can be found in our local jenkins instance, under the DEBIAN-BUILDS tab. We started out creating Jenkins jobs based on the recommendations of jenkins-debian-glue, but soon started deviating from it as more and more customization was needed to fit specific use cases. For every package that needs to be build for debian we dedicate two separate Jenkins jobs as suggested in the setup guide by jenkins-debian-glue. The <package-name>.source will build the source package, and the <package-name>.binaries will build all binary packages for different architectures and distributions. These two can be executed independently from each other The creation of these jobs are outlined below.

Building source packages

The steps taken by a <package-name>.source job are:

  1. Restrict where this project can be run: must be a debian slave
  2. Delete previous workspace
  3. Source Code Management: svn or git checkout of the debian subdirectory containing the relevant files [1] into a directory called 'source'
  4. Execute source building script

The first 3 steps of the job are straight forwards, while the last step is the one that does the actual work. The source building script differs for svn checkouts (local projects) and for git checkouts (adopted projects). For local projects coming from out local svn repository we execute the following build script:

ORIG_DIR=$(pwd)

cd source
dch --distribution unstable --release ""

svn upgrade

if [ -f debian/orig-tar.sh ]; then
    chmod +x debian/orig-tar.sh
fi
mkdir -p ../tarballs
uscan --download-current-version --destdir ../tarballs
cp ../*.tar.gz ../tarballs || true

svn-buildpackage -S --svn-builder dpkg-buildpackage -d --svn-move-to=${ORIG_DIR} --svn-dont-purge -uc -us --svn-ignore-new -rfakeroot

lintian -IiE --pedantic `find . -type f -name *.changes` || true

First, the changelog is modified to reflect a new build via the dhc command. After an svn upgrade, the tarballs containing the sources are fetched with the use of uscan and/or the aid of the debian/orig-tar.sh script (thus it is important for it have an executable flag). In case the custom debian/orig-tar.sh downloads the tarballs into the parent directory, we make sure they are copied into the tarballs directory, where svn-buildpackage expects them to be. Finally, the source package is built with svn-buildpackage, and lintian checks are executed on the results.

At the time of this writing our only encountered adopted projects are the argus c components, for which the debian subdirectory is checked out through github. These packages come with a predefined Makefile, so building them boils down to executing:

dch --distribution unstable --release ""
make deb-src

So that you wouldn't have to execute every build separately you can chain them into a single job which calls every <package-name>.source job as a downstream project, using Parametrized Trigger Plugin. On our Jenkins instance this job is called build-add.source

Building binaries packages

The steps taken by a <package-name>.binaries job are:

  1. Define ${architecture}, ${distribution} and debian slave from the matrix configuration
  2. Delete previous workspace
  3. Source Code Management: svn or git checkout of the debian subdirectory containing the relevant files [2] into a directory called 'source'
  4. Execute binary building script
    1. add package suffix
    2. build source package
    3. build binary packages
    4. execute lintian checks

Every <package-name>.binaries job is a multi-configuration job with the following axis defined:

  • User-defined Axis: architecture=amd64 i386
  • User-defined Axis: distribution=jessie wheezy squeeze sid
  • Label expression: label_exp=debian8

The two User-defined axis will create 8 sub-jobs, one for each combinations of architecture and distribution, while the Label expression will restrict the jobs' execution to debian8 nodes (see Jenkins Setup on how to set up nodes with labels).

After clearing the workspace and checking out the debian subdirectory to following script is executed:

###############################################################
#  Building source with modified changelog                    #
###############################################################
ORIG_DIR=$(pwd)

cd source
if [ "${distribution}" = "sid" ]; then
    dch --distribution unstable --release ""
else
    if [ "${distribution}" = "squeeze" ]; then
       bptag=bpo60+1
    elif [ "${distribution}" = "wheezy" ]; then
       bptag=bpo70+1
    elif [ "${distribution}" = "jessie" ]; then
       bptag=bpo80+1
    fi

    version=`dpkg-parsechangelog | sed -n 's/^Version: //p'`
    dch --force-distribution --distribution ${distribution}-backports -b -v ${version}~${bptag} "Rebuild for ${distribution}"
fi

svn upgrade

if [ -f debian/orig-tar.sh ]; then
    chmod +x debian/orig-tar.sh
fi
mkdir -p ../tarballs
uscan --download-current-version --destdir ../tarballs
cp ../*.tar.gz ../tarballs || true

svn-buildpackage -S --svn-builder dpkg-buildpackage -d --svn-move-to=${ORIG_DIR} --svn-dont-purge -uc -us --svn-ignore-new -rfakeroot

cd ../

###############################################################
#  Building binaries                                          #
###############################################################

USE_LOCAL_REPOSITORY=true

if [ -n ${USE_LOCAL_REPOSITORY} ]; then
    export release=${distribution}
    export REMOVE_FROM_RELEASE=true
else
     export REPOSITORY_EXTRA="deb http://software.nikhef.nl/dist/debian/ ${distribution} main"
     export REPOSITORY_EXTRA_KEYS='http://software.nikhef.nl/dist/debian/DEB-GPG-KEY-MWSEC.asc'
fi

/usr/bin/build-and-provide-package

###############################################################
#  Lintian reports                                            #
###############################################################


/usr/bin/lintian-junit-report `find . -type f -name *.dsc`
cat lintian.txt

The first part of the script build a source package with the relevant name suffix according to backporting conventions. This part is similar to the script executed in a <package-name>.source job, with a modified dch behaviour. Because of the different suffix appended to different distribution backports, it is necesarry to rebuild the source with the appropriate name. This means that we cannot rely on the output of <package-name>.source to be used in the <package-name>.binaries, as suggested by the jenkins-debian-glue guide.

The second part of the script is used to define where dependencies should be taken from, and executes the build-and-provide-package script. The build-and-provide-package script, provided by jenkins-debian-glue, will use an existing cowbuilder base (or create a new one) for every distribution-architecture pair to build the package. Once the packages are build, the script uploads them into the local repository.

As an extra step, lintian checks are executed at the end of the script.

Problems and Solution