Skip to end of metadata
Go to start of metadata

You are viewing an old version of this content. View the current version.

Compare with Current View Version History

« Previous Version 15 Next »

This page should be used to collect ideas to shape Kieker's delivery pipeline

 

Current Situation (for the Kieker core)

  1. Jenkins used with the following use cases
    1. Regular build of commits
      1. job: http://kieker.uni-kiel.de/jenkins/job/kieker/
      • runs

        ./gradlew.sh jenkinsBuild --stacktrace
    2. Nightly release creation
      1. job: kieker.uni-kiel.de/jenkins/job/kieker-nightly-release/
      2. runs

        ./gradlew.sh jenkinsNightlyBuild //// This includes the automatic test of release archives (bin/dev/check-release-archives.sh)
        
        chmod +x kieker-examples/OverheadEvaluationMicrobenchmark/executeRemoteMicroBenchmark.sh
        kieker-examples/OverheadEvaluationMicrobenchmark/executeRemoteMicroBenchmark.sh
    3. Release creation
      1. Operates only on the respective release branch
      2. job 1: http://kieker.uni-kiel.de/jenkins/job/kieker-RC/
        1. runs

          ./gradlew.sh jenkinsBuild --stacktrace
      3. job 2: http://kieker.uni-kiel.de/jenkins/job/kieker-RC-release/
        1. runs

          ./gradlew.sh jenkinsNightlyBuild

Target Scenario

  • Pipeline template not only for the Kieker core but also for other projects in the "Kieker orbit"
  • Test with different versions of Java (6?, 7, 8)
  • No artifact is being rebuild. In each stage after the build stage, the build artifacts from the build stage are used.
  • Does it make sense to have a stable branch in addition to master into which only those changesets are merged that passed the entire pipeline?
  • Do we want to have a gatekeeper condition (threshold) on the regression benchmark?

Stages

  • Build
    • Compiles, creates the release artifacts, runs the (fast) unit tests, maybe some smoke tests + static code analysis
  • Integration Test
    • Runs the more expensive integration tests
  • User Acceptance Test

How do we map the stages to Gradle tasks?

  • Example pipeline from a Gradle presentation ( https://www.youtube.com/watch?v=L4ZgTCq6dLQ )
    • Compile / Unit Tests
    • Integration Tests
    • Code analysis
    • Assemble Distribution
    • Publish Binaries (using these binaries in the following steps)
    • Acceptance Tests
    • User Acceptance Tests
    • Canary Deployment
    • Production Deployment
  • Further tools mentioned throughout the presentation:
    • SonarQube (historical data of build results)
    • Jacoco (code coverage)

To be able to adapt this example pipeline in Kieker, we would have to:

  • Separate the unit tests from the integration tests
  • Insert the call of the check-script (in the stage arount the acceptance test?)
  • Merge the gradle threshold code into master to be able to have quality gates for code analysis
  • Think about a way for continouous build numbering without breaking sonatype/maven conventions (we could use the build numbering in snap-ci to do so)


 

Pipelines

Pull Request Check

  • Goal: Makes sure that a pull request can be merged without conflict, can be compiled, passes the unit tests + static code analysis check and (more? we probably want to exclude the benchmarking here)
  • Branch: any?
  • Trigger: Pull request
  • Executed stages: build

Integration

  • Goal: Extensive quality control on the master branch. Automatically merge to a stable branch?
  • Branch: master
  • Trigger: Commit to master
  • Executed stages: all except for UAT and performance regression benchmark

Nightly Release

  • Goal: Execute all stages of the pipeline except for the UAT and uploads a SNAPSHOT release to Maven Central (automatically available there)
  • Branch: master
  • Trigger: Cron (once a day)
  • Executes stages: all except for UAT.

Release

  • Goal: Create and test the artifacts for a release candidate and upload the candidate to Maven Central (manual trigger for the actual release)
  • Trigger: Manual
  • Executed stages: all (particularly including manual UAT), maybe except for performance benchmark?

Possible Implementations

Jenkins

  • Problems:
    • No explicit pipeline concept
    • Re-use of artifacts between stages not explicitly supported
    • Configuration via VCS not possible

Travis CI

https://travis-ci.org/kieker-monitoring/kieker

  • Benefits:
    • Configuration file is inside the repository
    • Docker compatiblity
  • Problems:
    • Very strict resource limitations currently leading to failing builds (there would have to be more tweaking to get this running)
    • No pipeline concept
    • How are artifacts released (no access to workspace)
    • Re-use of artifacts between stages not explicitly supported

Snap CI

https://orca.snap-ci.com/kieker-monitoring/kieker

  • Benefits:
    • Explicit pipelines
    • Docker compatibility ( https://orca.snap-ci.com )
    • Re-use of artifacts among stages
    • Possibility to mark files/folders as artifacts and access them from the website
  • Problems:
    • Configured directly using the snap-ci website (not configuration-file based in the repository)
    • Cost model?

https://snap-ci.com/plans:

Do I have to pay for my public/open-source projects?
No! All plan limits as mentioned here are specifically for your private repositories alone! If you are working on a public repository or on working on an Open Source project, we will not count any builds you set up for those. So in essence, you get unlimited public project builds in addition to what your plan provides.
Can I really set up builds for unlimited public repositories?
Well - we clearly have physical limits- but yes, in essence we do not intend to stop you from setting up builds for as many public repositories as you need. Currently, we let everyone set up around 20 public repositories. However, as we keep adding capacity to our build grid, this number will go up!

 

Kieker Build Docker Containers

As Snap-cI (orca) and Travis-CI support Docker, we created Docker containers to have a reproducible build environment.

The repository containing all the Dockerfile files: https://github.com/kieker-monitoring-docker/kieker-build

The DockerHub containing the automatically built containers: https://hub.docker.com/r/kieker/kieker-build/

The most important tags right now are:

  • kieker/kieker-build:base (Base image for all other containers based on Ubuntu 15.10 and the additional R dependencies)
  • kieker/kieker-build:openjdk6 (Based on :base with openjdk-6 installed)
  • kieker/kieker-build:openjdk7 (Based on :base with openjdk-7 installed)
  • kieker/kieker-build:openjdk8 (Based on :base with openjdk-8 installed)
  • kieker/kieker-build:snap-ci-build (Based on :openjdk6, assuming that the kieker repository to build is mapped to /opt/kieker inside the container and executes the Gradle wrapper with the build task by default)
    To execute it locally:

    sudo docker run -t -i -v /path/to/kieker/repo:/opt/kieker kieker/kieker-build:snap-ci-build

  • kieker/kieker-build:snap-ci-nightly (Based on :openjdk6, assuming that the kieker repository to build is mapped to /opt/kieker inside the container and executes the Gradle wrapper with the jenkinsNightlyBuild task by default. In addition to the :snap-ci-build container, this container has all packages installed that are needed to run the nightly build; e.g., pdflatex, bibtex, fonts for documentation generation)
    To execute it locally:

    sudo docker run -t -i -v /path/to/kieker/repo:/opt/kieker kieker/kieker-build:snap-ci-nightly

  • No labels