Pyromaniac

CI Examples

CI components

A few open source components have been updated, or created which use the JFPatch-as-a-service system to build themselves on push. Hopefully these will serve as a useful guide should anyone else wish to use the facility.

Examples can also be found on GitHub by searching for the topic 'riscos-ci'. Fully details on how the CI systems can be configured can be found on the RISC OS Build service site.

LineEditor

Source type:BASIC assembler
LineEditor is a BASIC assembler module, which builds itself as part of the automation.
This was actually the first project I built using the API myself.
The CI build uses GitHub workflow to perform the build on an Ubuntu machine. The workflow has a few steps, thus:
  • It zips the source, and then submits it to the JFPatch-as-a-service JSON API interface.
    The result is extracted into different files, and if there was an output, it is decoded from Base64 to a file.
  • We check whether the build was successful, and if not, we report this and fail.
  • We extract the version number from the 'VersionNum' file, and apply this to the Zip that was built.
  • We report the artifact to GitHub.
The .robuild.yml file doesn't do anything particularly clever. It runs the existing '!!Release' Obey file to do the build. The build was changed slightly to be more friendly for use on non-RISC OS systems. In particular, the BASIC files were converted to BASIC Text format, which is diff-able in git.

Nettle

Source type:C application
The Nettle build is exactly the same style of build as LineEditor, from GitHub's point of view.
The .robuild.yaml file runs the build with 'amu', and then performs the necessary copying of files with the existing '!Release' Obey file. The Obey file was modified slightly to make the '*Wipe' operations non-fatal - Pyromaniac hasn't got a working '*Wipe' yet.

ErrorCancel

Source type:ObjAsm
The ErrorCancel build, which I've placed on GitHub with Rick Murray's permission, is again the same GitHub workflow as the previous two builds.
However, there's also a '.gitlab-ci.yml' automation file, which is used by GitLab for building jobs under its CI system.
The .robuild.yaml builds the objasm source. There's only one file, and we're not doing anything complicated, so the script just contains the 4 commands that we need - create directories, assembler to an AOF file, link into a module.

Pico

Source type:C command line tool
The pico/pilot builds were created back in the early 2000s, and weren't especially impressive. They were one of the first C components that was run on the 32bit RISC OS I created, because without a desktop they were useful for editing files.
Both GitHub and GitLab scripts are included. The master copy of Pico is in my GitLab repository and is tested automatically there; when pushed to master, the branches are synchronised to GitHub. Thus it tests both the types of workflows.

CObey

Source type:C module
Julie Stamp's CObey module was interesting; it needed small changes to be C89 compatible, as I've not got later versions of the compilers built, but otherwise wasn't too complicated. However, instead of using the JSON API, this build uses the riscos-build-online tool which I had created.
This simplifies the process of submitting the build, and makes the build more responsive - the output from the build is sent to GitHub as it happens. Consequently the CI file is a little different:
  • First we Zip up the sources.
    Then we download the 'riscos-build-online' tool from GitHub, and make it runnable.
    And we use the tool to submit the source to the service.
    Because the 'riscos-build-online' tool reports output and errors, and writes files to a known location, a lot of the boilerplate that was in the other workflows is removed.
  • We work out the version number from 'VersionNum' and rename the file so that it has a good name in the artifacts.
    Some variables are set for the version and leafname that we'll use later.
  • IF this is a release - that is, we pushed a tag starting with a 'v', we do a bunch of extra steps:
    • We obtain the built artifact we created.
    • We 'create a release' in GitHub - this is the thing you will see as a downloadable artifact off the main repository's homepage, currently on the right of the page.
    • We push the built artifact to the release we created, so that it's attached. This means that users will be able to download the built binary, as well as the source for this release.

DDEUtilsJF

Source type:JFPatch module
My version of DDEUtils is a JFPatch module. It uses exactly the same pattern as the CObey, but it additionally has a GitLab CI file which uses the 'riscos-build-online' tool.

riscos-build-online

Source type:C command line tool
The 'riscos-build-online' tool is a tool to make it easier to submit builds to the JFPatch-as-a-service system for building. It uses the WebSockets interface so it's a bit more involved than the JSON API version - but this also means that it can run for much longer, and give better feedback.
The CI build is more complex than the others, because it's building for other platforms. There are actually 3 major parts to the build:
  1. The first build is for linux - we build the tool so that it will run on the Ubuntu we're running on.
    • We checkout the sources - including the two submodules.
      The client uses two open source libraries; a JSON parser and a WebSockets client. These are submodules so that we're not duplicating the code into this repository. As submodules we know the code should not go stale because we will always get the same pinned version of the source.
    • The tool is built - this is just a simple 'make' which builds the tool and the deb archive.
    • We run the tests - this just uses the tool to run a simple command on the service and check we get a response.
    • We work out what the built files are and store them in some variables.
    • We upload the tool, and then the deb, so that it is available as an artifact.
  2. The second build, which happens after the first has completed, is to build the RISC OS version of the tool.
    • We checkout the sources. This is a fresh system, so we need to get the sources to be able to work with them.
    • We download the linux binary that we built in the earlier stage.
    • We zip up the sources, and we use the linux tool to send them to the service.
    • We work out the name to give the output.
    • We upload the tool as an artifact.
  3. If we were doing a release - a tag was pushed that starts with a 'v' - we do this final stage.
    • We download the 3 binaries that were built - the Ubuntu deb, the linux tool and the RISC OS tool/
    • We create a new release using the name with the version that was released.
    • We attach the 3 binaries to the release.
This means that once a change is pushed with the tag starting with a 'v', a release is prepared which contains the 3 binaries. These are left in the pending state, so that as owner of the repository you can confirm that they are good, and update any release notes you might want to add. No human is involved in the actual building, other than to push the tag.