Azure Pipelines for Github allows build of plugin

Hi Folks

———————-
An edit at the top of the thread on updates since I wrote this first post

Azure pipelines setup as described here can build your plugin on all three platforms and upload a working zip to a github release automatically on every push to master; as well as building all your pull requests before you merge them.

The directions in this thread are complete but the short version is:

  1. Basically you need to invite the azure pipeline into your repo (as documented https://docs.microsoft.com/en-us/azure/devops/pipelines/repos/github?view=azure-devops) and then have a command to build plugins.

  2. Add an ‘azure-pipelines.yml’ script which runs commands. https://github.com/baconpaul/BaconPlugs/blob/master/azure-pipelines.yml Is a good starting point

  3. If you use that one directly, copy the few supporting scripts from BaconPlugs and edit them accordingly (and if you are on windows note the information on chmod a script at check in below)

  4. Add a “nightly” release with no contents which will be the target for the upload

  5. Push to master; resolve errors

Others have found this takes 1-2 hours from scratch, and then just kinda works from there on out.

Hope this helps

————————

I just wanted to share something that might be useful - and sorry if it has already been covered.

Over at the https://github.com/surge-synthesizer/surge Surge Synth open source project we’ve had great success using Azure Pipelines to do builds on pull requests and stuff. Microsoft makes Azure pools for linux, mac, and windows available to free for open source projects.

Basically you need to invite the azure pipeline into your repo (as documented https://docs.microsoft.com/en-us/azure/devops/pipelines/repos/github?view=azure-devops) and then have a command to build plugins.

The azure-pipelines.yml file in BaconPlugs (https://github.com/baconpaul/BaconPlugs/blob/master/azure-pipelines.yml) is set up so that my commits trigger a build which does an automatic clean build against a fresh copy of the Rack SDK on mac, linux, and windows. It even makes the mac and linux zip available! (The pipeline machine doesn’t seem to have zip in its path on windows and I didn’t debug that yet).

I thought this might be useful to other folks if you want auto builds on your commits. Again on the Surge project we’ve found that handy. Of course, feel free to just copy that pipeline file. It only has a couple of things which are specific to my plugs (the trigger is on my candidate and release branch names as well as master; and the asset names contain BaconPlug names).

I haven’t done any thinking about how to build against 1.0 or anything like that. Just wanted to get my plugins building again and figured I would add the auto build.

Sorry if this isn’t the sort of thing to share here or if it is covered elsewhere. Hope it is useful.

Best

Paul

10 Likes

Oh and to see what it does:

https://github.com/baconpaul/BaconPlugs/commits/

if you take a look at the most recent commit there’s a green checkmark. Click on that and you can see the build output for all 3 platforms. That check meant my change built on the new VMs. Should have included that in my original note sorry!

Super cool! Can you download the build products? There are a few devs who don’t put their stuff in plugin manager but do try to post binaries for people?

Yeah if you go to the build result and click on assets you will see the Mac and Linux zip. If I debug why zip isn’t in path on the windows vm it would be there too.

Over in surge we actually build github releases from the pipelines but that is way more involved - mostly due to github bot permissions.

To reach more devs, maybe put the explanation of how to use it on your github also? See if you could get @Vortico to link to it? post a link to your github anytime someone on FB asks for a build?

Might be worth considering for the mac builds.

It’s easy enough to do this for Linux builds (ex. Laminar is easy, Drone-CI less easy but still workable.) Windows you can work around with a VM with enough elbow grease (or hell, cross compilation is a possibility too.) Mac builds are the worst because you can’t legally VM them unless your build server is a Mac, and they discontinued the server line of mac hardware which means you need a mac mini or something sitting around (and since those have soldered RAM now, yeah, VMs are “fun” on that platform :expressionless:)

Well, sure, we all have ways of dealing with platforms. I use native Win, Ubuntu in a VM, and borrow my wife’s mac. But this gives one simple way to do all three. And the continuous integration is useful. I’m always breaking my Linux build by messing up the case of file names. I’m going to guess this setup would also run unit test suites?

Yeah you can make it run unit tests easily if you have them. (I have not done that for my tests yet but plan to one day).

I’m just grappling with the fact that plugin.mk hardcodes “zip” as the command to make a zip file. Have a workaround for that and soon should have assets for all 3 being built by my pipeline. I’ll add a section to my GitHub readme also on how it works.

I always use lower snake casing for files (foo-bar.hh) and this works on every platform, always. (Albeit some languages that demand filename to equal module name like freepascal require underscores sadly.)

You can amusingly unittest this as well with a test_filenames.py in the source directory and running nose2 on it:


import unittest
import glob
import re

header_regex = re.compile('([a-z0-9]+-)*([a-z0-9]+).hh')
source_regex = re.compile('([a-z0-9]+-)*([a-z0-9]+).cc')

class TestFilenameConvention(unittest.TestCase):
   def test_headers(self):
      headers = glob.glob('*.hh')
   
      for f in headers:
         self.assertTrue(header_regex.match(f), f)
      
   def test_sources(self):
      sources = glob.glob('*.cc')
   
      for f in sources:
         self.assertTrue(source_regex.match(f), f)

Any filename outside of convention will make the test fail, and it doesn’t even require spooling up the VM.

I’m just saying continuous integration on all platform is useful, even if one is able to build manually on all platforms.

2 Likes

Yup.

Well anyway I got it so the CI builds now make zip files on all 3 platforms. I had to copy a make rule from plugin.mk to my makefile for dist to do it and name it win-dist but it all works. If you go look on the candidate_v0.6.3 branch in my GitHub you can see everything.

Basically if you copy that azure-pipelines.yml to your project, invite the bot after you set up the account, you should be good to go (if you change BACON to SQUINKY or whatever in a few spots). Adding extra stages is pretty obvious.

Hope it helps!

1 Like

Oh and here’s an example of a build from my latest push

https://dev.azure.com/paul0179/BaconPlugs/_build/results?buildId=16

you can see the “ARTIFACTS” in the top right which let you download the zips.

1 Like

I have had success using Travis-CI for this as well. They have experimental support for Windows builds now, and I have more experience with Travis than with the Azure suite so I gave it a try.

Here’s the travis.yml for one of my in-progress plugins. It has a couple things that may be superfluous for a more basic plugin as it has a bunch of dependencies to install and compile first, but removing or changing the apt/brew package lists should be all that’s needed to make it more generic.

2 Likes

I’ve set up Travis and connected it to my repository, with the .travis.yml copied from you and changed some text to match my project. It shows like it’s compiled it for mac and linux, but I have no idea where to download the created zips. Am I missing something? :confused:

The Travis config I have only saves the zips for tagged releases and uploads the zips to Github releases. You’ll need to update the deploy section of the yaml file to fit your needs, Travis has a ton of deploy providers (list here) so you can upload the zips to pretty much whatever accepts file uploads. If you just want to use Github releases like me, you will need to update the file pattern, repo name and api_key value (guide here on the API token management).

So if folks are interested over at https://github.com/surge-synthesizer/surge-rack I extended this technique so

1: it builds 062 and v1
2; it uploads the assets to a github release

This means with every master push of surge rack we get newly built zips in the release all platforms within about 15 minutes.

3 Likes

@baconpaul - could you maybe point me to some tutorial for this Azure artifacts publishing to github release pages, as I want it also to integrate in my work (which is described here - Build VCVRack with Conan and CMake + add CI). I understand what is defined in the azure-pipelines.yml, but I don’t know if there also must be something setup on Azure itself. Otherwise I’ll just try.

Thank you.

Sure. It’s teally easy actually - just look at how I set up the surge-rack azure pipelines. There’s a matrix of jobs for each os and at the end they publish pipeline artifact then another job which downloads them and then pushes them to a GitHub release. I just followed the api doc at Microsoft - didn’t really use a tutorial

Couple of things

  1. For surge rack I am using one release I made by hand and updating it. The create action works fine also but then you need the logic to determine if you want to

  2. I do rename my assets in my download job. Handy? Maybe. I also whack the version with jq at the start

Lemme know if anything’s not clear

Oh and you need a github connection in your azure. I made a dummy user which has write access for this so I didn’t have to use my user I’d and set it up securely in the admin console using oath. The Microsoft docs on that were super clear once I knew I needed it.

@baconpaul, thank you so very much for that!
It took me 2 hours yesterday to set it up and now that SDK v1 is available my plugin builds automatically for every platform on every commit :slight_smile: I’m quite familiar with Microsoft Azure from my day job and already a big fan, but these Azure Pipelines are just amazing!

One note: I had to adjust the execution rights of the script-files (chmod +x ...sh), but nothing that can’t be fixed in a few minutes.