Azure Pipelines for Github allows build of plugin

(Paul) #1

Hi Folks

I just wanted to share something that might be useful - and sorry if it has already been covered.

Over at the https://github.com/surge-synthesizer/surge Surge Synth open source project we’ve had great success using Azure Pipelines to do builds on pull requests and stuff. Microsoft makes Azure pools for linux, mac, and windows available to free for open source projects.

Basically you need to invite the azure pipeline into your repo (as documented https://docs.microsoft.com/en-us/azure/devops/pipelines/repos/github?view=azure-devops) and then have a command to build plugins.

The azure-pipelines.yml file in BaconPlugs (https://github.com/baconpaul/BaconPlugs/blob/master/azure-pipelines.yml) is set up so that my commits trigger a build which does an automatic clean build against a fresh copy of the Rack SDK on mac, linux, and windows. It even makes the mac and linux zip available! (The pipeline machine doesn’t seem to have zip in its path on windows and I didn’t debug that yet).

I thought this might be useful to other folks if you want auto builds on your commits. Again on the Surge project we’ve found that handy. Of course, feel free to just copy that pipeline file. It only has a couple of things which are specific to my plugs (the trigger is on my candidate and release branch names as well as master; and the asset names contain BaconPlug names).

I haven’t done any thinking about how to build against 1.0 or anything like that. Just wanted to get my plugins building again and figured I would add the auto build.

Sorry if this isn’t the sort of thing to share here or if it is covered elsewhere. Hope it is useful.

Best

Paul

5 Likes
Catro Modulo 0.6.4
(Paul) #2

Oh and to see what it does:

https://github.com/baconpaul/BaconPlugs/commits/

if you take a look at the most recent commit there’s a green checkmark. Click on that and you can see the build output for all 3 platforms. That check meant my change built on the new VMs. Should have included that in my original note sorry!

#3

Super cool! Can you download the build products? There are a few devs who don’t put their stuff in plugin manager but do try to post binaries for people?

(Paul) #4

Yeah if you go to the build result and click on assets you will see the Mac and Linux zip. If I debug why zip isn’t in path on the windows vm it would be there too.

Over in surge we actually build github releases from the pipelines but that is way more involved - mostly due to github bot permissions.

#5

To reach more devs, maybe put the explanation of how to use it on your github also? See if you could get @Vortico to link to it? post a link to your github anytime someone on FB asks for a build?

(Skrylar) #6

Might be worth considering for the mac builds.

It’s easy enough to do this for Linux builds (ex. Laminar is easy, Drone-CI less easy but still workable.) Windows you can work around with a VM with enough elbow grease (or hell, cross compilation is a possibility too.) Mac builds are the worst because you can’t legally VM them unless your build server is a Mac, and they discontinued the server line of mac hardware which means you need a mac mini or something sitting around (and since those have soldered RAM now, yeah, VMs are “fun” on that platform :expressionless:)

#7

Well, sure, we all have ways of dealing with platforms. I use native Win, Ubuntu in a VM, and borrow my wife’s mac. But this gives one simple way to do all three. And the continuous integration is useful. I’m always breaking my Linux build by messing up the case of file names. I’m going to guess this setup would also run unit test suites?

(Paul) #8

Yeah you can make it run unit tests easily if you have them. (I have not done that for my tests yet but plan to one day).

I’m just grappling with the fact that plugin.mk hardcodes “zip” as the command to make a zip file. Have a workaround for that and soon should have assets for all 3 being built by my pipeline. I’ll add a section to my GitHub readme also on how it works.

(Skrylar) #9

I always use lower snake casing for files (foo-bar.hh) and this works on every platform, always. (Albeit some languages that demand filename to equal module name like freepascal require underscores sadly.)

You can amusingly unittest this as well with a test_filenames.py in the source directory and running nose2 on it:


import unittest
import glob
import re

header_regex = re.compile('([a-z0-9]+-)*([a-z0-9]+).hh')
source_regex = re.compile('([a-z0-9]+-)*([a-z0-9]+).cc')

class TestFilenameConvention(unittest.TestCase):
   def test_headers(self):
      headers = glob.glob('*.hh')
   
      for f in headers:
         self.assertTrue(header_regex.match(f), f)
      
   def test_sources(self):
      sources = glob.glob('*.cc')
   
      for f in sources:
         self.assertTrue(source_regex.match(f), f)

Any filename outside of convention will make the test fail, and it doesn’t even require spooling up the VM.

#10

I’m just saying continuous integration on all platform is useful, even if one is able to build manually on all platforms.

2 Likes
(Paul) #11

Yup.

Well anyway I got it so the CI builds now make zip files on all 3 platforms. I had to copy a make rule from plugin.mk to my makefile for dist to do it and name it win-dist but it all works. If you go look on the candidate_v0.6.3 branch in my GitHub you can see everything.

Basically if you copy that azure-pipelines.yml to your project, invite the bot after you set up the account, you should be good to go (if you change BACON to SQUINKY or whatever in a few spots). Adding extra stages is pretty obvious.

Hope it helps!

1 Like
(Paul) #12

Oh and here’s an example of a build from my latest push

https://dev.azure.com/paul0179/BaconPlugs/_build/results?buildId=16

you can see the “ARTIFACTS” in the top right which let you download the zips.

1 Like
(Korfuri) #13

I have had success using Travis-CI for this as well. They have experimental support for Windows builds now, and I have more experience with Travis than with the Azure suite so I gave it a try.

Here’s the travis.yml for one of my in-progress plugins. It has a couple things that may be superfluous for a more basic plugin as it has a bunch of dependencies to install and compile first, but removing or changing the apt/brew package lists should be all that’s needed to make it more generic.

1 Like
(WINDOWS) Module beta testers?
(Catronomix) #14

I’ve set up Travis and connected it to my repository, with the .travis.yml copied from you and changed some text to match my project. It shows like it’s compiled it for mac and linux, but I have no idea where to download the created zips. Am I missing something? :confused:

(Korfuri) #15

The Travis config I have only saves the zips for tagged releases and uploads the zips to Github releases. You’ll need to update the deploy section of the yaml file to fit your needs, Travis has a ton of deploy providers (list here) so you can upload the zips to pretty much whatever accepts file uploads. If you just want to use Github releases like me, you will need to update the file pattern, repo name and api_key value (guide here on the API token management).

(Paul) #16

So if folks are interested over at https://github.com/surge-synthesizer/surge-rack I extended this technique so

1: it builds 062 and v1
2; it uploads the assets to a github release

This means with every master push of surge rack we get newly built zips in the release all platforms within about 15 minutes.

3 Likes