At Ballard Chalmers, we have become huge fans of Microsoft Flow from Office 365, using it among other things to create our internal client sites, send out our monthly support updates and other digital workplace process automation steps. This automation has saved several hours per month alone with relatively little work, but there has previously been an issue with having to manually export your workflow every time you make a change. Now, thanks to John Liu, we have a way to export these to SharePoint. However, the usual place for source code for us is Visual Studio Online (VSO) and it would have been nice to store the content there. Inspired by some of the chat on the Microsoft Cloud Show podcast, I thought that I would look into whether this could be done.
TL;DR
Creating a Flow to use the Flow Management actions to export all the flows created in the last day and upload them to Visual Studio Online or Github, allowing changes to be tracked as Source Control.
Connecting to VSO
I started off by re-creating John Liu’s workflow and confirming that I could write the flows to SharePoint which worked very well. So the next step was to work out how to do this with Visual Studio Online. The natural place was to use REST APIs and a quick search flagged up https://www.visualstudio.com/en-us/docs/integrate/api/git/overview. Using the browser first, I confirmed that I could get back a list of repositories using https://ballardchalmers.visualstudio.com/DefaultCollection/_apis/projects. Once this worked, I needed to work out the best way to get authentication going. I had used Azure AD OAuth authentication but I found that this didn’t work well with VSO and was the only version supported in Flow at the moment. So for now, I went with Donovan Brown’s recommended approach of using personal tokens – https://donovanbrown.com/post/how-to-call-team-services-rest-api-from-powershell.
- Log in to VSO, click on your profile image and select Security:
- Select personal access tokens from the left and then click on Add:
- Then give it a name, duration and select the scopes you need – in this case just code and projects:
Once you have this token, you can use Basic Authentication to connect with this token as the password and anything as the username. I used Postman to test the calls and confirm they would work with this authentication:
The next step was to use this API to add the files to VSO.
Adding Files to VSO
As mentioned above, I first needed the list of projects (https://www.visualstudio.com/en-us/docs/integrate/api/tfs/projects) to get the project to write the file to:
https://ballardchalmers.visualstudio.com/DefaultCollection/_apis/projects
Then, selecting the ID of the project from the response and using that to get the list of repositories (https://www.visualstudio.com/en-us/docs/integrate/api/git/repositories):
Returning the specific repository then gives access to all the other calls we need:
To add the Flow file (more on how to get that later), you make a POST to the pushes service at https://ballardchalmersltd.visualstudio.com/DefaultCollection/00000000-0000-0000-0000-000000000000/_apis/git/repositories/00000000-0000-0000-0000-000000000000/pushes with the authentication in the header (see image above) and the body as below:
{
“refUpdates”: [
{
“name”: “refs/heads/flow-branch”, -> the branch to push the file to
“oldObjectId”: “0000000000000000000000000000000000000000”-> set to zero string for a new file or the last commit id to update
}
],
“commits”: [
{
“comment”: “Initial commit.”, -> commit comment to show in VSO
“changes”: [
{
“changeType”: “add”,-> add for a new file and edit to update
“item”: {
“path”: “/flow.md” -> file path from repository root
},
“newContent”: {
“content”: “My first file!”, -> content of the file
“contentType”: “rawtext”-> the type of the content which is usually rawtext
}
}
]
}
]
}
The steps were to push the files to a branch, create a pull request for those files to be merged into the master and then approve that pull request. The pull request was created with a post to /pullrequests and the body below:
{ "sourceRefName": "refs/heads/flow-branch", "targetRefName": "refs/heads/master", "title": "Merge in flow changes", "description": "Merge em", "reviewers": [] }
And then approved with a patch request to /pullrequests/id where the id was from the request made and having the following body:
{ "autoCompleteSetBy": { "id": "14570dd6-1798-6d17-a83f-cad99959b185" -> id of the person to mark the request as completed }, "completionOptions": { "deleteSourceBranch": "true", -> remove the branch once merged "mergeCommitMessage": "Added known issues document", -> commit message "squashMerge": "false" -> push all the commits to a single merge – see https://docs.microsoft.com/en-us/vsts/git/merging-with-squash for more info } }
There it is, a way to add files and create them. I tested this and all looked great. Ran again and still looked great. Made a change to a file and it all fell apart. The issue I encountered is that there is currently no way via the API or the web interface to merge changes in a branch. When you merge branches in Git, it treats files that are different as a conflict which in Visual Studio you can select the changes you want to keep. Unfortunately, there is no way yet to be able to do this without Visual Studio.
Originally, I thought that I had to update the file on a separate branch as whenever I tried updating the file on the master branch, I received on error. Facing this not working, I delved a little further and found that it was the oldObjectId I was passing that was wrong. This needs to be the same as the last commit that has taken place. To get this, I called a GET on /commits with a parameter of $top=1 to get the latest and used this in oldObjectId. Now I was able to commit a file directly to the master branch with one call and it was time to integrate this back into Flow.
Integrating into Flow
For testing Flows, I usually start with a Flow button trigger that allows it to be started at any point. As John Liu’s post pointed out, you can now list all Flows using the new Flow actions and these can then be filtered to only show Flows edited today:
Listing the flows only returns a little information for each one, so you need to return the added detail by calling Get Flow for each one and I then create a single Compose block for the filename as it is used a few times:
Once you have the Flow contents, it is time to put this in a file. However, there are two different calls to make depending on whether it is a new file or updating an existing file. To determine which it is, the Flow makes an HTTP call to get the file. If that fails, then the file does not yet exist and should be added.
You can see from the image that the file is retrieved using /items with the parameter scopePath containing the FlowFileName output. The authentication is in the advanced options which is shown as expanded and I have used two variables to hold the username and password. If this step fails then the next step is skipped by changing the settings in “Configure run after” for the “Set type to add” action. On “Set type to edit”, this is set to run only if the previous is skipped. This then sets the FileCreateType to add or edit depending on whether the file exists or not.
Following the same step, another HTTP action retrieves the last commit ID for the repository and then parses the response using the Parse JSON action. I used the response from Postman to generate the schema for this.
The last step is then to create the files using that Commit ID:
The call makes use of the Commit ID, the FileCreateType that we have determined as being add or edit, the FilePath Output and the Body of the Flow which contains the JSON definition of the Flow itself. Once this step has been called, you can see the files in VSO:
What About GitHub?
Once the process was in place for VSO, it was a simple process to get it working for Github. The API documentation at https://developer.github.com/v3/repos/contents/#create-a-file outlines how to create a file and I again used the personal access token. There are slight changes to the process wherein a shared token is required for updating documents but the largest issue is that the personal token expires at midnight on each day and also expires when too many requests take place at the same time. It would be good to have an OAuth option in Flow but this isn’t available yet.
Summary
This is a great option for those who would like more control over changes being made for Flow and to be able to better track changes as they are made. It also demonstrates the power that Microsoft Flow offers for automating tasks. If you would like to try these workflows, you can read them on GitHub here, updating the variables at the top with the relevant values.
By Kevin McDonnell, Senior Technical Architect at Ballard Chalmers
|