We are now at part 6 of this series and we have a working application that has secure login and so it is time to deploy to Azure with Continuous Integration and Continuous Deployment using Azure DevOps. In part 7, we will cover the creation of the services needed to run it being built in to the deployment using Azure Resource Manager (ARM) but this part will assume that the Functions and Web App have been created already. The focus here is on deploying the latest code changes with tests along the way using Azure DevOps.
What is Azure DevOps?
Azure DevOps is the latest umbrella term for the set of services related to managing change in your application to ensure the best balance between keeping the service running and being able to make improvements. It includes:
- Azure Boards – for planning and managing work including user stories and bug management
- Azure Repos – for source control
- Azure Pipelines – for building, testing and deploying
- Azure Test Plans – for running tests and tracking results
- Azure Artifacts – for managing packages for the CI/CD pipelines
This is an evolution of Visual Studio Online with a shiny new user interface that makes things far easier to navigate and includes better segregation to allow you to use one item and not all – for example, it has great support for using GitHub for source control with work planning in Boards and deployment using Pipelines.
The benefits are that you can work well as a team, allowing the Product Owner to define the roadmap of work to be done in boards, developers to work on the code in Repos (linking each check-in to a user story), the QA team to run tests against a deployed version and the Dev Ops team to manage deployments from dev to test to live at the click of a button.
What About Pricing?
The great news is that you can get started for free. If you are on an open-source project, it’s completely free and unlimited build time. Even for small teams of up to 5 users, you have a free license with one hosted job, work item tracking and unlimited private git repos. For larger teams you pay per user and you can add additional pipelines to deploy more than one project at one time.
Note that the details above are correct at time of writing but check out this link for the latest details.
So How Does It Work?
Let me show you…
I will be using the sample application built in the earlier parts of the series which you can see at:
- Data and Functions – https://github.com/BallardChalmers/BCServerlessDemo.DataAndFunctions
- Client – https://github.com/BallardChalmers/BCServerlessDemo.Client
The first step is to create a build by navigating to Azure Pipelines, selecting the Build section and clicking on New. In the next window, define your Source Control source and the repository. You can then select your template or start from blank (note that you can also search which becomes even more important as the number of templates grows).
Once you have selected your template (in this case, I have selected the C# function), you will receive a set of tasks that can run. As you can see below, this template restores the Nuget packages used, builds the solution, creates an archive file, runs the tests and then publishes the artifacts for use in your release. You can see an example of the properties that you can define on the right and the properties to set it.
It should be made clear that this is the build process that will create a set of files that can be deployed to Azure. It does not do the actual publishing to Azure itself – that takes place in the Release step next.
Once you have filled out the details, you can save and queue the build. This will give you options around agents to run the build on, with the default being a set of hosted agents with Visual Studio 2017 running. If you have specific requirements such as a third-party product to be installed like SharePoint or BizTalk then you can deploy your own agent and configure that agent to be available as part of your own private queue.
The next step is one of my favourites as you can see the live status as the build progresses including the detailed logs. This not only gives you a clear indication of any problems, but it makes you feel better that something is actually happening.
Being the new Microsoft, these processes not only work for the more traditional Visual Studio based development but also work well with Node based solutions such as our Angular client. There are tasks for NPM Install and others as you can see below. The final publish will push the artifacts to a location where the release process can use the files.
You will notice above that the tests for the client code are run and the results published and also the results of the code coverage. There were some additional steps required to implement these outlined below:
- Add the packages karma-cli, karma-phantomjs-launcher and karma-junit-reporter
- Set up the tests to run using headless Chrome (i.e. not requiring the browser to open up on the desktop but to simulate that happening)
- Ensure the test results are pushed to the folder root
- Ensure the coverage results are pushed to the folder root
Once done, you can see the test and coverage results in the build details.
So, you now have some deployable files that need to be pushed to the Azure service. That is where the Release process kicks in.
Creating a Release Process
The release process can either be triggered automatically at the end of a build or be manually triggered. Creating a new release will allow you to define which artifacts you require from builds and which tasks you will run against these.
Adding the Azure App Service deployment template will allow you to easily deploy the Azure Functions. Once selected, you need to define which artifacts are needed from the builds.
There are then plenty of tasks that you can add, from the standard App Service Deploy through to PowerShell and other copy tasks too. The PowerShell scripts is hugely useful as you can then ensure that you can update properties in your configuration. Custom PowerShell scripts will update values based on the different environment, ensuring that any environment specific variables are set with a single build. This also means that you have no need to include insecure values in your source repository and can remove access for developers to live and UAT environments. The parameters for the script are built in to the PowerShell script task and therefore only available to those with access to the task.
For the Functions deploy, most of the default values can be used unless you have a more complex scenario. The Package should point to the zip file created by your build process.
Finally, I have used the AzureBlob File Copy to copy the Angular deployment to the Azure Blob Storage Static Website. This is still in preview but is a great way to host your basic Single Page Applications in Angular.
In other projects, we have also included a set of UI tests for the integration environment and a script to clear the database and import a set of known sample data. This allows a clear run of UI tests with known data which can flag failures before deploying to a User Acceptance Test or Live environment.
Having worked with the Microsoft incarnations of Team Foundation Server and Visual Studio Online, Azure DevOps is a real culmination of an evolving product to become essential for any Azure Developers. The ability to use in a simple or complex way depending on your needs and all in one place removes the need to spend days for each project trying to incorporate other products together. There may be better solutions for individual elements (although not many), but as a whole toolbox that works well with Azure it cannot be beaten at the moment. You will curse it many times (especially when your build fails after 30 mins for a simple typo) but far less than you would have in the past and that is easily balanced out with the speed to get a CI/CD process running.
When this is included with Azure Resource Manager templates, you can completely build a new environment from source code to full working service at the click of a button but that will be covered in the next and penultimate part in the series.
By Kevin McDonnell, Senior Technical Architect at Ballard Chalmers
About the author
Kevin McDonnell is a respected Senior Technical Architect at Ballard Chalmers. With a Master of Engineering (MEng), Engineering Science degree from the University of Oxford he specialises in .NET & Azure development and has a broad understanding of the wider Microsoft stack. He listens to what clients are looking to achieve and helps identify the best platform and solution to deliver on that. Kevin regularly blogs on Digital Workplace topics and is a regular contributor to the monthly #CollabTalk discussions on Twitter.