This is the third and last article telling the tale of our own DevOps for Microsoft Dynamics 365, and the technology behind it.
Part I – Background and how our DevOps tools evolved before we knew about it
Part II – Automation of the build and deploy process using custom VSTS Build Tasks
Part III – Demo of complete build and release definitions taking you from A to Z
After the first two articles we have now got a handful custom VSTS Build Tasks to help us take the build and deployment automation all the way. This final article demonstrates how we do that with VSTS builds and releases. Finally raising the questions of why we did all this and where to go from here.
A complete VSTS Build for Microsoft Dynamics 365
Below is a sample of a full build process that not only builds and packs a new CRM solution, but updates the individual assemblies and webresources in DEV environment, exports solutions and data, and then publishes the files exported from DEV together with Shuffle Definitions and Package Definition, which is the resulting build artifact.
The task Apply Build Version to assemblies is actually the classic sample build task that is published by Microsoft in a tutorial on how to create build tasks. We have just added one thing; the semantic version number extracted from the build number is also saved to a file in the root working folder on the build machine.
This file is picked up by the Shuffle Export task, if option “Set version” has been checked on the step. This way, the same version that VSTS Build is composing will be set on all solutions before export.
The artifacts published includes files matching the following patterns:
The results is that Shuffle Definitions, solution files exported, Shuffle data files, and Package Definition for the CRM Deployer are included as artifacts.
No binaries are included, except for the ones zipped within the solution files. The artifacts we are interested in are the files we can use to deploy the project to new or existing environments.
Combining builds to composite VSTS Releases
A Release is typically composed of the result of several builds. In this demo case we want to create a release to be deployed to test environments for a project called “MMS MUA”. This project has two of our products as prerequisites: Cinteros Utils and CEM.
I will create three Environments as targets for this release; one to create the cdzip package that is the distributable for deployment, and automated deployment to two other dev environments, where managed MMS MUA is a prerequisite.
So the Release Definition consumes three artifacts according to the picture below. All these are results of builds set up similar to the example above.
The cdzip packages are simply zipped files containing the Package Definition (cdpkg), the Shuffle Definitions (xml) and the data (xml) and solution (zip) files. So composing these packages can easily be done using existing utility tasks in VSTS.
First, the files to pack are copied to a “pack folder”.
This is done in the same way for all three artifacts.
After that, the files are packed using the built in Archive Files task.
Finally, the cdzip file is uploaded to a FTP site that has been defined under the Services in VSTS. From this FTP folder the distributable can be picked up and distributed for deploy using the CRM Deployer.
When performing an automated deploy from VSTS to a CRM environment, we simply run the same Shuffles as we did during Build, but reversed. And for all artifacts included.
To simplify setting up several environments, a Task Group has been created for each source artifact/product. The Task Groups accept one parameter only, the connection string to CRM.
This connection string is defined as an Environment specific variable, and the password is marked as Secret, so that it will not be visible in any logs or other outputs from the deploy.
Using Task Groups is an easy way to combine several Tasks to be executed in the same way for several different Environments in the Release Definition. Task Groups can have a set of variables defined, that can be filled with different information for each environment. In our case, we have the CrmConnection parameter for these Task Groups, to specify how to connect to each CRM environment.
The tasks for the task groups are very simple, just two instances of the Shuffle Import task. The first one imports solutions in the artifact, and the second one the data.
Why go our own way?
This is of course a question we have asked ourselves more than once, and a question we should continue to ask regularly.
Using the functionality released by Microsoft is always the first option, as it keeps us focusing on what we really do (delivering awesome enterprise scale systems empowering our customers’ users and improving their processes and quality, etc yada yada sales talk) instead of spending time on our own support systems.
But most of our tools have been created when there was no alternative from Microsoft, and when we still thought that creating our own tools and having full control over them is better than jumping on either open sourced toolkits or paid solutions.
Having worked out our processes, implementing our flavors of solution management, configurability, products and toolkits for CRM, we have been able to tailor our support systems to do what we need, in the way we need it.
Given all the benefits of the “not our responsibility” attitude you can assume when using official tools from Microsoft, and the “sharing is caring” possibilities of open source, you really have to be confident that the time you spend on internal tools is worthwhile.
So far, the time has been well spent.
Where do we go from here?
Having said that our time has been well spent, going further and offering our version of ALM and CI for Microsoft Dynamics 365 on VS Marketplace is definitely an option today. Another option is going completely open source on GitHub, which I currently feel is more probable.
That of course depends on if there is any interest from other parties out there. I am fully aware that our methods might just raise an eyebrow or two, and then fade into oblivion.
If you are interested in more details about our methods and tools, don’t hesitate to contact me on Twitter @rappen, in the comments below, or send me an e-mail.
EDIT: These DevOps tools have now been published as “Public Preview”! Read the article here: http://jonasrapp.innofactor.se/2017/06/DevOps-Preview.html