Send your request Join Sii

Implementing the ERP Microsoft Dynamics 365 F&O system is a really challenging task. There is a necessity to adjust to a new approach to managing the application lifecycle in terms of needed administrative tasks.

A lot has changed in relation to the previous versions of the system. For instance, there is a change in the methodology of moving code between environments in the newest version of popular AX.

In the series of three articles, I will present the most important technical matters regarding the automation of Dynamics 365 F&O. In this article, I will try to introduce and explain basic concepts. What is more, I hope that the entire article series will inspire AX’s „old birds” to look for an alternative and more optimal way of configuring automation.

In this first article, I will describe the basic components of automation, their role, and the way of working in the Dynamics 365 F&O.

Task automation

In Dynamics 365 F&O, many administrative tasks were somehow forced on IT employees. Of course, the previous versions of the system also required proper maintenance. However, there was greater latitude as to how to accomplish these tasks.

Personally, I consider this enforcement of good practices to be a big plus of Dynamics 365 F&O system. New methodologies give a greater sense of comfort and stability when operating the system. The workflow on tasks such as transferring code between environments or database backup forces the administrator to apply good practices during their implementation.

To explain the need for automation, I will first discuss what the Dynamics 365 F&O system administrator must face daily. The following diagrams illustrate two scenarios that cannot be avoided during Dynamics 365 F&O implementation phase, and after that during the maintenance phase.

Transferring code between environments

Workflow for transferring code between environments
Pic. 1 Workflow for transferring code between environments

The code workflow looks like this:

  1. The programmer creates the source code on the development machine.
  2. The source code is put into the code repository (in this case the repository on Azure DevOps).
  3. On the build machine, the code is compiled, and the output files are created (.dll files).
  4. The object code is transferred from the build machine to the Tier 2+ environment.
  5. From the Tier 2+ environment, the object code is transferred to the production environment.

Preparing a database backup

Workflow for preparing a database backup
Pic. 2 Workflow for preparing a database backup

The workflow for preparing a database backup is as follows:

  1. The database copy from the production environment to the Tier 2+ environment is started.
  2. From the Tier 2+ environment level, the database backup is transferred to the LCS Asset Library in the form of a .bacpac file.
  3. The last step is to download the database file from the LCS Asset Library and prepare it for use on development machines. During this operation, the database file will be converted from .bacpac to .bak format. For this operation, you can use the PowerShell script of a popular library created for Dynamics 365 F&O called “d365fo.tools”, and in it the “Import-D365Bacpac” function.

Without automation, all the above tasks must be initiated and supervised by the administrator manually. Most often at times significantly different from standard business hours.

Fortunately, not only the dynamic development of the Microsoft Dynamics 365 ecosystem, but also embedding the infrastructure of the newest AX in the Azure cloud yield innovative and interesting possibilities. While enjoying the benefits of cloud services, we can automate many time-consuming works.

Tasks, which are easiest to turn into automation mode are those concerning strictly administrative works.

The experience of our Competency Centre shows that implementing automation in processes of:

  • building,
  • releasing
  • and database backups

gives measurable effects in the sphere of:

  • costs,
  • saved time
  • and what is more – reduced stress ????

Automation components

In the automation system of the newest AX, it is worth understanding four main elements:

Runtime environment

It is a virtual machine where the actual automation work is carried out. In the case of D365 F&O, it is most often a Tier 1 virtual machine (a developer or building machine). The alternative is a runtime environment hosted in the Azure Cloud using the Microsoft Hosted Agent. The crucial property of this environment is less adjustment to Dynamics F&O. It lacks many components of a „classic” developer/build machine, such as SQL Server and therefore synchronization of a database cannot be done.

Azure Pipeline

Limiting the definition only to the context of automation in D365 F&O, it is a service of Azure DevOps, in which we determine the tasks in the process of automation. Here we will choose and parametrize exact steps, which along with using Agent and execution scripts, will be done on a particular runtime environment.

Azure Pipeline Agent

It is a program, working as an operating system service in a runtime environment. For D365 F&O it is preinstalled on the disk of the developer / build machine. It could also be used as a Microsoft Hosted Agent – a specific type of Azure Pipeline Agent embedded in the Azure cloud runtime environment provided by Microsoft, dedicated to quick builds.

It is responsible for downloading the step of the Pipeline’s saved definition in Azure Pipeline and ordering doing it in a runtime environment. To process the exact Pipeline’s step Agent uses execution scripts.

Execution scripts

The Agent needs instructions to perform the individual steps of the pipeline. Execution scripts are such instructions. These files store the code with which the Pipeline step is performed. Most often, execution scripts come in the form of PowerShell files. They are often stored and modified permanently in the runtime environment. It’s good to know that this is not the only way to work with these scripts.

A much better approach (in many cases) is to store scripts in a code repository. With the appropriate configuration (quite simple), the Pipeline Agent will download the execution scrips for further execution, and their content will be processed only in the context of the current Pipeline launch. This allows for convenient and flexible modification of execution scripts between individual Pipeline runs.

Another important feature is that execution scripts can be defined in many forms. It gives enormous possibilities in terms of what and how can be automated. The only limitation is the capabilities of the runtime, the capabilities of the selected scripting language, but honestly … the only limitation is our imagination ????

Interactions between components

The above components work together in the automation process. The diagram below shows the way of interaction between those components (Pic. 1).

Azure pipelines diagram
Pic. 3 Azure pipelines diagram

Within the service of Azure Pipelines particular tasks of the Pipeline are defined. In the runtime environment, there are Azure Pipeline’s Agents within the defined Azure Agent’s Pool. Azure Agent’s Pool defines which agents on which environments a given Pipeline can use. The pool conventionally groups Agents and is defined ithin the Azure Pipeline service.

A single agent communicates with the Azure Pipeline service and downloads the pipeline’s task sequentially. Afterwards, a single task is performed on the runtime environment with the usage of its resources and using execution scripts, until the downloading and performing of the last task from the Azure Pipeline definition.

Attention! In this system, communication is initiated only from the Azure Pipeline Agent side. This opens up a lot of possibilities in terms of the runtime environment configuration. In my opinion, the communication pattern, which Microsoft used here is simply an architectural masterpiece ???? As the content of this article is only an introduction, I will extend the topic in the next articles of that series.

Ending

The article presents the basic concepts of automation for Dynamics 365 F&O using Azure services. Its form is condensed – it omits some less significant, but still key details. I assumed that crucial thing is to understand the high-level logic of automation. Definitions are easy to find, read and understand using commonly available documentation materials.

However, I hope that the article will allow you to understand the fundamentals of automation for the D365 F&O system. Its idea and operation schema. This is the key to implementing it with full understanding.

In the next article, I will compare the methods of storing runtime environments. I would really like to hear your opinion regarding this content and encourage you to follow the blog.

Sources of valuable materials

  • Artiste.info – blog authored by Adrià Ariste Santacreu. A lot of material on the automation of administrative tasks in D365 F&O, but not only. The author also shares his knowledge and ideas about the usage of Dynamics-related tools and services that cooperates with it and can improve the system’s operational experience. The content is written in the form of “tutorials”, so you can learn through practice. I really recommend it as worth reading.
  • PowerShell library, which when used, could significalnty improve working with Dynamics 365 F&O – the code of the library is open source, the idea was developed by Mötz Jensen, but at the moment several people are working on it. An especially useful tool in daily work with administrative tasks under Dynamics 365 F&O.
  • MsDyn365FO – a blog authored by Paul Heisterkamp on Dynamics 365 F&O. Although there is not much content on the configuration of automation in Dynamics 365 F&O itself, the author shares some interesting ideas for solving common problems – e.g., automation of the appropriate environment preparation after database backup from PROD. Lots of interesting and useful content.
  • Microsoft documentation on automation using Azure Pipelines.
  • Microsoft documentation on automatic builds using Microsoft Hosted Agents.

***

If you are interested in other articles in the area of ​​Dynamics 365, we encourage you to read: Integracja Dynamics 365 Supply Chain Management z przewoźnikami and Komponenty niestandardowe w Power Apps.

5/5 ( vote: 1)
Rating:
5/5 ( vote: 1)
Author
Avatar
Kamil Gwiazdowski

Dynamics 365 F&O/SCM developer. Since recently also works as a technical architect. He is interested in quality management of solutions, automation of administrative tasks, and best practices in programming and project management. The Dynamics 365 ecosystem is as fascinating for him as the programming itself. The Azure cloud is also a topic that he can spend long hours reading and testing technical advancements. He likes developer work, including tasks unrelated to working with the code.

Leave a comment

Your email address will not be published. Required fields are marked *

You might also like

More articles

Don't miss out

Subscribe to our blog and receive information about the latest posts.

Get an offer

If you have any questions or would like to learn more about our offer, feel free to contact us.

Send your request Send your request

Natalia Competency Center Director

Get an offer

Join Sii

Find the job that's right for you. Check out open positions and apply.

Apply Apply

Paweł Process Owner

Join Sii

SUBMIT

Ta treść jest dostępna tylko w jednej wersji językowej.
Nastąpi przekierowanie do strony głównej.

Czy chcesz opuścić tę stronę?