On a recent project, we were faced with the challenge of keeping data synchronized between an external data source that was constantly being updated and Dynamics 365, which was serving primarily as the database for an externally facing Dynamics Portal website.
There were two goals in mind:
- Keep the strain on the source system to a minimum;
- Data should be synchronized as close to real-time as possible
The solution we came up with involves three main Azure components:
- Queue storage
- Logic Apps
- Function Apps
The first step of the process is to get the data we need into Azure. In this case, the source database (Epicor) has the ability to perform custom actions and/or run custom code when a record is modified in any way (created, updated, deleted). Using this functionality, we wrote small pieces of code that identify when a record that we need is updated and pushes the relevant data, as JSON, into an Azure Storage Queue that is specifically for that record-type. The workflow is simple, and the code is very lightweight, minimizing any strain on Epicor itself.
The next step of the process is to process the data that is now in the Azure Queue. This is done via an Azure Logic App. The Logic App is set up to watch a specific queue. Once a record appears in that queue, the Logic App picks it up for processing. The main step of the processing is to POST the JSON from the queue message to an Azure Function App we’ve created to process the record. If the POST to the Function App returns a 200 OK HTTP response, the record is deleted from the Azure Queue. If any other response is received, the record is put back in the queue to attempt to be processed again.
The Azure Function App, mentioned as part of the previous step, is the last piece of our data sync puzzle. This is code written to take in JSON and process it as an entity that belongs in Dynamics 365. It can be called like any RESTful web service. It determines based on the data it takes in if the record is new or an update to an existing record, then performs the necessary work to translate the data from JSON into the appropriate entity in Dynamics. If all is successful, it returns a ‘200 OK’ HTTP response. If not, it returns a response relevant to the problem that occurred.
Azure pieces configuration
In the short description of the data sync flow above, I’ve only briefly touched on each Azure piece, queues, logic apps and functions, and for the most part they’re all being used in a very simple way. However, it’s worth noting that each have multiple features and, in some cases, can be configured in different ways based on how you want them to act in specific scenarios.
For example, Azure Queues have a concept of a “Dequeue Count”. This is the number of times the item has been removed from the queue. If this number gets high, it generally indicates that whatever is processing the message is failing with regards to that message repeatedly. Monitoring this value can allow for a proactive response to potential errors.
For Logic Apps, when processing messages from an Azure Queue, the default configuration is to process up to 25 messages at a time in parallel. In certain scenarios, this may not be desirable functionality. For example, if records needed to be processed in order, you’d want to configure the Logic App to only pick up one message at a time. This can be done by altering the configuration of the step that monitors the queue and setting the concurrency value.
The example laid out above is just one way to solve the problem that was posed to our team. It allowed us to meet our goals of near-real-time data synchronization between the two systems while leaving a minimal impact on the source system. The pieces configured in Azure were easy to set up, allowing us to get the solution in place in a reasonable timeframe, but also powerful enough to handle a large volume of data.