For those that have lived in a bubble until now, the Data Export Service (DES) is a service available from Microsoft (on AppSource HERE) that allows exporting data and schema from a Dynamics 365 Online instance (sorry old-timers, your existing on-premise environment should have kept up with the times) to your own Azure SQL database. Follow that link above for more details.
Configuring this service is quite trivial, with only a few inputs needed. One gotcha to keep in mind is that, in order to configure specific entities to synchronize, they must have “change tracking” enabled at the entity level, as seen in the screenshot below.
Anyway, this article is not about what DES is, or how to configure it. You will find literally a thousand articles about that by running a quick search.
Instead, I want to focus on some of the scenarios where this makes sense to use. Let’s keep it high level so we can cover more.
Building a DW/DM/ODS
Ok, maybe a little misleading here in the title, but you get the idea.
DES exports the schema and data. Consequently, on the other end (In your Azure SQL) you will get a structure that maps the internal CDM structure. This is in no way your DW. Obviously you will need to transform this data and bring it into your final DW. DES will bring your data into your staging area, and you can then put in place all necessary processes, validations and transformation before this data finally makes it into your DW. This way, you end up with exactly what you need, in the format you need, and with minimal potential future issues.
Supporting a complex BI Solution
Reporting is at the top of mind for all organizations. After all, on the path to Digital Transformation, we’ve planned and executed on the data collection strategies, and now we need to find out how to make sense of all the data we are now collecting.
The Dynamics 365 platform does allow for some simplistic reporting using the built-in tools, and for some scenarios, those are just enough.
For more complex scenarios, you need a tool like Power BI or Tableau to provide added values. I’ve mentioned these two as the top players right now, but things change every day. Of course, as a MSFT based solution, Power BI is at the top of mind, but I’m finding a lot of organizations using Tableau instead.
Using DES in this scenario allows for complex queries with potential performance impact to be run against a data source with no impact on the actual production environment.
Furthermore, taking this approach allows you to blend data from other external sources, allowing for encompassing analysis and reporting that provide better overall value with decreased risks.
Different from the support for a BI solution, the Advanced Analytics aspect handles the support for Artificial Intelligence (AI) and Machine Learning (ML). Extracting data from your current production system and feeding into a ML model allows for a whole new way of making informed and automated decisions. Similar to the BI solution support model though, you can mash data from various systems and correlate the analysis for better results.
This topic can be discussed in more detail in another full post, at a future time.
Aside from the scenarios described in the previous topics, let’s consider a more complex implementation which leverages integrations and extensions leveraging Azure services. The Dynamics API has a set of limitations. For the Power Platform, these limits are described HERE. Business Central limits are described HERE. These are just some examples, but the idea is that any platform exposing an API should have some limitations enforced. Reasons include overall platform performance, security, etc.
In such situation, where the overall solution starts to push the limits of these APIs, you have to find more creative ways of designing and implementing such solution.
Each API call has a cost associated. I don’t necessarily mean a financial cost, I’m talking about a performance cost here.
Reducing or eliminating the read operations, while acknowledging the fact that a read has a lower cost that a write, could make the difference between a slow performing solution that results in a large number of failures, or a top performing solution.
Also, on the same there, for read-only operations, this could be the best approach. You will have systems that require only read access to your data, and exposing it from an Azure SQL database rather than directly from the production system is the way to go.
We’ve all worked on that integration project where we needed a lot of Data Transformation (DT) done before the data can be ingested in another system. That can be costly, and will take a toll on not only the source platform but also on the integration project. Your overall integration time is extended, and results in runs that can push the limits of the time window available for this particular integration scenario.
Leveraging DES to export the relevant data to an Azure SQL, and putting on top a custom API that serves the data in the modified format needed at target allows for better overall performance. I’ve seen cases where taking this approach, while having an increased upfront cost, results in increased performance and can reduce the total run time substantially. This can free up space for other jobs to be run, while having no performance impact on the source production system.
There’s so much more to talk about, but in closing, there are some specific scenarios where this service is essential.
For more details on DES you can look at the documentation on docs HERE.
Do you have other scenarios where DES was a life saver? Leave a note below and let me know!