Dyanmics CRM Data Migration using KingswaySoft Tips

I will divide the tips to CRM/SQL & Kingswaysoft parts:

CRM & SQL Optimizations:

  • Disable CRM plugins, audit and/or workflows in your initial load if you can, as they all have certain impact to your data integration or migration performance.
  • Ensure that there are no real-time workflow or synchronous plugins when doing any data migrations into CRM as this will affect badly the migration speed, try to convert them to Asynchronous workflows & plugins.
  • CRM plugins or workflows usually have a certain degree of performance impact on your CRM data integration. Poorly designed CRM plugins or workflows could severely affect your integration performance. Try to compare the performance before and after enabling them, in some cases you might have to revisit their design to make sure that best practices are applied in your custom code.
  • Ensure that the CRM maintenance job are not running on the same time you are running the migration packages specially the re-index job, you can use CRMJobEditor tool to modify the schedule for these system jobs, in general its advised to have them running out of the core business hours.
  • Make sure that “Reindex All” CRM maintenance job is configured and running properly, or otherwise create DB maintenance jobs to REBUILD or REORGANIZE indexes for your CRM database on a regular basis.
  • Monitor your database server to see if there are any excessive db locks.
  • Schedule the jobs to run from within SQL server agent as described here, Set the SSIS Package ProtectionLevel property to EncryptSensitiveWithPassword in case the connection passwords are stored locally and not passed as a parameter to the package as described here. It is advised to create package configurations file as described here. Make sure that the package are being executed using 32 bit run-time mode to allow BDD to run, you will need to do that as well in Visual studio for debugging purposes when setting the TargetServerVersion to SQL Server 2014 as described here.
  • Two components that impact the speed of your data migration are network latency and concurrency. Latency is the time that it takes for an information packet to travel through a network from its source to destination. Concurrency refers to processes that are executing simultaneously, working together to achieve the end result.

Kingswaysoft Optimizations:

  • To use CRM Bulk Data Load API, you just need to enter a batch size greater than 1 in the CRM destination component.
  • Avoid using the Duplicate Detection option if you can.
  • Make sure that you are always passing in a valid lookup reference for all lookup field, avoid using “Remove Unresolvable References” option. The option is designed for special scenario, and it does involve checking each lookup field value which could be very expensive some time.
  • Upsert action (except when the Alternate Key matching option is used) involves an extra service call which queries the target system by checking the existence of the incoming record, which has a cost associated in terms of its impact on your overall integration performance. If you have a way to do a straight Update or Create, it would typically offer you a better performance.
  • For CRM On-premise, you would typically use 5 BDD branches in each data flow with each CRM destination component using a batch size of 200 or 250. You can have multiple data flow tasks in the same SSIS package that write to CRM server simultaneously.
  • If you have a multi-node cluster for your on-premise deployment, you can use CRM connection manager’s CrmServerUrl property in its ConnectionString to specifically target a particular node within the cluster. Doing so, you can have multiple connection managers in the same package or project that target different nodes of the cluster, and you write to multiple destination components of the same configuration with different connection managers, so that you are technically writing to multiple cluster nodes in parallel, which provides some additional performance improvement on top of BDD.





Best Practices when Writing Dynamics CRM Plugins

  • For improved performance, Microsoft Dynamics 365 caches plug-in instances. The plug-in’s Execute method should be written to be stateless because the constructor is not called for every invocation of the plug-in. Also, multiple system threads could execute the plug-in at the same time. All per invocation state information is stored in the context, so you should not use global variables or attempt to store any data in member variables for use during the next plug-in invocation unless that data was obtained from the configuration parameter provided to the constructor. Changes to a plug-ins registration will cause the plug-in to be re-initialized.
  • Its a best practice to check for the target entity name and message name at the beginning of the plugin execute message to avoid running the plugin unintentionally.
  • When you want to update fields on a record, it’s good practice to create a new entity or early bound type of the record and only add the fields you want to update. By only updating the fields you are changing you reduce triggering other plugins running needlessly.
  • When retrieving an entity using SDK, make sure you are instantiating a new object (not just assigning a reference) and assigning it to the returned object from the retrieve SDK message for better performance.
  • Do not update the retrieve Target entity because it will update all fields included in the target entity.  The primary entity targeted by a platform create or update event should not be updated within the context of plug-in execution.  Developers should instead design their plug-in to execute in a stage prior to the core operation and manipulate the target object in the InputParameters.
  • The common method to avoid a recurring plugin is to check if  a plugins depth > 1.  This would stop the plugin from being run if was triggered from any other plugin.  The plugin would only run if triggered from the CRM form. This can resolve the problem of plugins firing more than once but it stops plugins being triggered from other plugins, this might not be the functionality you require.
  • Never save the Organization service in the CRM execution context in a static variable as this will lead to lots of OpenDataReader errors thrown by the CRM platform since we must keep the plugin/custom step stateless as per MSDN plugin best practices.
  • You need to have proper exception handling for your developed plugins to better troubleshoot any unexpected behaviors, for synchronous plug-ins, you can optionally display a custom error message in the error dialog of the web application by having your plug-in throw an InvalidPluginExecutionException exception with the custom message string as the exception Message property value, before throw the exception its good practice to log the exception error, origin and any other kind of helpful logging information in your custom logging location.
  • If you throw InvalidPluginExecutionException and do not provide a custom message, a generic default message is displayed in the error dialog. It is recommended that plug-ins only pass an InvalidPluginExecutionException back to the platform.
  • Plug-ins should exist with others in a project and not be isolated. An example of an exception to this recommendation would be if a plug-in needed to be selectively deployed to an environment, whereas the others are not to be deployed.  There are two areas of impact for this observed pattern of a single plug-in per assembly:
    1. Performance – each plug-in assembly has a lifecycle that is managed by the CRM deployment, which includes loading, caching, and unloading.  Having more than one assembly containing plug-ins causes more work to be done on the server and could affect the time in which it takes for a plug-in to execute.
    2. Maintainability – having more than one project in Visual Studio can make it more difficult to manage.  It also adds additional steps when packaging a solution and managing deployments.

    Consider merging isolated plug-ins into a single Visual Studio project and assembly.

  • Use NOLOCK hint for Microsoft Dynamics CRM QueryExpression and FetchXml requests for CRM entities that are not having frequent changes like configuration for better query execution performance.
  • Update plug-in target entity only contains the updated attributes. However, often the plug-in will require information from other attributes as well. Instead of issuing a retrieve query, the best practice is to push the required data in an image instead by using the Pre-Image.
  • Its advised for logging purposes to use ready made libraries like log4net or Nlog as they are optimized libraries for handling concurrency and high workload scenarios with many logging options and providers.
  • Avoid usage of batch request types like ExecuteMutipleRequest in plug-ins and workflow activities. Use these batch messages where code is being executed outside of the platform execution pipeline, such as integration scenarios where network latency would likely reduce the throughput and increase the duration of larger, bulk operations.
  • ExecuteMultiple and ExecuteTransaction messages are considered batch request messages. Their purpose is to minimize round trips between client and server over high-latency connections. Plug-ins either execute directly within the application process or in close proximity when sandbox-isolated, meaning latency is rarely an issue. Plug-in code should be very focused operations that execute quickly and minimize blocking to avoid exceeding timeout thresholds and ensure a responsive system for synchronous scenarios.






I will try to keep this post updated with further tips and tricks as soon as I know them.