Dyanmics CRM Data Migration using KingswaySoft Tips

I will divide the tips to CRM/SQL & Kingswaysoft parts:

CRM & SQL Optimizations:

  • Disable CRM plugins, audit and/or workflows in your initial load if you can, as they all have certain impact to your data integration or migration performance.
  • Ensure that there are no real-time workflow or synchronous plugins when doing any data migrations into CRM as this will affect badly the migration speed, try to convert them to Asynchronous workflows & plugins.
  • CRM plugins or workflows usually have a certain degree of performance impact on your CRM data integration. Poorly designed CRM plugins or workflows could severely affect your integration performance. Try to compare the performance before and after enabling them, in some cases you might have to revisit their design to make sure that best practices are applied in your custom code.
  • Ensure that the CRM maintenance job are not running on the same time you are running the migration packages specially the re-index job, you can use CRMJobEditor tool to modify the schedule for these system jobs, in general its advised to have them running out of the core business hours.
  • Make sure that “Reindex All” CRM maintenance job is configured and running properly, or otherwise create DB maintenance jobs to REBUILD or REORGANIZE indexes for your CRM database on a regular basis.
  • Monitor your database server to see if there are any excessive db locks.
  • Schedule the jobs to run from within SQL server agent as described here, Set the SSIS Package ProtectionLevel property to EncryptSensitiveWithPassword in case the connection passwords are stored locally and not passed as a parameter to the package as described here. It is advised to create package configurations file as described here. Make sure that the package are being executed using 32 bit run-time mode to allow BDD to run, you will need to do that as well in Visual studio for debugging purposes when setting the TargetServerVersion to SQL Server 2014 as described here.
  • Two components that impact the speed of your data migration are network latency and concurrency. Latency is the time that it takes for an information packet to travel through a network from its source to destination. Concurrency refers to processes that are executing simultaneously, working together to achieve the end result.

Kingswaysoft Optimizations:

  • To use CRM Bulk Data Load API, you just need to enter a batch size greater than 1 in the CRM destination component.
  • Avoid using the Duplicate Detection option if you can.
  • Make sure that you are always passing in a valid lookup reference for all lookup field, avoid using “Remove Unresolvable References” option. The option is designed for special scenario, and it does involve checking each lookup field value which could be very expensive some time.
  • Upsert action (except when the Alternate Key matching option is used) involves an extra service call which queries the target system by checking the existence of the incoming record, which has a cost associated in terms of its impact on your overall integration performance. If you have a way to do a straight Update or Create, it would typically offer you a better performance.
  • For CRM On-premise, you would typically use 5 BDD branches in each data flow with each CRM destination component using a batch size of 200 or 250. You can have multiple data flow tasks in the same SSIS package that write to CRM server simultaneously.
  • If you have a multi-node cluster for your on-premise deployment, you can use CRM connection manager’s CrmServerUrl property in its ConnectionString to specifically target a particular node within the cluster. Doing so, you can have multiple connection managers in the same package or project that target different nodes of the cluster, and you write to multiple destination components of the same configuration with different connection managers, so that you are technically writing to multiple cluster nodes in parallel, which provides some additional performance improvement on top of BDD.

References:

https://mbs.microsoft.com/customersource/northamerica/CRM/learning/documentation/user-guides/DataMigrationCRMOnlineOnboardingSuccess

http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365/faq

Advertisements

Best Practices when Writing Dynamics CRM Plugins

  • For improved performance, Microsoft Dynamics 365 caches plug-in instances. The plug-in’s Execute method should be written to be stateless because the constructor is not called for every invocation of the plug-in. Also, multiple system threads could execute the plug-in at the same time. All per invocation state information is stored in the context, so you should not use global variables or attempt to store any data in member variables for use during the next plug-in invocation unless that data was obtained from the configuration parameter provided to the constructor. Changes to a plug-ins registration will cause the plug-in to be re-initialized.
  • Its a best practice to check for the target entity name and message name at the beginning of the plugin execute message to avoid running the plugin unintentionally.
  • When you want to update fields on a record, it’s good practice to create a new entity or early bound type of the record and only add the fields you want to update. By only updating the fields you are changing you reduce triggering other plugins running needlessly.
  • When retrieving an entity using SDK, make sure you are instantiating a new object (not just assigning a reference) and assigning it to the returned object from the retrieve SDK message for better performance.
  • Do not update the retrieve Target entity because it will update all fields included in the target entity.
  • The common method to avoid a recurring plugin is to check if  a plugins depth > 1.  This would stop the plugin from being run if was triggered from any other plugin.  The plugin would only run if triggered from the CRM form. This can resolve the problem of plugins firing more than once but it stops plugins being triggered from other plugins, this might not be the functionality you require.

References:

https://msdn.microsoft.com/en-us/library/gg328263.aspx

https://crmbusiness.wordpress.com/2015/07/01/stopping-infinite-plugins-with-parameters-depth-and-parentcontext/

I will try to keep this post updated with further tips and tricks as soon as I know them.

Joining a New CRM Server to an Upgraded CRM Deployment

Sometimes we need to scale out a CRM 2015 deployment that is having the latest update installed, when you try to join a new server you are faced with the below error in the installer checks page:

The Product key is not compatible with installed version of Microsoft Dynamics CRM.

I believe this is a bug in the installer so to workaround it add the IgnoreChecks registry key to the computer that is running Microsoft Dynamics CRM so the installation can proceed when an error is shown in the Environmental Diagnostic Wizard (EDW):

Click Start, click Run, type regedit, and then click OK.

In the registry, locate the following subkey: 

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSCRM

Right-click MSCRM, point to New, click DWORD Value, and then type IgnoreChecks.

Double-click IgnoreChecks, and then type 1 in the Value data field.

Note: After including this registry value you will get the same error message in installation wizard but this time the Next> button will be enabled in Installation Wizard to complete this installation so just proceed and install the CRM latest updates at a later step.

CRM Indexing Management & Rebuild Index Scheduled Jobs

Reviewing index fragmentation on the CRM database is an important aspect for monitoring the CRM organizations as high levels of fragmentation will badly impact the performance, the fragmented indexes can be sorted out by reorganizing or rebuilding.

CRM provides 2 scheduled jobs for Indexes management:

  1. Indexing Management: responsible for creating indexes for any quick find columns
  2. Rebuild Index: responsible for maintaining the indexes fragmentation by doing index rebuild

Although it is supported by Microsoft to manually create or rebuild indexes but it is a best practice to let the scheduled jobs mentioned above do these 2 functions, if you needed to create a new index for a searchable column just add it to the quick find view and the next time the index management job runs it will automatically create the index for you.

Limitations to using Business Rules – CRM 2015

Here are the main limitations to using business rules from Microsoft official materials:

  • Business rules run only when the form loads and when field values change. They do not run when a record is saved, unless the scope for the rule is set at an entity level.
  • Business rules work only with fields. If you need to interact with other visible elements, such as tabs and sections, within the form you need use form scripts.
  • When you set a field value by using a business rule, any OnChange event handlers for that field will not run. This is to reduce the potential for a circular reference, which could lead to an infinite loop.
  • If a business rule references a field that is not present on a form, the rule will simply not run. There will be no error message.
  • Whole Number fields that use the formats for TimeZone, Duration, or Language will not appear in the rule editor for the conditions or actions, so they cannot be used with business rules.
  • For Microsoft Dynamics CRM for tablets, the definition of the business rules are downloaded and cached when CRM for tablets opens. Changes made to business rules are not applied until CRM for tablets is closed and re-opened.
  • When you set the value of a lookup field, the text of the primary field value that is set in the form will always match the text that is visible in the rule definition. If the text representing the primary field value of the record you are setting in the lookup changes, the value set by your rule will continue to use the text portion of the primary field value defined by the rule. To fix this, update the rule definition to use the current primary name field value.

I want to add that you can’t debug business rules as you can do when witting JavaScript code.

Moving Site Collection to Different Web Application – SharePoint

We created a site collection for saving CRM documents on Intranet web application, we wanted to move that site collection to a new web application to avoid having a single point of failure for the 2 site collections and also allow the CRM administrators to control the maximum attachment size for files stored on SharePoint.

We achieved this by running the below 2 PowerShell cmdlets:

Backup-SPSite -Identity “http://intranet.domain.com/sites/CRMDocs” -Path “C:\CRMDocumentsSiteCollection.bak”

Restore-SPSite -Identity “http://intranet.domain.com:8888/sites/CRMDocs” -Path “C:\CRMDocumentsSiteCollection.bak”

The site collection will be created if not already existing, and if we want to override an already existing site collection on the specified URL then we will need to add the -force parameter.

Then we re-configured the documents management settings in CRM to point to the new site collection URL, the good thing about CRM is that it saves the document locations relatively, so such a change in the base URL only requires updating the SharePoint site URL and everything works fine.

Please note that in case you faced a SQL timeout while re-configuring the document management settings in CRM due to having large number of documents then you will need to increase the CRM SQL timeouts on the front end servers.

CRM 2015 SLA New Features – Differences

CRM 2015 SLA is now having 2 main types:

  • Standard SLA
  • Enahanced SLA

Below is a comparison between the 2 types:

Standard SLA Enhanced SLA
Populates First Response by and/or Resolve by values in the case entity. Uses a new entity (SLA KPI Instance) to store this data.
Failure time stamped on case entity attributes. Failure/Warning times stamped on related SLA KPI Instance record displayed via a sub grid in the case form
Timer control based on case entity fields; can be directly added to case form. Timer control based on related SLA KPI Instance fields; can be added using Quick From
SLA Pause/Resume is not available SLA Pause/Resume while SLA Time calculation is automatically paused when a case is put on Hold, also the amount of time on hold is also tracked.

There is system setting that allows pausing cases SLA automatically for certain case status reasons.

The ability to pause can be disabled/enabled for each SLA.

Success actions are not available. Trigger actions when an SLA is successful

Important Considerations:

  • Cannot change SLA type once created.
  • Case SLA cannot be sorted by Enhanced SLA fields as they are now on another entity.
  • Queue Item views cannot display Enhanced SLA fields.