I will divide the tips to CRM/SQL & Kingswaysoft parts:
CRM & SQL Optimizations:
- Disable CRM plugins, audit and/or workflows in your initial load if you can, as they all have certain impact to your data integration or migration performance.
- Ensure that there are no real-time workflow or synchronous plugins when doing any data migrations into CRM as this will affect badly the migration speed, try to convert them to Asynchronous workflows & plugins.
- CRM plugins or workflows usually have a certain degree of performance impact on your CRM data integration. Poorly designed CRM plugins or workflows could severely affect your integration performance. Try to compare the performance before and after enabling them, in some cases you might have to revisit their design to make sure that best practices are applied in your custom code.
- Ensure that the CRM maintenance job are not running on the same time you are running the migration packages specially the re-index job, you can use CRMJobEditor tool to modify the schedule for these system jobs, in general its advised to have them running out of the core business hours.
- Make sure that “Reindex All” CRM maintenance job is configured and running properly, or otherwise create DB maintenance jobs to REBUILD or REORGANIZE indexes for your CRM database on a regular basis.
- Monitor your database server to see if there are any excessive db locks.
- Schedule the jobs to run from within SQL server agent as described here, Set the SSIS Package ProtectionLevel property to EncryptSensitiveWithPassword in case the connection passwords are stored locally and not passed as a parameter to the package as described here. It is advised to create package configurations file as described here. Make sure that the package are being executed using 32 bit run-time mode to allow BDD to run, you will need to do that as well in Visual studio for debugging purposes when setting the TargetServerVersion to SQL Server 2014 as described here.
- Two components that impact the speed of your data migration are network latency and concurrency. Latency is the time that it takes for an information packet to travel through a network from its source to destination. Concurrency refers to processes that are executing simultaneously, working together to achieve the end result.
- To use CRM Bulk Data Load API, you just need to enter a batch size greater than 1 in the CRM destination component.
- Avoid using the Duplicate Detection option if you can.
- Make sure that you are always passing in a valid lookup reference for all lookup field, avoid using “Remove Unresolvable References” option. The option is designed for special scenario, and it does involve checking each lookup field value which could be very expensive some time.
- Upsert action (except when the Alternate Key matching option is used) involves an extra service call which queries the target system by checking the existence of the incoming record, which has a cost associated in terms of its impact on your overall integration performance. If you have a way to do a straight Update or Create, it would typically offer you a better performance.
- For CRM On-premise, you would typically use 5 BDD branches in each data flow with each CRM destination component using a batch size of 200 or 250. You can have multiple data flow tasks in the same SSIS package that write to CRM server simultaneously.
- If you have a multi-node cluster for your on-premise deployment, you can use CRM connection manager’s CrmServerUrl property in its ConnectionString to specifically target a particular node within the cluster. Doing so, you can have multiple connection managers in the same package or project that target different nodes of the cluster, and you write to multiple destination components of the same configuration with different connection managers, so that you are technically writing to multiple cluster nodes in parallel, which provides some additional performance improvement on top of BDD.