Universal Containers (UC) has a very large and complex Salesforce org with hundreds of validation rules and triggers. The triggers are responsible for system updates and data manipulation as records are created or updates by users. A majority of the automation tool within UC'' org were not designed to run during a data load. UC is importing 100,000 records into Salesforce across several objects over the weekend.
What should a data architect do to mitigate any unwanted results during the import?
Ensuring validation rules, triggers and other automation tools are disabled is the best way to mitigate any unwanted results during the import, as it prevents any errors or conflicts that may occur due to the existing logic. Ensuring duplication and matching rules are defined may not be sufficient or relevant for preventing unwanted results. Importing the data in smaller batches over a 24-hour period may not be necessary or efficient. Bulkifying the trigger to handle import leads may not be possible or desirable if the triggers were not designed to run during a data load.
Currently there are no comments in this discussion, be the first to comment!