Universal Containers (UC) has a very large and complex Salesforce org with hundreds of validation rules and triggers. The triggers are responsible for system updates and data manipulation as records are created or updates by users. A majority of the automation tool within UC'' org were not designed to run during a data load. UC is importing 100,000 records into Salesforce across several objects over the weekend.
What should a data architect do to mitigate any unwanted results during the import?
Ensuring validation rules, triggers and other automation tools are disabled is the best way to mitigate any unwanted results during the import, as it prevents any errors or conflicts that may occur due to the existing logic. Ensuring duplication and matching rules are defined may not be sufficient or relevant for preventing unwanted results. Importing the data in smaller batches over a 24-hour period may not be necessary or efficient. Bulkifying the trigger to handle import leads may not be possible or desirable if the triggers were not designed to run during a data load.
Barrie
2 months agoHortencia
2 months agoPearly
2 months agoVerlene
3 months agoBok
3 months agoGracie
3 months agoStevie
4 months agoBrett
4 months agoShawnta
4 months agoAlana
4 months agoGianna
4 months agoMartina
5 months agoEdda
5 months agoLatricia
5 months agoZena
5 months agoKasandra
5 months agoKeneth
5 months agoDerick
2 months agoIn
2 months agoJulio
2 months agoGoldie
3 months agoFrance
6 months agoTyra
7 months ago