Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Certified Platform Data Architect (Plat-Arch-201) Exam - Topic 2 Question 68 Discussion

Actual exam question for Salesforce's Salesforce Certified Platform Data Architect (Plat-Arch-201) exam
Question #: 68
Topic #: 2
[All Salesforce Certified Platform Data Architect (Plat-Arch-201) Questions]

Universal Containers (UC) has a data model as shown in the image. The Project object has a private sharing model, and it has Roll -Up summary fields to calculate the number of resources assigned to the project, total

hours for the project, and the number of work items associated to the project. What should the architect consider, knowing there will be a large amount of time entry records to be loaded regularly from an external

system into Salesforce.com?

Show Suggested Answer Hide Answer

Contribute your Thoughts:

0/2000 characters
Lawrence
15 days ago
I like A as well. It simplifies the loading process with external IDs.
upvoted 0 times
...
Twana
20 days ago
I feel like D is the safest route. We need to manage performance with large data.
upvoted 0 times
...
Alida
25 days ago
Option B seems less effective. Workflows might not handle large data loads well.
upvoted 0 times
...
Brendan
1 month ago
I'm leaning towards option C. Triggers can handle complex calculations efficiently.
upvoted 0 times
...
Patria
1 month ago
I agree, but option A sounds good too. External IDs can help maintain relationships.
upvoted 0 times
...
Beckie
2 months ago
C) might be overkill for summary calculations.
upvoted 0 times
...
Margurite
2 months ago
D) sounds like a smart move to avoid sharing issues.
upvoted 0 times
...
Callie
2 months ago
Wait, can triggers really handle that much data?
upvoted 0 times
...
Christiane
2 months ago
I disagree, B) could simplify the process.
upvoted 0 times
...
Caren
3 months ago
This question is a real head-scratcher. I'm just going to go with the one that involves the least amount of work - option A.
upvoted 0 times
...
Alba
3 months ago
Haha, using workflow to calculate summary fields? That's like trying to put out a fire with gasoline. Option B is a no-go.
upvoted 0 times
...
Malcom
3 months ago
Triggers? Really? Option C is just asking for trouble. I'd steer clear of that one.
upvoted 0 times
...
Julianna
3 months ago
I'd go with option D. Deferring sharing calculations is a smart move when dealing with large data loads.
upvoted 0 times
...
Edison
3 months ago
Option A seems like the way to go. Using external IDs to link data is a solid approach.
upvoted 0 times
...
Tula
3 months ago
I feel like using external IDs to link records is a good approach, but I wonder if it would complicate the data loading process. I need to think more about that.
upvoted 0 times
...
Rebecka
4 months ago
I think option D sounds familiar from our practice questions. Deferring sharing calculations could help manage the load better, but I'm not entirely sure how it works in this context.
upvoted 0 times
...
Yaeko
4 months ago
Okay, let's think this through. The question mentions that the Project object has a private sharing model, so I'm guessing that's an important factor to consider. Using triggers or workflow to calculate the summary values could be a good way to optimize performance, but I'm not sure which one would be better. I'll have to think about this one a bit more.
upvoted 0 times
...
Margot
4 months ago
I think option D is the best choice. Deferring sharing calculations can save processing time.
upvoted 0 times
...
Erasmo
4 months ago
I remember we discussed the implications of using Roll-Up summary fields, especially with large data volumes. It might be risky if they can't handle the load efficiently.
upvoted 0 times
...
Ronna
4 months ago
A) is the best option for linking records efficiently.
upvoted 0 times
...
Alesia
5 months ago
I’m a bit confused about whether using triggers would be better than Roll-Ups. We practiced a similar question, but I can't recall the pros and cons clearly.
upvoted 0 times
...
Rodrigo
5 months ago
I feel pretty confident about this one. Since there's a large amount of time entry records to be loaded, I think the best approach would be to load all the data first, and then defer the sharing calculations. That way, you can get the data into Salesforce quickly and then worry about the sharing model later. Option D seems like the way to go.
upvoted 0 times
...
Daren
5 months ago
Hmm, this is a tricky one. I think the key is to consider the performance implications of the different options. Loading all the data using external IDs seems like it could be efficient, but I'm not sure about the sharing model. Maybe option D would be the best approach?
upvoted 0 times
...
Sueann
5 months ago
I'm not sure about this one. The question seems to be asking about the best way to handle a large amount of time entry records being loaded into Salesforce, but I'm not sure which approach would be the most efficient.
upvoted 0 times
Tora
5 days ago
I agree, but what about using triggers? Option C could be efficient too.
upvoted 0 times
...
Mollie
10 days ago
I think option D makes sense. Deferring calculations could save time.
upvoted 0 times
...
...

Save Cancel