New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Certified Platform Data Architect (Plat-Arch-201) Exam - Topic 5 Question 35 Discussion

Actual exam question for Salesforce's Salesforce Certified Platform Data Architect (Plat-Arch-201) exam
Question #: 35
Topic #: 5
[All Salesforce Certified Platform Data Architect (Plat-Arch-201) Questions]

UC has migrated its Back-office data into an on-premise database with REST API access. UC recently implemented Sales cloud for its sales organization. But users are complaining about a lack of order data inside SF.

UC is concerned about SF storage limits but would still like Sales cloud to have access to the data.

Which design patterns should a data architect select to satisfy the requirement?

Show Suggested Answer Hide Answer
Suggested Answer: B

Creating validation rules to check if the required attributes are entered is the best option to mandate this when customers are created in Salesforce. Validation rules allow you to specify criteria that must be met before a record can be saved. You can use validation rules to ensure that customers have a first name, last name, and email when they are created in Salesforce. This way, you can prevent incomplete or invalid data from being sent to your MDM solution.


Contribute your Thoughts:

0/2000 characters
Shanice
3 months ago
Option C could be a solid long-term solution too!
upvoted 0 times
...
Annabelle
3 months ago
I’m not sure about option D, seems like a workaround.
upvoted 0 times
...
Leonie
3 months ago
Wait, can SF Connect really handle all that data?
upvoted 0 times
...
Carman
4 months ago
Totally agree, virtualizing data sounds smart!
upvoted 0 times
...
Leontine
4 months ago
I think option B is the best choice to avoid storage issues.
upvoted 0 times
...
Amie
4 months ago
I vaguely recall something about iframes, but I'm not convinced that option D would provide a good user experience. It feels like a workaround.
upvoted 0 times
...
Deonna
4 months ago
I practiced a similar question about data access in Salesforce, and I think using SF Connect could really help with the storage issue.
upvoted 0 times
...
Jonelle
4 months ago
I'm not entirely sure, but I feel like developing a bidirectional integration could be overkill for this situation. Option C seems complicated.
upvoted 0 times
...
Maryln
5 months ago
I remember we discussed the pros and cons of migrating data versus virtualizing it. I think option B might be the best choice to avoid hitting storage limits.
upvoted 0 times
...
Lennie
5 months ago
The virtualization approach with Salesforce Connect sounds promising, but I'll need to make sure it meets all the requirements.
upvoted 0 times
...
Jules
5 months ago
I'm feeling pretty confident about this one. I'll weigh the pros and cons of each approach and select the best option.
upvoted 0 times
...
Edmond
5 months ago
Okay, I think I've got a strategy. I'll focus on the key requirements - access to the data in Salesforce and avoiding storage limits.
upvoted 0 times
...
Kanisha
5 months ago
Hmm, I'm a bit confused about the different design patterns here. I'll need to review the details of each option.
upvoted 0 times
...
Lavelle
5 months ago
This seems like a tricky one. I'll need to think through the storage and integration options carefully.
upvoted 0 times
...
Vallie
5 months ago
This seems like a straightforward question about upgrading a VM-100 BYOL instance to a VM-300 BYOL instance. I think the key things I need to focus on are the specific requirements for the upgrade.
upvoted 0 times
...
Joanna
5 months ago
Hmm, I'm not sure about this one. I know we need to find 12% of 9,500, but I'm not sure if I should round the answer or leave it as a decimal. I'll have to think it through carefully.
upvoted 0 times
...
Miles
5 months ago
Hmm, this is a tricky one. I'll need to think carefully about the different transformation types and how they might impact the Transaction Control transformation.
upvoted 0 times
...
Sylvie
10 months ago
Option F: Develop a secret handshake protocol to transfer the data between the systems. It'll be like a spy thriller, but with more spreadsheets.
upvoted 0 times
...
Maynard
10 months ago
Option E: Teach the sales team to use carrier pigeons to fetch the data from the on-premise system. No storage limits, and it's a great team-building exercise!
upvoted 0 times
Casie
8 months ago
E) Teach the sales team to use carrier pigeons to fetch the data from the on-premise system. No storage limits, and it's a great team-building exercise!
upvoted 0 times
...
Dulce
8 months ago
C) Develop a bidirectional integration between the on-premise system and Salesforce.
upvoted 0 times
...
Vanna
9 months ago
B) Use SF Connect to virtualize the data in SF and avoid storage limits.
upvoted 0 times
...
...
Kenneth
10 months ago
Option A might work, but I'm not sure I'd want to migrate all that data into Salesforce just to take advantage of the native functionality. Seems like a lot of overhead for the storage limits.
upvoted 0 times
...
Eric
10 months ago
Hmm, Option D seems interesting. Iframing the on-premise system in Salesforce could be a creative way to give the sales team the data they need without too much integration complexity.
upvoted 0 times
Alease
8 months ago
It's a creative approach to address the data access issue.
upvoted 0 times
...
Eileen
9 months ago
I agree, it could simplify things for the sales team.
upvoted 0 times
...
Daniela
10 months ago
Option D seems like a good solution.
upvoted 0 times
...
...
Yolande
10 months ago
I would go with Option C. A bidirectional integration between the systems ensures that the data is always in sync, and it gives the sales team the access they need without compromising the on-premise system.
upvoted 0 times
Lisha
8 months ago
True, a bidirectional integration would provide the best of both worlds for UC.
upvoted 0 times
...
Blair
9 months ago
I see your point, but with Option C, we can avoid storage limits and keep both systems updated.
upvoted 0 times
...
Doretha
10 months ago
But wouldn't Option A be easier to implement? Just migrate the data into SF.
upvoted 0 times
...
Sheridan
10 months ago
I think Option C is the best choice. It keeps the data in sync between systems.
upvoted 0 times
...
...
Marica
10 months ago
Option B looks like the way to go. Virtualizing the data in Salesforce seems like the most efficient solution to avoid storage limits. Plus, it keeps the data secure in the on-premise system.
upvoted 0 times
...
Eve
10 months ago
That's a good point, but I still think option B is more efficient in this scenario.
upvoted 0 times
...
Jaime
10 months ago
I disagree, I believe option C is the way to go as it ensures a seamless integration between the systems.
upvoted 0 times
...
Eve
11 months ago
I think option B is the best choice because it allows us to access the data without worrying about storage limits.
upvoted 0 times
...
Shizue
11 months ago
I'm not sure about option B). I think option C) might be better as it allows for bidirectional integration between systems.
upvoted 0 times
...
Ethan
11 months ago
I agree with Deeanna. Using SF Connect to virtualize the data seems like the most efficient solution.
upvoted 0 times
...
Deeanna
11 months ago
I think option B) sounds like a good idea. It would allow Sales cloud to access the data without worrying about storage limits.
upvoted 0 times
...

Save Cancel