New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Certified Platform Data Architect (Plat-Arch-201) Exam - Topic 5 Question 38 Discussion

Actual exam question for Salesforce's Salesforce Certified Platform Data Architect (Plat-Arch-201) exam
Question #: 38
Topic #: 5
[All Salesforce Certified Platform Data Architect (Plat-Arch-201) Questions]

Get Cloudy Consulting monitors 15,000 servers, and these servers automatically record their status every 10 minutes. Because of company policy, these status reports must be maintained for 5 years. Managers at Get Cloudy Consulting need access to up to one week's worth of these status reports with all of their details.

An Architect is recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce.

Which two limits should the Architect be aware of? (Choose two.)

Show Suggested Answer Hide Answer
Suggested Answer: A

Using Bulk API to export 1 million records from Salesforce is the best option. Bulk API is a RESTful API that allows you to perform asynchronous operations on large sets of data. You can use Bulk API to create, update, delete, or query millions of records in batches. Bulk API is optimized for performance and scalability, and it can handle complex data loading scenarios.


Contribute your Thoughts:

0/2000 characters
Florinda
3 months ago
Are we sure 5 years of data is really necessary? Seems excessive!
upvoted 0 times
...
Laurel
3 months ago
I disagree, workflow rule limits aren't as relevant here.
upvoted 0 times
...
Genevive
3 months ago
Wait, 15,000 servers? That's a lot to manage!
upvoted 0 times
...
Skye
4 months ago
I think API request limits are also crucial to consider.
upvoted 0 times
...
Sheldon
4 months ago
Definitely data storage limits! That's a big one.
upvoted 0 times
...
Juliann
4 months ago
I recall a practice question about web service callout limits, but I'm not sure if that's applicable here since it's more about data storage.
upvoted 0 times
...
Annita
4 months ago
I feel like workflow rule limits might not be as important for this scenario, but I could be wrong.
upvoted 0 times
...
Kattie
4 months ago
I'm not entirely sure, but I think API request limits could be relevant here too, given the frequency of status updates.
upvoted 0 times
...
Marge
5 months ago
I remember studying data storage limits, especially since we discussed how they can impact large datasets like this one.
upvoted 0 times
...
Berry
5 months ago
I feel pretty confident about this one. The data storage limit and API request limit are the two key things the Architect needs to be aware of. The data storage limit will determine how much historical data can be kept in Salesforce, and the API request limit will impact how quickly that data can be pulled in.
upvoted 0 times
...
Jessenia
5 months ago
Okay, let's think this through step-by-step. We have 15,000 servers reporting status every 10 minutes, and we need to store 1 week's worth of data in Salesforce. That's a lot of data, so the data storage limit is definitely something I need to consider. And since we'll be pulling all that data into Salesforce, the API request limit is also important.
upvoted 0 times
...
Nakisha
5 months ago
Hmm, I'm a bit unsure about this one. I know Salesforce has limits, but I can't remember exactly which ones are most relevant here. I'll need to review the Salesforce documentation to make sure I don't miss anything.
upvoted 0 times
...
Fatima
5 months ago
This seems like a pretty straightforward question. The key things I need to focus on are the data storage limits and API request limits in Salesforce.
upvoted 0 times
...
Marsha
5 months ago
This question seems pretty straightforward. I'm pretty confident I know the key characteristics of free software, so I'll carefully review each option.
upvoted 0 times
...
Selma
5 months ago
Ah, this is a good one. I remember learning about SDN's path computation engine, which is used to dynamically find alternative routes in case of failures. I'm pretty confident that's the right answer here.
upvoted 0 times
...
Eric
5 months ago
Hmm, I'm not sure if I should just follow the solution or try to do it from memory first. Let me think this through.
upvoted 0 times
...
Tonja
10 months ago
The webservice callout limits? That's the real wild card here. Imagine trying to fetch all those status reports at once. The Salesforce server might just throw in the towel and go on vacation!
upvoted 0 times
Rashad
8 months ago
Elliot: Absolutely, we don't want to risk losing any important status reports due to exceeding limits.
upvoted 0 times
...
Gwen
9 months ago
The webservice callout limits could really impact how efficiently those status reports are fetched.
upvoted 0 times
...
Elliot
9 months ago
User 2: I agree, we need to make sure we don't exceed those limits when integrating the data into Salesforce.
upvoted 0 times
...
Cheryll
9 months ago
User 1: Yeah, the webservice callout limits could definitely be a problem with that many servers.
upvoted 0 times
...
Dong
9 months ago
Agreed, that's crucial for storing all those status reports for 5 years.
upvoted 0 times
...
Derick
10 months ago
I think the Architect should definitely be aware of the data storage limits.
upvoted 0 times
...
...
Cassi
10 months ago
You know, the Architect could just ask the servers to report their status every hour instead of every 10 minutes. That way, they'd have less data to store and process. But hey, what do I know, I'm just a candidate!
upvoted 0 times
...
Alberta
10 months ago
Haha, the Architect better not forget about the workflow rule limits. Imagine trying to set up workflows for 15,000 servers, that's a recipe for chaos!
upvoted 0 times
Ty
10 months ago
I agree, it's important for the Architect to be aware of both data storage limits and workflow rule limits when integrating data into Salesforce.
upvoted 0 times
...
Ty
10 months ago
True, workflow rule limits are crucial to consider when setting up workflows for a large number of servers.
upvoted 0 times
...
...
Latricia
11 months ago
I agree, the data storage limits are crucial. But don't forget about the API request limits - gotta make sure those status reports can be accessed efficiently.
upvoted 0 times
...
Jimmie
11 months ago
I also think API Request limits are crucial to consider for integrating data into Salesforce.
upvoted 0 times
...
Jill
11 months ago
The Architect should definitely be aware of the data storage limits in Salesforce. With 15,000 servers reporting every 10 minutes, that's a lot of data to handle!
upvoted 0 times
Dyan
10 months ago
Definitely, API Request limits are important for ensuring smooth data retrieval.
upvoted 0 times
...
Hector
10 months ago
Yes, the data storage limits are crucial for handling all that server data.
upvoted 0 times
...
Colette
10 months ago
C) API Request limits
upvoted 0 times
...
Narcisa
10 months ago
A) Data storage limits
upvoted 0 times
...
...
Josphine
11 months ago
Yes, I agree. It's important to know how much data can be stored in Salesforce.
upvoted 0 times
...
Micah
11 months ago
I think the Architect should be aware of data storage limits.
upvoted 0 times
...

Save Cancel