Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Cloud Architect (PR000213) Exam - Topic 3 Question 71 Discussion

Actual exam question for Google's Professional Cloud Architect (PR000213) exam
Question #: 71
Topic #: 3
[All Professional Cloud Architect (PR000213) Questions]

Your company wants to migrate their 10-TB on-premises database export into Cloud Storage You want to minimize the time it takes to complete this activity, the overall cost and database load The bandwidth between the on-premises environment and Google Cloud is 1 Gbps You want to follow Google-recommended practices What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: A

The Data Transfer appliance is a Google-provided hardware device that can be used to transfer large amounts of data from on-premises environments to Cloud Storage. It is suitable for scenarios where the bandwidth between the on-premises environment and Google Cloud is low or insufficient, and the data size is large. The Data Transfer appliance can minimize the time it takes to complete the migration, the overall cost and database load, by avoiding network bottlenecks and reducing bandwidth consumption. The Data Transfer appliance also encrypts the data at rest and in transit, ensuring data security and privacy. The other options are not optimal for this scenario, because they either require a high-bandwidth network connection (B, C, D), or incur additional costs and complexity (B, C). Reference:

https://cloud.google.com/data-transfer-appliance/docs/overview

https://cloud.google.com/blog/products/storage-data-transfer/introducing-storage-transfer-service-for-on-premises-data


Contribute your Thoughts:

0/2000 characters
Justine
5 months ago
1 Gbps bandwidth means A will save a lot of time!
upvoted 0 times
...
Sharika
5 months ago
D seems too complicated for a simple migration.
upvoted 0 times
...
Ernest
5 months ago
C sounds interesting, but is it really the best option?
upvoted 0 times
...
Troy
5 months ago
I think B could be more efficient with the right tools.
upvoted 0 times
...
Felice
5 months ago
A is definitely the way to go for large data!
upvoted 0 times
...
Emiko
6 months ago
Developing a Dataflow job sounds appealing, but I recall it might not be the fastest method for such a large dataset. I’m torn between that and the Data Transfer appliance.
upvoted 0 times
...
Derick
6 months ago
I practiced a similar question where we had to optimize data transfer. I think compressing the data could help, but I’m not confident if it’s the best choice here.
upvoted 0 times
...
Jannette
6 months ago
I'm not entirely sure, but I think using a commercial ETL solution could add complexity and might not be the most efficient way to handle this migration.
upvoted 0 times
...
Valentin
6 months ago
I remember studying about the Data Transfer appliance and how it can help with large data migrations. It seems like a good option for minimizing time and cost.
upvoted 0 times
...
Winfred
6 months ago
This is a tricky one. I'm torn between options B and C. Using a commercial partner ETL solution might be more straightforward, but developing a Dataflow job could give me more control and flexibility. I'll need to weigh the pros and cons of each approach before making a decision.
upvoted 0 times
...
Wilda
6 months ago
Okay, I think I've got a good strategy for this. The 1 Gbps bandwidth between the on-premises environment and Google Cloud is pretty fast, so I'm leaning towards option D and compressing the data before uploading it with gsutil -m. That should help maximize the transfer speed and minimize the overall cost.
upvoted 0 times
...
Aretha
6 months ago
Hmm, I'm a bit unsure about this one. The question mentions following Google-recommended practices, so I'm not sure if developing a custom Dataflow job is the best approach. Maybe I should consider one of the other options that might be more in line with Google's recommendations.
upvoted 0 times
...
Belen
6 months ago
This looks like a straightforward data migration question. I think I'll go with option C and develop a Dataflow job to read directly from the database and write to Cloud Storage. That should minimize the load on the on-premises system.
upvoted 0 times
...
Alpha
6 months ago
This is a good test of our understanding of Scrum. I'm pretty confident that the answer is C - a new sprint starts immediately after the previous one. But I'll double-check the Scrum Guide to be sure.
upvoted 0 times
...
Lashandra
6 months ago
Hmm, this looks like a tricky one. I'll need to carefully consider the options and think through the potential causes.
upvoted 0 times
...
Caren
6 months ago
I think it's A or C, but I’m not entirely sure about the specifics of theft behavior.
upvoted 0 times
...
Tyisha
6 months ago
The "Exclude Network List Below" option sounds familiar too. I feel like that could have some implications that might not return that IP correctly.
upvoted 0 times
...
Gaston
2 years ago
Ooh, good point! Compression could definitely help. And with multi-threaded copy, we might be able to really maximize that bandwidth. Although, I'm not sure how much the compression would impact the overall upload time. Might be worth testing both options.
upvoted 0 times
...
Evelynn
2 years ago
Definitely, the Dataflow job sounds like the most efficient and cost-effective option. And with a 1Gbps connection, we should be able to get the data moved pretty quickly. Although, I do wonder if compressing the data first and using gsutil might be a bit faster?
upvoted 0 times
...
Janella
2 years ago
Yeah, I agree. The appliance is probably overkill for this size. A commercial ETL solution could work, but that might be more expensive than we need. I'm thinking the Dataflow job might be the way to go - it can handle the data transfer directly and leverage Google's infrastructure.
upvoted 0 times
...
Christa
2 years ago
Hmm, this is an interesting question. I think the key here is to minimize the time and cost while following Google-recommended practices. The Data Transfer appliance could work, but that's more for large-scale migrations, not a 10TB database.
upvoted 0 times
...

Save Cancel