Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Cloud Database Engineer Exam - Topic 14 Question 24 Discussion

Actual exam question for Google's Professional Cloud Database Engineer exam
Question #: 24
Topic #: 14
[All Professional Cloud Database Engineer Questions]

You are building a data warehouse on BigQuery. Sources of data include several MySQL databases located on-premises.

You need to transfer data from these databases into BigQuery for analytics. You want to use a managed solution that has low latency and is easy to set up. What should you do?

A. Create extracts from your on-premises databases periodically, and push these extracts to Cloud Storage. Upload the changes into BigQuery, and merge them with existing tables.

Show Suggested Answer Hide Answer
Suggested Answer: C

Contribute your Thoughts:

0/2000 characters
Alexis
4 months ago
D is solid if you want to keep things in Cloud SQL.
upvoted 0 times
...
Stevie
4 months ago
Wait, can Datastream really handle that much data? Sounds risky!
upvoted 0 times
...
Anjelica
4 months ago
A is too manual, not ideal for a managed solution.
upvoted 0 times
...
Ruthann
4 months ago
I disagree, I think C is more efficient with real-time streaming.
upvoted 0 times
...
Carissa
5 months ago
Option B seems like the best choice for low latency.
upvoted 0 times
...
Frank
5 months ago
I’m a bit confused about the differences between using Cloud Storage and directly streaming to BigQuery. I need to review that part again.
upvoted 0 times
...
Veronika
5 months ago
I practiced a similar question where we had to choose between different Google Cloud services for data transfer, and I feel like Database Migration Service could be a solid option too.
upvoted 0 times
...
Hortencia
5 months ago
I think Datastream might be the right choice since it allows for real-time data streaming, but I can't recall the exact setup steps.
upvoted 0 times
...
Helaine
5 months ago
I remember discussing the benefits of using Cloud Data Fusion for ETL processes, but I'm not sure if it's the best fit for low latency.
upvoted 0 times
...
Kimberely
5 months ago
I'm leaning towards option D with Database Migration Service. Replicating the data to Cloud SQL first and then using federated tables in BigQuery could be a neat solution. I'll have to research how easy the setup is, though.
upvoted 0 times
...
Cecil
5 months ago
Hmm, I'm a bit unsure about this one. Option A with Cloud Storage seems simple, but I'm not sure about the latency and ease of setup. Maybe I should look into the other options a bit more.
upvoted 0 times
...
Johana
5 months ago
This seems like a straightforward data migration problem. I'd probably go with option B - Cloud Data Fusion looks like a good managed solution that can handle the ETL process.
upvoted 0 times
...
Loreta
5 months ago
Option C with Datastream and Dataflow seems interesting, but it might be overkill for this use case. I'd want to make sure the setup isn't too complex. Option B with Cloud Data Fusion sounds like a good balance of features.
upvoted 0 times
...
Kallie
5 months ago
I think the "Guide Me" and "Active IQ" widgets would be the most helpful for identifying and solving issues with the newly deployed array. The "Guide Me" tool can provide step-by-step guidance, while "Active IQ" can give insights and recommendations.
upvoted 0 times
...
Ashanti
6 months ago
This question seems straightforward, I think I can handle it.
upvoted 0 times
...
Dan
6 months ago
I'm a bit confused by the `Out` struct and how it's being used with `for_each()`. I'll need to review my understanding of function objects and how they work with the standard library algorithms.
upvoted 0 times
...
Valentine
2 years ago
Haha, Norah, you and your streams! But in all seriousness, I think you two are onto something. Datastream does seem like the way to go here. The fact that it handles the transfer and processing is a big plus. Plus, I like the idea of not having to worry about setting up a bunch of separate components.
upvoted 0 times
...
Norah
2 years ago
Ooh, Datastream, huh? That does sound pretty slick. I was also considering Cloud Data Fusion (option B), but you make a good point about the latency and ease of setup. And let's be real, who doesn't love a good 'stream' these days? *winks*
upvoted 0 times
Claribel
2 years ago
Great choice! Datastream should work well for your data warehouse needs.
upvoted 0 times
...
Jacob
2 years ago
Thank you, I'll check out Datastream for my data transfer solution!
upvoted 0 times
...
Stephane
2 years ago
For sure, Datastream can handle the stream well for you.
upvoted 0 times
...
Thomasena
2 years ago
Definitely! Plus, it's always nice to have a 'stream' flowing smoothly.
upvoted 0 times
...
Maybelle
2 years ago
Oh yeah, Datastream can definitely help with low latency and easy setup.
upvoted 0 times
...
Sophia
2 years ago
Datastream sounds like a solid option for your data transfer needs.
upvoted 0 times
...
Frank
2 years ago
C
upvoted 0 times
...
...
Gladis
2 years ago
I agree, Kimberlie. Option A seems a bit too manual and prone to latency issues. And option D, while interesting, feels like overkill for this use case. I'm leaning towards Datastream (option C) since that seems to handle the data transfer and processing all in one managed solution.
upvoted 0 times
...
Kimberlie
2 years ago
Hmm, this question seems to be testing our knowledge of different data transfer solutions for BigQuery. I'm a bit torn between options B and C. Both seem like they could work, but I'm not sure which one would be the most 'low latency and easy to set up' as the question asks for.
upvoted 0 times
...

Save Cancel