Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional-Data-Engineer Topic 1 Question 78 Discussion

Actual exam question for Google's Google Cloud Certified Professional Data Engineer exam
Question #: 78
Topic #: 1
[All Google Cloud Certified Professional Data Engineer Questions]

You have a data processing application that runs on Google Kubernetes Engine (GKE). Containers need to be launched with their latest available configurations from a container registry. Your GKE nodes need to have GPUs. local SSDs, and 8 Gbps bandwidth. You want to efficiently provision the data processing infrastructure and manage the deployment process. What should you do?

Show Suggested Answer Hide Answer

Contribute your Thoughts:

Amos
8 days ago
Absolutely, and Dataflow and Cloud Scheduler seem more focused on data processing pipelines, which isn't the core requirement here. Cloud Build and Terraform is the way to go.
upvoted 0 times
...
Portia
9 days ago
Yeah, that's a good point. The other options like using Compute Engine startup scripts or GKE autoscaling don't seem to address the need for specialized hardware like GPUs and local SSDs.
upvoted 0 times
...
Ariel
10 days ago
I agree, the Cloud Build and Terraform option seems like the most comprehensive and efficient solution. It allows us to manage the entire provisioning and deployment process programmatically.
upvoted 0 times
...
Theresia
11 days ago
Hmm, this question seems to be testing our knowledge of infrastructure provisioning and deployment strategies for GKE. I'm thinking the best approach would be to use Cloud Build and Terraform to provision the infrastructure and deploy the latest container images.
upvoted 0 times
...

Save Cancel