New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Cloud DevOps Engineer Exam - Topic 10 Question 7 Discussion

Actual exam question for Google's Professional Cloud DevOps Engineer exam
Question #: 7
Topic #: 10
[All Professional Cloud DevOps Engineer Questions]

You currently store the virtual machine (VM) utilization logs in Stackdriver. You need to provide an easy-to-share interactive VM utilization dashboard that is updated in real time and contains information aggregated on a quarterly basis. You want to use Google Cloud Platform solutions. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

0/2000 characters
Janessa
4 months ago
C is too many steps, just use BigQuery directly!
upvoted 0 times
...
Tomas
4 months ago
A is definitely the easiest way to share with stakeholders.
upvoted 0 times
...
Tatum
4 months ago
Surprised that people still use Google Sheets for dashboards. Isn't that outdated?
upvoted 0 times
...
Denny
4 months ago
I disagree, B could be better for security needs. SIEM systems are powerful!
upvoted 0 times
...
Gerald
5 months ago
Option A seems the most straightforward. BigQuery and Data Studio work well together.
upvoted 0 times
...
Carmen
5 months ago
Option D seems complicated with the custom application. I’m not sure if we need that level of customization for a simple dashboard.
upvoted 0 times
...
Temeka
5 months ago
I feel like exporting to a CSV in option C is a bit outdated. We should aim for something more interactive, right?
upvoted 0 times
...
Lilli
5 months ago
I think option A makes the most sense since it directly uses BigQuery and Data Studio, which we practiced in class.
upvoted 0 times
...
Brock
5 months ago
I'm not entirely sure, but I remember something about using Cloud Pub/Sub for real-time data. Maybe option B could work?
upvoted 0 times
...
Cordie
5 months ago
Okay, let's see. The portal role and profile records need to be set up properly, so that's definitely something to check. And the user and contact records need to be in place as well.
upvoted 0 times
...
Alaine
5 months ago
The Trusted Subsystem pattern is all about isolating sensitive resources and controlling access to them. Based on that, I think the answer is D - all of the above. A database, a legacy system, and a file with predefined permissions would all be good candidates for the Trusted Subsystem approach.
upvoted 0 times
...
Shaniqua
5 months ago
I'm not entirely sure about this one. Is it the fabric WLCs that update the HTDB as new clients connect, or is it the border nodes that first register the endpoints and then update the HTDB? I'll have to review this topic again.
upvoted 0 times
...
Javier
5 months ago
Okay, I think I've got a plan. I'll focus on using a managed instance group with Cloud Filestore to handle the file system needs, and then use a load balancer to ensure high availability.
upvoted 0 times
...

Save Cancel