Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Architect Topic 6 Question 100 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 100
Topic #: 6
[All Professional Cloud Architect Questions]

Your company has a Google Cloud project that uses BigQuery for data warehousing on a pay-per-use basis. You want to monitor queries in real time to discover the most costly queries and which users spend the most. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: C

https://cloud.google.com/blog/products/data-analytics/taking-a-practical-approach-to-bigquery-cost-monitoring


Contribute your Thoughts:

Marvel
3 days ago
Option A gets my vote. Dataflow can crunch the numbers and give us the insights we need. Plus, Storage is much cooler than BigQuery, right?
upvoted 0 times
...
Eric
5 days ago
Option D is way too complicated. Who has time to mess with all those labels and Billing reports? No thanks!
upvoted 0 times
...
Ria
9 days ago
Option C looks good to me. Tapping into the billing data should give us the details we need on query costs and user activity.
upvoted 0 times
...
Deeanna
9 days ago
I'm not sure, I think option B could also work. Exporting BigQuery data access logs to BigQuery and performing a query on the generated table might provide the information we need.
upvoted 0 times
...
Charolette
13 days ago
I think Option B is the way to go. Exporting the logs directly to BigQuery makes the data more accessible for querying.
upvoted 0 times
Katlyn
5 days ago
I agree, it seems like the most efficient way to monitor queries and track user activity.
upvoted 0 times
...
Paulina
7 days ago
Option B is a good choice. Exporting logs to BigQuery will make it easier to extract the information we need.
upvoted 0 times
...
...
Sabine
17 days ago
I agree with Lacey. Option A seems like the most efficient way to monitor queries in real time and identify the most costly queries and users.
upvoted 0 times
...
Lacey
19 days ago
I think option A is the best choice because it involves exporting BigQuery data access logs to Cloud Storage and developing a Dataflow pipeline to compute the cost of queries split by users.
upvoted 0 times
...
Lavonna
26 days ago
Option A seems the most comprehensive approach to monitoring BigQuery queries and costs in real-time.
upvoted 0 times
Daisy
3 days ago
Yes, it would definitely help in identifying the most costly queries and which users are spending the most.
upvoted 0 times
...
Oren
4 days ago
I agree, creating a Cloud Logging sink to export BigQuery data access logs to Cloud Storage and developing a Dataflow pipeline sounds efficient.
upvoted 0 times
...
Gayla
6 days ago
Option A seems the most comprehensive approach to monitoring BigQuery queries and costs in real-time.
upvoted 0 times
...
...

Save Cancel