New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Cloud Architect Exam - Topic 6 Question 100 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 100
Topic #: 6
[All Professional Cloud Architect Questions]

Your company has a Google Cloud project that uses BigQuery for data warehousing on a pay-per-use basis. You want to monitor queries in real time to discover the most costly queries and which users spend the most. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: C

https://cloud.google.com/blog/products/data-analytics/taking-a-practical-approach-to-bigquery-cost-monitoring


Contribute your Thoughts:

0/2000 characters
Vilma
3 months ago
Wait, can you really track costs just by adding labels? That seems too easy!
upvoted 0 times
...
Silvana
3 months ago
I agree with B, querying the logs directly is straightforward.
upvoted 0 times
...
Avery
4 months ago
D seems a bit manual, not sure if it's the most efficient way.
upvoted 0 times
...
Haydee
4 months ago
I think C is the best choice for tracking costs effectively.
upvoted 0 times
...
Oren
4 months ago
Option B sounds solid, exporting logs to BigQuery is smart.
upvoted 0 times
...
Chau
4 months ago
I think option A might be overcomplicating things with Dataflow. I remember we practiced simpler methods, but I can't remember if they were as effective for tracking costs.
upvoted 0 times
...
Lenna
4 months ago
I feel like option D is a bit manual with all those steps. I can't recall if we covered how effective it is for real-time monitoring, though.
upvoted 0 times
...
Lajuana
5 months ago
I remember discussing option C in our study group. Activating billing export to BigQuery seems like a straightforward way to analyze costs, but I wonder if it gives real-time insights.
upvoted 0 times
...
Lang
5 months ago
I think option B sounds familiar because it involves exporting logs to BigQuery, which we practiced in class. But I'm not entirely sure if that's the best way to monitor costs in real time.
upvoted 0 times
...
Tanja
5 months ago
I like the idea of using the billing export in option C. That seems like a straightforward way to get the cost information I need. I'll have to make sure I understand how to filter the data properly.
upvoted 0 times
...
Johanna
5 months ago
Option A looks good to me. Exporting the logs to Cloud Storage and then using Dataflow to process them seems like a robust solution that will give me the real-time monitoring I need.
upvoted 0 times
...
Nobuko
5 months ago
Hmm, I'm a bit confused. There are a few different options here, and I'm not sure which one is the best approach. I'll need to think this through carefully.
upvoted 0 times
...
Altha
5 months ago
This seems like a straightforward question. I think option B is the way to go - exporting the BigQuery logs to a BigQuery table and then querying that table to get the information I need.
upvoted 0 times
...
Jacki
1 year ago
Option C, huh? I guess they're really trying to squeeze every penny out of our BigQuery usage. Better keep those queries lean and mean!
upvoted 0 times
...
Vallie
1 year ago
Option D? More like Option 'Doh!' Am I right? Someone's been drinking too much cloud Kool-Aid.
upvoted 0 times
Art
11 months ago
D) In the BigQuery dataset that contains all the tables to be queried, add a label for each user that can launch a query. Open the Billing page of the project. Select Reports. Select BigQuery as the product and filter by the user you want to check.
upvoted 0 times
...
Bernardo
11 months ago
C) Activate billing export into BigQuery. Perform a BigQuery query on the billing table to extract the information you need.
upvoted 0 times
...
Shawnda
12 months ago
B) Create a Cloud Logging sink to export BigQuery data access logs to BigQuery. Perform a BigQuery query on the generated table to extract the information you need.
upvoted 0 times
...
Jaime
1 year ago
A) Create a Cloud Logging sink to export BigQuery data access logs to Cloud Storage. Develop a Dataflow pipeline to compute the cost of queries split by users.
upvoted 0 times
...
...
Arlene
1 year ago
I'd go with Option B. Keeping the logs in BigQuery just makes sense - no need to export them anywhere else. Efficient and straightforward.
upvoted 0 times
...
Marvel
1 year ago
Option A gets my vote. Dataflow can crunch the numbers and give us the insights we need. Plus, Storage is much cooler than BigQuery, right?
upvoted 0 times
...
Eric
1 year ago
Option D is way too complicated. Who has time to mess with all those labels and Billing reports? No thanks!
upvoted 0 times
Quentin
1 year ago
B) 1. Create a Cloud Logging sink to export BigQuery data access logs to BigQuery. 2. Perform a BigQuery query on the generated table to extract the information you need.
upvoted 0 times
...
Rosalind
1 year ago
A) 1. Create a Cloud Logging sink to export BigQuery data access logs to Cloud Storage. 2. Develop a Dataflow pipeline to compute the cost of queries split by users.
upvoted 0 times
...
...
Ria
1 year ago
Option C looks good to me. Tapping into the billing data should give us the details we need on query costs and user activity.
upvoted 0 times
Rasheeda
1 year ago
2. Perform a BigQuery query on the billing table to extract the information you need.
upvoted 0 times
...
Beckie
1 year ago
C) 1. Activate billing export into BigQuery.
upvoted 0 times
...
...
Deeanna
1 year ago
I'm not sure, I think option B could also work. Exporting BigQuery data access logs to BigQuery and performing a query on the generated table might provide the information we need.
upvoted 0 times
...
Charolette
1 year ago
I think Option B is the way to go. Exporting the logs directly to BigQuery makes the data more accessible for querying.
upvoted 0 times
Joesph
1 year ago
Let's go with Option B then. It's a straightforward solution to analyze costly queries and user spending.
upvoted 0 times
...
Katlyn
1 year ago
I agree, it seems like the most efficient way to monitor queries and track user activity.
upvoted 0 times
...
Paulina
1 year ago
Option B is a good choice. Exporting logs to BigQuery will make it easier to extract the information we need.
upvoted 0 times
...
...
Sabine
1 year ago
I agree with Lacey. Option A seems like the most efficient way to monitor queries in real time and identify the most costly queries and users.
upvoted 0 times
...
Lacey
1 year ago
I think option A is the best choice because it involves exporting BigQuery data access logs to Cloud Storage and developing a Dataflow pipeline to compute the cost of queries split by users.
upvoted 0 times
...
Lavonna
1 year ago
Option A seems the most comprehensive approach to monitoring BigQuery queries and costs in real-time.
upvoted 0 times
Daisy
1 year ago
Yes, it would definitely help in identifying the most costly queries and which users are spending the most.
upvoted 0 times
...
Oren
1 year ago
I agree, creating a Cloud Logging sink to export BigQuery data access logs to Cloud Storage and developing a Dataflow pipeline sounds efficient.
upvoted 0 times
...
Gayla
1 year ago
Option A seems the most comprehensive approach to monitoring BigQuery queries and costs in real-time.
upvoted 0 times
...
...

Save Cancel