Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Associate Cloud Engineer Exam - Topic 4 Question 83 Discussion

Actual exam question for Google's Associate Cloud Engineer exam
Question #: 83
Topic #: 4
[All Associate Cloud Engineer Questions]

For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: D

Instance groups are collections of virtual machine (VM) instances that you can manage as a single entity. Instance groups can help you simplify the management of multiple instances, reduce operational costs, and improve the availability and performance of your applications. Instance groups support autoscaling, which automatically adds or removes instances from the group based on increases or decreases in load. Autoscaling helps your applications gracefully handle increases in traffic and reduces cost when the need for resources is lower. You can set the autoscaling policy based on CPU utilization, load balancing capacity, Cloud Monitoring metrics, or a queue-based workload. In this case, since the video encoding software is CPU-intensive, setting the autoscaling based on CPU utilization is the best option to ensure high availability and optimal performance.Reference:

Instance groups

Autoscaling groups of instances


Contribute your Thoughts:

0/2000 characters
Gerri
4 months ago
Wait, can we really drop logs like that? Seems risky!
upvoted 0 times
...
Noble
4 months ago
D sounds complicated, not sure if it's worth the effort.
upvoted 0 times
...
Madelyn
4 months ago
C is too simple, it won't handle filtering efficiently.
upvoted 0 times
...
Ronny
4 months ago
I think B is better, using Cloud Pub/Sub adds flexibility.
upvoted 0 times
...
Staci
4 months ago
Option A seems straightforward, just give the right permissions.
upvoted 0 times
...
Sunshine
5 months ago
Option D seems complicated with the Cloud Scheduler and all, but I wonder if it would actually minimize costs compared to the others.
upvoted 0 times
...
Dorothy
5 months ago
I’m a bit confused about option B. Creating a Cloud Function seems like an extra step, but I guess it could help filter the logs better?
upvoted 0 times
...
Jamika
5 months ago
I remember practicing a similar question where we had to set up log exports, and I feel like option C might be the most straightforward way to get Compute Engine logs into BigQuery.
upvoted 0 times
...
Marylyn
5 months ago
I think option A sounds familiar, but I'm not entirely sure if just updating the metadata is enough to send logs to BigQuery.
upvoted 0 times
...
Leatha
5 months ago
The key here is to minimize cost, so I'm leaning towards Option A. Updating the instance metadata seems like the simplest and most efficient way to get the logs into BigQuery.
upvoted 0 times
...
Julio
5 months ago
Hmm, I'm a bit confused about the different options. I'll need to carefully read through each one to understand the tradeoffs and decide which is the most cost-effective solution.
upvoted 0 times
...
Quentin
5 months ago
This looks like a straightforward question, I think I can handle it. Option C seems the most direct approach to get the logs into BigQuery.
upvoted 0 times
...
Elouise
5 months ago
I'm not sure I fully understand the difference between the Pub/Sub and Stackdriver Logging export options. I'll need to research those a bit more to figure out the best approach.
upvoted 0 times
...
Rosann
5 months ago
I've used ADP before, so I'm pretty confident I know the right answer here. I'd go with the SP prompt to set it up.
upvoted 0 times
...
Tonette
5 months ago
Hmm, this is a tricky one. I'll need to be really careful to stay objective and not let any personal connections influence my review. Gotta make sure I document everything thoroughly and get an outside opinion on this.
upvoted 0 times
...
Nettie
6 months ago
I'm a bit confused about the difference between implicit and explicit acceptance. I'll need to re-read that part.
upvoted 0 times
...
Aide
10 months ago
I'd go with Option B, it's like a three-course meal of cloud services - Pub/Sub, Cloud Functions, and BigQuery. Yum, yum! Just hope the chef doesn't overcook the logs.
upvoted 0 times
Dottie
8 months ago
Ashley: Haha, let's hope for some well-done logs then!
upvoted 0 times
...
Willard
9 months ago
User 3: I hope the chef doesn't burn the logs while cooking.
upvoted 0 times
...
Ashley
9 months ago
User 2: Yeah, it's like a full meal with Pub/Sub, Cloud Functions, and BigQuery.
upvoted 0 times
...
Letha
9 months ago
User 1: Option B sounds like a feast of cloud services!
upvoted 0 times
...
...
Pauline
10 months ago
Option A seems like the easiest solution, but I'm not sure it's the most cost-effective. Granting permissions and updating instance metadata sounds simple enough, but I wonder if there might be some hidden gotchas.
upvoted 0 times
...
Avery
10 months ago
Hmm, Option D sounds pretty interesting. Automating the BigQuery job with a Cloud Function and Cloud Scheduler could be a clever way to handle this. Though, I'm a bit concerned about the potential for extra costs with all the moving parts.
upvoted 0 times
Lili
9 months ago
User 3: Maybe we should weigh the benefits of automation against the potential costs before deciding.
upvoted 0 times
...
Stefany
9 months ago
User 2: I agree, but the potential for extra costs is something to consider.
upvoted 0 times
...
Sol
9 months ago
User 1: Option D does sound interesting. Automating the BigQuery job could be efficient.
upvoted 0 times
...
...
Michel
10 months ago
I'm leaning towards Option C. It's a bit more straightforward, and I like the ability to create a filter to only export the Compute Engine logs. Plus, BigQuery integration is built-in, so it's a simple one-stop solution.
upvoted 0 times
...
Raylene
10 months ago
Option B looks like the most efficient solution to me. Using Cloud Pub/Sub and a Cloud Function to filter and insert the logs directly into BigQuery seems like a great way to automate the process and minimize costs.
upvoted 0 times
Nana
9 months ago
I think Option B is the way to go. It streamlines the process and ensures that only the necessary logs are stored in the platform-logs dataset.
upvoted 0 times
...
Lennie
9 months ago
I think so too. It's a streamlined approach that ensures only relevant logs are sent to BigQuery.
upvoted 0 times
...
Francesco
10 months ago
Yes, it definitely seems like the best way to automate the process and keep costs down.
upvoted 0 times
...
Nichelle
10 months ago
Yeah, Option B is definitely the most efficient. It saves time and resources by only sending relevant Compute Engine logs to BigQuery.
upvoted 0 times
...
Thora
10 months ago
I agree, option B seems like the most efficient solution. Using Cloud Pub/Sub and a Cloud Function to filter and insert logs into BigQuery is a smart move.
upvoted 0 times
...
Zoila
10 months ago
I agree, Option B seems like the best choice. It automates the process and filters out unnecessary logs before inserting them into BigQuery.
upvoted 0 times
...
...
Ashton
11 months ago
Hmm, that's a good point. I see the benefits of option D now. Thanks for sharing your perspective, Fletcher.
upvoted 0 times
...
Fletcher
11 months ago
I disagree, I believe option D is more cost-effective in the long run. It automates the process efficiently.
upvoted 0 times
...
Ashton
11 months ago
I think option A is the best choice. It seems like the most straightforward solution.
upvoted 0 times
...

Save Cancel