Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Cloud Developer Exam - Topic 1 Question 79 Discussion

Actual exam question for Google's Professional Cloud Developer exam
Question #: 79
Topic #: 1
[All Professional Cloud Developer Questions]

You need to load-test a set of REST API endpoints that are deployed to Cloud Run. The API responds to HTTP POST requests Your load tests must meet the following requirements:

* Load is initiated from multiple parallel threads

* User traffic to the API originates from multiple source IP addresses.

* Load can be scaled up using additional test instances

You want to follow Google-recommended best practices How should you configure the load testing'?

Show Suggested Answer Hide Answer
Suggested Answer: C

Contribute your Thoughts:

0/2000 characters
Junita
4 months ago
Definitely going with C, it's the most flexible approach!
upvoted 0 times
...
Brianne
4 months ago
Wait, can Cloud Shell really handle that much load?
upvoted 0 times
...
Sherill
4 months ago
Unmanaged instance groups? Not sure that's the best choice.
upvoted 0 times
...
Twana
4 months ago
I think A is more straightforward though.
upvoted 0 times
...
Lindsey
5 months ago
Option C sounds solid for scaling up traffic.
upvoted 0 times
...
Emerson
5 months ago
I practiced a similar question where we had to consider multiple IP addresses for traffic. I wonder if that impacts which option is better here.
upvoted 0 times
...
Cortney
5 months ago
I feel like option D might not scale well since it's using Cloud Shell, but I can't remember the specifics of why that might be a problem.
upvoted 0 times
...
Carmen
5 months ago
I think using cURL is a common practice for load testing, but I can't recall the difference between managed and unmanaged instance groups.
upvoted 0 times
...
Olene
5 months ago
I remember we discussed using Kubernetes for load testing in class, so option C sounds familiar, but I'm not sure if it's the best choice for this scenario.
upvoted 0 times
...
Dorthy
5 months ago
Okay, I think I've got a handle on this. The distributed load testing framework in option C sounds like the way to go. That way I can easily scale up the number of concurrent users by deploying more Pods. Plus, it's running on a private GKE cluster, which should give me more control and flexibility.
upvoted 0 times
...
Theodora
5 months ago
Hmm, I'm a bit unsure about this one. The requirements mention scaling up the load using additional test instances, but I'm not sure if the other options fully address that. I might need to do some more research on the different load testing frameworks and how they can be deployed.
upvoted 0 times
...
Laurel
5 months ago
This looks like a pretty straightforward load testing scenario. I think the key is to follow Google's best practices, so I'm leaning towards option C - deploying a distributed load testing framework on a private GKE cluster.
upvoted 0 times
...
Paris
6 months ago
I'm a little confused by the difference between managed and unmanaged instance groups in options A and B. I'm not sure which one would be better for this use case. Maybe I should look into the pros and cons of each approach before deciding.
upvoted 0 times
...
Aja
6 months ago
Hmm, I'm not totally sure about this one. The GDPR mentions data portability, but I'm not clear on the exact objective. I'll have to think this through carefully.
upvoted 0 times
...
Suzan
6 months ago
I'm a bit lost on this one. The details in the answer choices are pretty specific, and I'm not super familiar with the terminology. I'll have to guess and hope for the best.
upvoted 0 times
...
Marg
6 months ago
Ah, I know this one! Business continuity programs should be part of the Governance process. That's the area that oversees the organization's overall direction and decision-making.
upvoted 0 times
...
Tamala
2 years ago
You're right, Tequila. Option C does seem to provide more flexibility in scaling up the load. Let's stick with that.
upvoted 0 times
...
Tequila
2 years ago
But doesn't option D limit our ability to scale up the load using additional test instances?
upvoted 0 times
...
Hui
2 years ago
I would go with option D instead. Downloading a container image on Cloud Shell seems more efficient to me.
upvoted 0 times
...
Tamala
2 years ago
I agree with Tequila. Option C seems to be the best choice for following Google-recommended best practices.
upvoted 0 times
...
Tequila
2 years ago
I think we should go with option C because it allows us to deploy additional Pods as needed and support more concurrent users.
upvoted 0 times
...
Ilona
2 years ago
I'm leaning towards option A. Using cURL and deploying the image in a managed instance group for each VM seems like a robust solution to me.
upvoted 0 times
...
Fletcher
2 years ago
I agree with Emily. Option D sounds simpler and easier to manage. Plus, it allows for easily increasing the load on the API.
upvoted 0 times
...
Sharee
2 years ago
I disagree, Alex. Option D seems like a more straightforward approach. Just download the container image and start multiple instances on Cloud Shell.
upvoted 0 times
...
Maia
2 years ago
I think option C is the best choice. Using a distributed load testing framework on a private GKE cluster seems like a scalable and efficient solution.
upvoted 0 times
...
Jamika
2 years ago
Hmm, I see what you mean about option D being simpler, but I'm not sure it would really meet all the requirements. Sequentially starting instances on Cloud Shell doesn't seem like it would give us the parallel threads and multiple IP addresses that the question calls for. I think C is still the way to go, even if it's a bit more complex.
upvoted 0 times
...
Iraida
2 years ago
I'm with you on the GKE approach. That seems like the most flexible and scalable option here. Plus, we can leverage all the built-in monitoring and autoscaling features of Kubernetes to really dial in the load testing.
upvoted 0 times
...
Krystal
2 years ago
I'm a little hesitant about option C, though. Setting up a private GKE cluster just for load testing seems a bit overkill, don't you think? I'm wondering if option D might be a simpler solution - just use the distributed load testing framework container on Cloud Shell to get the job done.
upvoted 0 times
...
Lelia
2 years ago
Yeah, I agree that C seems like the best option. The requirements specifically mention needing to load test from multiple parallel threads and multiple source IP addresses. A managed instance group with cURL doesn't sound like it would give us that kind of flexibility.
upvoted 0 times
...
Madalyn
2 years ago
Haha, can you imagine trying to scale up the load by just starting more instances of the container in Cloud Shell? That's like trying to put out a forest fire with a squirt gun!
upvoted 0 times
Lili
2 years ago
C) Deploy a distributed load testing framework on a private Google Kubernetes Engine Cluster Deploy additional Pods as needed to initiate more traffic and support the number of concurrent users.
upvoted 0 times
...
Susana
2 years ago
A) Create an image that has cURL installed and configure cURL to run a test plan Deploy the image in a managed instance group, and run one instance of the image for each VM.
upvoted 0 times
...
...
Reyes
2 years ago
The Cloud Shell option sounds kind of janky to me. I can't imagine that would be a very reliable or robust way to load-test the API. I think we need to go with a more enterprise-grade solution here.
upvoted 0 times
...
Tresa
2 years ago
Whoa, this question is really tricky! I'm not sure exactly how to approach it, but I think option C might be the way to go. Using a distributed load testing framework on a private GKE cluster seems like it would give us the ability to scale up the load and simulate traffic from multiple IP addresses.
upvoted 0 times
Dion
2 years ago
User 1
upvoted 0 times
...
...
Joye
2 years ago
Hmm, this is tricky. I'm not sure if I'd go with the cURL approach, since that seems a bit manual and not very scalable. I'm leaning towards the distributed load testing framework on GKE, but I'd need to do some more research to make sure that's the right approach.
upvoted 0 times
...
Nelida
2 years ago
I agree, this is a solid question. I think it's important to follow Google's best practices here, since they're the experts on Cloud Run and have a lot of experience with it.
upvoted 0 times
...
Oneida
2 years ago
This is a great question! I'm really interested to see how we can properly load-test these REST API endpoints on Cloud Run. The requirements around parallel threads, multiple source IP addresses, and scalability are all really important considerations.
upvoted 0 times
...

Save Cancel