Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Data Engineer Exam - Topic 2 Question 105 Discussion

Actual exam question for Google's Professional Data Engineer exam
Question #: 105
Topic #: 2
[All Professional Data Engineer Questions]

The Development and External teams nave the project viewer Identity and Access Management (1AM) role m a folder named Visualization. You want the Development Team to be able to read data from both Cloud Storage and BigQuery, but the External Team should only be able to read data from BigQuery. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

To ensure that the advertising department receives messages within 30 seconds of the click occurrence, and given the current system lag and data freshness metrics, the issue likely lies in the processing capacity of the Dataflow job. Here's why option B is the best choice:

System Lag and Data Freshness:

The system lag of 5 seconds indicates that Dataflow itself is processing messages relatively quickly.

However, the data freshness of 40 seconds suggests a significant delay before processing begins, indicating a backlog.

Backlog in Pub/Sub Subscription:

A backlog occurs when the rate of incoming messages exceeds the rate at which the Dataflow job can process them, causing delays.

Optimizing the Dataflow Job:

To handle the incoming message rate, the Dataflow job needs to be optimized or scaled up by increasing the number of workers, ensuring it can keep up with the message inflow.

Steps to Implement:

Analyze the Dataflow Job:

Inspect the Dataflow job metrics to identify bottlenecks and inefficiencies.

Optimize Processing Logic:

Optimize the transformations and operations within the Dataflow pipeline to improve processing efficiency.

Increase Number of Workers:

Scale the Dataflow job by increasing the number of workers to handle the higher load, reducing the backlog.


Dataflow Monitoring

Scaling Dataflow Jobs

Contribute your Thoughts:

0/2000 characters
Wenona
4 months ago
Not sure about that, what if they need Cloud Storage later?
upvoted 0 times
...
Gayla
4 months ago
Wait, can we really restrict access like that? Sounds tricky.
upvoted 0 times
...
Kenneth
4 months ago
I agree, option A makes the most sense here.
upvoted 0 times
...
Carin
4 months ago
The External Team only needs BigQuery access.
upvoted 0 times
...
Esteban
4 months ago
Removing permissions is definitely the way to go!
upvoted 0 times
...
Dallas
5 months ago
I’m leaning towards option C, but I’m not clear on how the Access Level works in that context. It’s a bit confusing!
upvoted 0 times
...
Cecilia
5 months ago
I remember something about VPC firewall rules, but I don't think they would help with IAM permissions directly. It feels like a trick option.
upvoted 0 times
...
Markus
5 months ago
This question reminds me of a practice scenario where we had to manage access levels for different teams. I wonder if creating a VPC Service Controls perimeter would be more effective.
upvoted 0 times
...
Tamera
5 months ago
I think we need to remove the Cloud Storage IAM permissions for the External Team, but I'm not entirely sure if that's the best approach.
upvoted 0 times
...
Yolando
6 months ago
Hmm, this is a tricky balancing act between granting the right access levels. I'm leaning towards option C, as it seems to provide the most granular control over the access levels for the two teams.
upvoted 0 times
...
Lauran
6 months ago
I've got a good handle on IAM permissions and VPC, so I think I can tackle this one. Option A seems like the most straightforward solution to restrict the External Team's access to Cloud Storage.
upvoted 0 times
...
Alexia
6 months ago
Whoa, this is a complex question. I'm a bit confused about the differences between VPC firewall rules and VPC Service Controls. I'll need to review those concepts before deciding on the best approach.
upvoted 0 times
...
Ashton
6 months ago
Okay, let's see. The key is restricting the External Team's access to Cloud Storage while still allowing the Development Team to access both Cloud Storage and BigQuery. I think option C might be the way to go.
upvoted 0 times
...
Pok
6 months ago
Hmm, this looks like a tricky one. I'll need to carefully read through the options and think about the IAM permissions and VPC concepts involved.
upvoted 0 times
...
Marti
11 months ago
Hey, I heard the External Team's been eyeing that BigQuery data like a hawk. Better lock it down with Option C, or they'll be trying to hack their way in!
upvoted 0 times
...
Brittani
11 months ago
Option A seems a bit too simple, and Option B doesn't really address the access control requirements. I think Option C is the most comprehensive solution here.
upvoted 0 times
Lenna
10 months ago
I think Option C is the most secure and comprehensive solution for this scenario.
upvoted 0 times
...
Lina
10 months ago
Yeah, Option C seems like the best choice to ensure the right level of access for each team.
upvoted 0 times
...
Jacqueline
10 months ago
I agree, Option C covers all the access control requirements effectively.
upvoted 0 times
...
...
Delsie
11 months ago
Haha, the External Team's not getting anywhere near my precious BigQuery data! Option C is the way to go, no doubt about it.
upvoted 0 times
Nelida
10 months ago
We need to make sure our data is protected, and Option C provides the necessary restrictions for the External Team.
upvoted 0 times
...
Arlette
10 months ago
Definitely, creating a VPC Service Controls perimeter is the most secure way to manage access to both Cloud Storage and BigQuery.
upvoted 0 times
...
Lacresha
11 months ago
I agree, Option C is the best choice to restrict access to BigQuery for the External Team.
upvoted 0 times
...
...
Magdalene
11 months ago
I'm leaning towards Option D. Containing the Cloud Storage project and adding the Development Team to the perimeter's Access Level should do the trick.
upvoted 0 times
Jaime
10 months ago
I'm not sure, but Option D does sound like it would provide the necessary restrictions for each team.
upvoted 0 times
...
Kristel
11 months ago
Agreed, Option D seems like the most secure option for this scenario.
upvoted 0 times
...
Alexia
11 months ago
I think Option D is the best choice. It restricts Cloud Storage access to the Development Team.
upvoted 0 times
...
...
Nichelle
12 months ago
I'm not sure, option D also sounds like a valid solution to control access.
upvoted 0 times
...
Zona
12 months ago
I agree with Mohammad, option C seems like the best choice to restrict access.
upvoted 0 times
...
Aleisha
1 year ago
Option C seems like the way to go. Restricting the External Team's access to BigQuery while allowing the Development Team to access both Cloud Storage and BigQuery sounds like the perfect solution.
upvoted 0 times
Virgilio
11 months ago
I agree, Option C is the most secure way to manage access for both teams. It's important to ensure that the right permissions are set up to protect our data.
upvoted 0 times
...
Zana
11 months ago
Option C seems like the best choice. It allows us to restrict access for the External Team while still giving the Development Team access to both Cloud Storage and BigQuery.
upvoted 0 times
...
...
Mohammad
1 year ago
I think we should go with option C.
upvoted 0 times
...

Save Cancel