Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Data Engineer Exam - Topic 3 Question 110 Discussion

Actual exam question for Google's Professional Data Engineer exam
Question #: 110
Topic #: 3
[All Professional Data Engineer Questions]

You are developing an Apache Beam pipeline to extract data from a Cloud SQL instance by using JdbclO. You have two projects running in Google Cloud. The pipeline will be deployed and executed on Dataflow in Project A. The Cloud SQL instance is running jn Project B and does not have a public IP address. After deploying the pipeline, you noticed that the pipeline failed to extract data from the Cloud SQL instance due to connection failure. You verified that VPC Service Controls and shared VPC are not in use in these projects. You want to resolve this error while ensuring that the data does not go through the public internet. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: C

Option A is incorrect because VPC Network Peering alone does not enable connectivity to Cloud SQL instances with private IP addresses.You also need to configure private services access and allocate an IP address range for the service producer network1.

Option B is incorrect because Cloud NAT does not support Cloud SQL instances with private IP addresses.Cloud NAT only provides outbound connectivity for resources that do not have public IP addresses, such as VMs, GKE clusters, and serverless instances2.

Option C is correct because it allows you to use a Compute Engine instance as a proxy server to connect to the Cloud SQL database over the peered network. The proxy server does not need an external IP address because it can communicate with the Dataflow workers and the Cloud SQL instance using internal IP addresses. You need to install the Cloud SQL Auth proxy on the proxy server and configure it to use a service account that has the Cloud SQL Client role.

Option D is incorrect because it requires you to assign public IP addresses to the Dataflow workers, which exposes the data to the public internet. This violates the requirement of ensuring that the data does not go through the public internet. Moreover, adding authorized networks does not work for Cloud SQL instances with private IP addresses.


Contribute your Thoughts:

0/2000 characters
Page
6 days ago
I think option C sounds familiar because we practiced a similar question about using a proxy server for database access.
upvoted 0 times
...
Cyndy
12 days ago
I remember we discussed VPC peering in class, but I'm not entirely sure if it's the best option here since it might complicate things.
upvoted 0 times
...
Catalina
17 days ago
I'm feeling pretty confident about this one. Connecting the two projects through VPC Network Peering and using a proxy server in Project B is the best way to securely access the Cloud SQL instance.
upvoted 0 times
...
Sean
23 days ago
The Cloud NAT option seems promising to avoid the public internet, but I'm not sure if that would fully address the connection issue with the Cloud SQL instance. I'll need to double-check the requirements.
upvoted 0 times
...
Tamesha
28 days ago
Okay, I think I've got a good handle on this. Setting up VPC Network Peering between the two projects and then using a proxy server in Project B seems like the way to go to keep the data off the public internet.
upvoted 0 times
...
Tanesha
1 month ago
Hmm, I'm a bit confused about the network peering and proxy server options. I'll need to review the details on those to make sure I understand them properly.
upvoted 0 times
...
Luisa
1 month ago
This looks like a tricky one. I'll need to carefully consider the network setup and security requirements to find the best solution.
upvoted 0 times
...

Save Cancel