A data scientist has created a BigQuery ML model and asks you to create an ML pipeline to serve predictions. You have a REST API application with the requirement to serve predictions for an individual user ID with latency under 100 milliseconds. You use the following query to generate predictions: SELECT predicted_label, user_id FROM ML.PREDICT (MODEL 'dataset.model', table user_features). How should you create the ML pipeline?
Your company produces 20,000 files every hour. Each data file is formatted as a comma separated values (CSV) file that is less than 4 KB. All files must be ingested on Google Cloud Platform before they can be processed. Your company site has a 200 ms latency to Google Cloud, and your Internet connection bandwidth is limited as 50 Mbps. You currently deploy a secure FTP (SFTP) server on a virtual machine in Google Compute Engine as the data ingestion point. A local SFTP client runs on a dedicated machine to transmit the CSV files as is. The goal is to make reports with data from the previous day available to the executives by 10:00 a.m. each day. This design is barely able to keep up with the current volume, even though the bandwidth utilization is rather low.
You are told that due to seasonality, your company expects the number of files to double for the next three months. Which two actions should you take? (choose two.)
You are developing an Apache Beam pipeline to extract data from a Cloud SQL instance by using JdbclO. You have two projects running in Google Cloud. The pipeline will be deployed and executed on Dataflow in Project A. The Cloud SQL instance is running jn Project B and does not have a public IP address. After deploying the pipeline, you noticed that the pipeline failed to extract data from the Cloud SQL instance due to connection failure. You verified that VPC Service Controls and shared VPC are not in use in these projects. You want to resolve this error while ensuring that the data does not go through the public internet. What should you do?
Option C is correct because it allows you to use a Compute Engine instance as a proxy server to connect to the Cloud SQL database over the peered network. The proxy server does not need an external IP address because it can communicate with the Dataflow workers and the Cloud SQL instance using internal IP addresses. You need to install the Cloud SQL Auth proxy on the proxy server and configure it to use a service account that has the Cloud SQL Client role.
Option D is incorrect because it requires you to assign public IP addresses to the Dataflow workers, which exposes the data to the public internet. This violates the requirement of ensuring that the data does not go through the public internet. Moreover, adding authorized networks does not work for Cloud SQL instances with private IP addresses.
You need to connect multiple applications with dynamic public IP addresses to a Cloud SQL instance. You configured users with strong passwords and enforced the SSL connection to your Cloud SOL instance. You want to use Cloud SQL public IP and ensure that you have secured connections. What should you do?
To securely connect multiple applications with dynamic public IP addresses to a Cloud SQL instance using public IP, the Cloud SQL Auth proxy is the best solution. This proxy provides secure, authorized connections to Cloud SQL instances without the need to configure authorized networks or deal with IP whitelisting complexities.
Cloud SQL Auth Proxy:
The Cloud SQL Auth proxy provides secure, encrypted connections to Cloud SQL.
It uses IAM permissions and SSL to authenticate and encrypt the connection, ensuring data security in transit.
By using the proxy, you avoid the need to constantly update authorized networks as the proxy handles dynamic IP addresses seamlessly.
Authorized Network Configuration:
Leaving the authorized network empty means no IP addresses are explicitly whitelisted, relying solely on the Auth proxy for secure connections.
This approach simplifies network management and enhances security by not exposing the Cloud SQL instance to public IP ranges.
Dynamic IP Handling:
Applications with dynamic IP addresses can securely connect through the proxy without the need to modify authorized networks.
The proxy authenticates connections using IAM, making it ideal for environments where application IPs change frequently.
Google Data Engineer Reference:
Using Cloud SQL Auth Proxy
Cloud SQL Security Overview
Setting up the Cloud SQL Auth Proxy
By using the Cloud SQL Auth proxy, you ensure secure, authorized connections for applications with dynamic public IPs without the need for complex network configurations.
You currently have a single on-premises Kafka cluster in a data center in the us-east region that is responsible for ingesting messages from IoT devices globally. Because large parts of globe have poor internet connectivity, messages sometimes batch at the edge, come in all at once, and cause a spike in load on your Kafka cluster. This is becoming difficult to manage and prohibitively expensive. What is the Google-recommended cloud native architecture for this scenario?
Justine
28 days agoLoise
1 months agoStanton
2 months agoFrederica
3 months agoMaia
4 months agoCarolann
4 months agoWinfred
4 months agoTennie
5 months agoJoye
5 months agoSarina
5 months agoOctavio
6 months agoHermila
6 months agoCordelia
6 months agoStanton
7 months agoDetra
7 months agoMaynard
7 months agoDeangelo
7 months agoChristene
8 months agoGilma
8 months agoGwenn
8 months agoRonald
8 months agoShawn
8 months agoDonte
9 months agoAntonette
9 months agoSon
9 months agoDouglass
9 months agoAliza
9 months agoJavier
10 months agoShannon
10 months agoTheron
10 months agoKristofer
10 months agoLauna
10 months agoDerick
11 months agoVerdell
11 months agoFreida
11 months agoVesta
12 months agoLashaunda
1 years agoLon
1 years agoEric
1 years agoErasmo
1 years agoDierdre
1 years agoZack
1 years agosaqib
1 years agoanderson
1 years ago