You are implementing a batch inference ML pipeline in Google Cloud. The model was developed by using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset that is stored in a BigQuery table. You want to perform inference with minimal effort. What should you do?
Vertex AI batch prediction is the most appropriate and efficient way to apply a pre-trained model like TensorFlow's SavedModel to a large dataset, especially for batch processing.
The Vertex AI batch prediction job works by exporting your dataset (in this case, historical data from BigQuery) to a suitable format (like Avro or CSV) and then processing it in Cloud Storage where the model is stored.
Avro format is recommended for large datasets as it is highly efficient for data storage and is optimized for read/write operations in Google Cloud, which is why option B is correct.
Option A suggests using BigQuery ML for inference, but it does not support running arbitrary TensorFlow models directly within BigQuery ML. Hence, BigQuery ML is not a valid option for this particular task.
Option C (exporting to CSV) is a valid alternative but is less efficient compared to Avro in terms of performance.
Rozella
6 days agoLetha
12 days agoMozelle
17 days agoVanesa
23 days agoSolange
28 days agoDelbert
1 month agoDyan
1 month agoGenevive
1 month agoRodney
1 month agoAlline
1 month agoTruman
11 months agoLavonne
11 months agoVallie
10 months agoDenae
10 months agoKeneth
10 months agoDeeann
10 months agoWilletta
10 months agoMelynda
10 months agoTonja
10 months agoGene
11 months agoStefania
11 months agoAnissa
11 months agoJesusita
10 months agoShasta
11 months agoMaile
11 months agoJaime
10 months agoSabine
10 months agoTheron
11 months agoElliott
11 months agoMaybelle
11 months agoSimona
11 months agoCristina
10 months agoGraham
11 months agoJamika
11 months agoStaci
11 months agoWhitney
11 months ago