Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Machine Learning Engineer Exam - Topic 1 Question 94 Discussion

Actual exam question for Google's Professional Machine Learning Engineer exam
Question #: 94
Topic #: 1
[All Professional Machine Learning Engineer Questions]

You are implementing a batch inference ML pipeline in Google Cloud. The model was developed by using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset that is stored in a BigQuery table. You want to perform inference with minimal effort. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

Vertex AI batch prediction is the most appropriate and efficient way to apply a pre-trained model like TensorFlow's SavedModel to a large dataset, especially for batch processing.

The Vertex AI batch prediction job works by exporting your dataset (in this case, historical data from BigQuery) to a suitable format (like Avro or CSV) and then processing it in Cloud Storage where the model is stored.

Avro format is recommended for large datasets as it is highly efficient for data storage and is optimized for read/write operations in Google Cloud, which is why option B is correct.

Option A suggests using BigQuery ML for inference, but it does not support running arbitrary TensorFlow models directly within BigQuery ML. Hence, BigQuery ML is not a valid option for this particular task.

Option C (exporting to CSV) is a valid alternative but is less efficient compared to Avro in terms of performance.


Contribute your Thoughts:

0/2000 characters
Rozella
6 days ago
I think B is better for handling larger datasets.
upvoted 0 times
...
Letha
12 days ago
A is the easiest way to integrate TensorFlow with BigQuery ML!
upvoted 0 times
...
Mozelle
17 days ago
I feel like option D might be overkill for this scenario since we just need batch inference, but I could be wrong about that.
upvoted 0 times
...
Vanesa
23 days ago
I’m a bit confused about the formats. Is Avro better than CSV for batch predictions in Vertex AI? I can't recall the specifics.
upvoted 0 times
...
Solange
28 days ago
I remember practicing with Vertex AI, and I feel like options B and C are similar. Exporting data to Cloud Storage seems like a common step.
upvoted 0 times
...
Delbert
1 month ago
I think option A sounds familiar, but I'm not entirely sure if BigQuery ML can directly import a TensorFlow model like that.
upvoted 0 times
...
Dyan
1 month ago
I'm leaning towards option C - exporting the data to Cloud Storage in CSV format and using a Vertex AI batch prediction job. That seems like a good balance of leveraging the power of Vertex AI while keeping the setup relatively simple. I feel pretty confident about this approach.
upvoted 0 times
...
Genevive
1 month ago
Option D looks interesting - using a Vertex AI endpoint to get predictions directly from the BigQuery data. That could be a really efficient way to do this. I'll have to research how to set up a Vertex AI endpoint, but it might be worth the effort.
upvoted 0 times
...
Rodney
1 month ago
Hmm, I'm a bit unsure about this one. I'm not super familiar with Vertex AI, so I'm not sure if that's the easiest approach. Maybe I should consider the BigQuery ML option (A) since I've used that before. I'll have to think this through a bit more.
upvoted 0 times
...
Alline
1 month ago
This seems like a straightforward question. I think I'll go with option B - exporting the data to Cloud Storage and using a Vertex AI batch prediction job. That way, I can leverage the power of Vertex AI without having to do too much manual setup.
upvoted 0 times
...
Truman
11 months ago
I prefer option C. Exporting the data to Cloud Storage in CSV format and configuring a Vertex AI batch prediction job seems like a simple solution.
upvoted 0 times
...
Lavonne
11 months ago
I don't know, options A and D both sound like they involve a lot of moving parts. Why not just go with the straightforward Cloud Storage export and Vertex AI batch prediction? Can't beat the classics!
upvoted 0 times
Vallie
10 months ago
I agree, keeping it simple with Cloud Storage export and Vertex AI batch prediction is the way to go.
upvoted 0 times
...
Denae
10 months ago
Yeah, that does sound like a straightforward approach. Option C could work too with CSV format.
upvoted 0 times
...
Keneth
10 months ago
Option B does seem like a simpler solution. Just export the data to Cloud Storage and use Vertex AI batch prediction.
upvoted 0 times
...
Deeann
10 months ago
Let's go with the classic approach of exporting to Cloud Storage and using Vertex AI batch prediction. It's reliable.
upvoted 0 times
...
Willetta
10 months ago
Yeah, Option C also involves exporting to Cloud Storage and using Vertex AI batch prediction. It's a solid choice.
upvoted 0 times
...
Melynda
10 months ago
I agree, keeping it simple with Cloud Storage export and Vertex AI batch prediction is the way to go.
upvoted 0 times
...
Tonja
10 months ago
Option B seems like the best choice. Export data to Cloud Storage and use Vertex AI batch prediction.
upvoted 0 times
...
...
Gene
11 months ago
I'm not sure about option D. I think option B could also work well if we export the historical data to Cloud Storage in Avro format.
upvoted 0 times
...
Stefania
11 months ago
Ha, BigQuery ML and TensorFlow in the same sentence? That's a recipe for a headache if I ever saw one. Option B or C for me, keep it simple!
upvoted 0 times
...
Anissa
11 months ago
Hmm, I'm not sure about option A. Trying to import the TensorFlow model into BigQuery ML seems like it might be more trouble than it's worth. I'd probably go with option C or D.
upvoted 0 times
Jesusita
10 months ago
Yeah, I think option C or D would be easier to implement for the batch inference ML pipeline.
upvoted 0 times
...
Shasta
11 months ago
I agree, option A does seem like it could be complicated.
upvoted 0 times
...
...
Maile
11 months ago
I'm leaning towards option D. Deploying a Vertex AI endpoint and using it to get predictions directly from the BigQuery data sounds like the easiest and most streamlined approach.
upvoted 0 times
Jaime
10 months ago
Let's go with option D then. Deploying a Vertex AI endpoint seems like the easiest way to get predictions.
upvoted 0 times
...
Sabine
10 months ago
It definitely sounds like the most streamlined approach. Option D is the way to go.
upvoted 0 times
...
Theron
11 months ago
I agree, deploying an endpoint and getting predictions directly from BigQuery is the way to go.
upvoted 0 times
...
Elliott
11 months ago
Option D seems like the best choice. Using a Vertex AI endpoint for predictions is efficient.
upvoted 0 times
...
...
Maybelle
11 months ago
I agree with Whitney. Option D sounds like the most straightforward approach to apply the model to the historical dataset.
upvoted 0 times
...
Simona
11 months ago
Option B seems like the most efficient choice here. Exporting the data to Cloud Storage in Avro format and then using Vertex AI batch prediction is a straightforward way to apply the TensorFlow model without having to do too much manual setup.
upvoted 0 times
Cristina
10 months ago
I agree. It's always best to choose the most efficient option when working with ML pipelines.
upvoted 0 times
...
Graham
11 months ago
Definitely, using Vertex AI batch prediction will save us a lot of time and effort.
upvoted 0 times
...
Jamika
11 months ago
That sounds like a good plan. It should make the process easier.
upvoted 0 times
...
Staci
11 months ago
B) Export the historical data to Cloud Storage in Avro format. Configure a Vertex AI batch prediction job to generate predictions for the exported data.
upvoted 0 times
...
...
Whitney
11 months ago
I think option D is the best choice. It seems like the most efficient way to get predictions from the historical data.
upvoted 0 times
...

Save Cancel