New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Machine Learning Engineer Exam - Topic 1 Question 76 Discussion

Actual exam question for Google's Professional Machine Learning Engineer exam
Question #: 76
Topic #: 1
[All Professional Machine Learning Engineer Questions]

You work for a company that sells corporate electronic products to thousands of businesses worldwide. Your company stores historical customer data in BigQuery. You need to build a model that predicts customer lifetime value over the next three years. You want to use the simplest approach to build the model. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: D

The best option to build a comprehensive system that recommends images to users that are similar in appearance to their own uploaded images is to download a pretrained convolutional neural network (CNN), and use the model to generate embeddings of the input images. Embeddings are low-dimensional representations of high-dimensional data that capture the essential features and semantics of the data. By using a pretrained CNN, you can leverage the knowledge learned from large-scale image datasets, such as ImageNet, and apply it to your own domain. A pretrained CNN can be used as a feature extractor, where the output of the last hidden layer (or any intermediate layer) is taken as the embedding vector for the input image. You can then measure the similarity between embeddings using a distance metric, such as cosine similarity or Euclidean distance, and recommend images that have the highest similarity scores to the user's uploaded image. Option A is incorrect because downloading a pretrained CNN and fine-tuning the model to predict hashtags based on the input images may not capture the visual similarity of the images, as hashtags may not reflect the appearance of the images accurately. For example, two images of different breeds of dogs may have the same hashtag #dog, but they may not look similar to each other. Moreover, fine-tuning the model may require additional data and computational resources, and it may not generalize well to new images that have different or missing hashtags. Option B is incorrect because retrieving image labels and dominant colors from the input images using the Vision API may not capture the visual similarity of the images, as labels and colors may not reflect the fine-grained details of the images. For example, two images of the same breed of dog may have different labels and colors depending on the background, lighting, and angle of the image. Moreover, using the Vision API may incur additional costs and latency, and it may not be able to handle custom or domain-specific labels. Option C is incorrect because using the provided hashtags to create a collaborative filtering algorithm may not capture the visual similarity of the images, as collaborative filtering relies on the ratings or preferences of users, not the features of the images. For example, two images of different animals may have similar ratings or preferences from users, but they may not look similar to each other. Moreover, collaborative filtering may suffer from the cold start problem, where new images or users that have no ratings or preferences cannot be recommended.Reference:

Image similarity search with TensorFlow

Image embeddings documentation

Pretrained models documentation

Similarity metrics documentation


Contribute your Thoughts:

0/2000 characters
Lisbeth
3 months ago
Surprised that ARIMA is even mentioned here!
upvoted 0 times
...
Annita
3 months ago
D seems more complex than necessary.
upvoted 0 times
...
Kayleigh
3 months ago
Wait, why not just use AutoML directly?
upvoted 0 times
...
Jesusa
4 months ago
Totally agree, C makes the most sense!
upvoted 0 times
...
Quinn
4 months ago
I think C is the simplest option.
upvoted 0 times
...
Geoffrey
4 months ago
I’m leaning towards option C, but I have a nagging feeling that I should double-check the differences between ARIMA and AutoML for this scenario.
upvoted 0 times
...
Tom
4 months ago
I feel like the ARIMA model might be more complex than what we need for predicting customer lifetime value. Maybe the AutoML regression is the way to go?
upvoted 0 times
...
Rozella
4 months ago
I remember practicing with Vertex AI Workbench, but I can't recall if it was better for ARIMA or AutoML models.
upvoted 0 times
...
Cecily
5 months ago
I think using BigQuery Studio for the AutoML regression model sounds like a straightforward approach, but I'm not entirely sure if it's the simplest option.
upvoted 0 times
...
Nada
5 months ago
Okay, this is a good one. I'm leaning towards option A - using BigQuery Studio to create an ARIMA model. That seems like the simplest approach, and ARIMA is a pretty standard time series forecasting technique, so it should work well for predicting customer lifetime value.
upvoted 0 times
...
Juliann
5 months ago
I think option D might be the way to go. Using a Vertex AI Workbench notebook would give me more flexibility and control over the model-building process. Plus, I'm more comfortable working in a notebook environment than just using the SQL editor.
upvoted 0 times
...
Margot
5 months ago
Hmm, I'm a bit unsure about this one. I'm not super familiar with the different model types, so I'll need to do some research to figure out which one is the best fit for predicting customer lifetime value. Maybe I'll start by looking into the differences between ARIMA and AutoML regression.
upvoted 0 times
...
Tabetha
5 months ago
This seems pretty straightforward. I'd go with option C - using BigQuery Studio to create an AutoML regression model. That seems like the simplest approach to build the model.
upvoted 0 times
...
Nana
5 months ago
I'm a little confused on this one. I know the different slices that make up an E2E Network Slice, but I can't quite recall if the cloud slice is one of them or not. I'll have to review my notes quickly before answering.
upvoted 0 times
...
Sharika
5 months ago
I'm a bit unsure about this one. The use of cheap sensors and proprietary protocols could also introduce major security issues. I'll need to review my notes to make the best choice.
upvoted 0 times
...
Deonna
5 months ago
Hmm, I'm not sure. Licensing could also be really important, especially if you're moving from on-premises software to cloud-based services. You don't want to get hit with unexpected licensing fees.
upvoted 0 times
...
Lon
9 months ago
I bet the company's CTO is going to be impressed when they see 'AutoML' in the answer. Might as well throw in some 'Vertex AI' for good measure, right?
upvoted 0 times
Bernardine
8 months ago
D) Create a Vertex AI Workbench notebook. Use IPython magic to run the create model statement to create an AutoML regression model.
upvoted 0 times
...
Carey
8 months ago
C) Access BigQuery Studio in the Google Cloud console. Run the create model statement in the SQL editor to create an AutoML regression model.
upvoted 0 times
...
Shelia
9 months ago
B) Create a Vertex AI Workbench notebook. Use IPython magic to run the create model statement to create an ARIMA model.
upvoted 0 times
...
Yong
9 months ago
A) Access BigQuery Studio in the Google Cloud console. Run the create model statement in the SQL editor to create an ARIMA model.
upvoted 0 times
...
...
Huey
9 months ago
D) Ooh, a combination of Vertex AI and AutoML regression! Now we're talking some high-tech stuff. I better brush up on my Python magic before tackling this one.
upvoted 0 times
Francine
8 months ago
D) Create a Vertex AI Workbench notebook. Use IPython magic to run the create model statement to create an AutoML regression model.
upvoted 0 times
...
Lynelle
8 months ago
B) Create a Vertex AI Workbench notebook. Use IPython magic to run the create model statement to create an ARIMA model.
upvoted 0 times
...
Lovetta
9 months ago
A) Access BigQuery Studio in the Google Cloud console. Run the create model statement in the SQL editor to create an ARIMA model.
upvoted 0 times
...
...
Susana
10 months ago
C) AutoML regression, huh? Sounds like a good compromise between simplicity and potential complexity. I'm intrigued, but I hope it doesn't overfits the data.
upvoted 0 times
Shaun
9 months ago
B) Create a Vertex AI Workbench notebook. Use IPython magic to run the create model statement to create an ARIMA model.
upvoted 0 times
...
Eleni
9 months ago
C) AutoML regression, huh? Sounds like a good compromise between simplicity and potential complexity. I'm intrigued, but I hope it doesn't overfit the data.
upvoted 0 times
...
Yoko
9 months ago
A) Access BigQuery Studio in the Google Cloud console. Run the create model statement in the SQL editor to create an ARIMA model.
upvoted 0 times
...
...
Pa
10 months ago
B) Vertex AI Workbench, eh? Looks like we're going for a bit of a fancier approach. But hey, if it gets the job done, I'm all for it.
upvoted 0 times
Brianne
10 months ago
D) Create a Vertex AI Workbench notebook. Use IPython magic to run the create model statement to create an AutoML regression model.
upvoted 0 times
...
Jeff
10 months ago
A) Access BigQuery Studio in the Google Cloud console. Run the create model statement in the SQL editor to create an ARIMA model.
upvoted 0 times
...
Sol
10 months ago
B) Vertex AI Workbench, eh? Looks like we're going for a bit of a fancier approach. But hey, if it gets the job done, I'm all for it.
upvoted 0 times
...
Carylon
10 months ago
A) Access BigQuery Studio in the Google Cloud console. Run the create model statement in the SQL editor to create an ARIMA model.
upvoted 0 times
...
...
Caitlin
10 months ago
That's a good point, Dong. AutoML regression models might provide more accurate predictions in this case.
upvoted 0 times
...
Dong
10 months ago
I disagree, I believe option D is better. AutoML regression models can handle complex relationships in the data.
upvoted 0 times
...
Gaston
10 months ago
A) Ah, the classic ARIMA approach! Efficient and straightforward, just the way I like it. Although, I wonder if the customer data might have some hidden complexities that could trip up the model.
upvoted 0 times
Olive
9 months ago
B) I agree, the ARIMA model is a good starting point. We can always adjust it if needed.
upvoted 0 times
...
Annette
9 months ago
A) Access BigQuery Studio in the Google Cloud console. Run the create model statement in the SQL editor to create an ARIMA model.
upvoted 0 times
...
...
Caitlin
11 months ago
I think option A is the best choice. ARIMA models are simple and effective for time series data.
upvoted 0 times
...
Rosalia
11 months ago
That's a good point, Quentin. AutoML regression models might provide more accurate predictions in this case.
upvoted 0 times
...
Quentin
11 months ago
I disagree, I believe option D is better. AutoML regression models can handle complex patterns in the data.
upvoted 0 times
...
Rosalia
11 months ago
I think option A is the best choice. ARIMA models are simple and effective for time series data.
upvoted 0 times
...

Save Cancel