Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional-Machine-Learning-Engineer Topic 5 Question 75 Discussion

Actual exam question for Google's Google Professional Machine Learning Engineer exam
Question #: 75
Topic #: 5
[All Google Professional Machine Learning Engineer Questions]

You have a custom job that runs on Vertex Al on a weekly basis The job is Implemented using a proprietary ML workflow that produces the datasets. models, and custom artifacts, and sends them to a Cloud Storage bucket Many different versions of the datasets and models were created Due to compliance requirements, your company needs to track which model was used for making a particular prediction, and needs access to the artifacts for each model. How should you configure your workflows to meet these requirement?

Show Suggested Answer Hide Answer
Suggested Answer: D

Contribute your Thoughts:

Karma
22 hours ago
That seems like a comprehensive way to ensure compliance requirements are met.
upvoted 0 times
...
Major
2 days ago
C) Use the Vertex AI Metadata API inside the custom job to create context, execution, and artifacts for each model, and use events to link them together.
upvoted 0 times
...
Marica
3 days ago
That could work too, but it might not provide as much detailed tracking.
upvoted 0 times
...
Rose
4 days ago
B) Create a Vertex AI experiment, and enable autologging inside the custom job.
upvoted 0 times
...
Karma
5 days ago
That sounds like a good option for tracking the models and datasets.
upvoted 0 times
Mabel
19 hours ago
A) Configure a TensorFlow Extended (TFX) ML Metadata database, and use the ML Metadata API.
upvoted 0 times
...
...
Marica
6 days ago
A) Configure a TensorFlow Extended (TFX) ML Metadata database, and use the ML Metadata API.
upvoted 0 times
...

Save Cancel