Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

CertNexus Exam AIP-210 Topic 6 Question 24 Discussion

Actual exam question for CertNexus's AIP-210 exam
Question #: 24
Topic #: 6
[All AIP-210 Questions]

A company is developing a merchandise sales application The product team uses training data to teach the AI model predicting sales, and discovers emergent bias. What caused the biased results?

Show Suggested Answer Hide Answer
Suggested Answer: B

Workflow design patterns for machine learning pipelines are common solutions to recurring problems in building and managing machine learning workflows. One of these patterns is to represent a pipeline with a directed acyclic graph (DAG), which is a graph that consists of nodes and edges, where each node represents a step or task in the pipeline, and each edge represents a dependency or order between the tasks. A DAG has no cycles, meaning there is no way to start at one node and return to it by following the edges. A DAG can help visualize and organize the pipeline, as well as facilitate parallel execution, fault tolerance, and reproducibility.


Contribute your Thoughts:

Dottie
1 months ago
I'm going to have to go with C. Flawed expectations? Sounds like the team was playing a game of 'Guess the Bias' instead of 'Predict the Sales'.
upvoted 0 times
Nguyet
7 days ago
C) The team set flawed expectations when training the model.
upvoted 0 times
...
Jenelle
8 days ago
B) The application was migrated from on-premise to a public cloud.
upvoted 0 times
...
Bettyann
17 days ago
A) The AI model was trained in winter and applied in summer.
upvoted 0 times
...
...
Kelvin
1 months ago
Nah, I'm sticking with option A. Training in winter and applying in summer? That's a recipe for disaster. Looks like the team needed to invest in a seasonal wardrobe for their AI model.
upvoted 0 times
...
Janessa
1 months ago
Oh, I'm feeling lucky with B. Migrating to the cloud? That's bound to introduce all kinds of unexpected biases. Gotta love technology, am I right?
upvoted 0 times
Garry
2 days ago
B) The application was migrated from on-premise to a public cloud.
upvoted 0 times
...
Brandon
20 days ago
A) The AI model was trained in winter and applied in summer.
upvoted 0 times
...
...
Lashanda
1 months ago
I don't know, D seems like the obvious choice to me. Inaccurate training data is a surefire way to get biased predictions. Maybe the team should have used a crystal ball instead?
upvoted 0 times
...
Valentine
1 months ago
Hmm, I'm gonna go with option C. Flawed expectations when training the model could definitely lead to biased results. Rookie mistake, but it happens.
upvoted 0 times
...
Gilma
2 months ago
Maybe the team should have set better expectations during training to avoid bias.
upvoted 0 times
...
Lera
2 months ago
I agree with Ernest, using inaccurate data can definitely lead to biased results.
upvoted 0 times
...
Ernest
2 months ago
I think the biased results were caused by inaccurate training data.
upvoted 0 times
...

Save Cancel