Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft AB-730 Exam - Topic 3 Question 4 Discussion

Actual exam question for Microsoft's AB-730 exam
Question #: 4
Topic #: 3
[All AB-730 Questions]

You ask Microsoft 365 Copilot to create a report based on information from the web. You verify the response and discover that some information is fictional.

What is this an example of?

Show Suggested Answer Hide Answer
Suggested Answer: B

This scenario is an example of fabrication, which is commonly referred to in generative AI contexts as a hallucination. Fabrication occurs when an AI system generates information that appears credible but is factually incorrect, invented, or unsupported by verifiable sources.

According to Microsoft AI Business Professional guidance, large language models predict text based on patterns learned during training. They do not ''know'' facts in the human sense. As a result, when asked to generate reports using web-based information, the model may produce plausible-sounding but fictional details if sufficient grounding or reliable sources are not provided.

Deepfake refers specifically to synthetic media such as manipulated images, audio, or video. Overreliance describes a human behavior risk where users trust AI outputs without verification. Prompt injection is a malicious technique designed to manipulate model behavior. Bias refers to systematic unfairness in outputs.

In this case, the presence of fictional information in the generated report directly aligns with fabrication, making option B the correct answer.


Contribute your Thoughts:

0/2000 characters

Currently there are no comments in this discussion, be the first to comment!


Save Cancel