Microsoft AI Business Professional AB-730 Question # 8 Topic 1 Discussion
AB-730 Exam Topic 1 Question 8 Discussion:
Question #: 8
Topic #: 1
You ask Microsoft 365 Copilot to create a report based on information from the web. You verify the response and discover that some information is fictional.
This scenario is an example of fabrication, which is commonly referred to in generative AI contexts as a hallucination. Fabrication occurs when an AI system generates information that appears credible but is factually incorrect, invented, or unsupported by verifiable sources.
According to Microsoft AI Business Professional guidance, large language models predict text based on patterns learned during training. They do not “know” facts in the human sense. As a result, when asked to generate reports using web-based information, the model may produce plausible-sounding but fictional details if sufficient grounding or reliable sources are not provided.
Deepfake refers specifically to synthetic media such as manipulated images, audio, or video. Overreliance describes a human behavior risk where users trust AI outputs without verification. Prompt injection is a malicious technique designed to manipulate model behavior. Bias refers to systematic unfairness in outputs.
In this case, the presence of fictional information in the generated report directly aligns with fabrication, making option B the correct answer.
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit