A technician uses AI to draft a document about new software benefits. Upon reading, the technician sees factually incorrect info. What term best describes this?
In AI terminology,hallucinationrefers to generated output that appears plausible but is factually incorrect.
FromQuentin Docter – CompTIA A+ Complete Study Guide:
“AI hallucination describes output that is fluent and coherent but includes information that is entirely fabricated or inaccurate.”
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit