To prevent data poisoning, especially in systems relying on publicly submitted content such as product reviews, access authentication is critical. The AAIA™ Study Guide specifies that authenticated input sources help ensure data integrity and traceability, reducing the likelihood of adversarial or malicious contributions.
“Limiting review input to authenticated users restricts unauthorized actors—such as competitors or bots—from submitting biased or harmful data. This control protects model training and outputs from being manipulated.”
Options A and C address technical data handling but not source authenticity. Option B may inadvertently incentivize biased reviews. Option D provides a robust control against poisoning.
[Reference: ISACA Advanced in AI Audit™ (AAIA™) Study Guide, Section: “AI Fundamentals and Technologies,” Subsection: “Input Data Security and Adversarial Risk Mitigation”, , ]
Submit