The correct answer is D because Amazon OpenSearch Service supports k-Nearest Neighbor (k-NN) search and vector similarity search, which are required for semantic search tasks, such as matching natural language queries to image embeddings.
From AWS documentation:
"Amazon OpenSearch Service supports k-NN search, which allows you to run efficient similarity searches on large-scale datasets using vector embeddings generated by models. This enables applications like natural language-based image search and personalized recommendations."
In this use case, image data can be encoded into vectors using foundation models (e.g., via Amazon Bedrock or SageMaker), and OpenSearch Service can index and retrieve results based on vector similarity.
Explanation of other options:
A. Amazon Comprehend is for text-based NLP tasks and does not provide vector similarity or search functionality.
B. Amazon Personalize is for user-item recommendations and personalization, not vector-based semantic search.
C. Amazon Polly is a text-to-speech service and not related to image search or vector databases.
Referenced AWS AI/ML Documents and Study Guides:
Amazon OpenSearch Service Documentation – k-NN and Vector Search
AWS ML Specialty Study Guide – Semantic Search and Vector Indexing
AWS Generative AI Best Practices – Embeddings and Vector Databases
Submit