New Year Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Amazon Web Services AWS Certified AI Practitioner Exam AIF-C01 Question # 70 Topic 8 Discussion

Amazon Web Services AWS Certified AI Practitioner Exam AIF-C01 Question # 70 Topic 8 Discussion

AIF-C01 Exam Topic 8 Question 70 Discussion:
Question #: 70
Topic #: 8

A company has created a custom model by fine-tuning an existing large language model (LLM) from Amazon Bedrock. The company wants to deploy the model to production and use the model to handle a steady rate of requests each minute.

Which solution meets these requirements MOST cost-effectively?


A.

Deploy the model by using an Amazon EC2 compute optimized instance.


B.

Use the model with on-demand throughput on Amazon Bedrock.


C.

Store the model in Amazon S3 and host the model by using AWS Lambda.


D.

Purchase Provisioned Throughput for the model on Amazon Bedrock.


Get Premium AIF-C01 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.