Pre-Summer Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: force70

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 3 out of 14 pages
Viewing questions 41-60 out of questions
Questions # 41:

A company needs to design a solution to process videos that users upload to an Amazon S3 bucket. Each video file is approximately 1 GB in size and takes approximately 20 minutes to process. During peak hours, the company expects to process approximately 100 simultaneous uploads. The video file processing is stateless and can run in parallel as soon as the video files arrive in the S3 bucket.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Use an AWS Lambda function to process each video. Split the video files into chunks, and use AWS Step Functions to orchestrate multiple processing steps.


B.

Use an Amazon EKS cluster with AWS Fargate profiles to deploy one container for each uploaded video. Configure an Amazon EventBridge rule to invoke the cluster when a user uploads a video.


C.

Use Amazon EC2 On-Demand Instances in an Auto Scaling group to process each file. Configure the Auto Scaling policy to increase the number of instances based on the number of files that the application needs to process.


D.

Use an Amazon ECS cluster with the AWS Fargate launch type. Use Fargate Spot capacity to run one container task for each uploaded video. Configure an Amazon EventBridge rule to invoke the cluster when a user uploads a video.


Expert Solution
Questions # 42:

A company is building an application that runs on several Linux-based containers in Amazon ECS. The containers must have shared access to log files and configuration data. The application requires a POSIX-compliant file system that provides high availability and scalability.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Configure an Amazon EFS file system with elastic throughput.


B.

Deploy an Amazon S3 bucket and configure shared access and object-level permissions.


C.

Configure an Amazon FSx for Lustre file system with the Persistent 2 deployment type.


D.

Attach an Amazon EBS volume to an Amazon EC2 instance to share access across containers.


Expert Solution
Questions # 43:

A company needs a solution to integrate transaction data from several Amazon DynamoDB tables into an existing Amazon Redshift data warehouse. The solution must maintain the provisioned throughput of DynamoDB.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an Amazon S3 bucket. Configure DynamoDB to export to the bucket on a regular schedule. Use an Amazon Redshift COPY command to read from the S3 bucket.


B.

Use an Amazon Redshift COPY command to read directly from each DynamoDB table.


C.

Create an Amazon S3 bucket. Configure an AWS Lambda function to read from the DynamoDB tables and write to the S3 bucket on a regular schedule. Use Amazon Redshift Spectrum to access the data in the S3 bucket.


D.

Use Amazon Athena Federated Query with a DynamoDB connector and an Amazon Redshift connector to read directly from the DynamoDB tables.


Expert Solution
Questions # 44:

A company uses Amazon API Gateway to manage its REST APIs that third-party service providers access The company must protect the REST APIs from SQL injection and cross-site scripting attacks.

What is the MOST operationally efficient solution that meets these requirements?

Options:

A.

Configure AWS Shield.


B.

Configure AWS WAR


C.

Set up API Gateway with an Amazon CloudFront distribution Configure AWS Shield in CloudFront.


D.

Set up API Gateway with an Amazon CloudFront distribution. Configure AWS WAF in CloudFront


Expert Solution
Questions # 45:

A company is developing a new online gaming application. The application will run on Amazon EC2 instances in multiple AWS Regions and will have a high number of globally distributed users. A solutions architect must design the application to optimize network latency for the users.

Which actions should the solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Configure AWS Global Accelerator. Create Regional endpoint groups in each Region where an EC2 fleet is hosted.


B.

Create a content delivery network (CDN) by using Amazon CloudFront. Enable caching for static and dynamic content, and specify a high expiration period.


C.

Integrate AWS Client VPN into the application. Instruct users to select which Region is closest to them after they launch the application. Establish a VPN connection to that Region.


D.

Create an Amazon Route 53 weighted routing policy. Configure the routing policy to give the highest weight to the EC2 instances in the Region that has the largest number of users.


E.

Configure an Amazon API Gateway endpoint in each Region where an EC2 fleet is hosted. Instruct users to select which Region is closest to them after they launch the application. Use the API Gateway endpoint that is closest to them.


Expert Solution
Questions # 46:

A company stores 5 PB of archived data on physical tapes in an on-premises data center. The company needs to retain the data for 10 years. The company does not want to change an existing backup workflow. The data center that stores the tapes has a 10 Gbps AWS Direct Connect connection to an AWS Region. The company wants to migrate the data to AWS as soon as possible.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Use an on-premises backup application to read the data from the tapes. Use local storage to stage the data temporarily. Use AWS DataSync to migrate the data to Amazon S3 Glacier Flexible Retrieval storage.


B.

Use an on-premises backup application to read the data from the tapes. Use the backup application to write directly to Amazon S3 Glacier Deep Archive storage.


C.

Order multiple AWS Snowball Edge devices. Copy the physical tapes to virtual tapes on the Snowball Edge devices. Ship the Snowball Edge devices to AWS. Create an S3 Lifecycle policy to move the tapes to Amazon S3 Glacier Instant Retrieval storage.


D.

Configure an on-premises AWS Storage Gateway Tape Gateway. Create virtual tapes on AWS. Use backup software to copy the physical tapes to the virtual tapes. Move the virtual tapes to Amazon S3 Glacier Deep Archive storage.


Expert Solution
Questions # 47:

A company is developing a photo-hosting application in the us-east-1 Region. The application gives users across multiple countries the ability to upload and view photos. Some photos are heavily viewed for months, while other photos are viewed for less than a week. The application allows users to upload photos that are up to 20 MB in size. The application uses photo metadata to determine which photos to display to each user.

The company needs a cost-effective storage solution to support the application.

Options:

A.

Store the photos in Amazon DynamoDB. Turn on DynamoDB Accelerator (DAX).


B.

Store the photos in the Amazon S3 Intelligent-Tiering storage class. Store the photo metadata and the S3 location URLs in Amazon DynamoDB.


C.

Store the photos in the Amazon S3 Standard storage class. Set up an S3 Lifecycle policy to move photos older than 30 days to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Use object tags to keep track of metadata.


D.

Store the photos in an Amazon DynamoDB table. Use the DynamoDB Standard-Infrequent Access (DynamoDB Standard-IA) storage class. Store the photo metadata in Amazon ElastiCache.


Expert Solution
Questions # 48:

A company discovers that an Amazon DynamoDB Accelerator (DAX) cluster for the company ' s web application workload is not encrypting data at rest. The company needs to resolve thesecurity issue.

Which solution will meet this requirement?

Options:

A.

Stop the existing DAX cluster. Enable encryption at rest for the existing DAX cluster, and start the cluster again.


B.

Delete the existing DAX cluster. Recreate the DAX cluster, and configure the new cluster to encrypt the data at rest.


C.

Update the configuration of the existing DAX cluster to encrypt the data at rest.


D.

Integrate the existing DAX cluster with AWS Security Hub to automatically enable encryption at rest.


Expert Solution
Questions # 49:

A company uses an organization in AWS Organizations to manage multiple AWS accounts. The company is building a product that spans multiple accounts. Developers at the company who work in multiple accounts need to give AWS Lambda functions access to write logs to an Amazon S3 bucket that is in a central logging account.

Which solution will meet this requirement in the MOST secure way?

Options:

A.

Create an IAM role in the central logging account that has write access to the S3 bucket. Create a trust policy that allows AWS Lambda functions in accounts within the organization to assume the IAM role.


B.

Create an IAM user in the central logging account that has full access to the S3 bucket. Create an S3 bucket policy that allows the IAM user to write to the S3 bucket. Use the IAM user access key and secret key credentials as environment variables.


C.

Create an S3 bucket policy for the S3 bucket in the central logging account. Configure the bucket policy to allow full access for AWS Lambda.


D.

Create an IAM user for each developer in the central logging account. Create an S3 bucket policy for the S3 bucket in the central logging account that allows full access for each IAM user.


Expert Solution
Questions # 50:

A company is migrating a data processing application to AWS. The application processes several short-lived batch jobs that cannot be disrupted. The process generates data after each batch job finishes running. The company accesses the data for 30 days following data generation. After 30 days, the company stores the data for 2 years.

The company wants to optimize costs for the application and data storage. Which solution will meet these requirements?

Options:

A.

Use Amazon EC2 Spot Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Instant Retrieval after 30 days. Configure a bucket policy to delete the data after 2 years.


B.

Use Amazon EC2 On-Demand Instances to run the application. Store the data in Amazon S3 Glacier Instant Retrieval. Move the data to S3 Glacier Deep Archive after 30 days. Configure an S3 Lifecycle configuration to delete the data after 2 years.


C.

Use Amazon EC2 Spot Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Flexible Retrieval after 30 days. Configure a bucket policy to delete the data after 2 years.


D.

Use Amazon EC2 On-Demand Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Deep Archive after 30 days. Configure an S3 Lifecycle configuration to delete the data after 2 years.


Expert Solution
Questions # 51:

A company has an application that runs on Amazon EC2 instances within a private subnet in a VPC. The instances access data in an Amazon S3 bucket in the same AWS Region. The VPC contains a NAT gateway in a public subnet to access the S3 bucket. The company wants to reduce costs by replacing the NAT gateway without compromising security or redundancy.

Which solution meets these requirements?

Options:

A.

Replace the NAT gateway with a NAT instance.


B.

Replace the NAT gateway with an internet gateway.


C.

Replace the NAT gateway with a gateway VPC endpoint.


D.

Replace the NAT gateway with an AWS Direct Connect connection.


Expert Solution
Questions # 52:

A company hosts its multi-tier, public web application in the AWS Cloud. The web application runs on Amazon EC2 instances, and its database runs on Amazon RDS. The company is anticipating a large increase in sales during an upcoming holiday weekend. A solutions architect needs to build asolution to analyze the performance of the web application with a granularity of no more than 2 minutes.

What should the solutions architect do to meet this requirement?

Options:

A.

Send Amazon CloudWatch logs to Amazon Redshift. Use Amazon QuickSight to perform further analysis.


B.

Enable detailed monitoring on all EC2 instances. Use Amazon CloudWatch metrics to perform further analysis.


C.

Create an AWS Lambda function to fetch EC2 logs from Amazon CloudWatch Logs. Use Amazon CloudWatch metrics to perform further analysis.


D.

Send EC2 logs to Amazon S3. Use Amazon Redshift to fetch togs from the S3 bucket to process raw data tor further analysis with Amazon QuickSight.


Expert Solution
Questions # 53:

A social media company allows users to upload images to its website. The website runs on Amazon EC2 instances. During upload requests, the website resizes the images to a standard size and stores the resized images in Amazon S3. Users are experiencing slow upload requests to the website.

The company needs to reduce coupling within the application and improve website performance. A solutions architect must design the most operationally efficient process for image uploads.

Which combination of actions should the solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Configure the application to upload images to S3 Glacier Flexible Retrieval.


B.

Configure the web server to upload the original images to Amazon S3.


C.

Configure the application to upload images directly from each user ' s browser to Amazon S3 by using a presigned URL.


D.

Configure S3 Event Notifications to invoke an AWS Lambda function when an image is uploaded. Use the function to resize the image.


E.

Create an Amazon EventBridge rule that invokes an AWS Lambda function on a schedule to resize uploaded images.


Expert Solution
Questions # 54:

An insurance company wants to migrate an application that calculates insurance premiums to AWS. The company must run calculations immediately when a customer submits information through the application. The application usually takes 10 seconds to process a calculation.

Which solution will meet this requirement?

Options:

A.

Set up an Amazon API Gateway HTTP API to receive the data. Use an AWS Lambda function to process the data immediately.


B.

Upload the customer data to an Amazon S3 bucket. Start an Amazon EC2 Spot Instance to process every data upload.


C.

Set up AWS Transfer Family to receive the customer data. Configure an Amazon EKS job to process the customer data on a schedule.


D.

Upload the data to an Amazon S3 bucket. Invoke an AWS Batch job to process every customer data upload.


Expert Solution
Questions # 55:

A company is developing a highly available natural language processing NLP application. The application handles large volumes of concurrent requests. The application performs NLP tasks such as entity recognition, sentiment analysis, and key phrase extraction on text data.

The company needs to store data that the application processes in a highly available and scalable database.

Which solution will meet these requirements?

Options:

A.

Create an Amazon API Gateway REST API endpoint to handle incoming requests. Configure the REST API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Comprehend to perform NLP tasks on the text data. Store the processed data in Amazon DynamoDB.


B.

Create an Amazon API Gateway HTTP API endpoint to handle incoming requests. Configure the HTTP API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Translate to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.


C.

Create an Amazon SQS queue to buffer incoming requests. Deploy the NLP application on Amazon EC2 instances in an Auto Scaling group. Use Amazon Comprehend to perform NLP tasks. Store the processed data in an Amazon RDS database.


D.

Create an Amazon API Gateway WebSocket API endpoint to handle incoming requests. Configure the WebSocket API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Textract to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.


Expert Solution
Questions # 56:

A company is creating a mobile financial app that gives users the ability to sign up and store personal information. The app uses an Amazon DynamoDB table to store user details and preferences.

The app generates a credit score report by using the data that is stored in DynamoDB. The app sends credit score reports to users once every month.

The company needs to provide users with an option to remove their data and preferences. The app must delete customer data within one month of receiving a request to delete the data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Lambda function to delete user information. Create an Amazon EventBridge rule that runs when a specified TTL expires. Configure the EventBridge rule to invoke the Lambda function.


B.

Create a DynamoDB stream. Create an AWS Lambda function to delete user information. When a specified TTL expires, write user information to the DynamoDB stream from the DynamoDB table. Configure the DynamoDB stream to invoke the Lambda function to delete user information.


C.

Enable TTL in DynamoDB. Set the expiration date as an attribute. Create an AWS Lambda function to set the TTL based on the expiration date value. Invoke the Lambda function when a user requests to delete personal data.


D.

Enable TTL in DynamoDB. Create an AWS Lambda function to delete user information. Configure AWS Config to detect the DynamoDB state change when TTL expires and to invoke the Lambda function.


Expert Solution
Questions # 57:

A company needs to store confidential files on AWS. The company accesses the files every week. The company must encrypt the files by using envelope encryption, and the encryption keys must be rotated automatically. The company must have an audit trail to monitor encryption key usage.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Store the confidential files in Amazon S3.


B.

Store the confidential files in Amazon S3 Glacier Deep Archive.


C.

Use server-side encryption with customer-provided keys (SSE-C).


D.

Use server-side encryption with Amazon S3 managed keys (SSE-S3).


E.

Use server-side encryption with AWS KMS managed keys (SSE-KMS).


Expert Solution
Questions # 58:

An ecommerce company experiences a surge in mobile application traffic every Monday at 8 AM during the company ' s weekly sales events. The application ' s backend uses an Amazon API Gateway HTTP API and AWS Lambda functions to process user requests. During peak sales periods, users report encountering TooManyRequestsException errors from the Lambda functions. The errors result in a degraded user experience. A solutions architect needs to design a scalable and resilient solution that minimizes the errors and ensures that the application ' s overall functionality remains unaffected.

Options:

A.

Create an Amazon Simple Queue Service (Amazon SQS) queue. Send user requests to the SQS queue. Configure the Lambda function with provisioned concurrency. Set the SQS queue as the event source trigger.


B.

Use AWS Step Functions to orchestrate and process user requests. Configure Step Functions to invoke the Lambda functions and to manage the request flow.


C.

Create an Amazon Simple Notification Service (Amazon SNS) topic. Send user requests to the SNS topic. Configure the Lambda functions with provisioned concurrency. Subscribe the functions to the SNS topic.


D.

Create an Amazon Simple Queue Service (Amazon SQS) queue. Send user requests to the SQS queue. Configure the Lambda functions with reserved concurrency. Set the SQS queue as the event source trigger for the functions.


Expert Solution
Questions # 59:

A company needs to set up a centralized solution to audit API calls to AWS for workloads that run on AWS services and non AWS services. The company must store logs of the audits for 7 years.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Set up a data lake in Amazon S3. Incorporate AWS CloudTrail logs and logs from non AWS services into the data lake. Use CloudTrail to store the logs for 7 years.


B.

Configure custom integrations for AWS CloudTrail Lake to collect and store CloudTrail events from AWS services and non AWS services. Use CloudTrail to store the logs for 7 years.


C.

Enable AWS CloudTrail for AWS services. Ingest non AWS services into CloudTrail to store the logs for 7 years


D.

Create new Amazon CloudWatch Logs groups. Send the audit data from non AWS services to the CloudWatch Logs groups. Enable AWS CloudTrail for workloads that run on AWS. Use CloudTrail to store the logs for 7 years.


Expert Solution
Questions # 60:

A company is using AWS Identity and Access Management (IAM) Access Analyzer to refine IAM permissions for employee users. The company uses an organization in AWS Organizations and AWS Control Tower to manage its AWS accounts. The company has designated a specific member account as an audit account.

A solutions architect needs to set up IAM Access Analyzer to aggregate findings from all member accounts in the audit account.

What is the first step the solutions architect should take?

Options:

A.

Use AWS CloudTrail to configure one trail for all accounts. Create an Amazon S3 bucket in the audit account. Configure the trail to send logs related to access activity to the new S3 bucket in the audit account.


B.

Configure a delegated administrator account for IAM Access Analyzer in the AWS Control Tower management account. In the delegated administrator account for IAM Access Analyzer, specify the AWS account ID of the audit account.


C.

Create an Amazon S3 bucket in the audit account. Generate a new permissions policy, and add a service role to the policy to give IAM Access Analyzer access to AWS CloudTrail and the S3 bucket in the audit account.


D.

Add a new trust policy that includes permissions to allow IAM Access Analyzer to perform sts:AssumeRole actions. Modify the permissions policy to allow IAM Access Analyzer to generate policies.


Expert Solution
Viewing page 3 out of 14 pages
Viewing questions 41-60 out of questions