New Year Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 2 out of 13 pages
Viewing questions 16-30 out of questions
Questions # 16:

A company plans to use an Amazon S3 bucket to archive backup data. Regulations require the company to retain the backup data for 7 years.

During the retention period, the company must prevent users, including administrators, from deleting the data. The company can delete the data after 7 years.

Which solution will meet these requirements?

Options:

A.

Create an S3 bucket policy that denies delete operations for 7 years. Create an S3 Lifecycle policy to delete the data after 7 years.


B.

Create an S3 Object Lock default retention policy that retains data for 7 years in governance mode. Create an S3 Lifecycle policy to delete the data after 7 years.


C.

Create an S3 Object Lock default retention policy that retains data for 7 years in compliance mode. Create an S3 Lifecycle policy to delete the data after 7 years.


D.

Create an S3 Batch Operations job to set a legal hold on each object for 7 years. Create an S3 Lifecycle policy to delete the data after 7 years.


Expert Solution
Questions # 17:

A company has a production Amazon RDS for MySQL database. The company needs to create a new application that will read frequently changing data from the database with minimal impact on the database's overall performance. The application will rarely perform the same query more than once.

What should a solutions architect do to meet these requirements?

Options:

A.

Set up an Amazon ElastiCache cluster. Query the results in the cluster.


B.

Set up an Application Load Balancer (ALB). Query the results in the ALB.


C.

Set up a read replica for the database. Query the read replica.


D.

Set up querying of database snapshots. Query the database snapshots.


Expert Solution
Questions # 18:

A company uses AWS Organizations to manage multiple AWS accounts. The company needs a secure, event-driven architecture in which specific Amazon SNS topics in Account A can publish messages to specific Amazon SQS queues in Account B.

Which solution meets these requirements while maintaining least privilege?

Options:

A.

Create a new IAM role in Account A that can publish to any SQS queue. Share the role ARN with Account B.


B.

Add SNS topic ARNs to SQS queue policies in Account B. Configure SNS topics to publish to any queue. Encrypt the queue with an AWS KMS key.


C.

Modify the SQS queue policies in Account B to allow only specific SNS topic ARNs from Account A to publish messages. Ensure the SNS topics have publish permissions for the specific queue ARN.


D.

Create a shared IAM role across both accounts with permission to publish to all SQS queues. Enable cross-account access.


Expert Solution
Questions # 19:

A company runs a MySQL database on a single Amazon EC2 instance.

The company needs to improve availability of the database to prepare for power outages.

Which solution will meet this requirement?

Options:

A.

Add an Application Load Balancer (ALB) in front of the EC2 instance.


B.

Configure EC2 automatic instance recovery to move the instance to another Availability Zone.


C.

Migrate the MySQL database to Amazon RDS and enable Multi-AZ deployment.


D.

Enable termination protection for the EC2 instance.


Expert Solution
Questions # 20:

A company uses an Amazon EC2 instance to run a script to poll for and process messages in an Amazon Simple Queue Service (Amazon SQS) queue. The company wants to reduce operational overhead while maintaining its ability to process an increasing number of messages that are added to the queue. Which solution will meet these requirements?

Options:

A.

Increase the size of the EC2 instance to process messages in the SQS queue faster.


B.

Configure an Amazon EventBridge rule to turn off the EC2 instance when the SQS queue is empty.


C.

Migrate the script on the EC2 instance to an AWS Lambda function with an event source of the SQS queue.


D.

Configure an AWS Systems Manager Run Command to run the script on demand.


Expert Solution
Questions # 21:

A company wants to visualize its AWS spend and resource usage. The company wants to use an AWS managed service to provide visual dashboards.

Which solution will meet these requirements?

Options:

A.

Configure an export in AWS Data Exports. Use Amazon QuickSight to create a cost and usage dashboard. View the data in QuickSight.


B.

Configure one custom budget in AWS Budgets for costs. Configure a second custom budget for usage. Schedule daily AWS Budgets reports by using the two budgets as sources.


C.

Configure AWS Cost Explorer to use user-defined cost allocation tags with hourly granularity to generate detailed data.


D.

Configure an export in AWS Data Exports. Use the standard export option. View the data in Amazon Athena.


Expert Solution
Questions # 22:

Question:

A company recently migrated a large amount of research data to an Amazon S3 bucket. The company needs an automated solution to identify sensitive data in the bucket. A security team also needs to monitor access patterns for the data 24 hours a day, 7 days a week to identify suspicious activities or evidence of tampering with security controls.

Options:

Options:

A.

Set up AWS CloudTrail reporting, and grant the security team read-only access to the CloudTrail reports. Set up an Amazon S3 Inventory report to identify sensitive data. Review the findings with the security team.


B.

Enable Amazon Macie and Amazon GuardDuty on the account. Grant the security team access to Macie and GuardDuty. Review the findings with the security team.


C.

Set up an Amazon S3 Inventory report. Use Amazon Athena and Amazon QuickSight to identify sensitive data. Create a dashboard for the security team to review findings.


D.

Use AWS Identity and Access Management (IAM) Access Advisor to monitor for suspicious activity and tampering. Create a dashboard for the security team. Set up an Amazon S3 Inventory report to identify sensitive data. Review the findings with the security team.


Expert Solution
Questions # 23:

A company provides a trading platform to customers. The platform uses an Amazon API Gateway REST API, AWS Lambda functions, and an Amazon DynamoDB table. Each trade that the platform processes invokes a Lambda function that stores the trade data in Amazon DynamoDB. The company wants to ingest trade data into a data lake in Amazon S3 for near real-time analysis. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon S3.


B.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.


C.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure Kinesis Data Streams to invoke a Lambda function that writes the data to Amazon S3.


D.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure a data stream to be the input for Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.


Expert Solution
Questions # 24:

A healthcare company is developing an AWS Lambda function that publishes notifications to an encrypted Amazon Simple Notification Service (Amazon SNS) topic. The notifications contain protected health information (PHI).

The SNS topic uses AWS Key Management Service (AWS KMS) customer-managed keys for encryption. The company must ensure that the application has the necessary permissions to publish messages securely to the SNS topic.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.

Create a resource policy for the SNS topic that allows the Lambda function to publish messages to the topic.


B.

Use server-side encryption with AWS KMS keys (SSE-KMS) for the SNS topic instead of customer-managed keys.


C.

Create a resource policy for the encryption key that the SNS topic uses that has the necessary AWS KMS permissions.


D.

Specify the Lambda function's Amazon Resource Name (ARN) in the SNS topic's resourcepolicy.


E.

Associate an Amazon API Gateway HTTP API with the SNS topic to control access to the topic by using API Gateway resource policies.


F.

Configure a Lambda execution role that has the necessary IAM permissions to use a customer-managed key in AWS KMS.


Expert Solution
Questions # 25:

A company must follow strict regulations for the management of data encryption keys. The company manages its own key externally and imports the key into AWS Key Management Service (AWS KMS). The company must control the imported key material and must rotate the key material on a regular schedule.

A solutions architect needs to import the key material into AWS KMS and rotate the key without interrupting applications that use the key.

Which solution will meet these requirements?

Options:

A.

Create a new AWS KMS key that has the same key ID as the existing key. Import new key material into the key.


B.

Schedule the existing AWS KMS key for deletion. Create a new KMS key that has new key material.


C.

Import new key material into the existing AWS KMS key. Set an expiration time for the old key material.


D.

Enable automatic key rotation for the existing AWS KMS key.


Expert Solution
Questions # 26:

A company runs its application by using Amazon EC2 instances and AWS Lambda functions. The EC2 instances run in private subnets of a VPC. The Lambda functions need direct network access to the EC2 instances for the application to work.

The application will run for 1 year. The number of Lambda functions that the application uses will increase during the 1-year period. The company must minimize costs on all application resources.

Which solution will meet these requirements?

Options:

A.

Purchase an EC2 Instance Savings Plan. Connect the Lambda functions to the private sub-nets that contain the EC2 instances.


B.

Purchase an EC2 Instance Savings Plan. Connect the Lambda functions to new public sub-nets in the same VPC where the EC2 instances run.


C.

Purchase a Compute Savings Plan. Connect the Lambda functions to the private subnets that contain the EC2 instances.


D.

Purchase a Compute Savings Plan. Keep the Lambda functions in the Lambda service VPC.


Expert Solution
Questions # 27:

A company has an application that runs on an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on Amazon EC2 instances. The application has a U1 that uses Amazon DynamoDB and data services that use Amazon S3 as part of the application deployment.

The company must ensure that the EKS Pods for the U1 can access only Amazon DynamoDB and that the EKS Pods for the data services can access only Amazon S3. The company uses AWS Identity and Access Management |IAM).

Which solution meets these requirements?

Options:

A.

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach both IAM policies to the EC2 instance profile. Use role-based access control (RBAC) to control access to Amazon S3 or DynamoDB (or the respective EKS Pods.


B.

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach the Amazon S3 IAM policy directly to the EKS Pods (or the data services and the DynamoDB policy to the EKS Pods for the U1.


C.

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Attach the Amazon S3 Full Access policy to the data services account and the AmazonDynamoDBFullAccess policy to the U1 service account.


D.

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Use IAM Role for Service Accounts (IRSA) to provide access to the EKS Pods for the U1 to Amazon S3 and the EKS Pods for the data services to DynamoDB.


Expert Solution
Questions # 28:

A company is preparing to store confidential data in Amazon S3. For compliance reasons, the data must be encrypted at rest. Encryption key usage must be logged for auditing purposes. Keys must be rotated every year.

Which solution meets these requirements and is the MOST operationally efficient?

Options:

A.

Server-side encryption with customer-provided keys (SSE-C)


B.

Server-side encryption with Amazon S3 managed keys (SSE-S3)


C.

Server-side encryption with AWS KMS keys (SSE-KMS) with manual rotation


D.

Server-side encryption with AWS KMS keys (SSE-KMS) with automatic rotation


Expert Solution
Questions # 29:

A solutions architect is investigating compute options for a critical analytics application. The application uses long-running processes to prepare and aggregate data. The processes cannot be interrupted. The application has a known baseline load. The application needs to handle occasional usage surges.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an Amazon EC2 Auto Scaling group. Set the Min capacity and Desired capacity parameters to the number of instances required to handle the baseline load. Purchase Reserved Instances for the Auto Scaling group.


B.

Create an Amazon EC2 Auto Scaling group. Set the Min capacity, Max capacity, and Desired capacity parameters to the number of instances required to handle the baseline load. Use On-Demand Instances to address occasional usage surges.


C.

Create an Amazon EC2 Auto Scaling group. Set the Min capacity and Desired capacity parameters to the number of instances required to handle the baseline load. Purchase Reserved Instances for the Auto Scaling group. Use the OnDemandPercentageAboveBaseCapacity parameter to configure the launch template to launch Spot Instances.


D.

Re-architect the application to use AWS Lambda functions instead of Amazon EC2 instances. Purchase a one-year Compute Savings Plan to reduce the cost of Lambda usage.


Expert Solution
Questions # 30:

A company has an application that runs on a single Amazon EC2 instance. The application uses a MySQL database that runs on the same EC2 instance. The company needs a highly available and automatically scalable solution to handle increased traffic.

Which solution will meet these requirements?

Options:

A.

Deploy the application to EC2 instances that run in an Auto Scaling group behind an Application Load Balancer. Create an Amazon Redshift cluster that has multiple MySQL-compatible nodes.


B.

Deploy the application to EC2 instances that are configured as a target group behind an Application Load Balancer. Create an Amazon RDS for MySQL cluster that has multiple instances.


C.

Deploy the application to EC2 instances that run in an Auto Scaling group behind an Application Load Balancer. Create an Amazon Aurora Serverless MySQL cluster for the database layer.


D.

Deploy the application to EC2 instances that are configured as a target group behind an Application Load Balancer. Create an Amazon ElastiCache (Redis OSS) cluster that uses the MySQL connector.


Expert Solution
Viewing page 2 out of 13 pages
Viewing questions 16-30 out of questions