Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 5 out of 12 pages
Viewing questions 61-75 out of questions
Questions # 61:

A company has stored millions of objects across multiple prefixes in an Amazon S3 bucket by using the Amazon S3 Glacier Deep Archive storage class. The company needs to delete all data older than 3 years except for a subset of data that must be retained. The company has identified the data that must be retained and wants to implement a serverless solution.

Which solution will meet these requirements?

Options:

A.

Use S3 Inventory to list all objects. Use the AWS CLI to create a script that runs on an Amazon EC2 instance that deletes objects from the inventory list.


B.

Use AWS Batch to delete objects older than 3 years except for the data that must be retained


C.

Provision an AWS Glue crawler to query objects older than 3 years. Save the manifest file of old objects. Create a script to delete objects in the manifest.


D.

Enable S3 Inventory. Create an AWS Lambda function to filter and delete objects. Invoke the Lambda function with S3 Batch Operations to delete objects by using the inventory reports.


Expert Solution
Questions # 62:

A company stores user data in AWS. The data is used continuously with peak usage during business hours. Access patterns vary, with some data not being used for months at a time. A solutions architect must choose a cost-effective solution that maintains the highest level of durability while maintaining high availability.

Which storage solution meets these requirements?

Options:

A.

Amazon S3 Standard


B.

Amazon S3 Intelligent-Tiering


C.

Amazon S3 Glacier Deep Archive


D.

Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)


Expert Solution
Questions # 63:

An analytics application runs on multiple Amazon EC2 Linux instances that use Amazon Elastic File System (Amazon EFS) Standard storage. The files vary in size and access frequency. The company accesses the files infrequently after 30 days. However, users sometimes request older files to generate reports.

The company wants to reduce storage costs for files that are accessed infrequently. The company also wants throughput to adjust based on the size of the file system. The company wants to use the TransitionToIA Amazon EFS lifecycle policy to transition files to Infrequent Access (IA) storage after 30 days.

Which solution will meet these requirements?

Options:

A.

Configure files to transition back to Standard storage when a user accesses the files again. Specify the provisioned throughput mode.


B.

Specify the provisioned throughput mode only.


C.

Configure files to transition back to Standard storage when a user accesses the files again. Specify the bursting throughput mode.


D.

Specify the bursting throughput mode only.


Expert Solution
Questions # 64:

A gaming company has a web application that displays game scores. The application runs on Amazon EC2 instances behind an Application Load Balancer (ALB). The application stores data in an Amazon RDS for MySQL database.

Users are experiencing long delays and interruptions caused by degraded database read performance. The company wants to improve the user experience.

Which solution will meet this requirement?

Options:

A.

Use an Amazon ElastiCache (Redis OSS) cache in front of the database.


B.

Use Amazon RDS Proxy between the application and the database.


C.

Migrate the application from EC2 instances to AWS Lambda functions.


D.

Use an Amazon Aurora Global Database to create multiple read replicas across multiple AWS Regions.


Expert Solution
Questions # 65:

A company stores customer data in a multitenant Amazon S3 bucket. Each customer's data is stored in a prefix that is unique to the customer. The company needs to migrate data for specific customers to a new. dedicated S3 bucket that is in the same AWS Region as the source bucket. The company must preserve object metadata such as creation date and version IDs.

After the migration is finished, the company must delete the source data for the migrated customers from the original multitenant S3 bucket.

Which combination of solutions will meet these requirements with the LEAST overhead? (Select THREE.)

Options:

A.

Create a new S3 bucket as a destination bucket. Enable versioning on the new bucket.


B.

Use S3 batch operations to copy objects from the specified prefixes to the destination bucket.


C.

Use the S3 CopyObject API, and create a script to copy data to the destination S3 bucket.


D.

Configure S3 Same-Region Replication (SRR) to replicate existing data from the specified prefixes in the source bucket to the destination bucket.


E.

Configure AWS DataSync to migrate data from the specified prefixes in the source bucket to the destination bucket.


F.

Use an S3 Lifecycle policy to delete objects from the source bucket after the data is migrated to the destination bucket.


Expert Solution
Questions # 66:

A company is building an application in the AWS Cloud. The application is hosted on Amazon EC2 instances behind an Application Load Balancer (ALB). The company uses Amazon Route 53 for the DNS.

The company needs a managed solution with proactive engagement to detect against DDoS attacks.

Which solution will meet these requirements?

Options:

A.

Enable AWS Config. Configure an AWS Config managed rule that detects DDoS attacks.


B.

Enable AWS WAF on the ALB Create an AWS WAF web ACL with rules to detect and prevent DDoS attacks. Associate the web ACL with the ALB.


C.

Store the ALB access logs in an Amazon S3 bucket. Configure Amazon GuardDuty to detect and take automated preventative actions for DDoS attacks.


D.

Subscribe to AWS Shield Advanced. Configure hosted zones in Route 53 Add ALB resources as protected resources.


Expert Solution
Questions # 67:

A law firm needs to make hundreds of files readable for the general public. The law firm must prevent members of the public from modifying or deleting the files before a specified future date. Which solution will meet these requirements MOST securely?

Options:

A.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the specified date.


B.

Create a new Amazon S3 bucket. Enable S3 Versioning. Use S3 Object Lock and set a retention period based on the specified date. Create an Amazon CloudFront distribution to serve content from the bucket. Use an S3 bucket policy to restrict access to the CloudFront origin access control (OAC).


C.

Create a new Amazon S3 bucket. Enable S3 Versioning. Configure an event trigger to run an AWS Lambda function if a user modifies or deletes an object. Configure the Lambda function to replace the modified or deleted objects with the original versions of the objects from a private S3 bucket.


D.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period based on the specified date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.


Expert Solution
Questions # 68:

A company is preparing to store confidential data in Amazon S3. For compliance reasons, the data must be encrypted at rest. Encryption key usage must be logged for auditing purposes. Keys must be rotated every year.

Which solution meets these requirements and is the MOST operationally efficient?

Options:

A.

Server-side encryption with customer-provided keys (SSE-C)


B.

Server-side encryption with Amazon S3 managed keys (SSE-S3)


C.

Server-side encryption with AWS KMS keys (SSE-KMS) with manual rotation


D.

Server-side encryption with AWS KMS keys (SSE-KMS) with automatic rotation


Expert Solution
Questions # 69:

A company runs an environment where data is stored in an Amazon S3 bucket. The objects are accessed frequently throughout the day. The company has strict data encryption requirements fordata that is stored in the S3 bucket. The company currently uses AWS Key Management Service (AWS KMS) for encryption.

The company wants to optimize costs associated with encrypting S3 objects without making additional calls to AWS KMS.

Which solution will meet these requirements?

Options:

A.

Use server-side encryption with Amazon S3 managed keys (SSE-S3).


B.

Use an S3 Bucket Key for server-side encryption with AWS KMS keys (SSE-KMS) on the new objects.


C.

Use client-side encryption with AWS KMS customer managed keys.


D.

Use server-side encryption with customer-provided keys (SSE-C) stored in AWS KMS.


Expert Solution
Questions # 70:

A company uses Amazon EC2 instances behind an Application Load Balancer (ALB) to serve content to users. The company uses Amazon Elastic Block Store (Amazon EBS) volumes to store data.

The company needs to encrypt data in transit and at rest.

Which combination of services will meet these requirements? (Select TWO.)

Options:

A.

Amazon GuardDuty


B.

AWS Shield


C.

AWS Certificate Manager (ACM)


D.

AWS Secrets Manager


E.

AWS Key Management Service (AWS KMS)


Expert Solution
Questions # 71:

A company needs to design a resilient web application to process customer orders. The web application must automatically handle increases in web traffic and application usage without affecting the customer experience or losing customer orders.

Which solution will meet these requirements?

Options:

A.

Use a NAT gateway to manage web traffic. Use Amazon EC2 Auto Scaling groups to receive, process, and store processed customer orders. Use an AWS Lambda function to capture and store unprocessed orders.


B.

Use a Network Load Balancer (NLB) to manage web traffic. Use an Application Load Balancer to receive customer orders from the NLB. Use Amazon Redshift with a Multi-AZ deployment to store unprocessed and processed customer orders.


C.

Use a Gateway Load Balancer (GWLB) to manage web traffic. Use Amazon Elastic Container Service (Amazon ECS) to receive and process customer orders. Use the GWLB to capture and store unprocessed orders. Use Amazon DynamoDB to store processed customer orders.


D.

Use an Application Load Balancer to manage web traffic. Use Amazon EC2 Auto Scaling groups to receive and process customer orders. Use Amazon Simple Queue Service (Amazon SQS) to store unprocessed orders. Use Amazon RDS with a Multi-AZ deployment to store processed customer orders.


Expert Solution
Questions # 72:

A company has an application that uses an Amazon RDS for PostgreSQL database. The company is developing an application feature that will store sensitive information for an individual in the database.

During a security review of the environment, the company discovers that the RDS DB instance is not encrypting data at rest. The company needs a solution that will provide encryption at rest for all the existing data and for any new data that is entered for an individual.

Which combination of steps should the company take to meet these requirements? (Select TWO.)

Options:

A.

Create a snapshot of the DB instance. Enable encryption on the snapshot. Use the encrypted snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.


B.

Create a snapshot of the DB instance. Create an encrypted copy of the snapshot. Use the encrypted snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.


C.

Modify the configuration of the DB instance by enabling encryption. Create a snapshot of the DB instance. Use the snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.


D.

Use AWS Key Management Service (AWS KMS) to create a new default AWS managed aws/rds key. Select this key as the encryption key for operations with Amazon RDS.


E.

Use AWS Key Management Service (AWS KMS) to create a new customer managed key. Select this key as the encryption key for operations with Amazon RDS.


Expert Solution
Questions # 73:

A company wants to send data from its on-premises systems to Amazon S3 buckets. The company created the S3 buckets in three different accounts. The company must send the data privately without traveling across the internet. The company has no existing dedicated connectivity to AWS.

Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Set up an AWS Direct Connect connection with a private VIF between the on-premises environment and the private VPC.


B.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Set up an AWS Direct Connect connection with a public VIF between the on-premises environment and the private VPC.


C.

Create an Amazon S3 interface endpoint in the networking account.


D.

Create an Amazon S3 gateway endpoint in the networking account.


E.

Establish a networking account in the AWS Cloud. Create a private VPC in the networking account. Peer VPCs from the accounts that host the S3 buckets with the VPC in the network account.


Expert Solution
Questions # 74:

A solutions architect is building an Amazon S3 data lake for a company. The company uses Amazon Kinesis Data Firehose to ingest customer personally identifiable information (PII) and transactional data in near real-time to an S3 bucket. The company needs to mask all PII data before storing thedata in the data lake.

Which solution will meet these requirements?

Options:

A.

Create an AWS Lambda function to detect and mask PII. Invoke the function from Kinesis Data Firehose.


B.

Use Amazon Macie to scan the S3 bucket. Configure Macie to detect and mask PII.


C.

Enable server-side encryption (SSE) on the S3 bucket.


D.

Create an AWS Lambda function that integrates with AWS CloudHSM. Configure the function to detect and mask PII.


Expert Solution
Questions # 75:

A company has an e-commerce site. The site is designed as a distributed web application hosted in multiple AWS accounts under one AWS Organizations organization. The web application is comprised of multiple microservices. All microservices expose their AWS services either through Amazon CloudFront distributions or public Application Load Balancers (ALBs). The company wants to protect public endpoints from malicious attacks and monitor security configurations. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS WAF to protect the public endpoints. Use AWS Firewall Manager from a dedicated security account to manage rules in AWS WAF. Use AWS Config rules to monitor the Regional and global WAF configurations.


B.

Use AWS WAF to protect the public endpoints. Apply AWS WAF rules in each account. Use AWS Config rules and AWS Security Hub to monitor the WAF configurations of the ALBs and the CloudFront distributions.


C.

Use AWS WAF to protect the public endpoints. Use AWS Firewall Manager from a dedicated security account to manage the rules in AWS WAF. Use Amazon Inspector and AWS Security Hub to monitor the WAF configurations of the ALBs and the CloudFront distributions.


D.

Use AWS Shield Advanced to protect the public endpoints. Use AWS Config rules to monitor the Shield Advanced configuration for each account.


Expert Solution
Viewing page 5 out of 12 pages
Viewing questions 61-75 out of questions