Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 11 out of 18 pages
Viewing questions 201-220 out of questions
Questions # 201:

A company needs the ability to analyze the log files of its proprietary application. The logs are stored in JSON format in an Amazon S3 bucket Queries will be simple and will run on-demand A solutions architect needs to perform the analysis with minimal changes to the existing architecture

What should the solutions architect do to meet these requirements with the LEAST amount of operational overhead?

Options:

A.

Use Amazon Redshift to load all the content into one place and run the SQL queries as needed


B.

Use Amazon CloudWatch Logs to store the logs Run SQL queries as needed from the Amazon CloudWatch console


C.

Use Amazon Athena directly with Amazon S3 to run the queries as needed


D.

Use AWS Glue to catalog the logs Use a transient Apache Spark cluster on Amazon EMR to run the SQL queries as needed


Expert Solution
Questions # 202:

A solutions architect is using Amazon S3 to design the storage architecture of a new digital media application. The media files must be resilient to the loss of an Availability Zone Some files are accessed frequently while other files are rarely accessed in an unpredictable pattern. The solutions architect must minimize the costs of storing and retrieving the media files.

Which storage option meets these requirements?

Options:

A.

S3 Standard


B.

S3 Intelligent-Tiering


C.

S3 Standard-Infrequent Access {S3 Standard-IA)


D.

S3 One Zone-Infrequent Access (S3 One Zone-IA)


Expert Solution
Questions # 203:

A company is building an application in the AWS Cloud. The application will store data in Amazon S3 buckets in two AWS Regions. The company must use an AWS Key Management Service (AWSKMS) customer managed key to encrypt all data that is stored in the S3 buckets. The data in both S3 buckets must be encrypted and decrypted with the same KMS key. The data and the key must be stored in each of the two Regions.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) Configure replication between the S3 buckets.


B.

Create a customer managed multi-Region KMS key. Create an S3 bucket in each Region. Configure replication between the S3 buckets. Configure the application to use the KMS key with client-side encryption.


C.

Create a customer managed KMS key and an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) Configure replication between the S3 buckets.


D.

Create a customer managed KMS key and an S3 bucket m each Region Configure the S3 buckets to use server-side encryption with AWS KMS keys (SSE-KMS) Configure replication between the S3 buckets.


Expert Solution
Questions # 204:

A company observes an increase in Amazon EC2 costs in its most recent bill The billing team notices unwanted vertical scaling of instance types for a couple of EC2 instances A solutions architect needs to create a graph comparing the last 2 months of EC2 costs and perform an in-depth analysis to identify the root cause of the vertical scaling

How should the solutions architect generate the information with the LEAST operational overhead?

Options:

A.

Use AWS Budgets to create a budget report and compare EC2 costs based on instance types


B.

Use Cost Explorer's granular filtering feature to perform an in-depth analysis of EC2 costs based on instance types


C.

Use graphs from the AWS Billing and Cost Management dashboard to compare EC2 costs based on instance types for the last 2 months


D.

Use AWS Cost and Usage Reports to create a report and send it to an Amazon S3 bucket Use Amazon QuickSight with Amazon S3 as a source to generate an interactive graph based on instance types.


Expert Solution
Questions # 205:

A company has a large Microsoft SharePoint deployment running on-premises that requires Microsoft Windows shared file storage. The company wants to migrate this workload to the AWS Cloud and is considering various storage options. The storage solution must be highly available and integrated with Active Directory for access control.

Which solution will satisfy these requirements?

Options:

A.

Configure Amazon EFS storage and set the Active Directory domain for authentication


B.

Create an SMB Me share on an AWS Storage Gateway tile gateway in two Availability Zones


C.

Create an Amazon S3 bucket and configure Microsoft Windows Server to mount it as a volume


D.

Create an Amazon FSx for Windows File Server file system on AWS and set the Active Directory domain for authentication


Expert Solution
Questions # 206:

A company has thousands of edge devices that collectively generate 1 TB of status alerts each day. Each alert is approximately 2 KB in size. A solutions architect needs to implement a solution to ingest and store the alerts for future analysis.

The company wants a highly available solution. However, the company needs to minimize costs and does not want to manage additional infrastructure. Ad ditionally, the company wants to keep 14 days of data available for immediate analysis and archive any data older than 14 days.

What is the MOST operationally efficient solution that meets these requirements?

Options:

A.

Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days


B.

Launch Amazon EC2 instances across two Availability Zones and place them behind an Elastic Load Balancer to ingest the alerts Create a script on the EC2 instances that will store tne alerts m an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days


C.

Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon Elasticsearch Service (Amazon ES) duster Set up the Amazon ES cluster to take manual snapshots every day and delete data from the duster that is older than 14 days


D.

Create an Amazon Simple Queue Service (Amazon SQS i standard queue to ingest the alerts and set the message retention period to 14 days Configure consumers to poll the SQS queue check the age of the message and analyze the message data as needed If the message is 14 days old the consumer should copy the message to an Amazon S3 bucket and delete the message from the SQS queue


Expert Solution
Questions # 207:

A company has applications that run on Amazon EC2 instances in a VPC. One of the applications needs to call the Amazon S3 API to store and read objects. According to the company's security regulations, no traffic from the applications is allowed to travel across the internet.

Which solution will meet these requirements?

Options:

A.

Configure an S3 interface endpoint.


B.

Configure an S3 gateway endpoint.


C.

Create an S3 bucket in a private subnet.


D.

Create an S3 bucket in the same Region as the EC2 instance.


Expert Solution
Questions # 208:

A development team runs monthly resource-intensive tests on its general purpose Amazon RDS for MySQL DB instance with Performance Insights enabled. The testing lasts for 48 hours once a month and is the only process that uses the database. The team wants to reduce the cost of running the tests without reducing the compute and memory attributes of the DB instance.

Which solution meets these requirements MOST cost-effectively?

Options:

A.

Stop the DB instance when tests are completed. Restart the DB instance when required.


B.

Use an Auto Scaling policy with the DB instance to automatically scale when tests are completed.


C.

Create a snapshot when tests are completed. Terminate the DB instance and restore the snapshot when required.


D.

Modify the DB instance to a low-capacity instance when tests are completed. Modify the DB instance again when required.


Expert Solution
Questions # 209:

A solutions architect is developing a multiple-subnet VPC architecture. The solution will consist of six subnets in two Availability Zones. The subnets are defined as public, private and dedicated for databases. Only the Amazon EC2 instances running in the private subnets should be able to access a database.

Which solution meets these requirements?

Options:

A.

Create a now route table that excludes the route to the public subnets' CIDR blocks. Associate the route table to the database subnets.


B.

Create a security group that denies ingress from the security group used by instances in the public subnets. Attach the security group to an Amazon RDS DB instance.


C.

Create a security group that allows ingress from the security group used by instances in the private subnets. Attach the security group to an Amazon RDS DB instance.


D.

Create a new peering connection between the public subnets and the private subnets. Create a different peering connection between the private subnets and the database subnets.


Expert Solution
Questions # 210:

A company wants to improve its ability to clone large amounts of production data into a test environment in the same AWS Region. The data is stored in Amazon EC2 instances on Amazon Elastic Block Store (Amazon EBS) volumes. Modifications to the cloned data must not affect the production environment. The software that accesses this data requires consistently high I/O performance.

A solutions architect needs to minimize the time that is required to clone the production data into the test environment.

Which solution will meet these requirements?

Options:

A.

Take EBS snapshots of the production EBS volumes. Restore the snapshots onto EC2 instance store volumes in the test environment.


B.

Configure the production EBS volumes to use the EBS Multi-Attach feature. Take EBS snapshots of the production EBS volumes. Attach the production EBS volumes to the EC2 instances in the test environment.


C.

Take EBS snapshots of the production EBS volumes. Create and initialize new EBS volumes. Attach the new EBS volumes to EC2 instances in the test environment before restoring the volumes from the production EBS snapshots.


D.

Take EBS snapshots of the production EBS volumes. Turn on the EBS fast snapshot restore feature on the EBS snapshots. Restore the snapshots into new EBS volumes. Attach the new EBS volumes to EC2 instances in the test environment.


Expert Solution
Questions # 211:

A company wants to move a multi-tiered application from on premises to the AWS Cloud to improve the application's performance. The application consists of application tiers that communicate with each other by way of RESTful services. Transactions are dropped when one tier becomes overloaded. A solutions architect must design a solution that resolves these issues and modernizes the application.

Which solution meets these requirements and is the MOST operationally efficient?

Options:

A.

Use Amazon API Gateway and direct transactions to the AWS Lambda functions as the application layer. Use Amazon Simple Queue Service (Amazon SQS) as the communication layer between application services.


B.

Use Amazon CloudWatch metrics to analyze the application performance history to determine the server's peak utilization during the performance failures. Increase the size of the application server's Amazon EC2 instances to meet the peak requirements.


C.

Use Amazon Simple Notification Service (Amazon SNS) to handle the messaging between application servers running on Amazon EC2 in an Auto Scaling group. Use Amazon CloudWatch to monitor the SNS queue length and scale up and down as required.


D.

Use Amazon Simple Queue Service (Amazon SQS) to handle the messaging between application servers running on Amazon EC2 in an Auto Scaling group. Use Amazon CloudWatch to monitor the SQS queue length and scale up when communication failures are detected.


Expert Solution
Questions # 212:

A company wants to run its critical applications in containers to meet requirements tor scalability and availability The company prefers to focus on maintenance of the critical applications The company does not want to be responsible for provisioning and managing the underlying infrastructure that runs the containerized workload

What should a solutions architect do to meet those requirements?

Options:

A.

Use Amazon EC2 Instances, and Install Docker on the Instances


B.

Use Amazon Elastic Container Service (Amazon ECS) on Amazon EC2 worker nodes


C.

Use Amazon Elastic Container Service (Amazon ECS) on AWS Fargate


D.

Use Amazon EC2 instances from an Amazon Elastic Container Service (Amazon ECS)-op6mized Amazon Machine Image (AMI).


Expert Solution
Questions # 213:

An Amazon EC2 administrator created the following policy associated with an IAM group containing several users

What is the effect of this policy?

Options:

A.

Users can terminate an EC2 instance in any AWS Region except us-east-1.


B.

Users can terminate an EC2 instance with the IP address 10 100 100 1 in the us-east-1 Region


C.

Users can terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100.100.254.


D.

Users cannot terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100 100 254


Expert Solution
Questions # 214:

A company is hosting a static website on Amazon S3 and is using Amazon Route 53 for DNS. The website is experiencing increased demand from around the world. The company must decrease latency for users who access the website.

Which solution meets these requirements MOST cost-effectively?

Options:

A.

Replicate the S3 bucket that contains the website to all AWS Regions. Add Route 53 geolocation routing entries.


B.

Provision accelerators in AWS Global Accelerator. Associate the supplied IP addresses with the S3 bucket. Edit the Route 53 entries to point to the IP addresses of the accelerators.


C.

Add an Amazon CloudFront distribution in front of the S3 bucket. Edit the Route 53 entries to point to the CloudFront distribution.


D.

Enable S3 Transfer Acceleration on the bucket. Edit the Route 53 entries to point to the new endpoint.


Expert Solution
Questions # 215:

A company is building an ecommerce web application on AWS. The application sends information about new orders to an Amazon API Gateway REST API to process. The company wants to ensure that orders are processed in the order that they are received.

Which solution will meet these requirements?

Options:

A.

Use an API Gateway integration to publish a message to an Amazon Simple Notification Service (Amazon SNS) topic when the application receives an order. Subscribe an AWS Lambda function to the topic to perform processing.


B.

Use an API Gateway integration to send a message to an Amazon Simple Queue Service (Amazon SQS) FIFO queue when the application receives an order. Configure the SQS FIFO queue to invoke an AWS Lambda function for processing.


C.

Use an API Gateway authorizer to block any requests while the application processes an order.


D.

Use an API Gateway integration to send a message to an Amazon Simple Queue Service (Amazon SQS) standard queue when the application receives an order. Configure the SQS standard queue to invoke an AWS Lambda function for processing.


Expert Solution
Questions # 216:

A company needs to store data in Amazon S3 and must prevent the data from being changed. The company wants new objects that are uploaded to Amazon S3 to remain unchangeable for a nonspecific amount of time until the company decides to modify the objects. Only specific users in the company’s AWS account can have the ability to delete the objects. What should a solutions architect do to meet these requirements?

Options:

A.

Create an S3 Glacier vault Apply a write-once, read-many (WORM) vault lock policy to the objects


B.

Create an S3 bucket with S3 Object Lock enabled Enable versioning Set a retention period of 100 years Use governance mode as the S3 bucket's default retention mode for new objects


C.

Create an S3 bucket Use AWS CloudTrail to (rack any S3 API events that modify the objects Upon notification, restore the modified objects from any backup versions that the company has


D.

Create an S3 bucket with S3 Object Lock enabled Enable versioning Add a legal hold to the objects Add the s3 PutObjectLegalHold permission to the IAM policies of users who need to delete the objects


Expert Solution
Questions # 217:

A company has an AWS Glue extract. transform, and load (ETL) job that runs every day at the same time. The job processes XML data that is in an Amazon S3 bucket.

New data is added to the S3 bucket every day. A solutions architect notices that AWS Glue is processing all the data during each run.

What should the solutions architect do to prevent AWS Glue from reprocessing old data?

Options:

A.

Edit the job to use job bookmarks.


B.

Edit the job to delete data after the data is processed


C.

Edit the job by setting the NumberOfWorkers field to 1.


D.

Use a FindMatches machine learning (ML) transform.


Expert Solution
Questions # 218:

A company performs monthly maintenance on its AWS infrastructure. During these maintenance activities, the company needs to rotate the credentials tor its Amazon ROS tor MySQL databases across multiple AWS Regions

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Store the credentials as secrets in AWS Secrets Manager. Use multi-Region secret replication for the required Regions Configure Secrets Manager to rotate the secrets on a schedule


B.

Store the credentials as secrets in AWS Systems Manager by creating a secure string parameter Use multi-Region secret replication for the required Regions Configure Systems Manager to rotate the secrets on a schedule


C.

Store the credentials in an Amazon S3 bucket that has server-side encryption (SSE) enabled Use Amazon EventBridge (Amazon CloudWatch Events) to invoke an AWS Lambda function to rotate the credentials


D.

Encrypt the credentials as secrets by using AWS Key Management Service (AWS KMS) multi-Region customer managed keys Store the secrets in an Amazon DynamoDB global table Use an AWS Lambda function to retrieve the secrets from DynamoDB Use the RDS API to rotate the secrets.


Expert Solution
Questions # 219:

A social media company allows users to upload images to its website. The website runs on Amazon EC2 instances. During upload requests, the website resizes the images to a standard size and stores the resized images in Amazon S3. Users are experiencing slow upload requests to the website.

The company needs to reduce coupling within the application and improve website performance. A solutions architect must design the most operationally efficient process for image uploads.

Which combination of actions should the solutions architect take to meet these requirements? (Choose two.)

Options:

A.

Configure the application to upload images to S3 Glacier.


B.

Configure the web server to upload the original images to Amazon S3.


C.

Configure the application to upload images directly from each user's browser to Amazon S3 through the use of a presigned URL.


D.

Configure S3 Event Notifications to invoke an AWS Lambda function when an image is uploaded. Use the function to resize the image


E.

Create an Amazon EventBridge (Amazon CloudWatch Events) rule that invokes an AWS Lambda function on a schedule to resize uploaded images.


Expert Solution
Questions # 220:

A company is implementing a new business application. The application runs on two Amazon EC2 instances and uses an Amazon S3 bucket for document storage. A solutions architect needs to ensure that the EC2 instances can access the S3 bucket.

What should the solutions architect do to meet this requirement?

Options:

A.

Create an IAM role that grants access to the S3 bucket. Attach the role to the EC2 instances.


B.

Create an IAM policy that grants access to the S3 bucket. Attach the policy to the EC2 instances.


C.

Create an IAM group that grants access to the S3 bucket. Attach the group to the EC2 instances.


D.

Create an IAM user that grants access to the S3 bucket. Attach the user account to the EC2 instances.


Expert Solution
Viewing page 11 out of 18 pages
Viewing questions 201-220 out of questions