Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 4 out of 12 pages
Viewing questions 46-60 out of questions
Questions # 46:

A company is deploying an application that processes streaming data in near-real time. The company plans to use Amazon EC2 instances for the workload. The network architecture must be configurable to provide the lowest possible latency between nodes.

Which networking solution meets these requirements?

Options:

A.

Place the EC2 instances in multiple VPCs, and configure VPC peering.


B.

Attach an Elastic Fabric Adapter (EFA) to each EC2 instance.


C.

Run the EC2 instances in a spread placement group.


D.

Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.


Expert Solution
Questions # 47:

A gaming company is building an application that uses a database to store user data. The company wants the database to have an active-active configuration that allows data writes to a secondary AWS Region. The database must achieve a sub-second recovery point objective (RPO).

Options:

Options:

A.

Deploy an Amazon ElastiCache (Redis OSS) cluster. Configure a global data store for disaster recovery. Configure the ElastiCache cluster to cache data from an Amazon RDS database that is deployed in the primary Region.


B.

Deploy an Amazon DynamoDB table in the primary Region and the secondary Region. Configure Amazon DynamoDB Streams to invoke an AWS Lambda function to write changes from the table in the primary Region to the table in the secondary Region.


C.

Deploy an Amazon Aurora MySQL database in the primary Region. Configure a global database for the secondary Region.


D.

Deploy an Amazon DynamoDB table in the primary Region. Configure global tables for the secondary Region.


Expert Solution
Questions # 48:

A company needs a data encryption solution for a machine learning (ML) process. The solution must use an AWS managed service. The ML process currently reads a large number of objects in Amazon S3 that are encrypted by a customer managed AWS KMS key. The current process incurs significant costs because of excessive calls to AWS Key Management Service (AWS KMS) to decrypt S3 objects. The company wants to reduce the costs of API calls to decrypt S3 objects.

Options:

A.

Switch from a customer managed KMS key to an AWS managed KMS key.


B.

Remove the AWS KMS encryption from the S3 bucket. Use a bucket policy to encrypt the data instead.


C.

Recreate the KMS key in AWS CloudHSM.


D.

Use S3 Bucket Keys to perform server-side encryption with AWS KMS keys (SSE-KMS) to encrypt and decrypt objects from Amazon S3.


Expert Solution
Questions # 49:

Question:

A company runs an application on several Amazon EC2 instances that store persistent data on an Amazon Elastic File System (Amazon EFS) file system. The company needs to replicate the data to another AWS Region by using an AWS managed service solution. Which solution will meet these requirements MOST cost-effectively?

Options:

Options:

A.

Use the EFS-to-EFS backup solution to replicate the data to an EFS file system in another Region.


B.

Run a nightly script to copy data from the EFS file system to an Amazon S3 bucket. Enable S3 Cross-Region Replication on the S3 bucket.


C.

Create a VPC in another Region. Establish a cross-Region VPC peer. Run a nightly rsync to copy data from the original Region to the new Region.


D.

Use AWS Backup to create a backup plan with a rule that takes a daily backup and replicates it to another Region. Assign the EFS file system resource to the backup plan.


Expert Solution
Questions # 50:

A global ecommerce company runs its critical workloads on AWS. The workloads use an Amazon RDS for PostgreSQL DB instance that is configured for a Multi-AZ deployment.

Customers have reported application timeouts when the company undergoes database failovers. The company needs a resilient solution to reduce failover time

Which solution will meet these requirements?

Options:

A.

Create an Amazon RDS Proxy. Assign the proxy to the DB instance.


B.

Create a read replica for the DB instance Move the read traffic to the read replica.


C.

Enable Performance Insights. Monitor the CPU load to identify the timeouts.


D.

Take regular automatic snapshots Copy the automatic snapshots to multiple AWS Regions


Expert Solution
Questions # 51:

A company needs a solution to enforce data encryption at rest on Amazon EC2 instances. The solution must automatically identify noncompliant resources and enforce compliance policies on findings.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Use an IAM policy that allows users to create only encrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Config and AWS Systems Manager to automate the detection and remediation of unencrypted EBS volumes.


B.

Use AWS Key Management Service (AWS KMS) to manage access to encrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Lambda and Amazon EventBridge to automate the detection and remediation of unencrypted EBS volumes.


C.

Use Amazon Macie to detect unencrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Systems Manager Automation rules to automatically encrypt existing and new EBS volumes.


D.

Use Amazon Inspector to detect unencrypted Amazon Elastic Block Store (Amazon EBS) volumes. Use AWS Systems Manager Automation rules to automatically encrypt existing and new EBS volumes.


Expert Solution
Questions # 52:

A solutions architect is designing the network architecture for an application that runs on Amazon EC2 instances in an Auto Scaling group. The application needs to access data that is in Amazon S3 buckets.

Traffic to the S3 buckets must not use public IP addresses. The solutions architect will deploy the application in a VPC that has public and private subnets.

Which solutions will meet these requirements? (Select TWO.)

Options:

A.

Deploy the EC2 instances in a private subnet. Configure a default route to an egress-only internet gateway.


B.

Deploy the EC2 instances in a public subnet. Create a gateway endpoint for Amazon S3. Associate the endpoint with the subnet's route table.


C.

Deploy the EC2 instances in a public subnet. Create an interface endpoint for Amazon S3. Configure DNS hostnames and DNS resolution for the VPC.


D.

Deploy the EC2 instances in a private subnet. Configure a default route to a NAT gateway in a public subnet.


E.

Deploy the EC2 instances in a private subnet. Configure a default route to a customer gateway.


Expert Solution
Questions # 53:

A company runs an application in a VPC on AWS. The company's on-premises data center has a DNS server. The data center is connected to AWS through an AWS Direct Connect connection with a private virtual interface (VIF). The on-premises DNS server needs to resolve the DNS name of the application in the VPC.

Options:

A.

Set up AWS Verified Access endpoints in the VPC. Configure DNS forwarding rules in Verified Access. Configure the on-premises DNS server to forward DNS queries through the Verified Access endpoints.


B.

Configure the Direct Connect connection to enable DNS resolution between the on-premises DNS server and the application in the VPC.


C.

Create an Amazon Route 53 Resolver outbound endpoint and a Resolver rule in the VPC. Configure the on-premises DNS server to send requests for the application to the outbound endpoint.


D.

Create an Amazon Route 53 Resolver inbound endpoint in the VPC. Configure the on-premises DNS server to send requests for the application to the inbound endpoint.


Expert Solution
Questions # 54:

A company provides a trading platform to customers. The platform uses an Amazon API Gateway REST API, AWS Lambda functions, and an Amazon DynamoDB table. Each trade that the platform processes invokes a Lambda function that stores the trade data in Amazon DynamoDB. The company wants to ingest trade data into a data lake in Amazon S3 for near real-time analysis. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon S3.


B.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.


C.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure Kinesis Data Streams to invoke a Lambda function that writes the data to Amazon S3.


D.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure a data stream to be the input for Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.


Expert Solution
Questions # 55:

A marketing company receives a large amount of new clickstream data in Amazon S3 from a marketing campaign The company needs to analyze the clickstream data in Amazon S3 quickly. Then the company needs to determine whether to process the data further in the data pipeline.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create external tables in a Spark catalog Configure jobs in AWS Glue to query the data


B.

Configure an AWS Glue crawler to crawl the data. Configure Amazon Athena to query the data.


C.

Create external tables in a Hive metastore. Configure Spark jobs in Amazon EMR to query the data.


D.

Configure an AWS Glue crawler to crawl the data. Configure Amazon Kinesis Data Analytics to use SQL to query the data


Expert Solution
Questions # 56:

A company is designing an IPv6 application that is hosted on Amazon EC2 instances in a private subnet within a VPC. The application will store user-uploaded content in Amazon S3 buckets. The application will save each S3 object's URL link and metadata in Amazon DynamoDB.

The company must not use public internet connections to transmit user-uploaded content or metadata.

Which solution will meet these requirements?

Options:

A.

Implement a gateway VPC endpoint for Amazon S3 and an interface VPC endpoint for Amazon DynamoDB.


B.

Implement interface VPC endpoints for both Amazon S3 and Amazon DynamoDB.


C.

Implement gateway VPC endpoints for both Amazon S3 and Amazon DynamoDB.


D.

Implement a gateway VPC endpoint for Amazon DynamoDB and an interface VPC endpoint for Amazon S3.


Expert Solution
Questions # 57:

A company has a social media application that is experiencing rapid user growth. The current architecture uses t-family Amazon EC2 instances. The current architecture struggles to handle the increasing number of user posts and images. The application experiences performance slowdowns during peak usage times.

A solutions architect needs to design an updated architecture that will resolve the performance issues and scale as usage increases.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use the largest Amazon EC2 instance in the same family to host the application. Install a relational database on the instance to store all account information and to store posts and images.


B.

Use Amazon Simple Queue Service (Amazon SQS) to buffer incoming posts. Use a larger EC2 instance in the same family to host the application. Store account information in Amazon DynamoDB. Store posts and images in the local EC2 instance file system.


C.

Use an Amazon API Gateway REST API and AWS Lambda functions to process requests. Store account information in Amazon DynamoDB. Use Amazon S3 to store posts and images.


D.

Deploy multiple EC2 instances in the same family. Use an Application Load Balancer to distribute traffic. Use a shared file system to store account information and to store posts and images.


Expert Solution
Questions # 58:

A company runs game applications on AWS. The company needs to collect, visualize, and analyze telemetry data from the company's game servers. The company wants to gain insights into the behavior, performance, and health of game servers in near real time. Which solution will meet these requirements?

Options:

A.

Use Amazon Kinesis Data Streams to collect telemetry data. Use Amazon Managed Service for Apache Flink to process the data in near real time and publish custom metrics to Amazon CloudWatch. Use Amazon CloudWatch to create dashboards and alarms from the custom metrics.


B.

Use Amazon Data Firehose to collect, process, and store telemetry data in near real time. Use AWS Glue to extract, transform, and load (ETL) data from Firehose into required formats for analysis. Use Amazon QuickSight to visualize and analyze the data.


C.

Use Amazon Kinesis Data Streams to collect, process, and store telemetry data. Use Amazon EMR to process the data in near real time into required formats for analysis. Use Amazon Athena to analyze and visualize the data.


D.

Use Amazon DynamoDB Streams to collect and store telemetry data. Configure DynamoDB Streams to invoke AWS Lambda functions to process the data in near real time. Use Amazon Managed Grafana to visualize and analyze the data.


Expert Solution
Questions # 59:

A company is designing a website that displays stock market prices to users. The company wants to use Amazon ElastiCache (Redis OSS) for the data caching layer. The company needs to ensure that the website's data caching layer can automatically fail over to another node if necessary.

Options:

A.

Enable read replicas in ElastiCache (Redis OSS). Promote the read replica when necessary.


B.

Enable Multi-AZ in ElastiCache (Redis OSS). Fail over to a second node when necessary.


C.

Export a backup of the ElastiCache (Redis OSS) cache to an Amazon S3 bucket. Restore the cache to a second cluster when necessary.


D.

Export a backup of the ElastiCache (Redis OSS) cache by using AWS Backup. Restore the cache to a second cluster when necessary.


Expert Solution
Questions # 60:

A global company runs a data lake application in the us-east-1 Region and the eu-west-1 Region in an active-passive configuration. Application data is stored locally in Amazon S3 buckets in each AWS Region. The bucket in us-east-1 is the primary active bucket that handles all writes. The company needs to ensure that the application has Regional fault tolerance. The company also needs the storage layer to provide a highly available active-active capability for reads across Regions. The storage layer must provide low latency access through a single global endpoint.

Options:

A.

Create an Amazon CloudFront distribution in each Region. Set the S3 bucket within each Region as the origin for the CloudFront distribution in the same Region.


B.

Use S3 Transfer Acceleration for cross-Region data transfers to the S3 buckets.


C.

Configure AWS Backup to replicate S3 buckets across Regions. Set up a disaster recovery environment.


D.

Create an S3 Multi-Region Access Point. Configure cross-Region replication.


Expert Solution
Viewing page 4 out of 12 pages
Viewing questions 46-60 out of questions