Spring Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 5 out of 16 pages
Viewing questions 61-75 out of questions
Questions # 61:

A company needs a data encryption solution for a machine learning (ML) process. The solution must use an AWS managed service. The ML process currently reads a large number of objects in Amazon S3 that are encrypted by a customer managed AWS KMS key. The current process incurs significant costs because of excessive calls to AWS Key Management Service (AWS KMS) to decrypt S3 objects. The company wants to reduce the costs of API calls to decrypt S3 objects.

Options:

A.

Switch from a customer managed KMS key to an AWS managed KMS key.


B.

Remove the AWS KMS encryption from the S3 bucket. Use a bucket policy to encrypt the data instead.


C.

Recreate the KMS key in AWS CloudHSM.


D.

Use S3 Bucket Keys to perform server-side encryption with AWS KMS keys (SSE-KMS) to encrypt and decrypt objects from Amazon S3.


Expert Solution
Questions # 62:

A solutions architect is designing a web application that will run on Amazon EC2 instances behind an Application Load Balancer (ALB). The company strictly requires that the application be resilient against malicious internet activity and attacks, and protect against new common vulnerabilities and exposures.

What should the solutions architect recommend?

Options:

A.

Leverage Amazon CloudFront with the ALB endpoint as the origin.


B.

Deploy an appropriate managed rule for AWS WAF and associate it with the ALB.


C.

Subscribe to AWS Shield Advanced and ensure common vulnerabilities and exposures are blocked.


D.

Configure network ACLs and security groups to allow only ports 80 and 443 to access the EC2 instances.


Expert Solution
Questions # 63:

A company uses Amazon Route 53 as its DNS provider. The company hosts a website both on premises and in the AWS Cloud. The company ' s on-premises data center is near the us-west-1 Region. The company hosts the website on AWS in the eu-central-1 Region.

The company wants to optimize load times for the website as much as possible.

Which solution will meet these requirements?

Options:

A.

Create a DNS record with a failover routing policy that routes all primary traffic to eu-central-1. Configure the routing policy to use the on-premises data center as the secondary location.


B.

Create a DNS record with an IP-based routing policy. Configure specific IP ranges to return the value for the eu-central-1 website. Configure all other IP ranges to return the value for the on-premises website.


C.

Create a DNS record with a latency-based routing policy. Configure one latency record for the eu-central-1 website and one latency record for the on-premises data center. Associate the record for the on-premises data center with the us-west-1 Region.


D.

Create a DNS record with a weighted routing policy. Split the traffic evenly between eu-central-1 and the on-premises data center.


Expert Solution
Questions # 64:

A company runs an application on Amazon EC2 instances behind an Application Load Balancer (ALB). The company wants to create a public API for the application that uses JSON Web Tokens (JWT) for authentication. The company wants the API to integrate directly with the ALB.

Which solution will meet these requirements?

Options:

A.

Use Amazon API Gateway to create a REST API.


B.

Use Amazon API Gateway to create an HTTP API.


C.

Use Amazon API Gateway to create a WebSocket API.


D.

Use Amazon API Gateway to create a gRPC API.


Expert Solution
Questions # 65:

An application uses an Amazon SQS queue and two AWS Lambda functions. One of the Lambda functions pushes messages to the queue, and the other function polls the queue and receives queued messages.

A solutions architect needs to ensure that only the two Lambda functions can write to or read from the queue.

Which solution will meet these requirements?

Options:

A.

Attach an IAM policy to the SQS queue that grants the Lambda function principals read and write access. Attach an IAM policy to the execution role of each Lambda function that denies all access to the SQS queue except for the principal of each function.


B.

Attach a resource-based policy to the SQS queue to deny read and write access to the queue for any entity except the principal of each Lambda function. Attach an IAM policy to the execution role of each Lambda function that allows read and write access to the queue.


C.

Attach a resource-based policy to the SQS queue that grants the Lambda function principals read and write access to the queue. Attach an IAM policy to the execution role of each Lambda function that allows read and write access to the queue.


D.

Attach a resource-based policy to the SQS queue to deny all access to the queue. Attach an IAM policy to the execution role of each Lambda function that grants read and write access to the queue.


Expert Solution
Questions # 66:

Question:

A company runs an application on several Amazon EC2 instances that store persistent data on an Amazon Elastic File System (Amazon EFS) file system. The company needs to replicate the data to another AWS Region by using an AWS managed service solution. Which solution will meet these requirements MOST cost-effectively?

Options:

Options:

A.

Use the EFS-to-EFS backup solution to replicate the data to an EFS file system in another Region.


B.

Run a nightly script to copy data from the EFS file system to an Amazon S3 bucket. Enable S3 Cross-Region Replication on the S3 bucket.


C.

Create a VPC in another Region. Establish a cross-Region VPC peer. Run a nightly rsync to copy data from the original Region to the new Region.


D.

Use AWS Backup to create a backup plan with a rule that takes a daily backup and replicates it to another Region. Assign the EFS file system resource to the backup plan.


Expert Solution
Questions # 67:

A company is migrating its online shopping platform to AWS and wants to adopt a serverless architecture.

The platform has a user profile and preference service that does not have a defined schema. The platform allows user-defined fields.

Profile information is updated several times daily. The company must store profile information in a durable and highly available solution. The solution must capture modifications to profile data for future processing.

Which solution will meet these requirements?

Options:

A.

Use an Amazon RDS for PostgreSQL instance to store profile data. Use a log stream in Amazon CloudWatch Logs to capture modifications.


B.

Use an Amazon DynamoDB table to store profile data. Use Amazon DynamoDB Streams to capture modifications.


C.

Use an Amazon ElastiCache (Redis OSS) cluster to store profile data. Use Amazon Data Firehose to capture modifications.


D.

Use an Amazon Aurora Serverless v2 cluster to store the profile data. Use a log stream in Amazon CloudWatch Logs to capture modifications.


Expert Solution
Questions # 68:

A company needs a solution to process customer orders from a global ecommerce platform. The solution must automatically start processing new orders immediately and must maintain a history of all order processing attempts.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Create an Amazon EventBridge rule that invokes an AWS Lambda function once every minute to check for new orders. Configure the Lambda function to process orders and store results in Amazon Aurora.


B.

Create an Amazon EventBridge event pattern that monitors the ecommerce platform ' s order events. Configure an EventBridge rule to invoke an AWS Lambda function when the platform receives a new order. Configure the function to store the results in Amazon DynamoDB.


C.

Use an Amazon EC2 instance to poll the ecommerce platform for new orders. Configure the instance to invoke an AWS Lambda function to process new orders. Configure the function to log results to Amazon CloudWatch.


D.

Use an Amazon SQS queue to invoke an AWS Lambda function when the platform receives a new order. Configure the function to process batches of orders and to store results in an Amazon EFS file system.


Expert Solution
Questions # 69:

A company uses a general-purpose instance class Amazon RDS for MySQL DB instance. The company has configured the DB instance in a Multi-AZ configuration across two Availability Zones as part of the company ' s production application.

The company ' s finance team needs to run SQL queries against the DB instance to generate reports. Customers have reported significant performance issues with the application during report generation.

A solutions architect needs to minimize the effect of the reporting job on the DB instance.

Which solution will meet these requirements?

Options:

A.

Create a proxy in Amazon RDS Proxy. Update the reporting job to query the proxy endpoint.


B.

Update the RDS DB instance configuration to use three Availability Zones.


C.

Add an RDS read replica. Update the reporting job to query the replica endpoint.


D.

Change the RDS configuration from a general-purpose instance class to a memory-optimized instance class.


Expert Solution
Questions # 70:

A company runs an ecommerce application on Amazon EC2 instances behind an Application Load Balancer. The instances run in an Amazon EC2 Auto Scaling group across multiple Availability Zones. The Auto Scaling group scales based on CPU utilization metrics. The ecommerce application stores the transaction data in a MySQL 8.0 database that is hosted on a large EC2 instance.

The database ' s performance degrades quickly as application load increases. The application handles more read requests than write transactions. The company wants a solution that will automatically scale the database to meet the demand of unpredictable read workloads while maintaining high availability.

Options:

A.

Use Amazon Redshift with a single node for leader and compute functionality.


B.

Use Amazon RDS with a Single-AZ deployment. Configure Amazon RDS to add reader instances in a different Availability Zone.


C.

Use Amazon Aurora with a Multi-AZ deployment. Configure Aurora Auto Scaling with Aurora Replicas.


D.

Use Amazon ElastiCache (Memcached) with EC2 Spot Instances.


Expert Solution
Questions # 71:

An ecommerce company is migrating its on-premises workload to the AWS Cloud. The workload currently consists of a web application and a backend Microsoft SQL database for storage.

The company expects a high volume of customers during a promotional event. The new infrastructure in the AWS Cloud must be highly available and scalable.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Migrate the web application to two Amazon EC2 instances across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS for Microsoft SQL Server with read replicas in both Availability Zones.


B.

Migrate the web application to an Amazon EC2 instance that runs in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to two EC2 instances across separate AWS Regions with database replication.


C.

Migrate the web application to Amazon EC2 instances that run in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS with Multi-AZ deployment.


D.

Migrate the web application to three Amazon EC2 instances across three Availability Zones behind an Application Load Balancer. Migrate the database to three EC2 instances across three Availability Zones.


Expert Solution
Questions # 72:

A company has stored millions of objects across multiple prefixes in an Amazon S3 bucket by using the Amazon S3 Glacier Deep Archive storage class. The company needs to delete all data older than 3 years except for a subset of data that must be retained. The company has identified the data that must be retained and wants to implement a serverless solution.

Which solution will meet these requirements?

Options:

A.

Use S3 Inventory to list all objects. Use the AWS CLI to create a script that runs on an Amazon EC2 instance that deletes objects from the inventory list.


B.

Use AWS Batch to delete objects older than 3 years except for the data that must be retained


C.

Provision an AWS Glue crawler to query objects older than 3 years. Save the manifest file of old objects. Create a script to delete objects in the manifest.


D.

Enable S3 Inventory. Create an AWS Lambda function to filter and delete objects. Invoke the Lambda function with S3 Batch Operations to delete objects by using the inventory reports.


Expert Solution
Questions # 73:

A company is designing a secure solution to grant access to its Amazon RDS for PostgreSQL database. Applications that run on Amazon EC2 instances must be able to securely authenticate to the database without storing long-term credentials.

Which solution will meet these requirements?

Options:

A.

Enable RDS IAM authentication and configure AWS Secrets Manager to store database credentials. Configure applications to retrieve credentials at runtime.


B.

Configure a custom IAM policy for the database that allows access from the EC2 instances ' IP addresses. Configure applications to use a static password to authenticate to the database.


C.

Set up an IAM user for each application. Store the access key ID and secret access key in the EC2 instances ' environment variables. Grant the IAM users permission to the database.


D.

Use IAM roles to assign permissions to the EC2 instances. Configure the applications to obtain a token from the RDS database to authenticate by using IAM authentication.


Expert Solution
Questions # 74:

Question:

A company wants to migrate an application that uses a microservice architecture to AWS. The services currently run on Docker containers on-premises. The application has an event-driven architecture that uses Apache Kafka. The company configured Kafka to use multiple queues to send and receive messages. Some messages must be processed by multiple services. Which solution will meet these requirements with the LEAST management overhead?

Options:

Options:

A.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the Amazon EC2 launch type. Deploy a Kafka cluster on EC2 instances to handle service-to-service communication.


B.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Create multiple Amazon Simple Queue Service (Amazon SQS) queues to handle service-to-service communication.


C.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Deploy an Amazon Managed Streaming for Apache Kafka (Amazon MSK) cluster to handle service-to-service communication.


D.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the Amazon EC2 launch type. Use Amazon EventBridge to handle service-to-service communication.


Expert Solution
Questions # 75:

A company has a large fleet of vehicles that are equipped with internet connectivity to send telemetry to the company. The company receives over 1 million data points every 5 minutes from the vehicles. The company uses the data in machine learning (ML) applications to predict vehicle maintenance needs and to preorder parts. The company produces visual reports based on the captured data. The company wants to migrate the telemetry ingestion, processing, and visualization workloads to AWS. Which solution will meet these requirements?

Options:

A.

Use Amazon Timestream for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon QuickSight to visualize the data.


B.

Use Amazon DynamoDB to store the data points. Use DynamoDB Connector to ingest data from DynamoDB into Amazon EMR for processing. Use Amazon QuickSight to visualize the data.


C.

Use Amazon Neptune to store the data points. Use Amazon Kinesis Data Streams to ingest data from Neptune into an AWS Lambda function for processing. Use Amazon QuickSight to visualize the data.


D.

Use Amazon Timestream to for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon Athena to visualize the data.


Expert Solution
Viewing page 5 out of 16 pages
Viewing questions 61-75 out of questions