Spring Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 7 out of 16 pages
Viewing questions 91-105 out of questions
Questions # 91:

A company needs to run its external website on Amazon EC2 instances and on-premises virtualized servers. The AWS environment has a 1 GB AWS Direct Connect connection to the data center. The application has IP addresses that will not change. The on-premises and AWS servers are able to restart themselves while maintaining the same IP address if a failure occurs. Some website users have to add their vendors to an allow list, so the solution must have a fixed IP address. The company needs a solution with the lowest operational overhead to handle this split traffic.

What should a solutions architect do to meet these requirements?

Options:

A.

Deploy an Amazon Route 53 Resolver with rules pointing to the on-premises and AWS IP addresses.


B.

Deploy a Network Load Balancer on AWS. Create target groups for the on-premises and AWS IP addresses.


C.

Deploy an Application Load Balancer on AWS. Register the on-premises and AWS IP addresses with the target group.


D.

Deploy Amazon API Gateway to direct traffic to the on-premises and AWS IP addresses based on the header of the request.


Expert Solution
Questions # 92:

A company is developing a new application that uses a relational database to store user data and application configurations. The company expects the application to have steady user growth. The company expects the database usage to be variable and read-heavy, with occasional writes.

The company wants to cost-optimize the database solution. The company wants to use an AWS managed database solution that will provide the necessary performance.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy the database on Amazon RDS. Use Provisioned IOPS SSD storage to ensure consistent performance for read and write operations.


B.

Deploy the database on Amazon Aurora Serveriess to automatically scale the database capacity based on actual usage to accommodate the workload.


C.

Deploy the database on Amazon DynamoDB. Use on-demand capacity mode to automatically scale throughput to accommodate the workload.


D.

Deploy the database on Amazon RDS Use magnetic storage and use read replicas to accommodate the workload


Expert Solution
Questions # 93:

A solutions architect needs to design a system to process incoming work items immediately. Processing can take up to 30 minutes and involves calling external APIs, executing multiple states, and storing intermediate states.

The solution must scale with variable workloads and minimize operational overhead.

Which combination of steps meets these requirements? (Select TWO.)

Options:

A.

Invoke an AWS Lambda function for each incoming work item. Configure each function to handle the work item completely. Store states in DynamoDB.


B.

Invoke an AWS Step Functions workflow to process incoming work items. Use Lambda functions for business logic. Store work item states in DynamoDB.


C.

Set up an API Gateway REST API to receive work items. Configure the API to invoke a Lambda function for each work item.


D.

Deploy two EC2 Reserved Instances behind an ALB and send requests to an SQS queue.


E.

Set up an API Gateway REST API to receive work items. Send the work items to an SQS queue.


Expert Solution
Questions # 94:

A company is moving a legacy data processing application to the AWS Cloud. The application needs to run on Amazon EC2 instances behind an Application Load Balancer (ALB).

The application must handle incoming traffic spikes and continue to work in the event of an application fault in one Availability Zone. The company requires that a Web Application Firewall (WAF) must be attached to the ALB.

Which solution will meet these requirements?

Options:

A.

Deploy the application to EC2 instances in an Auto Scaling group that is in a single Availability Zone. Use an ALB to distribute traffic. Use AWS WAF.


B.

Deploy the application to EC2 instances in an Auto Scaling group across multiple Availability Zones. Use an ALB to distribute traffic. Use AWS WAF.


C.

Deploy the application to EC2 instances in Auto Scaling groups across multiple AWS Regions. Use Route 53 latency routing. Attach AWS WAF to Route 53.


D.

Deploy the application to EC2 instances in an Auto Scaling group across multiple Availability Zones. Use a Network Load Balancer (NLB). Use AWS WAF.


Expert Solution
Questions # 95:

A company is developing a social media application. The company anticipates rapid and unpredictable growth in users and data volume. The application needs to handle a continuous high volume of user requests. User requests include long-running processes that store large amounts of user-generated content and user profiles in a relational format. The processes must run in a specific order. The company requires an architecture that can scale resources to meet demand spikes without downtime or performance degradation. The company must ensure that the components of the application can evolve independently without affecting other parts of the system. Which combination of AWS services will meet these requirements?

Options:

A.

Deploy the application on Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Use Amazon RDS as the database. Use Amazon Simple Queue Service (Amazon SQS) to decouple message processing between components.


B.

Deploy the application on Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Use Amazon RDS as the database. Use Amazon Simple Notification Service (Amazon SNS) to decouple message processing between components.


C.

Use Amazon DynamoDB as the database. Use AWS Lambda functions to implement the application. Configure Amazon DynamoDB Streams to invoke the Lambda functions. Use AWS Step Functions to manage workflows between services.


D.

Use an AWS Elastic Beanstalk environment with auto scaling to deploy the application. Use Amazon RDS as the database. Use Amazon Simple Notification Service (Amazon SNS) to decouple message processing between components.


Expert Solution
Questions # 96:

A company is planning to run an AI/ML workload on AWS. The company needs to train a model on a dataset that is in Amazon S3 Standard. A model training application requires multiple compute nodes and single-digit millisecond access to the data.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Move the data to S3 Intelligent-Tiering. Point the model training application to S3 Intelligent-Tiering as the data source.


B.

Add partitions to the S3 bucket by adding random prefixes. Reconfigure the model training application to point to the new prefixes as the data source.


C.

Move the data to S3 Express One Zone. Point the model training application to S3 Express One Zone as the data source.


D.

Move the data to a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS)volume attached to an Amazon EC2 instance. Point the model training application to the gp3 volume as the data source.


Expert Solution
Questions # 97:

An ecommerce company wants a disaster recovery solution for its Amazon RDS DB instances that run Microsoft SQL Server Enterprise Edition. The company ' s current recovery point objective (RPO) and recovery time objective (RTO) are 24 hours.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create a cross-Region read replica and promote the read replica to the primary instance


B.

Use AWS Database Migration Service (AWS DMS) to create RDS cross-Region replication.


C.

Use cross-Region replication every 24 hours to copy native backups to an Amazon S3 bucket


D.

Copy automatic snapshots to another Region every 24 hours.


Expert Solution
Questions # 98:

A company provides a trading platform to customers. The platform uses an Amazon API Gateway REST API, AWS Lambda functions, and an Amazon DynamoDB table. Each trade that the platform processes invokes a Lambda function that stores the trade data in Amazon DynamoDB. The company wants to ingest trade data into a data lake in Amazon S3 for near real-time analysis. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon S3.


B.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.


C.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure Kinesis Data Streams to invoke a Lambda function that writes the data to Amazon S3.


D.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure a data stream to be the input for Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.


Expert Solution
Questions # 99:

A company hosts an application on AWS. The application gives users the ability to upload photos and store the photos in an Amazon S3 bucket. The company wants to use Amazon CloudFront and a custom domain name to upload the photo files to the S3 bucket in the eu-west-1 Region.

Which solution will meet these requirements? (Select TWO.)

Options:

A.

Use AWS Certificate Manager (ACM) to create a public certificate in the us-east-1 Region. Use the certificate in CloudFront


B.

Use AWS Certificate Manager (ACM) to create a public certificate in eu-west-1. Use the certificate in CloudFront.


C.

Configure Amazon S3 to allow uploads from CloudFront. Configure S3 Transfer Acceleration.


D.

Configure Amazon S3 to allow uploads from CloudFront origin access control (OAC).


E.

Configure Amazon S3 to allow uploads from CloudFront. Configure an Amazon S3 website endpoint.


Expert Solution
Questions # 100:

A company runs an application on Amazon EC2 instances that have instance store volumes attached. The application uses Amazon Elastic File System (Amazon EFS) to store files that are shared across a cluster of Linux servers. The shared files are at least 1 GB in size.

The company accesses the files often for the first 7 days after creation. The files must remain readily available after the first 7 days.

The company wants to optimize costs for the application.

Which solution will meet these requirements?

Options:

A.

Configure an AWS Storage Gateway Amazon S3 File Gateway to cache frequently accessed files locally. Store older files in Amazon S3.


B.

Move the files from Amazon EFS, and store the files locally on each EC2 instance.


C.

Configure a lifecycle policy to move the files to the EFS Infrequent Access (IA) storage class after 7 days.


D.

Deploy AWS DataSync to automatically move files older than 7 days to Amazon S3 Glacier Deep Archive.


Expert Solution
Questions # 101:

A company plans to deploy containerized microservices in the AWS Cloud. The containers must mount a persistent file store that the company can manage by using OS-level permissions. The company requires fully managed services to host the containers and file store.

Options:

A.

Use AWS Lambda functions and an Amazon API Gateway REST API to handle the microservices. Use Amazon S3 buckets for storage.


B.

Use Amazon EC2 instances to host the microservices. Use Amazon Elastic Block Store (Amazon EBS) volumes for storage.


C.

Use Amazon Elastic Container Service (Amazon ECS) containers on AWS Fargate to handle the microservices. Use an Amazon Elastic File System (Amazon EFS) file system for storage.


D.

Use Amazon Elastic Container Service (Amazon ECS) containers on AWS Fargate to handle the microservices. Use an Amazon EC2 instance that runs a dedicated file store for storage.


Expert Solution
Questions # 102:

A company has developed an API using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static and dynamic content to users worldwide. The company wants to decrease the latency of transferring content for API requests.

Options:

Options:

A.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.


B.

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.


C.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.


D.

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.


Expert Solution
Questions # 103:

A company needs a solution to integrate transaction data from several Amazon DynamoDB tables into an existing Amazon Redshift data warehouse. The solution must maintain the provisioned throughput of DynamoDB.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an Amazon S3 bucket. Configure DynamoDB to export to the bucket on a regular schedule. Use an Amazon Redshift COPY command to read from the S3 bucket.


B.

Use an Amazon Redshift COPY command to read directly from each DynamoDB table.


C.

Create an Amazon S3 bucket. Configure an AWS Lambda function to read from the DynamoDB tables and write to the S3 bucket on a regular schedule. Use Amazon Redshift Spectrum to access the data in the S3 bucket.


D.

Use Amazon Athena Federated Query with a DynamoDB connector and an Amazon Redshift connector to read directly from the DynamoDB tables.


Expert Solution
Questions # 104:

A company has a non-production application that runs on an Amazon EC2 instance. The EC2 instance has an instance profile and an associated IAM role.

The company wants to automate patching for the EC2 instance.

Which solution will meet this requirement?

Options:

A.

Create a new IAM role. Attach the AmazonSSMManagedInstanceCore policy to the new IAM role. Attach the new IAM role to the EC2 instance profile. Use AWS Systems Manager to patch the instance.


B.

Create an IAM user. Attach the AmazonSSMManagedInstanceCore policy to the IAM user. Configure AWS Systems Manager to use the IAM user to patch the instance.


C.

Attach the AmazonSSMManagedInstanceCore policy to the existing IAM role. Use AWS Systems Manager to patch the EC2 instance.


D.

Attach the AmazonSSMManagedInstanceCore policy to an existing IAM user. Use EC2 Image Builder to patch the EC2 instance.


Expert Solution
Questions # 105:

A company runs a web application on Amazon EC2 instances in an Auto Scaling group that has a target group. The company designed the application to work with session affinity (sticky sessions) for a better user experience.

The application must be available publicly over the internet as an endpoint. A WAF must be applied to the endpoint for additional security. Session affinity (sticky sessions) must be configured on the endpoint.

Options:

A.

Create a public Network Load Balancer. Specify the application target group.


B.

Create a Gateway Load Balancer. Specify the application target group.


C.

Create a public Application Load Balancer. Specify the application target group.


D.

Create a second target group. Add Elastic IP addresses to the EC2 instances.


E.

Create a web ACL in AWS WAF. Associate the web ACL with the endpoint.


Expert Solution
Viewing page 7 out of 16 pages
Viewing questions 91-105 out of questions