Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 12 out of 18 pages
Viewing questions 221-240 out of questions
Questions # 221:

A company stores confidential data in an Amazon Aurora PostgreSQL database in the ap-southeast-3 Region The database is encrypted with an AWS Key Management Service (AWS KMS) customer managed key The company was recently acquired and must securely share a backup of the database with the acquiring company's AWS account in ap-southeast-3.

What should a solutions architect do to meet these requirements?

Options:

A.

Create a database snapshot Copy the snapshot to a new unencrypted snapshot Share the new snapshot with the acquiring company's AWS account


B.

Create a database snapshot Add the acquiring company's AWS account to the KMS key policy Share the snapshot with the acquiring company's AWS account


C.

Create a database snapshot that uses a different AWS managed KMS key Add the acquiring company's AWS account to the KMS key alias. Share the snapshot with the acquiring company's AWS account.


D.

Create a database snapshot Download the database snapshot Upload the database snapshot to an Amazon S3 bucket Update the S3 bucket policy to allow access from the acquiring company's AWS account


Expert Solution
Questions # 222:

A data analytics company wants to migrate its batch processing system to AWS. The company receives thousands of small data files periodically during the day through FTP. A on-premises batch job processes the data files overnight. However, the batch job takes hours to finish running.

The company wants the AWS solution to process incoming data files are possible with minimal changes to the FTP clients that send the files. The solution must delete the incoming data files the files have been processed successfully. Processing for each file needs to take 3-8 minutes.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Use an Amazon EC2 instance that runs an FTP server to store incoming files as objects in Amazon S3 Glacier Flexible Retrieval. Configure a job queue in AWS Batch. Use Amazon EventBridge rules to invoke the job to process the objects nightly from S3 Glacier Flexible Retrieval. Delete the objects after the job has processed the objects.


B.

Use an Amazon EC2 instance that runs an FTP server to store incoming files on an Amazon Elastic Block Store (Amazon EBS) volume. Configure a job queue in AWS Batch. Use Amazon EventBridge rules to invoke the process the files nightly from the EBS volume. Delete the files after the job has processed the files.


C.

Use AWS Transfer Family to create an FTP server to store incoming files on an Amazon Elastic Block Store (Amazon EBS) volume. Configure a job queue in AWS Batch. Use an Amazon S3 event notification when each files arrives to invoke the job in AWS Batch. Delete the files after the job has processed the files.


D.

Use AWS Transfer Family to create an FTP server to store incoming files in Amazon S3 Standard. Create an AWS Lambda function to process the files and to delete the files after they are proessed.yse an S3 event notification to invoke the lambda function when the fils arrive


Expert Solution
Questions # 223:

A company has a three-tier application on AWS that ingests sensor data from its users' devices The traffic flows through a Network Load Balancer (NLB) then to Amazon EC2 instances for the web tier and finally to EC2 instances for the application tier The application tier makes calls to a database

What should a solutions architect do to improve the security of the data in transit?

Options:

A.

Configure a TLS listener Deploy the server certrficate on the NLB


B.

Configure AWS Shield Advanced Enable AWS WAF on the NLB


C.

Change the load balancer to an Application Load Balancer (ALB) Enable AWS WAF on the ALB


D.

Encrypt the Amazon Elastic Block Store (Amazon EBS) volume on the EC2 instances by using AWS Key Management Service (AWS KMS)


Expert Solution
Questions # 224:

A company is using Amazon CloudFront with this website. The company has enabled logging on the CloudFront distribution, and logs are saved in one of the company's Amazon S3 buckets The company needs to perform advanced analyses on the logs and build visualizations

What should a solutions architect do to meet these requirements'?

Options:

A.

Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with AWS Glue


B.

Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with Amazon QuickSight


C.

Use standard SQL queries in Amazon DynamoDB to analyze the CloudFront logs m the S3 bucket Visualize the results with AWS Glue


D.

Use standard SQL queries in Amazon DynamoDB to analyze the CtoudFront logs m the S3 bucket Visualize the results with Amazon QuickSight


Expert Solution
Questions # 225:

A company has migrated an application to Amazon EC2 Linux instances. One of these EC2 instances runs several 1-hour tasks on a schedule. These tasks were written by different teams and have no common programming language. The company is concerned about performance and scalability while these tasks run on a single instance. A solutions architect needs to implement a solution to resolve these concerns.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS Batch to run the tasks as jobs. Schedule the jobs by using Amazon EventBridge (Amazon CloudWatch Events).


B.

Convert the EC2 instance to a container. Use AWS App Runner to create the container on demand to run the tasks as jobs.


C.

Copy the tasks into AWS Lambda functions. Schedule the Lambda functions by using Amazon EventBridge (Amazon CloudWatch Events).


D.

Create an Amazon Machine Image (AMI) of the EC2 instance that runs the tasks. Create an Auto Scaling group with the AMI to run multiple copies of the instance.


Expert Solution
Questions # 226:

A company has a web application with sporadic usage patterns There is heavy usage at the beginning of each month moderate usage at the start of each week and unpredictable usage during the week The application consists of a web server and a MySQL database server running inside the data center The company would like to move the application to the AWS Cloud and needs to select a cost-effective database platform that will not require database modifications

Which solution will meet these requirements?

Options:

A.

Amazon DynamoDB


B.

Amazon RDS for MySQL


C.

MySQL-compatible Amazon Aurora Serverless


D.

MySQL deployed on Amazon EC2 in an Auto Scaling group


Expert Solution
Questions # 227:

An online learning company is migrating to the AWS Cloud. The company maintains its student records in a PostgreSQL database. The company needs a solution in which its data is available and online across multiple AWS Regions at all times.

Which solution will meet these requirements with the LEAST amount of operational overhead?

Options:

A.

Migrate the PostgreSQL database to a PostgreSQL cluster on Amazon EC2 instances.


B.

Migrate the PostgreSQL database to an Amazon RDS for PostgreSQL DB instance with the Multi-AZ feature turned on.


C.

Migrate the PostgreSQL database to an Amazon RDS for PostgreSQL DB instance. Create a read replica in another Region.


D.

Migrate the PostgreSQL database to an Amazon RDS for PostgreSQL DB instance. Set up DB snapshots to be copied to another Region.


Expert Solution
Questions # 228:

A company is experiencing sudden increases in demand. The company needs to provision large Amazon EC2 instances from an Amazon Machine image (AMI) The instances will run m an Auto Scaling group. The company needs a solution that provides minimum initialization latency to meet the demand.

Which solution meets these requirements?

Options:

A.

Use the aws ec2 register-image command to create an AMI from a snapshot Use AWS Step Functions to replace the AMI in the Auto Scaling group


B.

Enable Amazon Elastic Block Store (Amazon EBS) fast snapshot restore on a snapshot Provision an AMI by using the snapshot Replace the AMI m the Auto Scaling group with the new AMI


C.

Enable AMI creation and define lifecycle rules in Amazon Data Lifecycle Manager (Amazon DLM) Create an AWS Lambda function that modifies the AMI in the Auto Scaling group


D.

Use Amazon EventBridge (Amazon CloudWatch Events) to invoke AWS Backup lifecycle policies that provision AMIs Configure Auto Scaling group capacity limits as an event source in EventBridge


Expert Solution
Questions # 229:

A company is planning to store data on Amazon RDS DB instances. The company must encrypt the data at rest.

What should a solutions architect do to meet this requirement?

Options:

A.

Create an encryption key and store the key in AWS Secrets Manager Use the key to encrypt the DB instances


B.

Generate a certificate in AWS Certificate Manager (ACM).Enable SSL/TLS on the DB instances by using the certificate


C.

Create a customer master key (CMK) in AWS Key Management Service (AWS KMS) Enable encryption for the DB instances


D.

Generate a certificate in AWS Identity and Access Management {IAM) Enable SSUTLS on the DB instances by using the certificate


Expert Solution
Questions # 230:

A company wants to create an application to store employee data in a hierarchical structured relationship. The company needs a minimum-latency response to high-traffic queries for the employee data and must protect any sensitive data. The company also need to receive monthly email messages if any financial information is present in the employee data.

Which combination of steps should a solutin architect take to meet these requirement? ( Select TWO.)

Options:

A.

Use Amazon Redshift to store the employee data in hierarchies. Unload the data to Amazon S3 every month.


B.

Use Amazon DynamoDB to store the employee data in hierarchies Export the data to Amazon S3 every month.


C.

Configure Amazon Macie for the AWS account Integrate Macie with Amazon EventBridge to send monthly events to AWS Lambda.


D.

Use Amazon Athena to analyze the employee data in Amazon S3 integrate Athena with Amazon QuickSight to publish analysis dashboards and share the dashboards with users.


E.

Configure Amazon Macie for the AWS account. integrate Macie with Amazon EventBridge to send monthly notifications through an Amazon Simple Notification Service (Amazon SNS) subscription.


Expert Solution
Questions # 231:

A company wants to use high performance computing (HPC) infrastructure on AWS for financial risk modeling. The company's HPC workloads run on Linux. Each HPC workflow runs on hundreds of Amazon EC2 Spot Instances, is shorl-lived, and generates thousands of output files that are ultimately stored in persistent storage for analytics and long-term future use.

The company seeks a cloud storage solution that permits the copying of on-premises data to long-term persistent storage to make data available for processing by all EC2 instances. The solution should also be a high performance file system that is integrated with persistent storage to read and write datasets and output files.

Which combination of AWS services meets these requirements?

Options:

A.

Amazon FSx for Lustre integrated with Amazon S3


B.

Amazon FSx for Windows File Server integrated with Amazon S3


C.

Amazon S3 Glacier integrated with Amazon Elastic Block Store (Amazon EBS)


D.

Amazon S3 bucket with a VPC endpoint integrated with an Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2) volume


Expert Solution
Questions # 232:

A company wants to use Amazon S3 for the secondary copy of its on-premises dataset. The company would rarely need to access this copy. The storage solution’s cost should be minimal.

Which storage solution meets these requirements?

Options:

A.

S3 Standard


B.

S3 Intelligent-Tiering


C.

S3 Standard-Infrequent Access (S3 Standard-IA)


D.

S3 One Zone-Infrequent Access (S3 One Zone-IA)


Expert Solution
Questions # 233:

An image-hosting company stores its objects in Amazon S3 buckets. The company wants to avoid accidental exposure of the objects in the S3 buckets to the public. All S3 objects in the entire AWS account need to remain private

Which solution will meal these requirements?

Options:

A.

Use Amazon GuardDuty to monitor S3 bucket policies Create an automatic remediation action rule that uses an AWS Lambda function to remediate any change that makes the objects public


B.

Use AWS Trusted Advisor to find publicly accessible S3 Dockets Configure email notifications In Trusted Advisor when a change is detected manually change the S3 bucket policy if it allows public access


C.

Use AWS Resource Access Manager to find publicly accessible S3 buckets Use Amazon Simple Notification Service (Amazon SNS) to invoke an AWS Lambda function when a change it detected. Deploy a Lambda function that programmatically remediates the change.


D.

Use the S3 Block Public Access feature on the account level. Use AWS Organizations to create a service control policy (SCP) that prevents IAM users from changing the setting Apply tie SCP to tie account


Expert Solution
Questions # 234:

A company has a Microsoft NET application that runs on an on-premises Windows Server Trie application stores data by using an Oracle Database Standard Edition server The company is planning a migration to AWS and wants to minimize development changes while moving the application The AWS application environment should be highly available

Which combination of actions should the company take to meet these requirements? (Select TWO )

Options:

A.

Refactor the application as serverless with AWS Lambda functions running NET Cote


B.

Rehost the application in AWS Elastic Beanstalk with the NET platform in a Multi-AZ deployment


C.

Replatform the application to run on Amazon EC2 with the Amazon Linux Amazon Machine Image (AMI)


D.

Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Amazon DynamoDB in a Multi-AZ deployment


E.

Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Oracle on Amazon RDS in a Multi-AZ deployment


Expert Solution
Questions # 235:

A company is moving its data management application to AWS. The company wants to transition to an event-driven architecture. The architecture needs to the more distributed and to use serverless concepts whit performing the different aspects of the workflow. The company also wants to minimize operational overhead.

Which solution will meet these requirements?

Options:

A.

Build out the workflow in AWS Glue Use AWS Glue to invoke AWS Lambda functions to process the workflow slaps


B.

Build out the workflow in AWS Step Functions Deploy the application on Amazon EC2 Instances Use Step Functions to invoke the workflow steps on the EC2 instances


C.

Build out the workflow in Amazon EventBridge. Use EventBridge to invoke AWS Lambda functions on a schedule to process the workflow steps.


D.

Build out the workflow m AWS Step Functions Use Step Functions to create a stale machine Use the stale machine to invoke AWS Lambda functions to process the workflow steps


Expert Solution
Questions # 236:

A company runs a web application that is backed by Amazon RDS. A new database administrator caused data loss by accidentally editing information in a database table To help recover from this type of incident, the company wants the ability to restore the database to its state from 5 minutes before any change within the last 30 days.

Which feature should the solutions architect include in the design to meet this requirement?

Options:

A.

Read replicas


B.

Manual snapshots


C.

Automated backups


D.

Multi-AZ deployments


Expert Solution
Questions # 237:

A company hosts a web application on multiple Amazon EC2 instances The EC2 instances are in an Auto Scaling group that scales in response to user demand The company wants to optimize cost savings without making a long-term commitment

Which EC2 instance purchasing option should a solutions architect recommend to meet these requirements'?

Options:

A.

Dedicated Instances only


B.

On-Demand Instances only


C.

A mix of On-Demand instances and Spot Instances


D.

A mix of On-Demand instances and Reserved instances


Expert Solution
Questions # 238:

A solutions architect needs to design a system to store client case files. The files are core company assets and are important. The number of files will grow over time.

The files must be simultaneously accessible from multiple application servers that run on Amazon EC2 instances. The solution must have built-in redundancy.

Which solution meets these requirements?

Options:

A.

Amazon Elastic File System (Amazon EFS)


B.

Amazon Elastic Block Store (Amazon EBS)


C.

Amazon S3 Glacier Deep Archive


D.

AWS Backup


Expert Solution
Questions # 239:

An application that is hosted on Amazon EC2 instances needs to access an Amazon S3 bucket Traffic must not traverse the internet How should a solutions architect configure access to meet these requirements?

Options:

A.

Create a private hosted zone by using Amazon Route 53


B.

Set up a gateway VPC endpoint for Amazon S3 in the VPC


C.

Configure the EC2 instances to use a NAT gateway to access the S3 bucket


D.

Establish an AWS Site-to-Site VPN connection between the VPC and the S3 bucket


Expert Solution
Questions # 240:

A company runs an application on Amazon EC2 Linux instances across multiple Availability Zones. The application needs a storage layer that is highly available and Portable Operating System Interface (POSIX) compliant. The storage layer must provide maximum data durability and must be shareable across the EC2 instances. The data in the storage layer will be accessed frequency for the first 30 days and will be accessed infrequently alter that time.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Use the Amazon S3 Standard storage class Create an S3 Lifecycle policy to move infrequently accessed data to S3 Glacier


B.

Use the Amazon S3 Standard storage class. Create an S3 Lifecycle policy to move infrequently accessed data to S3 Standard-Infrequent Access (EF3 Standard-IA).


C.

Use the Amazon Elastic File System (Amazon EFS) Standard storage class. Create a Lifecycle management policy to move infrequently accessed data to EFS Standard-Infrequent Access (EFS Standard-IA)


D.

Use the Amazon Elastic File System (Amazon EFS) One Zone storage class. Create a Lifecycle management policy to move infrequently accessed data to EFS One Zone-Infrequent Access (EFS One Zone-IA).


Expert Solution
Viewing page 12 out of 18 pages
Viewing questions 221-240 out of questions