Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 3 out of 13 pages
Viewing questions 31-45 out of questions
Questions # 31:

A financial services company plans to launch a new application on AWS to handle sensitive financial transactions. The company will deploy the application on Amazon EC2 instances. The company will use Amazon RDS for MySQL as the database. The company's security policies mandate that data must be encrypted at rest and in transit.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys. Configure AWS Certificate Manager (ACM) SSL/TLS certificates for encryption in transit.


B.

Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys. Configure IPsec tunnels for encryption in transit


C.

Implement third-party application-level data encryption before storing data in Amazon RDS for MySQL. Configure AWS Certificate Manager (ACM) SSL/TLS certificates for encryption in transit.


D.

Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys Configure a VPN connection to enable private connectivity to encrypt data in transit.


Expert Solution
Questions # 32:

A data science team needs storage for nightly log processing. The size and number of logs is unknown, and the logs persist for only 24 hours.

What is the MOST cost-effective solution?

Options:

A.

Amazon S3 Glacier Deep Archive


B.

Amazon S3 Standard


C.

Amazon S3 Intelligent-Tiering


D.

Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)


Expert Solution
Questions # 33:

A company hosts multiple applications on AWS for different product lines. The applications use different compute resources, including Amazon EC2 instances and Application Load Balancers. The applications run in different AWS accounts under the same organization in AWS Organizations across multiple AWS Regions. Teams for each product line have tagged each compute resource in the individual accounts.

The company wants more details about the cost for each product line from the consolidated billing feature in Organizations.

Which combination of steps will meet these requirements? (Select TWO.)

Options:

A.

Select a specific AWS generated tag in the AWS Billing console.


B.

Select a specific user-defined tag in the AWS Billing console.


C.

Select a specific user-defined tag in the AWS Resource Groups console.


D.

Activate the selected tag from each AWS account.


E.

Activate the selected tag from the Organizations management account.


Expert Solution
Questions # 34:

Question:

A company is building an ecommerce application that uses a relational database to store customer data and order history. The company also needs a solution to store 100 GB of product images. The company expects the traffic flow for the application to be predictable. Which solution will meet these requirements MOST cost-effectively?

Options:

Options:

A.

Use Amazon RDS for MySQL for the database. Store the product images in an Amazon S3 bucket.


B.

Use Amazon DynamoDB for the database. Store the product images in an Amazon S3 bucket.


C.

Use Amazon RDS for MySQL for the database. Store the product images in an Amazon Aurora MySQL database.


D.

Create three Amazon EC2 instances. Install MongoDB software on the instances to use as the database. Store the product images in an Amazon RDS for MySQL database with a Multi-AZ deployment.


Expert Solution
Questions # 35:

A company is designing a new internal web application in the AWS Cloud. The new application must securely retrieve and store multiple employee usernames and passwords from an AWS managed service. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve usernames and passwords from Parameter Store.


B.

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.


C.

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Parameter Store.


D.

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.


Expert Solution
Questions # 36:

A multinational company operates in multiple AWS Regions. The company must ensure that its developers and administrators have secure, role-based access to AWS resources.

The roles must be specific to each user's geographic location and job responsibilities.

The company wants to implement a solution to ensure that each team can access only resources within the team's Region. The company wants to use its existing directory service to manage user access. The existing directory service organizes users into roles based on location. The system must be capable of integrating seamlessly with multi-factor authentication (MFA).

Which solution will meet these requirements?

Options:

A.

Use AWS Security Token Service (AWS STS) to generate temporary access tokens. Integrate STS with the directory service. Assign Region-specific roles.


B.

Configure AWS IAM Identity Center with federated access. Integrate IAM Identity Center with the directory service to set up Region-specific IAM roles.


C.

Create IAM managed policies that restrict access by location. Apply policies based on group membership in the directory.


D.

Use custom Lambda functions to dynamically assign IAM policies based on login location and job function.


Expert Solution
Questions # 37:

A company wants to migrate an Oracle database to AWS. The database consists of a single table that contains millions of geographic information systems (GIS) images that are high resolution and are identified by a geographic code.

When a natural disaster occurs, tens of thousands of images get updated every few minutes. Each geographic code has a single image or row that is associated with it. The company wants a solution that is highly available and scalable during such events.

Options:

A.

Store the images and geographic codes in a database table. Use Oracle running on an Amazon RDS Multi-AZ DB instance.


B.

Store the images in Amazon S3 buckets. Use Amazon DynamoDB with the geographic code as the key and the image S3 URL as the value.


C.

Store the images and geographic codes in an Amazon DynamoDB table. Configure DynamoDB Accelerator (DAX) during times of high load.


D.

Store the images in Amazon S3 buckets. Store geographic codes and image S3 URLs in a database table. Use Oracle running on an Amazon RDS Multi-AZ DB instance.


Expert Solution
Questions # 38:

A company is migrating some workloads to AWS. However, many workloads will remain on premises. The on-premises workloads require secure and reliable connectivity to AWS with consistent, low-latency performance.

The company has deployed the AWS workloads across multiple AWS accounts and multiple VPCs. The company plans to scale to hundreds of VPCs within the next year.

The company must establish connectivity between each of the VPCs and from the on-premises environment to each VPC.

Which solution will meet these requirements?

Options:

A.

Use an AWS Direct Connect connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.


B.

Use multiple AWS Site-to-Site VPN connections to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs.


C.

Use an AWS Direct Connect connection with a Direct Connect gateway to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs. Associate the transit gateway with the Direct Connect gateway.


D.

Use an AWS Site-to-Site VPN connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.


Expert Solution
Questions # 39:

An insurance company runs an application on premises to process contracts. The application processes jobs that are comprised of many tasks. The individual tasks run for up to 5 minutes. Some jobs can take up to 24 hours in total to finish. If a task fails, the task must be reprocessed.

The company wants to migrate the application to AWS. The company will use Amazon S3 as part of the solution. The company wants to configure jobs to start automatically when a contract is uploaded to an S3 bucket.

Which solution will meet these requirements?

Options:

A.

Use AWS Lambda functions to process individual tasks. Create a primary Lambda function to handle the overall job processing by calling individual Lambda functions in sequence. Configure the S3 bucket to send an event notification to invoke the primary Lambda function to begin processing.


B.

Use a state machine in AWS Step Functions to handle the overall contract processing job. Configure the S3 bucket to send an event notification to Amazon EventBridge. Create a rule in Amazon EventBridge to target the state machine.


C.

Use an AWS Batch job to handle the overall contract processing job. Configure the S3 bucket to send an event notification to initiate the Batch job.


D.

Use an S3 event notification to notify an Amazon Simple Queue Service (Amazon SQS) queue when a contract is uploaded. Configure an AWS Lambda function to read messages from the queue and to run the contract processing job.


Expert Solution
Questions # 40:

A company needs to ensure that an IAM group that contains database administrators can perform operations only within Amazon RDS. The company must ensure that the members of the IAM group cannot access any other AWS services.

Options:

A.

Create an IAM policy that includes a statement that has the Effect "Allow" and the Action "rds:". Attach the IAM policy to the IAM group.


B.

Create an IAM policy that includes two statements. Configure the first statement to have the Effect "Allow" and the Action "rds:". Configure the second statement to have the Effect "Deny" and the Action "". Attach the IAM policy to the IAM group.


C.

Create an IAM policy that includes a statement that has the Effect "Deny" and the NotAction "rds:". Attach the IAM policy to the IAM group.


D.

Create an IAM policy with a statement that includes the Effect "Allow" and the Action "rds:". Include a permissions boundary that has the Effect "Allow" and the Action "rds:". Attach the IAM policy to the IAM group.


Expert Solution
Questions # 41:

A weather forecasting company needs to process hundreds of gigabytes of data with sub-millisecond latency. The company has a high performance computing (HPC) environment in its data center and wants to expand its forecasting capabilities.

A solutions architect must identify a highly available cloud storage solution that can handle large amounts of sustained throughput Files that are stored in the solution should be accessible to thousands of compute instances that will simultaneously access and process the entire dataset.

What should the solutions architect do to meet these requirements?

Options:

A.

Use Amazon FSx for Lustre scratch file systems


B.

Use Amazon FSx for Lustre persistent file systems.


C.

Use Amazon Elastic File System (Amazon EFS) with Bursting Throughput mode.


D.

Use Amazon Elastic File System (Amazon EFS) with Provisioned Throughput mode.


Expert Solution
Questions # 42:

A company hosts an application on AWS that uses an Amazon S3 bucket and an Amazon Aurora database. The company wants to implement a multi-Region disaster recovery (DR) strategy that minimizes potential data loss.

Which solution will meet these requirements?

Options:

A.

Create an Aurora read replica in a second Availability Zone within the same AWS Region. Enable S3 Versioning for the bucket.


B.

Create an Aurora read replica in a second AWS Region. Configure AWS Backup to create continuous backups of the S3 bucket to a second bucket in a second Availability Zone.


C.

Enable Aurora native database backups across multiple AWS Regions. Use S3 cross-account backups within the company's local Region.


D.

Migrate the database to an Aurora global database. Create a second S3 bucket in a second Region. Configure Cross-Region Replication.


Expert Solution
Questions # 43:

A company runs a monolithic application in its on-premises data center. The company used Java/Tomcat to build the application. The application uses Microsoft SQL Server as a database.

The company wants to migrate the application to AWS.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.

Use AWS App2Container to containerize the application. Deploy the application on Amazon Elastic Kubernetes Service (Amazon EKS). Deploy the database to Amazon RDS for SQL Server. Configure a Multi-AZ deployment.


B.

Containerize the application and deploy the application on a self-managed Kubernetes cluster on an Amazon EC2 instance. Deploy the database on a separate EC2 instance. Set up Microsoft SQL Server Always On availability groups.


C.

Deploy the frontend of the web application as a website on Amazon S3. Use Amazon DynamoDB for the database tier.


D.

Use AWS App2Container to containerize the application. Deploy the application on Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon DynamoDB for the database tier.


Expert Solution
Questions # 44:

A company runs an HPC workload that uses a 200-TB file system on premises. The company needs to migrate this data to Amazon FSx for Lustre. Internet capacity is 10 Mbps, and all data must be migrated within 30 days.

Which solution will meet this requirement?

Options:

A.

Use AWS DMS to transfer data into S3 and link FSx for Lustre to the bucket.


B.

Deploy AWS DataSync on premises and transfer directly into FSx for Lustre.


C.

Use AWS Storage Gateway Volume Gateway to move data into FSx for Lustre.


D.

Use an AWS Snowball Edge storage-optimized device to transfer data into S3 and link FSx for Lustre to the bucket.


Expert Solution
Questions # 45:

A company needs to implement a new data retention policy for regulatory compliance. As part of this policy, sensitive documents that are stored in an Amazon S3 bucket must be protected from deletion or modification for a fixed period of time.

Which solution will meet these requirements?

Options:

A.

Activate S3 Object Lock on the required objects and enable governance mode.


B.

Activate S3 Object Lock on the required objects and enable compliance mode.


C.

Enable versioning on the S3 bucket. Set a lifecycle policy to delete the objects after a specified period.


D.

Configure an S3 Lifecycle policy to transition objects to S3 Glacier Flexible Retrieval for the retention duration.


Expert Solution
Viewing page 3 out of 13 pages
Viewing questions 31-45 out of questions