Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 8 out of 12 pages
Viewing questions 106-120 out of questions
Questions # 106:

A solutions architect needs to host a high performance computing (HPC) workload in the AWS Cloud. The workload will run on hundreds of Amazon EC2 instances and will require parallel access to a shared file system to enable distributed processing of large datasets. Datasets will be accessed across multiple instances simultaneously. The workload requires access latency within 1 ms. After processing has completed, engineers will need access to the dataset for manual postprocessing.

Which solution will meet these requirements?

Options:

A.

Use Amazon Elastic File System (Amazon EFS) as a shared fie system. Access the dataset from Amazon EFS.


B.

Mount an Amazon S3 bucket to serve as the shared file system. Perform postprocessing directly from the S3 bucket.


C.

Use Amazon FSx for Lustre as a shared file system. Link the file system to an Amazon S3 bucket for postprocessing.


D.

Configure AWS Resource Access Manager to share an Amazon S3 bucket so that it can be mounted to all instances for processing and postprocessing.


Expert Solution
Questions # 107:

A company collects data from sensors. The company needs a cloud-based solution to store and transform the sensor data to make critical decisions. The solution must store the data for up to 2 days. After 2 days, the solution must delete the data. The company needs to use the transformeddata in an automated workflow that has manual approval steps.

Which solution will meet these requirements?

Options:

A.

Load the data into an Amazon Simple Queue Service (Amazon SQS) queue that has a retention period of 2 days. Use an Amazon EventBridge pipe to retrieve data from the queue, transform the data, and pass the data to an AWS Step Functions workflow.


B.

Load the data into AWS DataSync. Delete the DataSync task after 2 days. Invoke an AWS Lambda function to retrieve the data, transform the data, and invoke a second Lambda function that performs the remaining workflow steps.


C.

Load the data into an Amazon Simple Notification Service (Amazon SNS) topic. Use an Amazon EventBridge pipe to retrieve the data from the topic, transform the data, and send the data to Amazon EC2 instances to perform the remaining workflow steps.


D.

Load the data into an Amazon Simple Notification Service (Amazon SNS) topic. Use an Amazon EventBridge pipe to retrieve the data from the topic and transform the data into an appropriate format for an Amazon SQS queue. Use an AWS Lambda function to poll the queue to perform the remaining workflow steps.


Expert Solution
Questions # 108:

A company is creating a new application that will store a large amount of data. The data will be analyzed hourly and will be modified by several Amazon EC2 Linux instances that are deployed across multiple Availability Zones. The needed amount of storage space will continue to grow for the next 6 months.

Which storage solution should a solutions architect recommend to meet these requirements?

Options:

A.

Store the data in Amazon S3 Glacier. Update the S3 Glacier vault policy to allow access to the application instances.


B.

Store the data in an Amazon Elastic Block Store (Amazon EBS) volume. Mount the EBS volume on the application instances.


C.

Store the data in an Amazon Elastic File System (Amazon EFS) file system. Mount the file system on the application instances.


D.

Store the data in an Amazon Elastic Block Store (Amazon EBS) Provisioned IOPS volume shared between the application instances.


Expert Solution
Questions # 109:

A company wants to use an API to translate text from one language to another. The API must receive an HTTP header value and pass the value to an embedded library. The API translates documents in 6 minutes. The API requires a custom authorization mechanism.

Options:

A.

Configure an Amazon API Gateway REST API with AWS_PROXY integration to synchronously call an AWS Lambda function to perform translations.


B.

Configure an AWS Lambda function with a Lambda function URL to synchronously call a second function to perform translations.


C.

Configure an Amazon API Gateway REST API with AWS_PROXY integration to asynchronously call an AWS Lambda function to perform translations.


D.

Configure an Amazon API Gateway REST API with HTTP PROXY integration to synchronously call a web endpoint that is hosted on an EC2 instance.


Expert Solution
Questions # 110:

A company is migrating a distributed application to AWS. The application serves variable workloads. The legacy platform consists of a primary server that coordinates jobs across multiple compute nodes. The company wants to modernize the application with a solution that maximizes resiliency and scalability.

How should a solutions architect design the architecture to meet these requirements?

Options:

A.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs. Implement the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group. Configure EC2 Auto Scaling to use scheduled scaling.


B.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs. Implement the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group. Configure EC2 Auto Scaling based on the size of the queue.


C.

Implement the primary server and the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group. Configure AWS CloudTrail as a destination for the jobs. Configure EC2 Auto Scaling based on the load on the primary server.


D.

Implement the primary server and the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group. Configure Amazon EventBridge as a destination for the jobs. Configure EC2 Auto Scaling based on the load on the compute nodes.


Expert Solution
Questions # 111:

A company wants to standardize its Amazon Elastic Block Store (Amazon EBS) volume encryption strategy. The company also wants to minimize the cost and configuration effort required to operate the volume encryption check.

Which solution will meet these requirements?

Options:

A.

Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Use Amazon EventBridge to schedule an AWS Lambda function to run the API calls.


B.

Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Run the API calls on an AWS Fargate task.


C.

Create an AWS Identity and Access Management (IAM) policy that requires the use of tags on EBS volumes. Use AWS Cost Explorer to display resources that are not properly tagged. Encrypt the untagged resources manually.


D.

Create an AWS Config rule for Amazon EBS to evaluate if a volume is encrypted and to flag the volume if it is not encrypted.


Expert Solution
Questions # 112:

A company runs several websites on AWS for its different brands Each website generates tens of gigabytes of web traffic logs each day. A solutions architect needs to design a scalable solution to give the company's developers the ability to analyze traffic patterns across all the company's websites. This analysis by the developers will occur on demand once a week over the course of several months. The solution must support queries with standard SQL.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Store the logs in Amazon S3. Use Amazon Athena for analysis.


B.

Store the logs in Amazon RDS. Use a database client for analysis.


C.

Store the logs in Amazon OpenSearch Service. Use OpenSearch Service for analysis.


D.

Store the logs in an Amazon EMR cluster. Use a supported open-source framework for SQL-based analysis.


Expert Solution
Questions # 113:

A finance company uses an on-premises search application to collect streaming data from various producers. The application provides real-time updates to search and visualization features. The company is planning to migrate to AWS and wants to use an AWS native solution. Which solution will meet these requirements?

Options:

A.

Use Amazon EC2 instances to ingest and process the data streams to Amazon S3 buckets for storage. Use Amazon Athena to search the data. Use Amazon Managed Grafana to create visualizations.


B.

Use Amazon EMR to ingest and process the data streams to Amazon Redshift for storage. Use Amazon Redshift Spectrum to search the data. Use Amazon QuickSight to create visualizations.


C.

Use Amazon Elastic Kubernetes Service (Amazon EKS) to ingest and process the data streams to Amazon DynamoDB for storage. Use Amazon CloudWatch to create graphical dashboards to search and visualize the data.


D.

Use Amazon Kinesis Data Streams to ingest and process the data streams to Amazon OpenSearch Service. Use OpenSearch Service to search the data. Use Amazon QuickSight to create visualizations.


Expert Solution
Questions # 114:

A company has a website that handles dynamic traffic loads. The website architecture is based on Amazon EC2 instances in an Auto Scaling group that is configured to use scheduled scaling. Each EC2 instance runs code from an Amazon Elastic File System (Amazon EFS) volume and stores shared data back to the same volume.

The company wants to optimize costs for the website.

Which solution will meet this requirement?

Options:

A.

Reconfigure the Auto Scaling group to set a desired number of instances. Turn off scheduled scaling.


B.

Create a new launch template version for the Auto Scaling group that uses larger EC2 instances.


C.

Reconfigure the Auto Scaling group to use a target tracking scaling policy.


D.

Replace the EFS volume with instance store volumes.


Expert Solution
Questions # 115:

A company runs an application on Amazon EC2 instances. The application is deployed in private subnets in three Availability Zones of the us-east-1 Region. The instances must be able to connect to the internet to download files. The company wants a design that is highly available across the Region.

Which solution should be implemented to ensure that there are no disruptions to internet connectivity?

Options:

A.

Deploy a NAT instance in a private subnet of each Availability Zone.


B.

Deploy a NAT gateway in a public subnet of each Availability Zone.


C.

Deploy a transit gateway in a private subnet of each Availability Zone.


D.

Deploy an internet gateway in a public subnet of each Availability Zone.


Expert Solution
Questions # 116:

A company has an employee web portal. Employees log in to the portal to view payroll details. The company is developing a new system to give employees the ability to upload scanned documents for reimbursement. The company runs a program to extract text-based data from the documents and attach the extracted information to each employee's reimbursement IDs for processing.

The employee web portal requires 100% uptime. The document extract program runs infrequently throughout the day on an on-demand basis. The company wants to build a scalable and cost-effective new system that will require minimal changes to the existing web portal. The company does not want to make any code changes.

Which solution will meet these requirements with the LEAST implementation effort?

Options:

A.

Run Amazon EC2 On-Demand Instances in an Auto Scaling group for the web portal. Use an AWS Lambda function to run the document extract program. Invoke the Lambda function when an employee uploads a new reimbursement document.


B.

Run Amazon EC2 Spot Instances in an Auto Scaling group for the web portal. Run the document extract program on EC2 Spot Instances Start document extract program instances when an employee uploads a new reimbursement document.


C.

Purchase a Savings Plan to run the web portal and the document extract program. Run the web portal and the document extract program in an Auto Scaling group.


D.

Create an Amazon S3 bucket to host the web portal. Use Amazon API Gateway and an AWS Lambda function for the existing functionalities. Use the Lambda function to run the document extract program. Invoke the Lambda function when the API that is associated with a new document upload is called.


Expert Solution
Questions # 117:

A company uses Amazon RDS (or PostgreSQL to run its applications in the us-east-1 Region. The company also uses machine learning (ML) models to forecast annual revenue based on neat real-time reports. The reports are generated by using the same RDS for PostgreSQL database. The database performance slows during business hours. The company needs to improve database performance.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create a cross-Region read replica. Configure the reports to be generated from the read replica.


B.

Activate Multi-AZ DB instance deployment for RDS for PostgreSQL. Configure the reports to be generated from the standby database.


C.

Use AWS Data Migration Service (AWS DMS) to logically replicate data lo a new database. Configure the reports to be generated from the new database.


D.

Create a read replica in us-east-1. Configure the reports to be generated from the read replica.


Expert Solution
Questions # 118:

A solutions architect is investigating compute options for a critical analytics application. The application uses long-running processes to prepare and aggregate data. The processes cannot be interrupted. The application has a known baseline load. The application needs to handle occasional usage surges.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an Amazon EC2 Auto Scaling group. Set the Min capacity and Desired capacity parameters to the number of instances required to handle the baseline load. Purchase Reserved Instances for the Auto Scaling group.


B.

Create an Amazon EC2 Auto Scaling group. Set the Min capacity, Max capacity, and Desired capacity parameters to the number of instances required to handle the baseline load. Use On-Demand Instances to address occasional usage surges.


C.

Create an Amazon EC2 Auto Scaling group. Set the Min capacity and Desired capacity parameters to the number of instances required to handle the baseline load. Purchase Reserved Instances for the Auto Scaling group. Use the OnDemandPercentageAboveBaseCapacity parameter to configure the launch template to launch Spot Instances.


D.

Re-architect the application to use AWS Lambda functions instead of Amazon EC2 instances. Purchase a one-year Compute Savings Plan to reduce the cost of Lambda usage.


Expert Solution
Questions # 119:

A company needs to provide a team of contractors with temporary access to the company's AWS resources for a short-term project. The contractors need different levels of access to AWS services. The company needs to revoke permissions for all the contractors when the project is finished.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS IAM to create a user account for each contractor. Attach policies that define access levels for the contractors to the user accounts. Manually deactivate the accounts when the project is finished.


B.

Use AWS Security Token Service (AWS STS) to generate temporary credentials for the contractors. Provide the contractors access based on predefined roles. Set the access to automatically expire when the project is finished.


C.

Configure AWS Config rules to monitor the contractors' access patterns. Use AWS Config rules to automatically revoke permissions that are not in use or that are too permissive.


D.

Use AWS CloudTrail and custom Amazon EventBridge triggers to audit the contractors' actions. Adjust the permissions for each contractor based on activity logs.


Expert Solution
Questions # 120:

A company stores data in Amazon S3. According to regulations, the data must not contain personally identifiable information (PII). The company recently discovered that S3 buckets have some objects that contain PII. The company needs to automatically detect PII in S3 buckets and to notify the company's security team. Which solution will meet these requirements?

Options:

A.

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData event type from Macie findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.


B.

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.


C.

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData:S3Object/Personal event type from Macie findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.


D.

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.


Expert Solution
Viewing page 8 out of 12 pages
Viewing questions 106-120 out of questions