Pre-Summer Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: force70

Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 7 out of 14 pages
Viewing questions 121-140 out of questions
Questions # 121:

A company is building a serverless application to process clickstream data from its website. The clickstream data is sent to an Amazon Kinesis Data Streams data stream from the application web servers.

The company wants to enrich the clickstream data by joining the clickstream data with customer profile data from an Amazon Aurora Multi-AZ database. The company wants to use Amazon Redshift to analyze the enriched data. The solution must be highly available.

Which solution will meet these requirements?

Options:

A.

Use an AWS Lambda function to process and enrich the clickstream data. Use the same Lambda function to write the clickstream data to Amazon S3. Use Amazon Redshift Spectrum to query the enriched data in Amazon S3.


B.

Use an Amazon EC2 Spot Instance to poll the data stream and enrich the clickstream data. Configure the EC2 instance to use the COPY command to send the enriched results to Amazon Redshift.


C.

Use an Amazon Elastic Container Service (Amazon ECS) task with AWS Fargate Spot capacity to poll the data stream and enrich the clickstream data. Configure an Amazon EC2 instance to use the COPY command to send the enriched results to Amazon Redshift.


D.

Use Amazon Kinesis Data Firehose to load the clickstream data from Kinesis Data Streams to Amazon S3. Use AWS Glue crawlers to infer the schema and populate the AWS Glue Data Catalog. Use Amazon Athena to query the raw data in Amazon S3.


Expert Solution
Questions # 122:

A company maintains a data lake in an Amazon S3 bucket. The company needs to onboard multiple vendors who need to access the data lake. Each vendor has its own AWS account and requires access to separate datasets in the data lake.

The company needs a secure and scalable solution to provide the vendors with access to the data that each vendor requires. The solution must log all vendor activities for auditing.

Which solution will meet these requirements in the MOST scalable way?

Options:

A.

Create an IAM role and IAM policy for each vendor. Use cross-account resource sharing to share the appropriate IAM role ARN with each vendor. Instruct each vendor to use the IAM role ARN to access the data lake from a resource in the vendor ' s AWS account. Set up S3 server access logging for the S3 bucket.


B.

Create an IAM user for each vendor. Use an IAM policy to grant access to the S3 data lake. Share the user credentials for each IAM user with each vendor. Set up S3 server access logging for the S3 bucket.


C.

Deploy AWS IAM Identity Center. Create a user account for each vendor. Create S3 Access Grants for each vendor that have the required permissions.


D.

Create an S3 presigned URL for each vendor that has the required permissions. Share the appropriate URL with each vendor to access the S3 bucket. Configure AWS CloudTrail logs to collect access logs for the S3 bucket.


Expert Solution
Questions # 123:

A company has a VPC with multiple private subnets that host multiple applications. The applications must not be accessible to the internet. However, the applications need to access multiple AWS services. The applications must not use public IP addresses to access the AWS services.

Options:

A.

Configure interface VPC endpoints for the required AWS services. Route traffic from the private subnets through the interface VPC endpoints.


B.

Deploy a NAT gateway in each private subnet. Route traffic from the private subnets through the NAT gateways.


C.

Deploy internet gateways in each private subnet. Route traffic from the private subnets through the internet gateways.


D.

Set up an AWS Direct Connect connection between the private subnets. Route traffic from the private subnets through the Direct Connect connection.


Expert Solution
Questions # 124:

A company runs an application on Amazon EC2 instances across multiple Availability Zones in the same AWS Region. The EC2 instances share an Amazon Elastic File System (Amazon EFS) volume that is mounted on all the instances. The EFS volume stores a variety of files such as installation media, third-party files, interface files, and other one-time files.

The company accesses some EFS files frequently and needs to retrieve the files quickly. The company accesses other files rarely. The EFS volume is multiple terabytes in size. The company needs to optimize storage costs for Amazon EFS.

Which solution will meet these requirements with the LEAST effort?

Options:

A.

Move the files to Amazon S3. Set up a lifecycle policy to move the files to S3 Glacier Flexible Retrieval.


B.

Apply a lifecycle policy to the EFS files to move the files to EFS Infrequent Access.


C.

Move the files to Amazon Elastic Block Store (Amazon EBS) Cold HDD Volumes (sc1).


D.

Move the files to Amazon S3. Set up a lifecycle policy to move the rarely-used files to S3 Glacier Deep Archive.


Expert Solution
Questions # 125:

A company ' s SAP application has a backend SQL Server database in an on-premises environment. The company wants to migrate its on-premises application and database server to AWS. The company needs an instance type that meets the high demands of its SAP database. On-premises performance data shows that both the SAP application and the database have high memory utilization.

Which solution will meet these requirements?

Options:

A.

Use the compute optimized Instance family for the application Use the memory optimized instance family for the database.


B.

Use the storage optimized instance family for both the application and the database


C.

Use the memory optimized instance family for both the application and the database


D.

Use the high performance computing (HPC) optimized instance family for the application. Use the memory optimized instance family for the database.


Expert Solution
Questions # 126:

A company has a production Amazon RDS for MySQL database. The company needs to create a new application that will read frequently changing data from the database with minimal impact on the database ' s overall performance. The application will rarely perform the same query more than once.

What should a solutions architect do to meet these requirements?

Options:

A.

Set up an Amazon ElastiCache cluster. Query the results in the cluster.


B.

Set up an Application Load Balancer ALB. Query the results in the ALB.


C.

Set up a read replica for the database. Query the read replica.


D.

Set up querying of database snapshots. Query the database snapshots.


Expert Solution
Questions # 127:

A company uses Amazon Elastic Container Service (Amazon ECS) to run workloads that belong to service teams. Each service team uses an owner tag to specify the ECS containers that the team owns. The company wants to generate an AWS Cost Explorer report that shows how much each service team spends on ECS containers on a monthly basis.

Which combination of steps will meet these requirements in the MOST operationally efficient way? (Select TWO.)

Options:

A.

Create a custom report in Cost Explorer. Apply a filter for Amazon ECS.


B.

Create a custom report in Cost Explorer. Apply a filter for the owner resource tag.


C.

Set up AWS Compute Optimizer. Review the rightsizing recommendations.


D.

Activate the owner tag as a cost allocation tag. Group the Cost Explorer report by linked account.


E.

Activate the owner tag as a cost allocation tag. Group the Cost Explorer report by the owner cost allocation tag.


Expert Solution
Questions # 128:

A company runs an application on premises. The application needs to periodically upload large files to an Amazon S3 bucket. A solutions architect needs a solution to provide the application with short-lived authenticated access to the S3 bucket. The solution must not use long-term credentials. The solution needs to be secure and scalable.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an IAM user that has an access key and a secret key. Store the keys on the on-premises server in an environment variable. Attach a policy to the IAM user that restricts access to only the S3 bucket.


B.

Configure an AWS Site-to-Site VPN connection from the on-premises environment to the company ' s VPC. Launch an Amazon EC2 instance with an instance profile. Route all file uploads from the on-premises application through the EC2 instance to the S3 bucket.


C.

Configure an S3 bucket policy to allow access for the on-premises server ' s public IP address. Configure the policy to allow PUT operations only from the server ' s IP address.


D.

Configure a trust relationship between the on-premises server and AWS Security Token Service (AWS STS). Generate credentials by assuming an IAM role for each upload operation.


Expert Solution
Questions # 129:

A company is designing an IPv6 application that is hosted on Amazon EC2 instances in a private subnet within a VPC. The application will store user-uploaded content in Amazon S3 buckets. The application will save each S3 object ' s URL link and metadata in Amazon DynamoDB.

The company must not use public internet connections to transmit user-uploaded content or metadata.

Which solution will meet these requirements?

Options:

A.

Implement a gateway VPC endpoint for Amazon S3 and an interface VPC endpoint for Amazon DynamoDB.


B.

Implement interface VPC endpoints for both Amazon S3 and Amazon DynamoDB.


C.

Implement gateway VPC endpoints for both Amazon S3 and Amazon DynamoDB.


D.

Implement a gateway VPC endpoint for Amazon DynamoDB and an interface VPC endpoint for Amazon S3.


Expert Solution
Questions # 130:

A company runs a workload in an AWS Region. Users connect to the workload by using an Amazon API Gateway REST API.

The company uses Amazon Route 53 as its DNS provider and has created a Route 53 Hosted Zone. The company wants to provide unique and secure URLs for all workload users.

Which combination of steps will meet these requirements with the MOST operational efficiency? (Select THREE.)

Options:

A.

Create a wildcard custom domain name in the Route 53 hosted zone as an alias for the API Gateway endpoint.


B.

Use AWS Certificate Manager (ACM) to request a wildcard certificate that matches the custom domain in a second Region.


C.

Create a hosted zone for each user in Route 53. Create zone records that point to the API Gateway endpoint.


D.

Use AWS Certificate Manager (ACM) to request a wildcard certificate that matches the custom domain name in the same Region.


E.

Use API Gateway to create multiple API endpoints for each user.


F.

Create a custom domain name in API Gateway for the REST API. Import the certificate from AWS Certificate Manager (ACM).


Expert Solution
Questions # 131:

A retail company is building an order fulfillment system using a microservices architecture on AWS. The system must store incoming orders durably until processing completes successfully. Multiple teams’ services process orders according to a defined workflow. Services must be scalable, loosely coupled, and able to handle sudden surges in order volume. The processing steps of each order must be centrally tracked.

Which solution will meet these requirements?

Options:

A.

Send incoming orders to an Amazon Simple Notification Service (Amazon SNS) topic. Start an AWS Step Functions workflow for each order that orchestrates the microservices. Use AWS Lambda functions for each microservice.


B.

Send incoming orders to an Amazon Simple Queue Service (Amazon SQS) queue. Start an AWS Step Functions workflow for each order that orchestrates the microservices. Use AWS Lambda functions for each microservice.


C.

Send incoming orders to an Amazon Simple Queue Service (Amazon SQS) queue. Use Amazon EventBridge to distribute events among the microservices. Use AWS Lambda functions for each microservice.


D.

Send incoming orders to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe Amazon EventBridge to the topic to distribute events among the microservices. Use AWS Lambda functions for each microservice.


Expert Solution
Questions # 132:

A marketing team wants to build a campaign for an upcoming multi-sport event. The team has news reports from the past five years in PDF format. The team needs a solution to extract insights about the content and the sentiment of the news reports. The solution must use Amazon Textract to process the news reports.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Provide the extracted insights to Amazon Athena for analysis Store the extracted insights and analysis in an Amazon S3 bucket.


B.

Store the extracted insights in an Amazon DynamoDB table. Use Amazon SageMaker to build a sentiment model.


C.

Provide the extracted insights to Amazon Comprehend for analysis. Save the analysis to an Amazon S3 bucket.


D.

Store the extracted insights in an Amazon S3 bucket. Use Amazon QuickSight to visualize and analyze the data.


Expert Solution
Questions # 133:

A company wants to standardize its Amazon Elastic Block Store (Amazon EBS) volume encryption strategy. The company also wants to minimize the cost and configuration effort required to operate the volume encryption check.

Which solution will meet these requirements?

Options:

A.

Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Use Amazon EventBridge to schedule an AWS Lambda function to run the API calls.


B.

Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Run the API calls on an AWS Fargate task.


C.

Create an AWS Identity and Access Management (IAM) policy that requires the use of tags on EBS volumes. Use AWS Cost Explorer to display resources that are not properly tagged. Encrypt the untagged resources manually.


D.

Create an AWS Config rule for Amazon EBS to evaluate if a volume is encrypted and to flag the volume if it is not encrypted.


Expert Solution
Questions # 134:

A company hosts an application on Amazon EC2 On-Demand Instances in an Auto Scaling group. Application peak hours occur at the same time each day. Application users experience slow application performance at the start of peak hours. The application performs normally 2–3 hours after peak hours begin. The company wants to ensure that the application works properly at the start of peak hours.

Which solution will meet these requirements?

Options:

A.

Configure an Application Load Balancer to distribute traffic properly to the instances.


B.

Configure a dynamic scaling policy for the Auto Scaling group to launch new instances based on memory utilization.


C.

Configure a dynamic scaling policy for the Auto Scaling group to launch new instances based on CPU utilization.


D.

Configure a scheduled scaling policy for the Auto Scaling group to launch new instances before peak hours.


Expert Solution
Questions # 135:

A healthcare company is designing a system to store and manage logs in the AWS Cloud. The system ingests and stores logs in JSON format that contain sensitive patient information. The company must identify any sensitive data and must be able to search the log data by using SQL queries.

Which solution will meet these requirements?

Options:

A.

Store the logs in an Amazon S3 bucket. Configure Amazon Macie to discover sensitive data. Use Amazon Athena to query the logs.


B.

Store the logs in an Amazon EBS volume. Create an application that uses Amazon SageMaker AI to detect sensitive data. Use Amazon RDS to query the logs.


C.

Store the logs in Amazon DynamoDB. Use AWS KMS to discover sensitive data. Use Amazon Redshift Spectrum to query the logs.


D.

Store the logs in an Amazon S3 bucket. Use Amazon Inspector to discover sensitive data. Use Amazon Athena to query the logs.


Expert Solution
Questions # 136:

A company ' s data platform uses an Amazon Aurora MySQL database. The database has multiple read replicas and multiple DB instances across different Availability Zones. Users have recently reported errors from the database that indicate that there are too many connections. The company wants to reduce the failover time by 20% when a read replica is promoted to primary writer.

Which solution will meet this requirement?

Options:

A.

Switch from Aurora to Amazon RDS with Multi-AZ cluster deployment.


B.

Use Amazon RDS Proxy in front of the Aurora database.


C.

Switch to Amazon DynamoDB with DynamoDB Accelerator DAX for read connections.


D.

Switch to Amazon Redshift with relocation capability.


Expert Solution
Questions # 137:

A company runs a web application in an Amazon EC2 Auto Scaling group. The application runs during business hours only. The company cannot allow interruptions to the application during business hours.

The company wants to optimize compute costs for the application based on the application ' s usage pattern.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.

Manually terminate the instances during non-business hours. Manually launch new instances during business hours.


B.

Create a scheduled scaling policy for the Auto Scaling group. Configure the policy to scale out during business hours and to scale in during non-business hours.


C.

Use Amazon EC2 Spot Instances in the Auto Scaling group.


D.

Purchase Amazon EC2 Reserved Instances on a 1-year term to handle the maximum expected load for the Auto Scaling group.


Expert Solution
Questions # 138:

A company runs a web application that stores user-generated images. The application currently stores 500 GB of images. The average file size of the images is 2 MB. The company expects the total amount of images to grow to 2 TB within 6 months. The application needs to serve all stored images with low latency to users from around the world.

Which storage solution will meet these requirements MOST cost-effectively?

Options:

A.

Store images in Amazon EBS volumes that are attached to multiple Amazon EC2 instances across multiple AWS Regions. Serve the content locally based on each user ' s location.


B.

Store images in an Amazon S3 bucket. Integrate the S3 bucket with Amazon CloudFront. Integrate the web application with a CloudFront endpoint to provide global access.


C.

Store the images in an Amazon EFS file system. Use the Standard storage class with Regional access. Enable cross-Region replication to provide high availability and global access.


D.

Deploy Amazon FSx for Windows File Server in two AWS Regions. Set up Windows File Server Replication across Regions to provide global access.


Expert Solution
Questions # 139:

A company stores data in Amazon S3. According to regulations, the data must not contain personally identifiable information (PII). The company recently discovered that S3 buckets have some objects that contain PII. The company needs to automatically detect PII in S3 buckets and to notify the company ' s security team. Which solution will meet these requirements?

Options:

A.

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData event type from Macie findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.


B.

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.


C.

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData:S3Object/Personal event type from Macie findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.


D.

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.


Expert Solution
Questions # 140:

A weather forecasting company needs to process hundreds of gigabytes of data with sub-millisecond latency. The company has a high performance computing (HPC) environment in its data center and wants to expand its forecasting capabilities.

A solutions architect must identify a highly available cloud storage solution that can handle large amounts of sustained throughput Files that are stored in the solution should be accessible to thousands of compute instances that will simultaneously access and process the entire dataset.

What should the solutions architect do to meet these requirements?

Options:

A.

Use Amazon FSx for Lustre scratch file systems


B.

Use Amazon FSx for Lustre persistent file systems.


C.

Use Amazon Elastic File System (Amazon EFS) with Bursting Throughput mode.


D.

Use Amazon Elastic File System (Amazon EFS) with Provisioned Throughput mode.


Expert Solution
Viewing page 7 out of 14 pages
Viewing questions 121-140 out of questions