Pass the Amazon Web Services AWS Certified Associate SAA-C03 Questions and answers with CertsForce

Viewing page 18 out of 18 pages
Viewing questions 341-360 out of questions
Questions # 341:

A law firm needs to make hundreds of files readable for the general public. The law firm must prevent members of the public from modifying or deleting the files before a specified future date. Which solution will meet these requirements MOST securely?

Options:

A.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the specified date.


B.

Create a new Amazon S3 bucket. Enable S3 Versioning. Use S3 Object Lock and set a retention period based on the specified date. Create an Amazon CloudFront distribution to serve content from the bucket. Use an S3 bucket policy to restrict access to the CloudFront origin access control (OAC).


C.

Create a new Amazon S3 bucket. Enable S3 Versioning. Configure an event trigger to run an AWS Lambda function if a user modifies or deletes an object. Configure the Lambda function to replace the modified or deleted objects with the original versions of the objects from a private S3 bucket.


D.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period based on the specified date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.


Expert Solution
Questions # 342:

A company wants to run a hybrid workload for data processing. The data needs to be accessed by on-premises applications for local data processing using an NFS protocol, and must also be accessible from the AWS Cloud for further analytics and batch processing.

Which solution will meet these requirements?

Options:

A.

Use an AWS Storage Gateway file gateway to provide file storage to AWS, then perform analytics on this data in the AWS Cloud.


B.

Use an AWS Storage Gateway tape gateway to copy the backup of the local data to AWS, then perform analytics on this data in the AWS Cloud.


C.

Use an AWS Storage Gateway volume gateway in a stored volume configuration to regularly take snapshots of the local data, then copy the data to AWS.


D.

Use an AWS Storage Gateway volume gateway in a cached volume configuration to back up all the local storage in the AWS Cloud, then perform analytics on this data in the cloud.


Expert Solution
Questions # 343:

A media company has an ecommerce website to sell music. Each music file is stored as an MP3 file. Premium users of the website purchase music files and download the files. The company wants to store music files on AWS. The company wants to provide access only to the premium users. The company wants to use the same URL for all premium users.

Which solution will meet these requirements?

Options:

A.

Store the MP3 files on a set of Amazon EC2 instances that have Amazon Elastic Block Store (Amazon EBS) volumes attached. Manage access to the files by creating an IAM user and an IAM policy for each premium user.


B.

Store all the MP3 files in an Amazon S3 bucket. Create a presigned URL for each MP3 file. Share the presigned URLs with the premium users.


C.

Store all the MP3 files in an Amazon S3 bucket. Create an Amazon CloudFront distribution that uses the S3 bucket as the origin. Generate CloudFront signed cookies for the music files. Share the signed cookies with the premium users.


D.

Store all the MP3 files in an Amazon S3 bucket. Create an Amazon CloudFront distribution that uses the S3 bucket as the origin. Use a CloudFront signed URL for each music file. Share the signed URLs with the premium users.


Expert Solution
Questions # 344:

A company is using an Amazon Redshift cluster to run analytics queries for multiple sales teams. In addition to the typical workload, on the last Monday morning of each month, thousands of users run reports. Users have reported slow response times during the monthly surge.

The company must improve query performance without impacting the availability of the Redshift cluster.

Which solution will meet these requirements?

Options:

A.

Resize the Redshift cluster by using the classic resize capability of Amazon Redshift before every monthly surge. Reduce the cluster to its original size after each surge.


B.

Resize the Redshift cluster by using the elastic resize capability of Amazon Redshift before every monthly surge. Reduce the cluster to its original size after each surge.


C.

Enable the concurrency scaling feature for the Redshift cluster for specific workload management (WLM) queues.


D.

Enable Amazon Redshift Spectrum for the Redshift cluster before every monthly surge.


Expert Solution
Questions # 345:

A company has 5 TB of datasets. The datasets consist of 1 million user profiles and 10 million connections. The user profiles have connections as many-to-many relationships. The company needs a performance-efficient way to find mutual connections up to five levels.

Which solution will meet these requirements?

Options:

A.

Use an Amazon S3 bucket to store the datasets. Use Amazon Athena to perform SQL JOIN queries to find connections.


B.

Use Amazon Neptune to store the datasets with edges and vertices. Query the data to find connections.


C.

Use an Amazon S3 bucket to store the datasets. Use Amazon QuickSight to visualize connections.


D.

Use Amazon RDS to store the datasets with multiple tables. Perform SQL JOIN queries to find connections.


Expert Solution
Questions # 346:

A company is planning to run an AI/ML workload on AWS. The company needs to train a model on a dataset that is in Amazon S3 Standard. A model training application requires multiple compute nodes and single-digit millisecond access to the data.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.

Move the data to S3 Intelligent-Tiering. Point the model training application to S3 Intelligent-Tiering as the data source.


B.

Add partitions to the S3 bucket by adding random prefixes. Reconfigure the model training application to point to the new prefixes as the data source.


C.

Move the data to S3 Express One Zone. Point the model training application to S3 Express One Zone as the data source.


D.

Move the data to a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS)volume attached to an Amazon EC2 instance. Point the model training application to the gp3 volume as the data source.


Expert Solution
Questions # 347:

Which solution will meet these requirements?

Options:

A.

Migrate the database to Amazon Aurora Serverless.


B.

Migrate the database to Amazon RDS for SQL Server.


C.

Migrate the database to Amazon EC2 instances that run SQL Server.


D.

Migrate the database to Amazon Redshift.


Expert Solution
Questions # 348:

Which solution will meet these requirements?

Options:

A.

Migrate the databases to Amazon EC2 instances that use SQL Server Amazon Machine Images (AMIs) provided by AWS.


B.

Migrate to Amazon Aurora PostgreSQL by using Babelfish for Aurora PostgreSQL.


C.

Migrate the databases to a PostgreSQL database that runs on Amazon EC2 instances.


D.

Migrate the databases to Amazon RDS for Microsoft SQL Server.


Expert Solution
Questions # 349:

A company has an application with a REST-based interface that allows data to be received in near-real time from a third-party vendor. Once received, the application processes and stores the data for further analysis. The application is running on Amazon EC2 instances.

The third-party vendor has received many 503 Service Unavailable Errors when sending data to the application. When the data volume spikes, the compute capacity reaches its maximum limit and the application is unable to process all requests.

Which design should a solutions architect recommend to provide a more scalable solution?

Options:

A.

Use Amazon Kinesis Data Streams to ingest the data. Process the data using AWS Lambda functions.


B.

Use Amazon API Gateway on top of the existing application. Create a usage plan with a quota limit for the third-party vendor.


C.

Use Amazon Simple Notification Service (Amazon SNS) to ingest the data. Put the EC2 instances in an Auto Scaling group behind an Application Load Balancer.


D.

Repackage the application as a container. Deploy the application using Amazon Elastic Container Service (Amazon ECS) using the EC2 launch type with an Auto Scaling group.


Expert Solution
Questions # 350:

A company is developing a new application that will run on Amazon EC2 instances. The application needs to access multiple AWS services.

The company needs to ensure that the application will not use long-term access keys to access AWS services.

Options:

A.

Create an IAM user. Assign the IAM user to the application. Create programmatic access keys for the IAM user. Embed the access keys in the application code.


B.

Create an IAM user that has programmatic access keys. Store the access keys in AWS Secrets Manager. Configure the application to retrieve the keys from Secrets Manager when the application runs.


C.

Create an IAM role that can access AWS Systems Manager Parameter Store. Associate the role with each EC2 instance profile. Create IAM access keys for the AWS services, and store the keys in Parameter Store. Configure the application to retrieve the keys from Parameter Store when the application runs.


D.

Create an IAM role that has permissions to access the required AWS services. Associate the IAM role with each EC2 instance profile.


Expert Solution
Viewing page 18 out of 18 pages
Viewing questions 341-360 out of questions