Spring Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Google Google Cloud Certified Professional-Cloud-Architect Questions and answers with CertsForce

Viewing page 2 out of 7 pages
Viewing questions 11-20 out of questions
Questions # 11:

For this question, refer to the Cymbal Retail case study. Cymbal's generative Al models require high-performance storage for temporary files generated during model training and inference. These files are ephemeral and frequently accessed and modified You need to select a storage solution that minimizes latency and cost and maximizes performance for generative Al workloads. What should you do?

Options:

A.

Use a Cloud Storage bucket in the same region as your virtual machines Configure lifecycle policies to delete files after processing


B.

Use Filestore to store temporary files


C.

Use performance persistent disks.


D.

Use Local SSDs attached to the VMs running the generative Al models


Expert Solution
Questions # 12:

For this question, refer to the TerramEarth case study. To be compliant with European GDPR regulation, TerramEarth is required to delete data generated from its European customers after a period of 36 months when it contains personal data. In the new architecture, this data will be stored in both Cloud Storage and BigQuery. What should you do?

Options:

A.

Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.


B.

Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action when with an Age condition of 36 months.


C.

Create a BigQuery time-partitioned table for the European data, and set the partition expiration period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.


D.

Create a BigQuery time-partitioned table for the European data, and set the partition period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action with an Age condition of 36 months.


Expert Solution
Questions # 13:

TerramEarth has about 1 petabyte (PB) of vehicle testing data in a private data center. You want to move the data to Cloud Storage for your machine learning team. Currently, a 1-Gbps interconnect link is available for you. The machine learning team wants to start using the data in a month. What should you do?

Options:

A.

Request Transfer Appliances from Google Cloud, export the data to appliances, and return the appliances to Google Cloud.


B.

Configure the Storage Transfer service from Google Cloud to send the data from your data center to Cloud Storage


C.

Make sure there are no other users consuming the 1 Gbps link, and use multi-thread transfer to upload the data to Cloud Storage.


D.

Export files to an encrypted USB device, send the device to Google Cloud, and request an import of the data to Cloud Storage


Expert Solution
Questions # 14:

For this question, refer to the TerramEarth case study. A new architecture that writes all incoming data to

BigQuery has been introduced. You notice that the data is dirty, and want to ensure data quality on an

automated daily basis while managing cost.

What should you do?

Options:

A.

Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.


B.

Create a Cloud Function that reads data from BigQuery and cleans it. Trigger it. Trigger the Cloud Function from a Compute Engine instance.


C.

Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.


D.

Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.


Expert Solution
Questions # 15:

You are migrating a Linux-based application from your private data center to Google Cloud. The TerramEarth security team sent you several recent Linux vulnerabilities published by Common Vulnerabilities and Exposures (CVE). You need assistance in understanding how these vulnerabilities could impact your migration. What should you do?

Options:

A.

Open a support case regarding the CVE and chat with the support engineer.


B.

Read the CVEs from the Google Cloud Status Dashboard to understand the impact.


C.

Read the CVEs from the Google Cloud Platform Security Bulletins to understand the impact


D.

Post a question regarding the CVE in Stack Overflow to get an explanation


E.

Post a question regarding the CVE in a Google Cloud discussion group to get an explanation


Expert Solution
Questions # 16:

For this question, refer to the TerramEarth case study. TerramEarth has decided to store data files in Cloud Storage. You need to configure Cloud Storage lifecycle rule to store 1 year of data and minimize file storage cost.

Which two actions should you take?

Options:

A.

Create a Cloud Storage lifecycle rule with Age: “30”, Storage Class: “Standard”, and Action: “Set to Coldline”, and create a second GCS life-cycle rule with Age: “365”, Storage Class: “Coldline”, and Action: “Delete”.


B.

Create a Cloud Storage lifecycle rule with Age: “30”, Storage Class: “Coldline”, and Action: “Set to Nearline”, and create a second GCS life-cycle rule with Age: “91”, Storage Class: “Coldline”, and Action: “Set to Nearline”.


C.

Create a Cloud Storage lifecycle rule with Age: “90”, Storage Class: “Standard”, and Action: “Set to Nearline”, and create a second GCS life-cycle rule with Age: “91”, Storage Class: “Nearline”, and Action: “Set to Coldline”.


D.

Create a Cloud Storage lifecycle rule with Age: “30”, Storage Class: “Standard”, and Action: “Set to Coldline”, and create a second GCS life-cycle rule with Age: “365”, Storage Class: “Nearline”, and Action: “Delete”.


Expert Solution
Questions # 17:

You have broken down a legacy monolithic application into a few containerized RESTful microservices. You want to run those microservices on Cloud Run. You also want to make sure the services are highly available with low latency to your customers. What should you do?

Options:

A.

Deploy Cloud Run services to multiple availability zones. Create Cloud Endpoints that point to the services. Create a global HTIP(S) Load Balancing instance and attach the Cloud Endpoints to its backend.


B.

Deploy Cloud Run services to multiple regions Create serverless network endpoint groups pointing to the services. Add the serverless NE Gs to a backend service that is used by a global HTIP(S) Load Balancing instance.


C.

Cloud Run services to multiple regions. In Cloud DNS, create a latency-based DNS name that points to the services.


D.

Deploy Cloud Run services to multiple availability zones. Create a TCP/IP global load balancer. Add the Cloud Run Endpoints to its backend service.


Expert Solution
Questions # 18:

For this question, refer to the TerramEarth case study. You need to implement a reliable, scalable GCP solution for the data warehouse for your company, TerramEarth. Considering the TerramEarth business and technical requirements, what should you do?

Options:

A.

Replace the existing data warehouse with BigQuery. Use table partitioning.


B.

Replace the existing data warehouse with a Compute Engine instance with 96 CPUs.


C.

Replace the existing data warehouse with BigQuery. Use federated data sources.


D.

Replace the existing data warehouse with a Compute Engine instance with 96 CPUs. Add an additional Compute Engine pre-emptible instance with 32 CPUs.


Expert Solution
Questions # 19:

For this question, refer to the TerramEarth case study. You are asked to design a new architecture for the

ingestion of the data of the 200,000 vehicles that are connected to a cellular network. You want to follow

Google-recommended practices.

Considering the technical requirements, which components should you use for the ingestion of the data?

Options:

A.

Google Kubernetes Engine with an SSL Ingress


B.

Cloud IoT Core with public/private key pairs


C.

Compute Engine with project-wide SSH keys


D.

Compute Engine with specific SSH keys


Expert Solution
Questions # 20:

For this question, refer to the EHR Healthcare case study. EHR has single Dedicated Interconnect

connection between their primary data center and Googles network. This connection satisfies

EHR’s network and security policies:

• On-premises servers without public IP addresses need to connect to cloud resources

without public IP addresses

• Traffic flows from production network mgmt. servers to Compute Engine virtual

machines should never traverse the public internet.

You need to upgrade the EHR connection to comply with their requirements. The new

connection design must support business critical needs and meet the same network and

security policy requirements. What should you do?

Options:

A.

Add a new Dedicated Interconnect connection


B.

Upgrade the bandwidth on the Dedicated Interconnect connection to 100 G


C.

Add three new Cloud VPN connections


D.

Add a new Carrier Peering connection


Expert Solution
Viewing page 2 out of 7 pages
Viewing questions 11-20 out of questions