Pass the Google Google Cloud Certified Associate-Cloud-Engineer Questions and answers with CertsForce

Viewing page 7 out of 10 pages
Viewing questions 61-70 out of questions
Questions # 61:

You have a number of applications that have bursty workloads and are heavily dependent on topics to decouple publishing systems from consuming systems. Your company would like to go serverless to enable developers to focus on writing code without worrying about infrastructure. Your solution architect has already identified Cloud Pub/Sub as a suitable alternative for decoupling systems. You have been asked to identify a suitable GCP Serverless service that is easy to use with Cloud Pub/Sub. You want the ability to scale down to zero when there is no traffic in order to minimize costs. You want to follow Google recommended practices. What should you suggest?

Options:

A.

Cloud Run for Anthos


B.

Cloud Run


C.

App Engine Standard


D.

Cloud Functions.


Expert Solution
Questions # 62:

You received a JSON file that contained a private key of a Service Account in order to get access to several resources in a Google Cloud project. You downloaded and installed the Cloud SDK and want to use this private key for authentication and authorization when performing gcloud commands. What should you do?

Options:

A.

Use the command gcloud auth login and point it to the private key


B.

Use the command gcloud auth activate-service-account and point it to the private key


C.

Place the private key file in the installation directory of the Cloud SDK and rename it to "credentials ison"


D.

Place the private key file in your home directory and rename it to ‘’GOOGLE_APPUCATION_CREDENTiALS".


Expert Solution
Questions # 63:

You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google- recommended practices to obtain the combined logs for all projects. What should you do?

Options:

A.

Navigate to Stackdriver Logging and select resource.labels.project_id="*"


B.

Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.


C.

Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.


D.

Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.


Expert Solution
Questions # 64:

You have a Google Cloud Platform account with access to both production and development projects. You need to create an automated process to list all compute instances in development and production projects on a daily basis. What should you do?

Options:

A.

Create two configurations using gcloud config. Write a script that sets configurations as active, individually. For each configuration, use gcloud compute instances list to get a list of compute resources.


B.

Create two configurations using gsutil config. Write a script that sets configurations as active, individually. For each configuration, use gsutil compute instances list to get a list of compute resources.


C.

Go to Cloud Shell and export this information to Cloud Storage on a daily basis.


D.

Go to GCP Console and export this information to Cloud SQL on a daily basis.


Expert Solution
Questions # 65:

You have a number of compute instances belonging to an unmanaged instances group. You need to SSH to one of the Compute Engine instances to run an ad hoc script. You’ve already authenticated gcloud, however, you don’t have an SSH key deployed yet. In the fewest steps possible, what’s the easiest way to SSH to the instance?

Options:

A.

Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.


B.

Use the gcloud compute ssh command.


C.

Create a key with the ssh-keygen command. Then use the gcloud compute ssh command.


D.

Create a key with the ssh-keygen command. Upload the key to the instance. Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.


Expert Solution
Questions # 66:

You just installed the Google Cloud CLI on your new corporate laptop. You need to list the existing instances of your company on Google Cloud. What must you do before you run the gcloud compute instances list command?

Choose 2 answers

Options:

A.

Run gcloud auth login, enter your login credentials in the dialog window, and paste the received login token to gcloud CLI.


B.

Create a Google Cloud service account, and download the service account key. Place the key file in a folder on your machine where gcloud CLI can find it.


C.

Download your Cloud Identity user account key. Place the key file in a folder on your machine where gcloud CLI can find it.


D.

Run gcloud config set compute/zone $my_zone to set the default zone for gcloud CLI.


E.

Run gcloud config set project $my_project to set the default project for gcloud CLI.


Expert Solution
Questions # 67:

You have an on-premises data analytics set of binaries that processes data files in memory for about 45 minutes every midnight. The sizes of those data files range from 1 gigabyte to 16 gigabytes. You want to migrate this application to Google Cloud with minimal effort and cost. What should you do?

Options:

A.

Upload the code to Cloud Functions. Use Cloud Scheduler to start the application.


B.

Create a container for the set of binaries. Use Cloud Scheduler to start a Cloud Run job for the container.


C.

Create a container for the set of binaries Deploy the container to Google Kubernetes Engine (GKE) and use the Kubernetes scheduler to start the application.


D.

Lift and shift to a VM on Compute Engine. Use an instance schedule to start and stop the instance.


Expert Solution
Questions # 68:

Your company has multiple projects linked to a single billing account in Google Cloud. You need to visualize the costs with specific metrics that should be dynamically calculated based on company-specific criteria. You want to automate the process. What should you do?

Options:

A.

In the Google Cloud console, visualize the costs related to the projects in the Reports section.


B.

In the Google Cloud console, visualize the costs related to the projects in the Cost breakdown section.


C.

In the Google Cloud console, use the export functionality of the Cost table. Create a Looker Studiodashboard on top of the CSV export.


D.

Configure Cloud Billing data export to BigOuery for the billing account. Create a Looker Studio dashboard on top of the BigQuery export.


Expert Solution
Questions # 69:

Your application development team has created Docker images for an application that will be deployed on Google Cloud. Your team does not want to manage the infrastructure associated with this application. You need to ensure that the application can scale automatically as it gains popularity. What should you do?

Options:

A.

Create an Instance template with the container image, and deploy a Managed Instance Group withAutoscaling.


B.

Upload Docker images to Artifact Registry, and deploy the application on Google Kubernetes Engine usingStandard mode.


C.

Upload Docker images to the Cloud Storage, and deploy the application on Google Kubernetes Engine usingStandard mode.


D.

Upload Docker images to Artifact Registry, and deploy the application on Cloud Run.


Expert Solution
Questions # 70:

You manage an App Engine Service that aggregates and visualizes data from BigQuery. The application is deployed with the default App Engine Service account. The data that needs to be visualized resides in a different project managed by another team. You do not have access to this project, but you want your application to be able to read data from the BigQuery dataset. What should you do?

Options:

A.

Ask the other team to grant your default App Engine Service account the role of BigQuery Job User.


B.

Ask the other team to grant your default App Engine Service account the role of BigQuery Data Viewer.


C.

In Cloud IAM of your project, ensure that the default App Engine service account has the role of BigQuery Data Viewer.


D.

In Cloud IAM of your project, grant a newly created service account from the other team the role of BigQuery Job User in your project.


Expert Solution
Viewing page 7 out of 10 pages
Viewing questions 61-70 out of questions