Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Google Google Cloud Certified Professional-Cloud-Security-Engineer Questions and answers with CertsForce

Viewing page 6 out of 10 pages
Viewing questions 51-60 out of questions
Questions # 51:

You need to connect your organization's on-premises network with an existing Google Cloud environment that includes one Shared VPC with two subnets named Production and Non-Production. You are required to:

Use a private transport link.

Configure access to Google Cloud APIs through private API endpoints originating from on-premises environments.

Ensure that Google Cloud APIs are only consumed via VPC Service Controls.

What should you do?

Options:

A.

1. Set up a Cloud VPN link between the on-premises environment and Google Cloud.2. Configure private access using the restricted googleapis.com domains in on-premises DNS configurations.


B.

1. Set up a Partner Interconnect link between the on-premises environment and Google Cloud.2. Configure private access using the private.googleapis.com domains in on-premises DNS configurations.


C.

1. Set up a Direct Peering link between the on-premises environment and Google Cloud.2. Configure private access for both VPC subnets.


D.

1. Set up a Dedicated Interconnect link between the on-premises environment and Google Cloud.2. Configure private access using the restricted.googleapis.com domains in on-premises DNS configurations.


Expert Solution
Questions # 52:

Your organization is worried about recent news headlines regarding application vulnerabilities in production applications that have led to security breaches. You want to automatically scan your deployment pipeline for vulnerabilities and ensure only scanned and verified containers can run in the environment. What should you do?

Options:

A.

Enable Binary Authorization and create attestations of scans.


B.

Use gcloud artifacts docker images describe LOCATION-docker.pkg.dev/PROJECT_ID/REPOSITORY/IMAGE_ID@sha256:HASH --show-package-vulnerability in your CI/CD pipeline, and trigger a pipeline failure for critical vulnerabilities.


C.

Use Kubernetes role-based access control (RBAC) as the source of truth for cluster access by granting "container clusters.get" to limited users. Restrict deployment access by allowing these users to generate a kubeconfig file containing the configuration access to the GKE cluster.


D.

Enforce the use of Cloud Code for development so users receive real-time security feedback on vulnerable libraries and dependencies before they check in their code.


Expert Solution
Questions # 53:

An office manager at your small startup company is responsible for matching payments to invoices and creating billing alerts. For compliance reasons, the office manager is only permitted to have the Identity and Access Management (IAM) permissions necessary for these tasks. Which two IAM roles should the office manager have? (Choose two.)

Options:

A.

Organization Administrator


B.

Project Creator


C.

Billing Account Viewer


D.

Billing Account Costs Manager


E.

Billing Account User


Expert Solution
Questions # 54:

A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities.

Which service should be used to accomplish this?

Options:

A.

Cloud Armor


B.

Google Cloud Audit Logs


C.

Cloud Security Scanner


D.

Forseti Security


Expert Solution
Questions # 55:

You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution:

Must be cloud-native

Must be cost-efficient

Minimize operational overhead

How should you accomplish this? (Choose two.)

Options:

A.

Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue.


B.

Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry.


C.

Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found.


D.

Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster.


E.

In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster.


Expert Solution
Questions # 56:

A centralized security service has been implemented by your company. All applications running in Google Cloud are required to send data to this service. You need to ensure that developers have high autonomy to configure firewall rules within their projects, while preventing accidental blockage of access to the central security service. What should you do?

Options:

A.

Deploy a central Secure Web Proxy and connect it to all VPC networks. Create a Secure Web Proxy policy to allow traffic to the central security service.


B.

Implement a hierarchical firewall policy that prioritizes the central security service by allowing its connections and directing all other traffic to the subsequent firewall level.


C.

Create a central project to manage Shared VPC networks which will be accessible to all other projects. Administer all firewall rules centrally within this project.


D.

Use Terraform to automate the creation of the required firewall rule in all projects. Restrict rule change permissions solely to the Terraform service account.


Expert Solution
Questions # 57:

Your company has been creating users manually in Cloud Identity to provide access to Google Cloud resources. Due to continued growth of the environment, you want to authorize the Google Cloud Directory Sync (GCDS) instance and integrate it with your on-premises LDAP server to onboard hundreds of users. You are required to:

Replicate user and group lifecycle changes from the on-premises LDAP server in Cloud Identity.

Disable any manually created users in Cloud Identity.

You have already configured the LDAP search attributes to include the users and security groups in scope for Google Cloud. What should you do next to complete this solution?

Options:

A.

1. Configure the option to suspend domain users not found in LDAP.2. Set up a recurring GCDS task.


B.

1. Configure the option to delete domain users not found in LDAP.2. Run GCDS after user and group lifecycle changes.


C.

1. Configure the LDAP search attributes to exclude manually created Cloud Identity users not found in LDAP.2. Set up a recurring GCDS task.


D.

1. Configure the LDAP search attributes to exclude manually created Cloud identity users not found in LDAP.2. Run GCDS after user and group lifecycle changes.


Expert Solution
Questions # 58:

A customer is collaborating with another company to build an application on Compute Engine. The customer is building the application tier in their GCP Organization, and the other company is building the storage tier in a different GCP Organization. This is a 3-tier web application. Communication between portions of the application must not traverse the public internet by any means.

Which connectivity option should be implemented?

Options:

A.

VPC peering


B.

Cloud VPN


C.

Cloud Interconnect


D.

Shared VPC


Expert Solution
Questions # 59:

You are working with a network engineer at your company who is extending a large BigQuery-based data analytics application. Currently, all of the data for that application is ingested from on-premises applications over a Dedicated Interconnect connection with a 20Gbps capacity. You need to onboard a data source on Microsoft Azure that requires a daily ingestion of approximately 250 TB of data. You need to ensure that the data gets transferred securely and efficiently. What should you do?

Options:

A.

Establish a Cross-Cloud Interconnect connection between Microsoft Azure and Google Cloud. Configure a network route over this connection to transfer the data.


B.

Establish a VPN connection with the Microsoft Azure subscription where the source application is running. Transfer the data through the VPN connection.


C.

Use the existing Dedicated Interconnect connection through the on-premises network and establish connectivity to Microsoft Azure.


D.

Set up a SFTP server with a public IP address that runs on a VM in your Google Cloud project. Connect from Microsoft Azure to this server.


Expert Solution
Questions # 60:

You are using Security Command Center (SCC) to protect your workloads and receive alerts for suspected security breaches at your company. You need to detect cryptocurrency mining software. Which SCC service should you use?

Options:

A.

Web Security Scanner


B.

Container Threat Detection


C.

Rapid Vulnerability Detection


D.

Virtual Machine Threat Detection


Expert Solution
Viewing page 6 out of 10 pages
Viewing questions 51-60 out of questions