New Year Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Google Google Cloud Certified Professional-Cloud-Security-Engineer Questions and answers with CertsForce

Viewing page 6 out of 9 pages
Viewing questions 51-60 out of questions
Questions # 51:

Your organization has 3 TB of information in BigQuery and Cloud SQL. You need to develop a cost-effective, scalable, and secure strategy to anonymize the personally identifiable information (PII) that exists today. What should you do?

Options:

A.

Scan your BigQuery and Cloud SQL data using the Cloud DLP data profiling feature. Use the data profiling results to create a de-identification strategy with either Cloud Sensitive Data Protection's de-identification templates or custom configurations.


B.

Create a new BigQuery dataset and Cloud SQL instance. Copy a small subset of the data to these new locations. Use Cloud Data Loss Prevention API to scan this subset for PII. Based on the results, create a custom anonymization script and apply the script to the entire 3 TB dataset in the original locations.


C.

Export all 3TB of data from BigQuery and Cloud SQL to Cloud Storage. Use Cloud Sensitive Data Protection to anonymize the exported data. Re-import the anonymized data back into BigQuery and Cloud SQL.


D.

Inspect a representative sample of the data in BigQuery and Cloud SQL to identify PII. Based on this analysis, develop a custom script to anonymize the identified PII.


Expert Solution
Questions # 52:

Your team sets up a Shared VPC Network where project co-vpc-prod is the host project. Your team has configured the firewall rules, subnets, and VPN gateway on the host project. They need to enable Engineering Group A to attach a Compute Engine instance to only the 10.1.1.0/24 subnet.

What should your team grant to Engineering Group A to meet this requirement?

Options:

A.

Compute Network User Role at the host project level.


B.

Compute Network User Role at the subnet level.


C.

Compute Shared VPC Admin Role at the host project level.


D.

Compute Shared VPC Admin Role at the service project level.


Expert Solution
Questions # 53:

You are managing a set of Google Cloud projects that are contained in a folder named Data Warehouse A new data analysis team has been approved to perform data analysis for all BigQuery data in the projects within the Data Warehouse folder. They should only be able to read the data and not have permissions to modify or delete the data. You want to reduce the operational overhead of provisioning access while adhering to the principle of least privilege. What should you do?

Options:

A.

Grant the BigQuery Data Viewer role at the dataset level for each BigQuery dataset within each project in the Data Warehouse folder


B.

Grant the BigQuery Data Viewer role at the Data Warehouse folder.


C.

Grant the BigQuery Data Viewer role at the project level for each project within the Data Warehouse folder.


D.

Grant the BigQuery Metadata Viewer role at the Data Warehouse folder


Expert Solution
Questions # 54:

You are in charge of migrating a legacy application from your company datacenters to GCP before the current maintenance contract expires. You do not know what ports the application is using and no documentation is available for you to check. You want to complete the migration without putting your environment at risk.

What should you do?

Options:

A.

Migrate the application into an isolated project using a “Lift & Shift” approach. Enable all internal TCP traffic using VPC Firewall rules. Use VPC Flow logs to determine what traffic should be allowed for theapplication to work properly.


B.

Migrate the application into an isolated project using a “Lift & Shift” approach in a custom network. Disable all traffic within the VPC and look at the Firewall logs to determine what traffic should be allowed for the application to work properly.


C.

Refactor the application into a micro-services architecture in a GKE cluster. Disable all traffic from outside the cluster using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.


D.

Refactor the application into a micro-services architecture hosted in Cloud Functions in an isolated project.Disable all traffic from outside your project using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.


Expert Solution
Questions # 55:

An organization is moving applications to Google Cloud while maintaining a few mission-critical applications on-premises. The organization must transfer the data at a bandwidth of at least 50 Gbps. What should they use to ensure secure continued connectivity between sites?

Options:

A.

Dedicated Interconnect


B.

Cloud Router


C.

Cloud VPN


D.

Partner Interconnect


Expert Solution
Questions # 56:

A company’s application is deployed with a user-managed Service Account key. You want to use Google- recommended practices to rotate the key.

What should you do?

Options:

A.

Open Cloud Shell and run gcloud iam service-accounts enable-auto-rotate --iam- account=IAM_ACCOUNT.


B.

Open Cloud Shell and run gcloud iam service-accounts keys rotate --iam- account=IAM_ACCOUNT --key=NEW_KEY.


C.

Create a new key, and use the new key in the application. Delete the old key from the Service Account.


D.

Create a new key, and use the new key in the application. Store the old key on the system as a backup key.


Expert Solution
Questions # 57:

Your security team wants to implement a defense-in-depth approach to protect sensitive data stored in a Cloud Storage bucket. Your team has the following requirements:

    The Cloud Storage bucket in Project A can only be readable from Project B.

    The Cloud Storage bucket in Project A cannot be accessed from outside the network.

    Data in the Cloud Storage bucket cannot be copied to an external Cloud Storage bucket.

What should the security team do?

Options:

A.

Enable domain restricted sharing in an organization policy, and enable uniform bucket-level access on the Cloud Storage bucket.


B.

Enable VPC Service Controls, create a perimeter around Projects A and B. and include the Cloud Storage API in the Service Perimeter configuration.


C.

Enable Private Access in both Project A and B's networks with strict firewall rules that allow communication between the networks.


D.

Enable VPC Peering between Project A and B's networks with strict firewall rules that allow communication between the networks.


Expert Solution
Questions # 58:

You need to implement an encryption at-rest strategy that reduces key management complexity for non-sensitive data and protects sensitive data while providing the flexibility of controlling the key residency and rotation schedule. FIPS 140-2 L1 compliance is required for all data types. What should you do?

Options:

A.

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.


B.

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service


C.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.


D.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.


Expert Solution
Questions # 59:

You discovered that sensitive personally identifiable information (PII) is being ingested to your Google Cloud environment in the daily ETL process from an on-premises environment to your BigQuery datasets. You need to redact this data to obfuscate the PII, but need to re-identify it for data analytics purposes. Which components should you use in your solution? (Choose two.)

Options:

A.

Secret Manager


B.

Cloud Key Management Service


C.

Cloud Data Loss Prevention with cryptographic hashing


D.

Cloud Data Loss Prevention with automatic text redaction


E.

Cloud Data Loss Prevention with deterministic encryption using AES-SIV


Expert Solution
Questions # 60:

Employees at your company use their personal computers to access your organization s Google Cloud console. You need to ensure that users can only access the Google Cloud console from their corporate-issued devices and verify that they have a valid enterprise certificate

What should you do?

Options:

A.

Implement an Identity and Access Management (1AM) conditional policy to verify the device certificate


B.

Implement a VPC firewall policy Activate packet inspection and create an allow rule to validate and verify the device certificate.


C.

Implement an organization policy to verify the certificate from the access context.


D.

Implement an Access Policy in BeyondCorp Enterprise to verify the device certificate Create an access binding with the access policy just created.


Expert Solution
Viewing page 6 out of 9 pages
Viewing questions 51-60 out of questions