Pass the Google Google Cloud Platform Associate-Data-Practitioner Questions and answers with CertsForce

Viewing page 2 out of 4 pages
Viewing questions 11-20 out of questions
Questions # 11:

You are a database administrator managing sales transaction data by region stored in a BigQuery table. You need to ensure that each sales representative can only see the transactions in their region. What should you do?

Options:

A.

Add a policy tag in BigQuery.


B.

Create a row-level access policy.


C.

Create a data masking rule.


D.

Grant the appropriate 1AM permissions on the dataset.


Questions # 12:

Your company uses Looker to generate and share reports with various stakeholders. You have a complex dashboard with several visualizations that needs to be delivered to specific stakeholders on a recurring basis, with customized filters applied for each recipient. You need an efficient and scalable solution to automate the delivery of this customized dashboard. You want to follow the Google-recommended approach. What should you do?

Options:

A.

Create a separate LookML model for each stakeholder with predefined filters, and schedule the dashboards using the Looker Scheduler.


B.

Create a script using the Looker Python SDK, and configure user attribute filter values. Generate a new scheduled plan for each stakeholder.


C.

Embed the Looker dashboard in a custom web application, and use the application's scheduling features to send the report with personalized filters.


D.

Use the Looker Scheduler with a user attribute filter on the dashboard, and send the dashboard with personalized filters to each stakeholder based on their attributes.


Questions # 13:

You manage a Cloud Storage bucket that stores temporary files created during data processing. These temporary files are only needed for seven days, after which they are no longer needed. To reduce storage costs and keep your bucket organized, you want to automatically delete these files once they are older than seven days. What should you do?

Options:

A.

Set up a Cloud Scheduler job that invokes a weekly Cloud Run function to delete files older than seven days.


B.

Configure a Cloud Storage lifecycle rule that automatically deletes objects older than seven days.


C.

Develop a batch process using Dataflow that runs weekly and deletes files based on their age.


D.

Create a Cloud Run function that runs daily and deletes files older than seven days.


Questions # 14:

You work for a financial organization that stores transaction data in BigQuery. Your organization has a regulatory requirement to retain data for a minimum of seven years for auditing purposes. You need to ensure that the data is retained for seven years using an efficient and cost-optimized approach. What should you do?

Options:

A.

Create a partition by transaction date, and set the partition expiration policy to seven years.


B.

Set the table-level retention policy in BigQuery to seven years.


C.

Set the dataset-level retention policy in BigQuery to seven years.


D.

Export the BigQuery tables to Cloud Storage daily, and enforce a lifecycle management policy that has a seven-year retention rule.


Questions # 15:

Your company is migrating their batch transformation pipelines to Google Cloud. You need to choose a solution that supports programmatic transformations using only SQL. You also want the technology to support Git integration for version control of your pipelines. What should you do?

Options:

A.

Use Cloud Data Fusion pipelines.


B.

Use Dataform workflows.


C.

Use Dataflow pipelines.


D.

Use Cloud Composer operators.


Questions # 16:

Your retail organization stores sensitive application usage data in Cloud Storage. You need to encrypt the data without the operational overhead of managing encryption keys. What should you do?

Options:

A.

Use Google-managed encryption keys (GMEK).


B.

Use customer-managed encryption keys (CMEK).


C.

Use customer-supplied encryption keys (CSEK).


D.

Use customer-supplied encryption keys (CSEK) for the sensitive data and customer-managed encryption keys (CMEK) for the less sensitive data.


Questions # 17:

You are a data analyst working with sensitive customer data in BigQuery. You need to ensure that only authorized personnel within your organization can query this data, while following the principle of least privilege. What should you do?

Options:

A.

Enable access control by using IAM roles.


B.

Update dataset privileges by using the SQL GRANT statement.


C.

Export the data to Cloud Storage, and use signed URLs to authorize access.


D.

Encrypt the data by using customer-managed encryption keys (CMEK).


Questions # 18:

Your company is building a near real-time streaming pipeline to process JSON telemetry data from small appliances. You need to process messages arriving at a Pub/Sub topic, capitalize letters in the serial number field, and write results to BigQuery. You want to use a managed service and write a minimal amount of code for underlying transformations. What should you do?

Options:

A.

Use a Pub/Sub to BigQuery subscription, write results directly to BigQuery, and schedule a transformation query to run every five minutes.


B.

Use a Pub/Sub to Cloud Storage subscription, write a Cloud Run service that is triggered when objects arrive in the bucket, performs the transformations, and writes the results to BigQuery.


C.

Use the “Pub/Sub to BigQuery” Dataflow template with a UDF, and write the results to BigQuery.


D.

Use a Pub/Sub push subscription, write a Cloud Run service that accepts the messages, performs the transformations, and writes the results to BigQuery.


Questions # 19:

You manage a large amount of data in Cloud Storage, including raw data, processed data, and backups. Your organization is subject to strict compliance regulations that mandate data immutability for specific data types. You want to use an efficient process to reduce storage costs while ensuring that your storage strategy meets retention requirements. What should you do?

Options:

A.

Configure lifecycle management rules to transition objects to appropriate storage classes based on access patterns. Set up Object Versioning for all objects to meet immutability requirements.


B.

Move objects to different storage classes based on their age and access patterns. Use Cloud Key Management Service (Cloud KMS) to encrypt specific objects with customer-managed encryption keys (CMEK) to meet immutability requirements.


C.

Create a Cloud Run function to periodically check object metadata, and move objects to the appropriate storage class based on age and access patterns. Use object holds to enforce immutability for specific objects.


D.

Use object holds to enforce immutability for specific objects, and configure lifecycle management rules to transition objects to appropriate storage classes based on age and access patterns.


Questions # 20:

Your organization uses Dataflow pipelines to process real-time financial transactions. You discover that one of your Dataflow jobs has failed. You need to troubleshoot the issue as quickly as possible. What should you do?

Options:

A.

Set up a Cloud Monitoring dashboard to track key Dataflow metrics, such as data throughput, error rates, and resource utilization.


B.

Create a custom script to periodically poll the Dataflow API for job status updates, and send email alerts if any errors are identified.


C.

Navigate to the Dataflow Jobs page in the Google Cloud console. Use the job logs and worker logs to identify the error.


D.

Use the gcloud CLI tool to retrieve job metrics and logs, and analyze them for errors and performance bottlenecks.


Viewing page 2 out of 4 pages
Viewing questions 11-20 out of questions