New Year Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Databricks Databricks Certification Databricks-Certified-Data-Engineer-Associate Questions and answers with CertsForce

Viewing page 5 out of 5 pages
Viewing questions 41-50 out of questions
Questions # 41:

A data engineer manages multiple external tables linked to various data sources. The data engineer wants to manage these external tables efficiently and ensure that only the necessary permissions are granted to users for accessing specific external tables.

How should the data engineer manage access to these external tables?

Options:

A.

Create a single user role with full access to all external tables and assign it to all users.


B.

Use Unity Catalog to manage access controls and permissions for each external table individually.


C.

Set up Azure Blob Storage permissions at the container level, allowing access to all external tables.


D.

Grant permissions on the Databricks workspace level, which will automatically apply to all external tables.


Expert Solution
Questions # 42:

Which of the following commands will return the location of database customer360?

Options:

A.

DESCRIBE LOCATION customer360;


B.

DROP DATABASE customer360;


C.

DESCRIBE DATABASE customer360;


D.

ALTER DATABASE customer360 SET DBPROPERTIES ('location' = '/user'};


E.

USE DATABASE customer360;


Expert Solution
Questions # 43:

A data engineer is getting a partner organization up to speed with Databricks account. Both teams share some business use cases. The data engineer has to share some of your Unity-Catalog managed delta tables and the notebook jobs creating those tables with the partner organization.

How can the data engineer seamlessly share the required information?

Options:

A.

Zip all the code and share via email and allow data ingestion from your data lake


B.

Data and Notebooks can be shared simply using Unity Catalog.


C.

Share access to codebase via Github and allow them to ingest datasets from your Datalake.


D.

Share required datasets and notebooks via Delta Sharing. Manage permissions via Unity Catalog.


Expert Solution
Questions # 44:

What is the primary function of the Silver layer in the Databricks medallion architecture?

Options:

A.

lngest raw data in its original state


B.

Validate, clean, and deduplicate data for further processing


C.

Aggregate and enrich data for business analytics


D.

Store historical data solely for auditing purposes


Expert Solution
Questions # 45:

A data engineer has configured a Structured Streaming job to read from a table, manipulate the data, and then perform a streaming write into a new table.

The code block used by the data engineer is below:

Question # 45

If the data engineer only wants the query to process all of the available data in as many batches as required, which of the following lines of code should the data engineer use to fill in the blank?

Options:

A.

processingTime(1)


B.

trigger(availableNow=True)


C.

trigger(parallelBatch=True)


D.

trigger(processingTime="once")


E.

trigger(continuous="once")


Expert Solution
Viewing page 5 out of 5 pages
Viewing questions 41-50 out of questions