Month End Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Microsoft Microsoft Certified: Fabric Analytics Engineer Associate DP-600 Questions and answers with CertsForce

Viewing page 2 out of 4 pages
Viewing questions 11-20 out of questions
Questions # 11:

You have a Fabric tenant that contains a semantic model. The model contains 15 tables.

You need to programmatically change each column that ends in the word Key to meet the following requirements:

• Hide the column.

• Set Nullable to False.

• Set Summarize By to None

• Set Available in MDX to False.

• Mark the column as a key column.

What should you use?

Options:

A.

Microsoft Power Bl Desktop


B.

Tabular Editor


C.

ALM Toolkit


D.

DAX Studio


Expert Solution
Questions # 12:

You have a Fabric tenant that contains a lakehouse named lakehouse1. Lakehouse1 contains a table named Table1.

You are creating a new data pipeline.

You plan to copy external data to Table1. The schema of the external data changes regularly.

You need the copy operation to meet the following requirements:

• Replace Table1 with the schema of the external data.

• Replace all the data in Table1 with the rows in the external data.

You add a Copy data activity to the pipeline. What should you do for the Copy data activity?

Options:

A.

From the Source tab, add additional columns.


B.

From the Destination tab, set Table action to Overwrite.


C.

From the Settings tab, select Enable staging


D.

From the Source tab, select Enable partition discovery


E.

From the Source tab, select Recursively


Expert Solution
Questions # 13:

You have a Fabric tenant that contains a takehouse named lakehouse1. Lakehouse1 contains a Delta table named Customer.

When you query Customer, you discover that the query is slow to execute. You suspect that maintenance was NOT performed on the table.

You need to identify whether maintenance tasks were performed on Customer.

Solution: You run the following Spark SQL statement:

DESCRIBE HISTORY customer

Does this meet the goal?

Options:

A.

Yes


B.

No


Expert Solution
Questions # 14:

You need to migrate the Research division data for Productline2. The solution must meet the data preparation requirements. How should you complete the code? To answer, select the appropriate options in the answer area

NOTE: Each correct selection is worth one point.

Question # 14


Expert Solution
Questions # 15:

Which syntax should you use in a notebook to access the Research division data for Productlinel?

A)

Question # 15

B)

Question # 15

C)

Question # 15

D)

Question # 15

Options:

A.

Option A


B.

Option B


C.

Option C


D.

Option D


Expert Solution
Questions # 16:

What should you recommend using to ingest the customer data into the data store in the AnatyticsPOC workspace?

Options:

A.

a stored procedure


B.

a pipeline that contains a KQL activity


C.

a Spark notebook


D.

a dataflow


Expert Solution
Questions # 17:

You need to implement the date dimension in the data store. The solution must meet the technical requirements.

What are two ways to achieve the goal? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

Options:

A.

Populate the date dimension table by using a dataflow.


B.

Populate the date dimension table by using a Stored procedure activity in a pipeline.


C.

Populate the date dimension view by using T-SQL.


D.

Populate the date dimension table by using a Copy activity in a pipeline.


Expert Solution
Questions # 18:

You need to ensure the data loading activities in the AnalyticsPOC workspace are executed in the appropriate sequence. The solution must meet the technical requirements.

What should you do?

Options:

A.

Create a pipeline that has dependencies between activities and schedule the pipeline.


B.

Create and schedule a Spark job definition.


C.

Create a dataflow that has multiple steps and schedule the dataflow.


D.

Create and schedule a Spark notebook.


Expert Solution
Questions # 19:

Which type of data store should you recommend in the AnalyticsPOC workspace?

Options:

A.

a data lake


B.

a warehouse


C.

a lakehouse


D.

an external Hive metaStore


Expert Solution
Questions # 20:

You need to recommend a solution to prepare the tenant for the PoC.

Which two actions should you recommend performing from the Fabric Admin portal? Each correct answer presents part of the solution.

NOTE: Each correct answer is worth one point.

Options:

A.

Enable the Users can try Microsoft Fabric paid features option for specific security groups.


B.

Enable the Allow Azure Active Directory guest users to access Microsoft Fabric option for specific security groups.


C.

Enable the Users can create Fabric items option and exclude specific security groups.


D.

Enable the Users can try Microsoft Fabric paid features option for the entire organization.


E.

Enable the Users can create Fabric items option for specific security groups.


Expert Solution
Viewing page 2 out of 4 pages
Viewing questions 11-20 out of questions