Databricks Certified Associate Developer for Apache Spark 3.0 Exam Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 Question # 44 Topic 5 Discussion

Databricks Certified Associate Developer for Apache Spark 3.0 Exam Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 Question # 44 Topic 5 Discussion

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 Exam Topic 5 Question 44 Discussion:
Question #: 44
Topic #: 5

The code block displayed below contains an error. The code block should merge the rows of DataFrames transactionsDfMonday and transactionsDfTuesday into a new DataFrame, matching

column names and inserting null values where column names do not appear in both DataFrames. Find the error.

Sample of DataFrame transactionsDfMonday:

1.+-------------+---------+-----+-------+---------+----+

2.|transactionId|predError|value|storeId|productId| f|

3.+-------------+---------+-----+-------+---------+----+

4.| 5| null| null| null| 2|null|

5.| 6| 3| 2| 25| 2|null|

6.+-------------+---------+-----+-------+---------+----+

Sample of DataFrame transactionsDfTuesday:

1.+-------+-------------+---------+-----+

2.|storeId|transactionId|productId|value|

3.+-------+-------------+---------+-----+

4.| 25| 1| 1| 4|

5.| 2| 2| 2| 7|

6.| 3| 4| 2| null|

7.| null| 5| 2| null|

8.+-------+-------------+---------+-----+

Code block:

sc.union([transactionsDfMonday, transactionsDfTuesday])


A.

The DataFrames' RDDs need to be passed into the sc.union method instead of the DataFrame variable names.


B.

Instead of union, the concat method should be used, making sure to not use its default arguments.


C.

Instead of the Spark context, transactionDfMonday should be called with the join method instead of the union method, making sure to use its default arguments.


D.

Instead of the Spark context, transactionDfMonday should be called with the union method.


E.

Instead of the Spark context, transactionDfMonday should be called with the unionByName method instead of the union method, making sure to not use its default arguments.


Get Premium Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.