Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 40 Topic 5 Discussion

Databricks Certified Associate Developer for Apache Spark 3.5 – Python Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Question # 40 Topic 5 Discussion

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Exam Topic 5 Question 40 Discussion:
Question #: 40
Topic #: 5

40 of 55.

A developer wants to refactor older Spark code to take advantage of built-in functions introduced in Spark 3.5.

The original code:

from pyspark.sql import functions as F

min_price = 110.50

result_df = prices_df.filter(F.col("price") > min_price).agg(F.count("*"))

Which code block should the developer use to refactor the code?


A.

result_df = prices_df.filter(F.col("price") > F.lit(min_price)).agg(F.count("*"))


B.

result_df = prices_df.where(F.lit("price") > min_price).groupBy().count()


C.

result_df = prices_df.withColumn("valid_price", when(col("price") > F.lit(min_price), True))


D.

result_df = prices_df.filter(F.lit(min_price) > F.col("price")).count()


Get Premium Databricks-Certified-Associate-Developer-for-Apache-Spark-3.5 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.