Winter Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: bigdisc65

Complete Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 Databricks Materials

Page: 6 / 6
Question 24

Which of the following code blocks returns only rows from DataFrame transactionsDf in which values in column productId are unique?

Options:

A.

transactionsDf.distinct("productId")

B.

transactionsDf.dropDuplicates(subset=["productId"])

C.

transactionsDf.drop_duplicates(subset="productId")

D.

transactionsDf.unique("productId")

E.

transactionsDf.dropDuplicates(subset="productId")

Question 25

The code block displayed below contains an error. The code block should return a copy of DataFrame transactionsDf where the name of column transactionId has been changed to

transactionNumber. Find the error.

Code block:

transactionsDf.withColumn("transactionNumber", "transactionId")

Options:

A.

The arguments to the withColumn method need to be reordered.

B.

The arguments to the withColumn method need to be reordered and the copy() operator should be appended to the code block to ensure a copy is returned.

C.

The copy() operator should be appended to the code block to ensure a copy is returned.

D.

Each column name needs to be wrapped in the col() method and method withColumn should be replaced by method withColumnRenamed.

E.

The method withColumn should be replaced by method withColumnRenamed and the arguments to the method need to be reordered.

Question 26

The code block shown below should write DataFrame transactionsDf as a parquet file to path storeDir, using brotli compression and replacing any previously existing file. Choose the answer that

correctly fills the blanks in the code block to accomplish this.

transactionsDf.__1__.format("parquet").__2__(__3__).option(__4__, "brotli").__5__(storeDir)

Options:

A.

1. save

2. mode

3. "ignore"

4. "compression"

5. path

B.

1. store

2. with

3. "replacement"

4. "compression"

5. path

C.

1. write

2. mode

3. "overwrite"

4. "compression"

5. save

(Correct)

D.

1. save

2. mode

3. "replace"

4. "compression"

5. path

E.

1. write

2. mode

3. "overwrite"

4. compression

5. parquet

Question 27

Which of the following code blocks uses a schema fileSchema to read a parquet file at location filePath into a DataFrame?

Options:

A.

spark.read.schema(fileSchema).format("parquet").load(filePath)

B.

spark.read.schema("fileSchema").format("parquet").load(filePath)

C.

spark.read().schema(fileSchema).parquet(filePath)

D.

spark.read().schema(fileSchema).format(parquet).load(filePath)

E.

spark.read.schema(fileSchema).open(filePath)

Page: 6 / 6
Exam Name: Databricks Certified Associate Developer for Apache Spark 3.0 Exam
Last Update: Nov 21, 2024
Questions: 180
Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 pdf

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 PDF

$28  $80
Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 Engine

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 Testing Engine

$33.25  $95
Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 PDF + Engine

Databricks-Certified-Associate-Developer-for-Apache-Spark-3.0 PDF + Testing Engine

$45.5  $130