The code block shown below should set the number of partitions that Spark uses when shuffling data for joins or aggregations to 100. Choose the answer that correctly fills the blanks in the code
block to accomplish this.
spark.sql.shuffle.partitions
__1__.__2__.__3__(__4__, 100)
Correct code block:
spark.conf.set('spark.sql.shuffle.partitions', 20)
The code block expresses the option incorrectly.
Correct! The option should be expressed as a string.
The code block sets the wrong option.
No, spark.sql.shuffle.partitions is the correct option for the use case in the question.
The code block sets the incorrect number of parts.
Wrong, the code block correctly states 20 parts.
The code block uses the wrong command for setting an option.
No, in PySpark spark.conf.set() is the correct command for setting an option.
The code block is missing a parameter.
Incorrect, spark.conf.set() takes two parameters.
More info: Configuration - Spark 3.1.2 Documentation
Melodie
12 days agoSonia
5 months agoReita
5 months agoBobbye
5 months agoWillard
5 months agoCandida
3 months agoYolande
3 months agoTwana
4 months agoBettye
4 months agoHeike
4 months agoLouvenia
4 months agoJade
4 months agoBoris
6 months agoCatina
5 months agoReena
5 months agoVallie
6 months agoSelma
6 months agoJolanda
6 months ago