
Ultimate access to all questions.
A data scientist has developed a two-class decision tree classifier using Spark ML and computed the predictions in a Spark DataFrame preds_df with the schema: prediction DOUBLE, actual DOUBLE. Which of the following code blocks correctly computes the model's accuracy from preds_df and assigns it to the accuracy variable?