
Answer-first summary for fast verification
Answer: mlflow.log_metric
The correct answer is `mlflow.log_metric`, which is specifically designed for logging metrics during a model run. This functionality is crucial for tracking and comparing the performance of different models based on specific metrics. While `mlflow.log_artifact` (Option B) is used for logging files or artifacts, it is not tailored for metrics. `mlflow.start_run` (Option A) initiates a new run but does not log metrics directly. `mlflow.compare_runs` (Option C) is used for retrospective comparison of runs, not for logging metrics during the run. Therefore, for efficient tracking and comparison of metrics, `mlflow.log_metric` is the appropriate choice.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
In the context of developing a workflow that involves training multiple models and comparing their performance, which MLflow feature is most suitable for logging metrics and visualizations for each model run?
A
mlflow.start_run
B
mlflow.log_artifact
C
mlflow.compare_runs
D
mlflow.log_metric
No comments yet.