
Answer-first summary for fast verification
Answer: mini_batch_size= "6"
The question specifies that the ParallelRunStep must process six input files each time the inferencing function is called when using a file dataset. According to Microsoft documentation and the community consensus (with 100% agreement and high upvotes), the mini_batch_size parameter controls the number of files processed per run() call for FileDataset inputs. Option C (mini_batch_size="6") directly matches this requirement. Option A (process_count_per_node) controls parallel processes per node, not file batch size. Option B (node_count) sets the number of compute nodes, unrelated to file processing per call. Option D (error_threshold) defines error tolerance, not batch processing. Thus, only C is correct.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You register a model for use in a batch inference pipeline that processes a file dataset using a ParallelRunStep. The inference script must process six input files each time the inferencing function is called.
Which configuration setting must you specify in the ParallelRunConfig object for the ParallelRunStep?
A
process_count_per_node= "6"
B
node_count= "6"
C
mini_batch_size= "6"
D
error_threshold= "6"
No comments yet.