
Ultimate access to all questions.
You register a model for use in a batch inference pipeline that processes a file dataset using a ParallelRunStep. The inference script must process six input files each time the inferencing function is called.
Which configuration setting must you specify in the ParallelRunConfig object for the ParallelRunStep?
A
process_count_per_node= "6"_
B
node_count= "6"_
C
mini_batch_size= "6"
D
error_threshold= "6"_