Ultimate access to all questions.
You register a model for use in a batch inference pipeline that processes a file dataset using a ParallelRunStep. The inference script must process six input files each time the inferencing function is called.
Which configuration setting must you specify in the ParallelRunConfig object for the ParallelRunStep?