Ultimate access to all questions.
Your company's CTO is worried about the high costs associated with running data pipelines, particularly large batch processing jobs. These jobs don't need to run on a strict schedule, and the CTO is open to longer completion times if it means reducing expenses. You're currently using Cloud Dataflow for most pipelines and want to minimize costs without extensive changes. What's your best recommendation?