A Generative AI Engineer is developing an LLM-based application where the document chunks for the retriever have a maximum size of 512 tokens. Given that cost and latency are higher priorities than quality, which available context length level should they select? | Databricks Certified Generative AI Engineer - Associate Quiz - LeetQuiz