
Databricks Certified Generative AI Engineer - Associate
Get started today
Ultimate access to all questions.
A Generative AI Engineer is developing an LLM-based application where the document chunks for the retriever have a maximum size of 512 tokens. Given that cost and latency are higher priorities than quality, which available context length level should they select?
A Generative AI Engineer is developing an LLM-based application where the document chunks for the retriever have a maximum size of 512 tokens. Given that cost and latency are higher priorities than quality, which available context length level should they select?
Exam-Like
Comments
Loading comments...