
Ultimate access to all questions.
Deep dive into the quiz with AI chat providers.
We prepare a focused prompt with your quiz and certificate details so each AI can offer a more tailored, in-depth explanation.
A company wants to assess the costs that are associated with using a large language model (LLM) to generate inferences. The company wants to use Amazon Bedrock to build generative AI applications. Which factor will drive the inference costs?
A
Number of tokens consumed
B
Temperature value
C
Amount of data used to train the LLM
D
Total training time
Explanation:
For Amazon Bedrock, inference costs are primarily driven by the number of tokens consumed. Here's why:
This token-based pricing model is common across many LLM services as it accurately reflects the actual computational work performed by the model.