The “temperature” parameter adjusts the randomness of an LLM’s output by scaling the softmax distribution—low values (e.g., 0.7) make it more deterministic, high values (e.g., 1.5) increase creativity—Option A is correct. Option B (stop string) is the stop sequence. Option C (penalty) relates to presence/frequency penalties. Option D (max tokens) is a separate parameter. Temperature shapes output style.
OCI 2025 Generative AI documentation likely defines temperature under generation parameters.
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit