If blank, the request will not include temperature.
Targets OpenAI-compatible /v1/chat/completions endpoints (works well via LiteLLM).
Includes developer role for providers that support it; others may ignore or reject it.
Settings + messages are saved to localStorage.