Add LLM_HEADERS support for custom request headers
Reads LLM_HEADERS as a JSON object from .env and merges it into every LLM request alongside the existing Authorization header. Useful for endpoints that require non-standard headers (e.g. x-openclaw-agent-id). LLM_API_KEY continues to be sent without the "Bearer" prefix in .env. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -4,8 +4,12 @@ LLM_ENDPOINT=http://localhost:11434/v1/chat/completions
|
||||
# Model name passed to the endpoint
|
||||
LLM_MODEL=llama3
|
||||
|
||||
# Optional API key (sent as Bearer token) — leave blank for local servers
|
||||
# Optional API key — value is sent as "Bearer <value>", do NOT include the word "Bearer" here
|
||||
LLM_API_KEY=
|
||||
|
||||
# Optional extra headers sent with every LLM request, as a JSON object
|
||||
# Example: LLM_HEADERS={"x-openclaw-agent-id":"jibo"}
|
||||
LLM_HEADERS=
|
||||
|
||||
# Default system prompt for the voice AI loop
|
||||
LLM_SYSTEM_PROMPT=You are Jibo, a friendly social robot. Keep responses brief and conversational.
|
||||
|
||||
Reference in New Issue
Block a user