routeId
in their path (e.g., /proxy/{routeId}/...
or /openai/{routeId}/...
). The Proxy uses this routeId
to identify the target LLM Configuration and route the request accordingly.
/openai/{routeId}/v1/...
)langchaingo
) that converts standard OpenAI API requests into the format required by the target backend LLM (defined in the {routeId}
configuration) and translates the backend LLM’s response back into the standard OpenAI format.
/proxy/{routeId}/...
){routeId}
.