OpenRouterModel
What it is
A small wrapper that constructs a langchain_openai.ChatOpenAI client configured to call the OpenRouter API endpoint.
Public API
class OpenRouterModel__init__(api_key: str)- Stores the OpenRouter API key and sets the fixed
base_urltohttps://openrouter.ai/api/v1.
- Stores the OpenRouter API key and sets the fixed
get_model(model_id: str) -> langchain_openai.ChatOpenAI- Returns a
ChatOpenAIinstance configured with:model=model_idapi_key=SecretStr(self.api_key)base_url=self.base_url
- Returns a
Configuration/Dependencies
- Dependencies:
langchain_openai.ChatOpenAIpydantic.SecretStr
- Configuration:
- Requires an OpenRouter API key passed to
OpenRouterModel(api_key=...). - Uses a hardcoded base URL:
https://openrouter.ai/api/v1.
- Requires an OpenRouter API key passed to
Usage
from naas_abi_marketplace.applications.openrouter.models.OpenRouterModel import OpenRouterModel
router = OpenRouterModel(api_key="YOUR_OPENROUTER_API_KEY")
llm = router.get_model("openai/gpt-4o-mini") # model_id is passed through as-is
# llm is a langchain_openai.ChatOpenAI instance configured for OpenRouter
Caveats
- No validation is performed for
api_keyormodel_id; invalid values will fail when the returned client is used. - The base URL is fixed in code and cannot be overridden via the public API.