gpt_5_mini
What it is
A module-level ChatModel definition that wraps LangChain’s ChatOpenAI configured for the OpenAI model gpt-5-mini with deterministic output (temperature=0).
Public API
-
Constants
MODEL_ID: str— set to"gpt-5-mini".PROVIDER: str— set to"openai".
-
Objects
model: ChatModel— preconfigured chat model instance:model_id=MODEL_IDprovider=PROVIDERmodel=langchain_openai.ChatOpenAI(...)with:model="gpt-5-mini"temperature=0api_key=SecretStr(ABIModule.get_instance().configuration.openai_api_key)
Configuration/Dependencies
- Dependencies
langchain_openai.ChatOpenAInaas_abi_core.models.Model.ChatModelnaas_abi_marketplace.ai.chatgpt.ABIModulepydantic.SecretStr
- Required configuration
ABIModule.get_instance().configuration.openai_api_keymust be set and accessible at import time (used to buildSecretStrforapi_key).
Usage
from naas_abi_marketplace.ai.chatgpt.models.gpt_5_mini import model
# `model` is a ChatModel wrapper; use it according to your ChatModel interface.
print(model.model_id) # "gpt-5-mini"
print(model.provider) # "openai"
# Access underlying LangChain ChatOpenAI instance if needed:
llm = model.model
Caveats
- The OpenAI API key is read during module import; missing/invalid configuration can cause import-time errors.
- This file does not define any methods; runtime interaction depends on the external
ChatModelandChatOpenAIinterfaces.