o4_mini_deep_research
What it is
- A module-level definition of a
ChatModelconfigured for OpenAI’so4-mini-deep-researchvialangchain_openai.ChatOpenAI. - Exposes a ready-to-use
modelobject preconfigured with:model_id="o4-mini-deep-research"provider="openai"temperature=0- API key pulled from
ABIModuleconfiguration
Public API
- Constants
MODEL_ID: str—"o4-mini-deep-research"PROVIDER: str—"openai"
- Module variable
model: ChatModel—naas_abi_core.models.Model.ChatModelinstance wrapping aChatOpenAIclient.
Configuration/Dependencies
- Dependencies
langchain_openai.ChatOpenAInaas_abi_core.models.Model.ChatModelnaas_abi_marketplace.ai.chatgpt.ABIModulepydantic.SecretStr
- Configuration
- Requires
ABIModule.get_instance().configuration.openai_api_keyto be set; used to build theChatOpenAIclient.
- Requires
Usage
from naas_abi_marketplace.ai.chatgpt.models.o4_mini_deep_research import model
# `model` is a ChatModel wrapper; use it according to your ChatModel interface.
print(model.model_id) # "o4-mini-deep-research"
print(model.provider) # "openai"
# Access the underlying LangChain client if needed:
llm = model.model
Caveats
- Importing the module constructs the
ChatOpenAIclient immediately; it will accessABIModuleconfiguration at import time. - If
openai_api_keyis missing/invalid, instantiation may fail depending on upstream library behavior.