o3_deep_research
What it is
- A module-level configuration that registers/exports a
ChatModelinstance for OpenAI’so3-deep-researchmodel using LangChain’sChatOpenAI.
Public API
MODEL_ID: str- Constant model identifier:
"o3-deep-research".
- Constant model identifier:
PROVIDER: str- Constant provider identifier:
"openai".
- Constant provider identifier:
model: naas_abi_core.models.Model.ChatModel- Preconfigured chat model wrapper containing:
model_idandprovider- a
langchain_openai.ChatOpenAIinstance withtemperature=0and an API key sourced fromABIModuleconfiguration.
- Preconfigured chat model wrapper containing:
Configuration/Dependencies
- Dependencies:
langchain_openai.ChatOpenAInaas_abi_core.models.Model.ChatModelnaas_abi_marketplace.ai.chatgpt.ABIModulepydantic.SecretStr
- Configuration required:
ABIModule.get_instance().configuration.openai_api_keymust be set (used to build theChatOpenAIapi_key).
Usage
from naas_abi_marketplace.ai.chatgpt.models import o3_deep_research
chat_model = o3_deep_research.model # ChatModel wrapper
llm = chat_model.model # underlying ChatOpenAI instance
Caveats
- This module only defines a preconfigured model instance; it does not expose helper functions for invoking or formatting prompts.
- Importing/initializing requires
ABIModuleto be configured with a valid OpenAI API key.