deepseek_r1
What it is
- A non-functional template for configuring a
deepseek-r1chat model (Data Engineer domain) usinglangchain_openai.ChatOpenAI. - Provides a
create_model()factory that logs warnings and returns a configured model only ifDEEPSEEK_API_KEYis available.
Public API
create_model() -> ChatOpenAI | None- Creates and returns a
ChatOpenAIinstance configured for the"deepseek-r1"model. - Behavior:
- Logs a warning that the model is not functional yet.
- Reads
DEEPSEEK_API_KEYvianaas_abi.secret.get. - If missing, logs an error and returns
None. - Otherwise returns
ChatOpenAI(model="deepseek-r1", temperature=0.1, max_tokens=4000, api_key=...).
- Creates and returns a
model: None- Placeholder module-level variable (currently always
None). - Comment indicates it would be
create_model()in a functional version.
- Placeholder module-level variable (currently always
Configuration/Dependencies
- Environment/Secrets:
DEEPSEEK_API_KEY(retrieved vianaas_abi.secret.get("DEEPSEEK_API_KEY"))
- Python dependencies:
langchain_openai.ChatOpenAInaas_abi.secretnaas_abi_core.logger
Usage
from naas_abi_marketplace.domains.data_engineer.models.deepseek_r1 import create_model
model = create_model()
if model is None:
raise RuntimeError("Missing DEEPSEEK_API_KEY or model not available")
# If a compatible ChatOpenAI backend is available, you could use it like:
# response = model.invoke("Explain data warehouse vs data lake.")
# print(response)
Caveats
- The module is explicitly marked NOT FUNCTIONAL YET and logs a warning on
create_model()call. - The exported
modelvariable is a placeholder and is not initialized automatically.