llama_3_3_70b
What it is
- A module that defines a preconfigured
ChatModelwrapper for Meta’s Llama-3.3-70B-Instruct usinglangchain_ollama.ChatOllama.
Public API
-
Constants
ID: Model identifier ("meta-llama/Llama-3.3-70B-Instruct").NAME: Ollama model name ("llama-3.3-70b-instruct").DESCRIPTION: Human-readable model description.IMAGE: Image URL for the model.CONTEXT_WINDOW: Context window size (131072).PROVIDER: Provider label ("meta").TEMPERATURE: Default temperature (0).MAX_TOKENS: Defined but not used in configuration (4096).MAX_RETRIES: Defined but not used in configuration (2).
-
Objects
model: ChatModel: Anaas_abi_core.models.Model.ChatModelinstance configured with:model_id,name,description,image,providermodel=ChatOllama(model=NAME, temperature=TEMPERATURE)context_window=CONTEXT_WINDOW
Configuration/Dependencies
- Dependencies
langchain_ollama.ChatOllamanaas_abi_core.models.Model.ChatModel
- Runtime expectation
- The configured
ChatOllamabackend must be available and have the model namedllama-3.3-70b-instructaccessible.
- The configured
Usage
from naas_abi_marketplace.ai.llama.models.llama_3_3_70b import model
# `model` is a ChatModel wrapper around a ChatOllama instance
print(model.name)
print(model.model_id)
Caveats
MAX_TOKENSandMAX_RETRIESare defined in the module but are not applied to theChatOllamaorChatModelconfiguration here.- The exact invocation interface (e.g.,
.invoke(...)) depends onChatModelandChatOllamaimplementations, which are external to this module.