from naas_abi_core.services.agent.beta.IntentMapper import Intent, IntentMapper, IntentTypeintents = [ Intent(intent_value="write a report", intent_type=IntentType.AGENT, intent_target="report_agent"), Intent(intent_value="calculate an arithmetic result", intent_type=IntentType.TOOL, intent_target="calculator"),]mapper = IntentMapper(intents)results = mapper.map_intent("I need to write a report about AI trends", k=1)top = results[0]["intent"]print(top.intent_value) # e.g. "write a report"print(top.intent_type) # IntentType.AGENTprint(top.intent_target) # "report_agent"_, prompt_results = mapper.map_prompt("3 / 4 + 5", k=1)print(prompt_results[0]["intent"].intent_value)
Caveats
If embedding_model is not provided, a warning is logged and OpenAIEmbeddings(model="text-embedding-3-large") is used.
The internal VectorStore is initialized only after embeddings are generated, and its dimension is inferred from the first vector.
map_prompt does not use the configured LLM; it only does embedding similarity against intent_value strings and returns ([], prompt_results).
Returned items from map_intent/map_prompt are dicts whose base shape is determined by VectorStore.similarity_search; this module only adds an "intent" key.