AI Platform Overview
The Lexigram AI subsystem provides a contract-first, provider-agnostic framework for building intelligent applications. It follows the same IoC discipline as the rest of the framework, ensuring that LLM clients, vector stores, and agent strategies are swappable and testable.
- 📦 Package:
lexigram-ai(Orchestrator) - 🚦 Status: Beta
- 🔑 Config key:
ai
1. Principles
Section titled “1. Principles”- Protocol-First: All AI interactions go through standard protocols from
lexigram-contracts. - Modular Implementation: Sub-packages like
lexigram-ai-llmandlexigram-ai-ragare isolated and discoverable via entry points. - Registry-Based: LLM providers and embedding models are managed via a centralized registry.
2. Core Hierarchy
Section titled “2. Core Hierarchy”graph TD
AI["lexigram-ai (Orchestrator)"]
AI --> LLM["lexigram-ai-llm"]
AI --> RAG["lexigram-ai-rag"]
AI --> Mem["lexigram-ai-memory"]
AI --> Agents["lexigram-ai-agents"]
subgraph Contracts [lexigram-contracts]
Msg["ChatMessage"]
Role["Role"]
Doc["Document"]
end
LLM -.-> Contracts
RAG -.-> Contracts
Agents -.-> Contracts
3. Shared Value Types
Section titled “3. Shared Value Types”The AI ecosystem relies on immutable value types defined in lexigram.contracts.ai.
ChatMessage
Section titled “ChatMessage”Represents a single message in a conversation.
from lexigram.contracts.ai.llm import ChatMessage, Role
msg = ChatMessage( role=Role.USER, content="Hello, how do I use the Result pattern?", metadata={"tokens": 12})Standard roles for multi-turn conversations.
SYSTEM: Instructions for the model.USER: Input from the human.ASSISTANT: Response from the model.TOOL: Output from a function call.
4. Example Transformation
Section titled “4. Example Transformation”A simple “Completion to Result” flow:
from lexigram.ai import AIClient
async def ask_geezmo(prompt: str) -> Result[str, AIError]: client = await container.resolve(AIClient)
# AIClient returns a Result[Completion, AIError] return ( await client.complete(prompt) .map(lambda c: c.text) )