Skip to content
GitHubDiscord

AI (lexigram-ai)

AI Layer for Lexigram Framework - Native LLM, Vector, RAG integration


lexigram-ai is the AI orchestration layer for the Lexigram Framework. It is a thin coordinator package that wires together independent AI sub-packages (lexigram-ai-llm, lexigram-ai-rag, lexigram-ai-agents, etc.) via the framework’s DI container and entry-point discovery, and exposes a single AIModule entry point for application composition. Zero-config usage starts with sensible defaults.

Terminal window
uv add lexigram-ai
# Optional extras
uv add "lexigram-ai[llm,vector,rag,governance,observability]"
from lexigram import Application
from lexigram.di.module import Module, module
from lexigram.ai import AIModule
@module(imports=[AIModule.configure()])
class AppModule(Module):
pass
app = Application(modules=[AppModule])
if __name__ == "__main__":
app.run()

Zero-config usage: Call AIModule.configure() with no arguments to use defaults.

application.yaml
ai:
llm:
default_provider: "openai"
openai:
api_key: "${OPENAI_API_KEY}"
rag:
enabled: true
governance:
enabled: true
monthly_budget: 1000.0
Section titled “Option 2 — Profiles + Environment Variables (recommended)”
Terminal window
export LEX_PROFILE=production
# Environment variables for each field
from lexigram.ai.config import AIConfig
from lexigram.ai import AIModule
config = AIConfig(...)
AIModule.configure(config)
FieldDefaultEnv varDescription
enabledTrueLEX_AI__ENABLEDGlobal AI feature toggle
llmNoneLEX_AI_LLM__*LLM provider config
vectorNoneLEX_AI__VECTOR__*Vector store config
ragNoneLEX_AI__RAG__*RAG pipeline config
governancedefaultLEX_AI__GOVERNANCE__*Policy enforcement config
observabilitydefaultLEX_AI__OBSERVABILITY__*Tracing and metrics config
subsystems{}—Config for third-party entry-point subsystems
MethodDescription
AIModule.configure(config)Configure with explicit config
AIModule.stub()Minimal config for testing
  • AI orchestration: Thin coordinator wiring all AI sub-packages via DI
  • Entry-point discovery: Discovers sub-packages via lexigram.ai.subsystems entry points
  • Multi-provider support: Integrates LLM, RAG, agents, memory, governance, and observability
  • Production security: Validates API keys in production environments
async with Application.boot(modules=[AIModule.stub()]) as app:
# your test code
...
FileWhat it contains
src/lexigram/ai/__init__.pyLazy-loaded public API: AIModule, AIProvider, AIConfig
src/lexigram/ai/module.pyAIModule.configure() and AIModule.stub()
src/lexigram/ai/config.pyAIConfig and get_subsystem_config()
src/lexigram/ai/di/provider.pyAIProvider — registers and boots sub-providers
src/lexigram/ai/exceptions.pyAIError base exception