Build durable, observable AI systems β fully open source, works offline.
AbstractFramework is a modular ecosystem for building AI agents and workflows that survive restarts, scale to production, and give you full visibility into what's happening. Every component is open source, works with local models, and designed to be composed however you need.
This repository is the single access point to the ecosystem:
- install the full framework with one
pipcommand - understand how all packages fit together
- create and deploy new specialized solutions (flows/agents) across clients
ββββββββββββββββββββββββββββββββββββββββββββ¬βββββββββββββββββββββββββββββββββββ
β GATEWAY PATH (Recommended) β LOCAL PATH (Alternative) β
ββββββββββββββββββββββββββββββββββββββββββββΌβββββββββββββββββββββββββββββββββββ€
β β β
β Browser UIs (Observer, Flow Editor, β AbstractCode (terminal) β
β Code Web, Your App) β AbstractAssistant (macOS tray) β
β β β β β
β βΌ β β β
β ββββββββββββββββββββββββββββββββββββββ β β β
β β AbstractGateway β β β β
β β ββββββββββββββββββββββββββββββββ β β β β
β β Bundle discovery (specialized β β β β
β β agents across all clients) β β β β
β β Run control (start/pause/resume) β β β β
β β Ledger streaming (real-time SSE) β β β β
β ββββββββββββββββββββ¬ββββββββββββββββββ β β β
β β β β β
βββββββββββββββββββββββΌβββββββββββββββββββββ΄ββββββββββββββΌβββββββββββββββββββββ
ββββββββββββββββββββ¬ββββββββββββββββ
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Composition: AbstractAgent (ReAct/CodeAct/MemAct) + AbstractFlow (.flow) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ----ββββββββ
β Foundation: AbstractRuntime + AbstractCore (+ Voice/Vision capability plugins) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ----ββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Memory & Knowledge: AbstractMemory Β· AbstractSemantics β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
- 100% Open Source β MIT licensed, no black boxes, you own everything
- Local First β Run entirely offline with Ollama, LM Studio, or any local model
- Durable β Workflows survive crashes; resume exactly where you left off
- Observable β Every operation is logged; replay any run from history
- Modular β Use one package or the full stack; compose what you need
pip install "abstractframework==0.1.1"abstractframework==0.1.1 installs the pinned global release:
| Package | Version |
|---|---|
abstractcore |
2.11.8 |
abstractruntime |
0.4.2 |
abstractagent |
0.3.1 |
abstractflow |
0.3.7 |
abstractcode |
0.3.6 |
abstractgateway |
0.2.1 |
abstractmemory |
0.0.2 |
abstractsemantics |
0.0.2 |
abstractvoice |
0.6.3 |
abstractvision |
0.2.1 |
abstractassistant |
0.4.2 |
Default behavior in this release:
abstractcoreis installed withopenai,anthropic,huggingface,embeddings,tokens,tools,media,compression,serverabstractflowis installed witheditor
# Local (recommended)
ollama serve && ollama pull qwen3:4b
# Or use LM Studio
# Or cloud providers via env vars:
export OPENAI_API_KEY="..."
export ANTHROPIC_API_KEY="..."
export OPENROUTER_API_KEY="..."abstractcode --provider ollama --model qwen3:4bYou now have a durable coding assistant in your terminal. Type /help to explore.
Durability: Your session persists across restarts β close and reopen, your full context is preserved. Start fresh with
/clear.
assistant trayThe assistant appears in your menu bar. Click to interact, or use keyboard shortcuts.
Durability: Sessions persist β your conversation history is preserved across app restarts.
Use AbstractCore as a drop-in unified LLM client that works with any provider and model:
from abstractcore import create_llm
llm = create_llm("ollama", model="qwen3:4b-instruct")
# llm = create_llm("openai", model="gpt-4o")
# llm = create_llm("anthropic", model="claude-3-5-sonnet-latest")
response = llm.generate("Explain durable execution in 3 bullets.")
print(response.content)Deploy a run gateway and observe workflows in your browser:
export ABSTRACTGATEWAY_AUTH_TOKEN="for-my-security-my-token-must-be-at-least-15-chars"
export ABSTRACTGATEWAY_DATA_DIR="my-folder/runtime/gateway"
abstractgateway serve --port 8080
npx @abstractframework/observer # Gateway observability dashboard
npx @abstractframework/flow # Visual workflow editor
npx @abstractframework/code # Browser coding assistantOpen http://localhost:3001, connect to the gateway, and start observing.
pip install "abstractframework==0.1.1"pip install abstractcore==2.11.8
pip install "abstractflow[editor]==0.3.7"
pip install abstractgateway==0.2.1# Web UIs (run directly)
npx @abstractframework/observer # Gateway observability dashboard
npx @abstractframework/flow # Visual workflow editor
npx @abstractframework/code # Browser coding assistant
# UI component libraries (for building your own apps)
npm install @abstractframework/ui-kit
npm install @abstractframework/panel-chat
npm install @abstractframework/monitor-flow
npm install @abstractframework/monitor-active-memory
npm install @abstractframework/monitor-gpuThe tables below describe the ecosystem components. The abstractframework==0.1.1 install profile pins all Python packages to the versions listed in Quick Start.
| Package | What It Does | Install |
|---|---|---|
| AbstractCore | Unified LLM API β providers, tools, structured output, media | pip install abstractcore |
| AbstractRuntime | Durable execution β ledger, effects, pause/resume, replay | pip install abstractruntime |
| Package | What It Does | Install |
|---|---|---|
| AbstractAgent | Agent patterns β ReAct, CodeAct, MemAct loops | pip install abstractagent |
| AbstractFlow | Visual workflows β portable .flow bundles + editor |
pip install abstractflow |
| Package | What It Does | Install |
|---|---|---|
| AbstractMemory | Temporal triple store β provenance-aware, vector search | pip install abstractmemory |
| AbstractSemantics | Schema registry β predicates, entity types for KG | pip install abstractsemantics |
| Package | What It Does | Install |
|---|---|---|
| AbstractCode | Terminal TUI β durable coding assistant | pip install abstractcode |
| AbstractAssistant | macOS tray app β local agent with optional voice | pip install abstractassistant |
| AbstractGateway | HTTP server β remote runs, durable commands, SSE | pip install abstractgateway |
| AbstractObserver | Browser UI β observe, launch, and control runs | npx @abstractframework/observer |
These are optional capability plugins for AbstractCore. Once installed, they expose additional capabilities on llm instances (e.g., llm.voice.tts(), llm.vision.t2i()), keeping AbstractCore lightweight by default.
| Package | What It Does | Install |
|---|---|---|
| AbstractVoice | Voice I/O β adds llm.voice (TTS) and llm.audio (STT) |
pip install abstractcore abstractvoice |
| AbstractVision | Image generation β adds llm.vision (text-to-image, image-to-image) |
pip install abstractcore abstractvision |
| Package | What It Does | Install |
|---|---|---|
| @abstractframework/flow | Visual workflow editor (drag-and-drop) | npx @abstractframework/flow |
| @abstractframework/code | Browser-based coding assistant | npx @abstractframework/code |
| Package | What It Does |
|---|---|
| @abstractframework/ui-kit | Theme tokens + UI primitives |
| @abstractframework/panel-chat | Chat thread + message cards + composer |
| @abstractframework/monitor-flow | Agent-cycle trace viewer |
| @abstractframework/monitor-active-memory | Knowledge graph explorer (ReactFlow) |
| @abstractframework/monitor-gpu | GPU utilization widget |
| Guide | Description |
|---|---|
| Docs Index | Entrypoint docs for the ecosystem |
| Getting Started | Pick a path and run something |
| Architecture | How the pieces fit together |
| API | Meta-package API (create_llm, install profile helpers) |
| Configuration | Environment variables & providers |
| FAQ | Common questions |
| Scenarios | End-to-end paths by use case |
| Guides | Focused "how it works" notes |
| Glossary | Shared terminology |
AbstractFramework is designed so you can author one specialized workflow and deploy it across clients.
- Build your specialized logic in the Flow editor (
npx @abstractframework/flow). - Export it as a
.flowbundle with an interface contract (abstractcode.agent.v1). - Run it in terminal (
abstractcode --workflow ...), browser UIs, or throughabstractgateway.
See docs/getting-started.md Path 12 for a complete end-to-end example.
We built AbstractFramework because we wanted:
- Full control β No vendor lock-in, no proprietary dependencies
- Local by default β Privacy and cost control with open-source models
- Durability β AI systems that don't lose work when things crash
- Observability β Complete visibility, not a black box
- Composability β Use what you need, replace what you don't
Cloud APIs are supported when you need them (complex reasoning tasks), but the framework is designed to run entirely on your hardware.
Every package is its own repo. Find what interests you:
Foundation: AbstractCore Β· AbstractRuntime
Composition: AbstractAgent Β· AbstractFlow
Memory: AbstractMemory Β· AbstractSemantics
Apps: AbstractCode Β· AbstractAssistant Β· AbstractGateway Β· AbstractObserver
Modalities: AbstractVoice Β· AbstractVision
UI Components: AbstractUIC
MIT β see LICENSE.