sterl
a terminal AI agent for the Stellar network - payments, path payments, and trustlines, with private keys behind a local signer you approve transaction-by-transaction.
$npm install -g sterlai
pnpm add -g sterlai
· then run sterl setup
Bring your own LLM
sterl setup picks a model provider with you. Re-run anytime with sterl model.
OAuth subscription
Claude Pro/Max, ChatGPT Plus/Pro, Copilot, Gemini. Sign in once in your browser, no key files.
/login
API key
Anthropic, OpenAI, Groq (free tier), OpenRouter, Gemini, xAI, Mistral, Cerebras. Auto-detected from env.
export ANTHROPIC_API_KEY=…
Local Ollama
Probes localhost:11434, lists installed models, writes your
~/.pi/agent/models.json. Zero cost, fully offline.
sterl model --ollama
Stellar-native
Speaks classic Stellar operations through Horizon. XLM, USDC, and any custom asset via CODE:ISSUER.
You sign, not the LLM
Every signature flows through a local daemon that prints the transaction and waits for your y/N.
MCP-ready
Expose Sterl's tools to Cursor, Claude Desktop, or any MCP client with one config block.