Agentless · ssh-native
Agentless on servers
All inference and tooling run on the client. No remote agent, no extra open ports, no sudo surgery—just your existing SSH path.
AstraTerm — AI-native terminal for modern operations.
Inference, tool calls, and file edits stay on your machine. Prod stays agent/port/sudo-clean—SSH is the same channel your team already trusts.
From error triage through execution and file edits, every step carries context, consent, and a trail—without touching your fleet.
Agentless · ssh-native
All inference and tooling run on the client. No remote agent, no extra open ports, no sudo surgery—just your existing SSH path.
Context-aware
Understands shell context, output, and cwd to narrow failures and suggest verifiable next steps.
Human-in-the-loop
Risky commands, PTY writes, and file patches require explicit confirmation with a timeline you can audit.
SSH · SFTP · local
Local shells, remote SSH, SFTP, and AI sessions share the same surface.
AstraTerm’s AI isn’t a floating chat bubble. It rides your SSH session, gathers context, calls tools, shows a timeline, and pauses before impact.
$ journalctl -u astra-worker -n 40
ERROR Missing required environment variable DATABASE_URL
worker exited with code 1, restart scheduled in 5s
$ astra ai fix deploy failure
AI is reading project files and preparing a patch▋
+ DATABASE_URL=${DATABASE_URL} + ASTRA_WORKER_MODE=production
From client architecture through execution, you keep the steering wheel.
Zero server intrusion
Model + tools live on the client. No agent sprawl, no surprise listeners, no sudo churn on prod.
Explicit consent
Commands, PTY writes, and file mutations never happen silently—confirm before impact.
Transparent records
Analysis, tool traces, and outputs appear on a chronological rail for retrospectives.
Licenses intact
AstraTerm ships on Tabby with MIT attribution and upstream compliance.
Install on the desktop only—nothing changes serverside apart from connecting the way you already do over SSH.