About bcat
bcat is an AI chat agent that runs local language models on your own hardware. The mascot is a tuxedo cat in smart glasses — the real Brody — and the project is named after him.
Why local-first?
Most AI chat products send every word you type to a cloud server. That's a privacy problem, a vendor-lock-in problem, and a "what happens when the SaaS shuts down" problem all at once. bcat sidesteps all of that by running open-weight models (Llama, Mistral, Qwen, Phi, and others via Ollama) on the same machine that runs the chat UI.
What makes it different
- Your data stays on your box. No telemetry, no third-party model calls.
- Self-learning loop. bcat reflects on its own answers and stores high-confidence reasoning patterns it can reuse later.
- Pluggable specialists. Route hard prompts to a stronger model and easy prompts to a faster one.
- Open source. Flask, SQLite, and a small UI — the entire stack fits in your head.
Who builds it
bcat is built by Aegyrix LLC as part of the Ægyrix Security Suite — a set of privacy-first developer tools that share the same design philosophy: local-first, audit-friendly, and free for individuals.
Tech stack
- Backend: Python 3.13, Flask, SQLite, Google OAuth (JWT)
- Models: Ollama-served local LLMs
- Frontend: vanilla HTML / CSS / JS, no build step
- Hosting: chat.bcat.app on Aegyrix VPS02
- Source: github.com/aegyrix/bcat.app