OpenBug CLI
OpenBug CLI is an AI-assisted CLI for real-time debugging: captures logs as you run apps, searches your local codebase, and answers in an interactive terminal.
What is OpenBug CLI?
OpenBug CLI is an intelligent command-line tool for debugging running applications with AI assistance. It pairs an interactive terminal assistant with a local cluster that captures logs while you run your services, then uses those logs and your codebase to answer debugging questions.
The core purpose is to reduce context switching between runtime behavior (logs) and source code. Instead of manually grepping through multiple terminals and files, you can ask questions about what’s happening and have the CLI correlate the relevant logs with code it can access locally.
Key Features
- Interactive terminal AI assistant (
debug): Start the assistant in one terminal and use it to ask questions about issues as your services run. - Automatic log capture and streaming: When you run your services with
debug <command>, OpenBug streams logs to the local cluster for use in AI responses. - Natural-language code search across the local codebase: The assistant can search your codebase in response to questions such as where a given behavior is implemented.
- Multi-service debugging via a shared local cluster: Run different services in separate terminals; all connect to the same cluster so the AI can trace problems across your stack.
- Local-first access and selective data sharing: The codebase is accessed locally and not uploaded; only specific snippets the AI queries are sent to the server, and logs are streamed only when needed to answer.
- Authenticated requests using a personal API key: The CLI authenticates requests with your API key (as described in the setup flow).
How to Use OpenBug CLI
- Install the CLI with:
npm install -g @openbug/cli - Start the AI assistant in Terminal 1:
You’ll be prompted to log in and paste an API key from the OpenBug app.debug - Run your services with debugging enabled in other terminals. Examples from the repository:
debug npm run dev debug python app.py debug docker-compose up - Ask debugging questions in Terminal 1 while your services are running. The assistant analyzes the captured logs and searches your codebase to provide targeted insights.
If you’re testing without setup, the project also provides an interactive demo that walks through debugging three realistic bugs.
Use Cases
- Diagnose why an endpoint fails during local development: Ask, for example, “Why is auth failing?” while your backend is running; the assistant can reference relevant logs and locate related validation logic.
- Trace issues across multiple services: Run a backend and frontend (or multiple backend services) in separate terminals using
debug ...; the AI can use logs from multiple services to explain how an error propagates. - Find implementation points for an unknown behavior: Use natural-language questions like “Where do we handle payment webhooks?” to have the assistant search the local repository for where the behavior is implemented.
- Investigate data inconsistencies seen at runtime: When logs suggest mismatched schemas or configuration errors, ask the AI to correlate log lines with the relevant code paths.
- Debug unfamiliar codebases without relying on internet search: The assistant searches the actual local codebase rather than searching the internet for generic guidance.
FAQ
-
Does OpenBug upload my entire codebase? No. The documentation states your codebase is accessed locally and never uploaded; only specific code snippets that the AI queries are sent to the server.
-
When does OpenBug send logs to the server? Logs are streamed to the server only when the AI needs them to answer your questions.
-
How does OpenBug support debugging across multiple services? By using a shared local cluster: you run multiple services in different terminals with
debug <command>, and they all connect to the same cluster so the AI can correlate logs across the stack. -
Can I self-host the OpenBug server? Yes. The repository describes a self-hosting approach: clone the server repository, configure it with your OpenAI API key, then point the CLI to your server via environment variables such as
WEB_SOCKET_URLandAPI_BASE_URL.
Alternatives
- Local debugging with logs + text search (e.g., grep/ripgrep) + IDE tooling: Similar inputs (logs and source code), but the workflow relies on manual correlation and navigation rather than an AI-assisted conversational interface.
- Application performance/observability platforms (logs and tracing dashboards): Useful for viewing and querying runtime data, but they typically don’t provide natural-language, code-aware debugging from your local repository.
- AI code assistants focused on repository Q&A (without runtime log capture): These can answer questions about code structure, but they don’t automatically capture logs from running services to ground answers in runtime behavior.
Alternatives
AakarDev AI
AakarDev AI is a powerful platform that simplifies the development of AI applications with seamless vector database integration, enabling rapid deployment and scalability.
BenchSpan
BenchSpan runs AI agent benchmarks in parallel, captures scores and failures in run history, and uses commit-tagged executions to improve reproducibility.
Sleek Analytics
Lightweight, privacy-friendly analytics with real-time visitor tracking—see where visitors come from, what they view, and how long they stay.
OpenFlags
OpenFlags is an open source, self-hosted feature flag system with a control plane and typed SDKs for progressive delivery and safe rollouts.
BookAI.chat
BookAI allows you to chat with your books using AI by simply providing the title and author.
DeepMotion
DeepMotion is an AI motion capture and body-tracking platform to generate 3D animations from video (and text) in your web browser, via Animate 3D API.