🧩 Tibo – a powerful command-line tool designed to index your codebase, generate call graphs, and chunk code into a vector database. With tibo, you can query your codebase using natural language and retrieve contextually relevant files, functions, and code snippets effortlessly.
- Codebase Indexing: Scans and organizes your project for easy querying.
- Call Graph Generation: Maps relationships between functions and files.
- Vector Database: Embeds code chunks for fast, intelligent retrieval.
- Natural Language Queries: Ask questions about your code in plain English.
- Context-Aware Results: Returns relevant files and snippets with added context from the call graph.
Get started with tibo by installing:
pip install tibo
Find the latest version and additional details on the PyPI project page.
Follow these steps to integrate tibo into your workflow:
- Configure the Tool - Set up tibo with your OpenAI API key:
tibo config
OPTIONAL: Configure a local LLM to use for on device ai processing:
tibo local
NOTE: need to provide model name and url, and ensure the local llm server is running on your device at that specified URL.
- Index Your Project - Navigate to your project directory and index your codebase:
cd /path/to/your/project
tibo index
Note: This creates a .tibo folder in your project root to store indexed data, call graphs, and vector embeddings.
- Query Your Codebase - Fetch relevant context by asking questions in natural language:
tibo fetch "my query to the codebase"
Results include the most relevant file names and code chunks. Full output is saved in .tibo/query_output/query_output.json.
- NEW Interact with Tibo Agent - chat with the ai agent to understnad the codebase better and get help with implementing new features:
tibo agent
NOTE: requires running tibo config and adding ANTHROPIC_API_KEY when prompted. The agent can use the tibo fetching tools if you have run 'tibo index' before. In the shell:
- type 'exit' or 'quit' to quit the agent shell
- type '#' followed by a command to execute a command directly in your terminal
- type 'reset' to reset the conversation history
Extra tools:
- agent can perform web searches (requires setting up OPENAI_API_KEY)
- agent can get project structure details (requires running tibo index)
- agent can read file contents when needed NOTE: Editing/creating/deleting files coming soon...
Configuration: Link Tibo to your OpenAI API for LLM-powered enhancements.
Indexing: Processes codebase, builds call graph, chunks files, enhances with GPT-4o-mini, and stores vector embeddings locally.
Querying: Enhances your query with an LLM, matches it to the top relevant chunks, and supplements results with call graph context.
Python 3.7+ An OpenAI API key (required for LLM functionality)
We welcome contributions! Feel free to open issues or submit pull requests on our GitHub repository.
MIT License