A powerful AI-powered command line interface tool that helps you interact with various AI models directly from your terminal.
- 🤖 Support for multiple AI providers (OpenAI, Anthropic)
- 💬 Interactive chat mode
- 📝 Code generation and explanation
- 🔧 Command-line interface with rich output
- ⚙️ Configurable settings and API keys
- 🎨 Beautiful terminal UI with syntax highlighting
- Python 3.8 or higher
- uv package manager
- Clone the repository:
git clone https://github.com/yourusername/ai-cli.git
cd ai-cli- Create a virtual environment and install dependencies:
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
uv pip install -e .- Install development dependencies (optional):
uv pip install -e ".[dev]"- Set up your API keys:
cp .env.example .env
# Edit .env with your API keysCreate a .env file in the project root with your API keys:
# OpenAI API Key
OPENAI_API_KEY=your_openai_api_key_here
# Anthropic API Key
ANTHROPIC_API_KEY=your_anthropic_api_key_here
# Default model (optional)
DEFAULT_MODEL=gpt-4# Start an interactive chat session
ai-cli chat
# Ask a single question
ai-cli ask "What is the capital of France?"
# Generate code
ai-cli code "Write a Python function to calculate fibonacci numbers"
# Explain code
ai-cli explain "def fibonacci(n): return n if n < 2 else fibonacci(n-1) + fibonacci(n-2)"ai-cli chat- Start an interactive chat sessionai-cli ask <question>- Ask a single questionai-cli code <prompt>- Generate code based on a promptai-cli explain <code>- Explain the given codeai-cli config- Manage configuration settingsai-cli --help- Show help information
--model <model>- Specify the AI model to use--provider <provider>- Specify the AI provider (openai, anthropic)--temperature <float>- Set the temperature for responses (0.0-2.0)--max-tokens <int>- Set maximum tokens for responses--verbose- Enable verbose output
ai-cli/
├── ai_cli/ # Main package
│ ├── __init__.py
│ ├── main.py # CLI entry point
│ ├── core/ # Core functionality
│ │ ├── __init__.py
│ │ ├── client.py # AI client implementations
│ │ ├── config.py # Configuration management
│ │ └── utils.py # Utility functions
│ ├── commands/ # CLI commands
│ │ ├── __init__.py
│ │ ├── chat.py # Chat command
│ │ ├── ask.py # Ask command
│ │ ├── code.py # Code generation command
│ │ └── config.py # Config command
│ └── models/ # Data models
│ ├── __init__.py
│ ├── chat.py # Chat models
│ └── config.py # Configuration models
├── tests/ # Test suite
│ ├── __init__.py
│ ├── test_core/
│ ├── test_commands/
│ └── test_models/
├── docs/ # Documentation
├── pyproject.toml # Project configuration
├── README.md # This file
├── .env.example # Environment variables template
├── .gitignore # Git ignore rules
└── .pre-commit-config.yaml # Pre-commit hooks
# Run all tests
pytest
# Run tests with coverage
pytest --cov=ai_cli
# Run specific test file
pytest tests/test_core/test_client.py# Format code
black ai_cli tests
# Sort imports
isort ai_cli tests
# Lint code
flake8 ai_cli tests
# Type checking
mypy ai_cliInstall pre-commit hooks to automatically run code quality checks:
pre-commit install- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.