Version 1.1 - Comprehensive web research powered by Firecrawl and LangGraph
- π» Advanced Settings Page - Comprehensive configuration with environment detection
- π§ Multiple LLM Provider Support - OpenAI, Anthropic, OpenRouter, Google Gemini, Groq, Grok AI, and Deepseek
- π¨ Enhanced UI/UX - Reorganized navigation, floating contribute button, improved visuals
- πΊοΈ Comprehensive Roadmap - Chat Page 2.0, advanced features, and planned expansions
- π Better Security - Improved API key handling with environment variable precedence
- π± Responsive Design - Optimized for all screen sizes
- Firecrawl: Multi-source web content extraction and search
- LangGraph: Intelligent agent workflow orchestration
- Multiple LLM Providers: OpenAI GPT-4o, Anthropic Claude, Google Gemini, and more
- Next.js 15: Modern React framework with App Router and Turbopack
- TypeScript: Type-safe development with comprehensive error handling
| Service | Purpose | Get Key |
|---|---|---|
| Firecrawl | Web scraping and content extraction | firecrawl.dev/app/api-keys |
| LLM Provider | AI reasoning and synthesis | See options below |
| Provider | Models Available | Get API Key |
|---|---|---|
| OpenAI | GPT-4o, GPT-4o-mini, GPT-3.5-turbo | platform.openai.com/api-keys |
| Anthropic | Claude 3.5 Sonnet, Claude 3 Haiku | console.anthropic.com |
| Google Gemini | Gemini Pro, Gemini Flash | ai.google.dev |
| OpenRouter | Access to 100+ models | openrouter.ai/keys |
| Groq | Ultra-fast Llama, Mixtral, Gemma | console.groq.com |
| Grok AI | xAI's Grok models | x.ai |
| Deepseek | Advanced reasoning models | platform.deepseek.com |
# Clone the repository
git clone https://github.com/mendableai/firesearch.git
cd firesearch
# Install dependencies
pnpm install
# Create environment file
cp .env.example .env.local
# Add your API keys to .env.local
FIRECRAWL_API_KEY=your_firecrawl_key
OPENAI_API_KEY=your_openai_key
# Start development server
pnpm dev# Using npm
npm install && npm run dev
# Using yarn
yarn install && yarn devdocker build -t firesearch .
docker run -p 3000:3000 --env-file .env.local firesearch- π§ Smart Query Decomposition - Breaks complex questions into focused sub-queries
- π Multi-Source Search - Searches multiple sources simultaneously via Firecrawl
- β Answer Validation - Verifies source relevance (0.7+ confidence threshold)
- π Intelligent Retry Logic - Alternative search strategies for incomplete answers
- π Real-time Progress - Live updates as searches complete
- π Full Citations - Every fact linked to its original source
- π¬ Context Memory - Follow-up questions maintain conversation history
- βοΈ Advanced Settings - Comprehensive configuration with environment detection
- π¨ Modern UI - Clean, responsive design with intuitive navigation
- π Privacy-First - API keys stored securely in local storage
- π Cross-Platform - Works on desktop, tablet, and mobile devices
- π Fast Performance - Optimized with Next.js 15 and Turbopack
- π¬ Discussion the Research - Interactive discussions about search results
- ποΈ View Active Model - Real-time display of models being used
- π Search History - Browse and revisit previous searches
- π Search Analytics - Insights into search patterns and cost tracking
- π° Real-Time Cost Calculator - Live cost tracking for each search request
- β Manual Settings Confirmation - Explicit save actions with confirmation dialogs
- π Dark Mode & Light Mode - Enhanced theme system with user preferences
- π± Responsive Design - Optimized mobile and tablet experience
- π¨ Accessibility - WCAG 2.1 AA compliance and keyboard navigation
- π Multi-Language Support - Search and interface localization
- π€ Export & Share - PDF export and sharing research findings
- π€ Voice Search - Speech-to-text input capabilities
- π€ Custom Search Agents - Specialized research agents for specific domains
- π Browser Extensions - Quick access from any webpage
- π Data Visualization - Charts and graphs for research insights
- π Local LLM Support - Ollama, LM Studio, and other local inference servers
- π Self-hosted Firecrawl - Connect to your own Firecrawl instance for enhanced security
- π Emerging Providers - Support for specialized reasoning engines and new AI platforms
flowchart TB
Query["User Query:<br/>'Compare latest iPhones'"]:::query
Query --> Break
Break["π Query Decomposition"]:::primary
subgraph SubQ["π Sub-Questions"]
S1["iPhone 15 Pro specs"]:::search
S2["iPhone 15 pricing"]:::search
S3["iPhone 15 vs 14 comparison"]:::search
end
Break --> SubQ
subgraph FC["π₯ Firecrawl Searches"]
FC1["Search Apple.com<br/>TechCrunch"]:::firecrawl
FC2["Search GSMArena<br/>The Verge"]:::firecrawl
FC3["Search CNET<br/>MacRumors"]:::firecrawl
end
S1 --> FC1
S2 --> FC2
S3 --> FC3
subgraph Valid["β
Validation"]
V1["Specs found β (0.95)"]:::good
V2["Pricing found β (0.85)"]:::good
V3["Comparison β (0.4)"]:::bad
end
FC --> Valid
Valid --> Synthesis["π€ LLM Synthesis"]:::synthesis
Synthesis --> Answer["π Cited Response<br/>+ Follow-ups"]:::answer
classDef query fill:#ff8c42,stroke:#ff6b1a,stroke-width:3px,color:#fff
classDef primary fill:#3a4a5c,stroke:#2c3a47,stroke-width:2px,color:#fff
classDef search fill:#4caf50,stroke:#388e3c,stroke-width:2px,color:#fff
classDef firecrawl fill:#ff6b1a,stroke:#ff4500,stroke-width:2px,color:#fff
classDef good fill:#4caf50,stroke:#388e3c,stroke-width:2px,color:#fff
classDef bad fill:#f44336,stroke:#d32f2f,stroke-width:2px,color:#fff
classDef synthesis fill:#9c27b0,stroke:#7b1fa2,stroke-width:2px,color:#fff
classDef answer fill:#3a4a5c,stroke:#2c3a47,stroke-width:3px,color:#fff
- π Query Analysis - Understanding user intent and complexity
- π Decomposition - Breaking into focused sub-questions
- π Multi-Search - Parallel searches via Firecrawl API
- π Content Extraction - Markdown content from relevant sources
- β Validation - Confidence scoring for answer completeness
- π Retry Logic - Alternative strategies for gaps
- π€ Synthesis - LLM combines findings into comprehensive response
- π Citation - Full source attribution and follow-up generation
Customize search behavior in the Settings page or via lib/config.ts:
export const SEARCH_CONFIG = {
// Search Parameters
MAX_SEARCH_QUERIES: 12, // Maximum parallel searches
MAX_SOURCES_PER_SEARCH: 4, // Sources per search query
MAX_SOURCES_TO_SCRAPE: 3, // Additional content scraping
// Quality Control
MIN_CONTENT_LENGTH: 100, // Minimum viable content length
MIN_ANSWER_CONFIDENCE: 0.7, // Answer completeness threshold
// Performance
MAX_RETRIES: 2, // Retry attempts for failures
MAX_SEARCH_ATTEMPTS: 2, // Alternative search strategies
SCRAPE_TIMEOUT: 15000, // Content extraction timeout
// Response Generation
SUMMARY_CHAR_LIMIT: 100, // Source summary length
ENABLE_FOLLOW_UPS: true, // Generate follow-up questions
} as const;- "Compare the latest iPhone, Samsung Galaxy, and Google Pixel flagship features"
- "What are the key differences between ChatGPT-4 and Claude 3.5 Sonnet?"
- "NVIDIA RTX 4090 vs RTX 4080 Super performance benchmarks"
- "Who are the founders of Firecrawl and what's their background?"
- "Latest funding rounds in AI infrastructure companies 2024"
- "Tesla's Q4 2024 earnings vs analyst expectations"
- "Recent breakthroughs in quantum computing error correction"
- "Climate change impact on arctic ice coverage 2020-2024"
- "Effectiveness of different learning management systems for remote education"
We welcome contributions! Here's how to get started:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and ensure tests pass
- Commit your changes:
git commit -m "Add amazing feature" - Push to your branch:
git push origin feature/amazing-feature - Open a Pull Request
- Follow TypeScript best practices
- Add tests for new features
- Update documentation as needed
- Use conventional commits for messages
- Ensure accessibility compliance
Firesearch leverages Firecrawl's /search endpoint for comprehensive web research:
// Search with content extraction
const response = await fetch('/api/search', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
query: "iPhone 16 specifications",
limit: 8,
scrapeOptions: {
formats: ["markdown"]
}
})
});Support for multiple providers with unified interface:
// Dynamic provider switching
const llmResponse = await queryLLM({
provider: "openai", // or "anthropic", "gemini", etc.
model: "gpt-4o",
messages: [...],
apiKey: getApiKey(provider)
});Firesearch is designed with privacy and control in mind. Future versions will support:
Connect to your own Firecrawl instance for:
- Enhanced Security - All data remains within your infrastructure
- Compliance - Meet strict regulatory requirements (GDPR, HIPAA, SOC2)
- Cost Control - Avoid per-request API fees for high-volume usage
- Customization - Tailor scraping behavior for specific needs
Run AI models locally for complete privacy:
- Ollama Integration - Support for Llama, Mistral, and other open models
- LM Studio Support - Easy local model management and inference
- Custom Endpoints - Connect to any OpenAI-compatible local server
- Zero Data Sharing - All processing happens on your hardware
- Complete Privacy - No data ever leaves your environment
- Cost Optimization - Reduce API costs for high-volume usage
- Regulatory Compliance - Meet enterprise security requirements
- Custom Configuration - Fine-tune behavior for specific use cases
- Click the "Deploy with Vercel" button above
- Add your environment variables in the Vercel dashboard
- Deploy automatically on every push to main
# Build for production
pnpm build
# Start production server
pnpm start
# Or use Docker
docker build -t firesearch .
docker run -p 3000:3000 firesearch# Required
FIRECRAWL_API_KEY=your_firecrawl_key
OPENAI_API_KEY=your_openai_key
# Optional (can be set in Settings page)
ANTHROPIC_API_KEY=your_anthropic_key
GOOGLE_API_KEY=your_google_key
OPENROUTER_API_KEY=your_openrouter_key
GROQ_API_KEY=your_groq_key
GROK_API_KEY=your_grok_key
DEEPSEEK_API_KEY=your_deepseek_key
# Self-hosting Options (Coming Soon)
FIRECRAWL_API_URL=http://localhost:3002 # For self-hosted Firecrawl
OLLAMA_BASE_URL=http://localhost:11434 # For local Ollama models
LM_STUDIO_BASE_URL=http://localhost:1234 # For LM Studio local serverMIT License - see LICENSE file for details.
- Firecrawl for powerful web scraping capabilities
- LangChain for LangGraph agent framework
- Vercel for seamless deployment platform
- Open source community for continuous inspiration and contributions
Built with β€οΈ by the Firesearch Team
Powered by Firecrawl and LangGraph
Powered by Firecrawl and LangGraph
