Chrome extension for extracting YouTube transcripts, analyzing comments, and running transcript-grounded Q&A with selectable LLM providers.
- Extracts transcripts from YouTube videos and Shorts.
- Supports multi-language caption retrieval when available.
- Extracts captions locally from YouTube player/page data without requiring YouTube Data API v3.
- Analyzes comments for sentiment and recurring themes.
- Answers user questions from transcript context (RAG-style flow).
- Supports
openai,huggingface,gemini, and localollamaproviders. - Provides manual transcript mode when automatic extraction is not possible.
https://www.youtube.com/watch?v=...https://www.youtube.com/shorts/...
- Clone the repository:
git clone https://github.com/Rishpraveen/Youtube_Context_Analysis_using_API.git- Open
chrome://extensions/. - Enable
Developer mode. - Click
Load unpackedand select this project folder.
Open the extension options page and configure the following.
YouTube API key: Optional. Transcript extraction primarily uses player/page methods; API key is mainly used for more reliable comment fetching.LLM provider: Choose one ofopenai,huggingface,gemini, orollama.
Provider-specific settings:
openai: API key and model.huggingface: API key (optional for some models) and model.gemini: API key and model.ollama: local endpoint (defaulthttp://localhost:11434) and model.
Performance/settings:
batchSize,maxComments,chunkSize.fetchAllLanguages,preferredLanguages,autoTranslateCaptions.browserExtractionEnabled.manualModeanddefaultTranscript.
- Open a YouTube video or Short.
- Open the extension popup.
- Use tabs:
Transcript: fetch transcript.Comments: analyze comments.RAG Analysis: ask questions based on transcript text.
- Export transcript/comments/RAG outputs from popup export buttons.
Keyboard shortcuts in popup:
Ctrl+1,Ctrl+2,Ctrl+3: switch tabs.t: fetch transcript (Transcript tab).c: analyze comments (Comments tab).r: focus RAG input (RAG tab).
To run analysis locally:
- Install Ollama.
- Start Ollama service locally.
- Pull at least one model (example:
llama3.2:3b). - In options, set provider to
ollama. - Confirm endpoint and model values match your local setup.
If you installed models using Alpaca, use the same model name shown by ollama list in the Model field.
If the endpoint or model is invalid, analysis requests fail with a provider error in the popup status.
manifest.json: extension configuration and permissions.background.js: orchestration, API calls, caching, provider routing, context menu.content.js: page-level extraction (captions/comments metadata and fallback logic).popup.js: popup UI actions, progress, rendering, export.options.js: settings management and provider/API validation.
- Transcript not loading: Verify you are on a supported YouTube URL, then try manual mode if captions are unavailable.
- Provider errors: Re-check selected provider credentials/endpoint/model in options.
- Ollama not responding: Confirm Ollama is running and reachable at configured endpoint.
- Comment fetch issues: If YouTube API path fails, fallback scraping may still work depending on page state and loaded comments.
- Project is plain JavaScript (no bundler).
- Load as unpacked extension for testing.
- A small URL support script exists:
test_shorts_support.js(requires local Node.js runtime).
See Contributing.md.
MIT. See LICENSE.