A dashboard built to track and visualize common doubts across different project groups. It began as a security research exercise and grew into a full-stack project with a clean interface and custom data handling.
- Periodic Monitoring: Visibility into new doubts synced every 5 days.
- Advanced Filtering: Quickly isolate groups "With Doubts" or filter by "Recent Updates".
- Intelligent Sorting: Sort by group number, volume of doubts, or recency.
- Multi-Format Export: Export aggregated data as professional CSV or formatted Markdown.
- UI/UX: Dark-themed, responsive interface optimized for Desktop and Mobile (iOS/Safari).
- CDC Logic: Custom hashing mechanism to trigger updates only on actual content changes.
- Framework: React 19
- Build Tool: Vite 8
- Language: TypeScript
- Styling: Vanilla CSS (CSS3 Variable-based Design System)
- Language: Python 3.10+
- Database/Cache: Redis (for high-performance data retrieval)
- Deployment: Configured for Vercel Serverless
- Node.js (v18+) & pnpm
- Python 3.10+
- Redis Server (Local or Managed)
-
Clone & Install:
git clone https://github.com/manas/doubt-dashboard.git cd doubt-dashboard pnpm install -
Backend Setup:
# Create a virtual environment python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate # Install dependencies pip install -r dev-requirements.txt
-
Variables: Copy
.env.exampleto.envand configure your Redis credentials. -
Launch:
pnpm dev # Runs both frontend and backend concurrently
.
├── api/ # Vercel Serverless Functions (Python)
├── scripts/ # Backend utility scripts & Scrapers
├── src/ # Frontend source code
│ ├── components/ # Modular UI components
│ └── helpers.ts # Data processing & Export logic
└── index.html # Entry pointThe project sparked when I discovered an IDOR (Insecure Direct Object Reference) vulnerability in my university's doubt submission portal. By manipulating URL parameters, it was possible to access submissions from any group. What started as a simple Python script to gather academic data turned into an engineering exercise to experiment with caching, modular architectures, and custom data processing.
One of the goals was to implement the backend without any framework.
- It was a challenge to gain hands-on experience with raw Python server logic and manual request handling.
- While the backend is intentionally minimal, the frontend uses React to maintain an organized, modular developer experience.
- Privacy First: All personal data (PII), names, and images were permanently removed from the data pipeline to comply with GDPR.
- Infrastructure Respect: The scraper runs on a low-frequency cycle (every 5 days) with strict rate-limiting to avoid any strain on university systems.
- Security: The scraping logic is decoupled from the dashboard to ensure the dashboard only handles processed, sanitized data.
This project is for academic and educational purposes only.
- Ethics: Before using or adapting this code, ensure you comply with your institution's ethical guidelines and terms of service.
- Liability: The author is not responsible for any misuse of this tool. It was built as a proof-of-concept for research and learning.
