You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When entities.locations is not available, widgets fall back to text pattern matching for country/city names.
Development
# Install dependencies
npm install
# Run dev server
npm run dev
# Build for production
npm run build
Server Deployment
# SSH to your server
ssh user@your-server-ip
# Navigate to projectcd /opt/reddit-analyzer
# Restart API server
pkill -f gunicorn
nohup gunicorn -w 1 -b 0.0.0.0:5000 api_server_simple:app --timeout 120 > gunicorn.log 2>&1&
Relevant Server Files
File
Description
/opt/reddit-analyzer/api_server_simple.py
Main API server - simplified raw dump approach
/opt/reddit-analyzer/data/
Permanent storage - never deleted, data is merged
/opt/reddit-analyzer/cache/
Quick access cache (copy of data)
/opt/reddit-analyzer/gunicorn.log
Server logs
API Endpoints
Endpoint
Method
Description
/analyze
POST
Fetch/analyze user, merges with existing data
/health
GET
Server health check
/users
GET
List all stored users with stats
Server-side Queueing (New)
To prevent overload and show users their place in line, the frontend now supports server-managed queuing for analysis requests. When enabled on the server, the flow is:
Client POSTs /analyze with {"username":"...","queue":true} and header X-Queue: 1
Server responds immediately with a queued status:
{
"status": "queued", // or "processing"
"request_id": "abc123",
"position": 2, // number in queue (1 = first)
"eta_seconds": 90 // optional
}
Client polls GET /queue/status?request_id=abc123 every ~1.5s until status is done
Server returns the result either embedded in the status { status: 'done', result: {...} } or via GET /queue/result?request_id=abc123
Frontend changes are backward compatible: if the server does not support queueing, /analyze will return the analysis as before.
Minimal Flask server implementation sketch (add to /opt/reddit-analyzer/api_server_simple.py):
# --- queue.py (inline or separate module) ---importthreading, queue, time, uuidfromtypingimportDictjob_q=queue.Queue() # pending jobsstatuses: Dict[str, dict] = {}
defworker_thread(process_fn):
whileTrue:
job=job_q.get()
ifjobisNone:
breakrid=job['request_id']
statuses[rid]['status'] ='processing'try:
result=process_fn(job['payload'])
statuses[rid]['status'] ='done'statuses[rid]['result'] =resultexceptExceptionase:
statuses[rid]['status'] ='error'statuses[rid]['message'] =str(e)
finally:
job_q.task_done()
defqueue_position(rid: str) ->int:
# compute 1-based position; simple O(n) scanitems=list(job_q.queue)
fori, jinenumerate(items, start=1):
ifj['request_id'] ==rid:
returnireturn0# 0 means up next or processing# In your Flask app startup:# start a single worker to guarantee concurrency=1# worker = threading.Thread(target=worker_thread, args=(process_analyze,), daemon=True)# worker.start()# --- in your Flask routes ---@app.route('/analyze', methods=['POST'])defanalyze_route():
data=request.get_json(force=True)
username=data.get('username','').strip()
ifnotusername:
returnjsonify({ 'error': 'username_required' }), 400# If client requests queueing, enqueueifrequest.headers.get('X-Queue') =='1'ordata.get('queue'):
rid=uuid.uuid4().hexpayload= {
'username': username,
'force_refresh': bool(data.get('force_refresh')),
'include_raw': bool(data.get('include_raw')),
'top': int(data.get('top') or0)
}
statuses[rid] = { 'status': 'queued' }
job_q.put({ 'request_id': rid, 'payload': payload })
pos=queue_position(rid)
returnjsonify({ 'status': 'queued', 'request_id': rid, 'position': pos })
# Backward-compatible: process synchronouslyresult=process_analyze({
'username': username,
'force_refresh': bool(data.get('force_refresh')),
'include_raw': bool(data.get('include_raw')),
'top': int(data.get('top') or0)
})
returnjsonify(result)
@app.get('/queue/status')defqueue_status():
rid=request.args.get('request_id','')
s=statuses.get(rid)
ifnots:
returnjsonify({ 'status': 'error', 'message': 'unknown_request' }), 404pos=queue_position(rid)
out= { 'status': s.get('status', 'queued'), 'position': pos }
if'eta_seconds'ins:
out['eta_seconds'] =s['eta_seconds']
ifout['status'] =='done'and'result'ins:
# You can omit result here and serve it from /queue/result insteadpassreturnjsonify(out)
@app.get('/queue/result')defqueue_result():
rid=request.args.get('request_id','')
s=statuses.get(rid)
ifnots:
returnjsonify({ 'status': 'error', 'message': 'unknown_request' }), 404ifs.get('status') !='done':
returnjsonify({ 'status': s.get('status','queued') }), 202returnjsonify(s['result'])
Hook your existing analysis function into process_analyze(payload) so the worker executes it. Keep the worker count at 1 for strict queueing.
Trigger fresh deployment Thu Feb 5 21:34:04 CET 2026
Cache invalidation 1770323964
About
React dashboard for analyzing Reddit user activity, behavior patterns, and content with AI-powered insights. Features content analysis, karma breakdowns, and activity timelines.