docs: Update README and add run script for Redis and RQ worker integration

This commit enhances the README with detailed instructions for installing and starting Redis, including commands for various operating systems. It clarifies the automatic startup of the RQ worker with the FastAPI server and updates the project status for Phase 2 features. Additionally, a new script `run_api_with_worker.sh` is introduced to streamline the process of starting the FastAPI server alongside the RQ worker, ensuring a smoother setup for users. The worker now has a unique name to prevent conflicts during execution.
This commit is contained in:
tanyar09 2025-10-31 13:01:58 -04:00
parent 4c2148f7fc
commit 2f039a1d48
5 changed files with 237 additions and 35 deletions

123
README.md
View File

@ -65,22 +65,33 @@ This creates the SQLite database at `data/punimtag.db` (default). For PostgreSQL
### Running the Application
**Prerequisites:**
- Redis must be running (for background jobs)
- Redis must be installed and running (for background jobs)
**Install Redis (if not installed):**
```bash
# Check if Redis is running
redis-cli ping
# If not running, start Redis:
# On Linux: sudo systemctl start redis
# On macOS with Homebrew: brew services start redis
# Or run directly: redis-server
# On Ubuntu/Debian:
sudo apt update && sudo apt install -y redis-server
sudo systemctl start redis-server
sudo systemctl enable redis-server # Auto-start on boot
# On macOS with Homebrew:
brew install redis
brew services start redis
# Verify Redis is running:
redis-cli ping # Should respond with "PONG"
```
**Start Redis (if installed but not running):**
```bash
# On Linux:
sudo systemctl start redis-server
# Or run directly:
redis-server
```
**Terminal 1 - Redis (if not running as service):**
```bash
redis-server
```
**Terminal 2 - Backend API:**
**Terminal 2 - Backend API (automatically starts RQ worker):**
```bash
cd /home/ladmin/Code/punimtag
source venv/bin/activate
@ -88,15 +99,16 @@ export PYTHONPATH=/home/ladmin/Code/punimtag
uvicorn src.web.app:app --host 127.0.0.1 --port 8000
```
**Terminal 3 - RQ Worker (required for photo import):**
```bash
cd /home/ladmin/Code/punimtag
source venv/bin/activate
export PYTHONPATH=/home/ladmin/Code/punimtag
python -m src.web.worker
You should see:
```
✅ RQ worker started in background subprocess (PID: ...)
INFO: Started server process
INFO: Uvicorn running on http://127.0.0.1:8000
```
**Terminal 4 - Frontend:**
**Note:** The RQ worker automatically starts in a background subprocess when the API starts. You'll see a confirmation message with the worker PID. If Redis isn't running, you'll see a warning message.
**Terminal 3 - Frontend:**
```bash
cd /home/ladmin/Code/punimtag/frontend
npm run dev
@ -108,7 +120,11 @@ Then open your browser to **http://localhost:3000**
- Username: `admin`
- Password: `admin`
**Note:** The RQ worker (Terminal 3) is required for background photo import jobs. Without it, jobs will remain in "Pending" status.
**Note:**
- The RQ worker starts automatically in a background subprocess when the API server starts
- Make sure Redis is running first, or the worker won't start
- Worker names are unique to avoid conflicts when restarting
- Photo uploads are stored in `data/uploads` (configurable via `PHOTO_STORAGE_DIR` env var)
---
@ -119,6 +135,13 @@ Then open your browser to **http://localhost:3000**
- **[Phase 1 Status](docs/PHASE1_FOUNDATION_STATUS.md)**: Phase 1 implementation status
- **[Phase 1 Checklist](docs/PHASE1_CHECKLIST.md)**: Complete Phase 1 checklist
**Phase 2 Features:**
- Photo import via folder scan or file upload
- Background processing with progress tracking
- Real-time job status updates (SSE)
- Duplicate detection by checksum
- EXIF metadata extraction
---
## 🏗️ Project Structure
@ -158,10 +181,10 @@ punimtag/
- ✅ Health, version, and metrics endpoints
- ✅ JWT authentication (login, refresh, user info)
- ✅ Job management endpoints (RQ/Redis integration)
- ✅ API routers for photos, faces, people, tags (placeholders)
- ✅ SQLAlchemy models for all entities
- ✅ Alembic migrations configured and applied
- ✅ Database initialized (SQLite default, PostgreSQL supported)
- ✅ RQ worker auto-start (starts automatically with API server)
**Frontend:**
- ✅ React + Vite + TypeScript setup
@ -176,13 +199,40 @@ punimtag/
- ✅ Indices configured for performance
- ✅ SQLite database at `data/punimtag.db`
### Next: Phase 2 - Processing & Identify
### Phase 2: Image Ingestion & Scan Tab ✅ **COMPLETE**
- Photo import (folder scan and upload)
- Face detection and processing pipeline
**Backend:**
- ✅ Photo import service with checksum computation
- ✅ EXIF date extraction and image metadata
- ✅ Folder scanning with recursive option
- ✅ File upload support
- ✅ Background job processing with RQ
- ✅ Real-time job progress via SSE (Server-Sent Events)
- ✅ Duplicate detection (by path and checksum)
- ✅ Photo storage configuration (`PHOTO_STORAGE_DIR`)
**Frontend:**
- ✅ Scan tab UI with folder selection
- ✅ Drag-and-drop file upload
- ✅ Recursive scan toggle
- ✅ Real-time job progress with progress bar
- ✅ Job status monitoring (SSE integration)
- ✅ Results display (added/existing counts)
- ✅ Error handling and user feedback
**Worker:**
- ✅ RQ worker auto-starts with API server
- ✅ Unique worker names to avoid conflicts
- ✅ Graceful shutdown handling
### Next: Phase 3 - Face Processing & Identify
- DeepFace pipeline integration
- Face detection (RetinaFace, MTCNN, OpenCV, SSD)
- Face embeddings computation (ArcFace, Facenet, etc.)
- Identify workflow UI
- Auto-match engine
- Scan and Process tab implementations
- Process tab implementation
---
@ -213,6 +263,9 @@ SECRET_KEY=your-secret-key-here
# Single-user credentials (change in production!)
ADMIN_USERNAME=admin
ADMIN_PASSWORD=admin
# Photo storage directory (default: data/uploads)
PHOTO_STORAGE_DIR=data/uploads
```
---
@ -242,20 +295,26 @@ npm test
- Database setup
- Basic API endpoints
### 🔄 Phase 2: Processing & Identify (In Progress)
- Photo import (scan/upload)
- DeepFace processing pipeline
### ✅ Phase 2: Image Ingestion & Scan Tab (Complete)
- ✅ Photo import (folder scan and file upload)
- ✅ Background job processing with RQ
- ✅ Real-time progress tracking via SSE
- ✅ Scan tab UI implementation
- ✅ Duplicate detection and metadata extraction
### 🔄 Phase 3: Processing & Identify (In Progress)
- Face detection and processing pipeline (DeepFace)
- Identify workflow UI
- Auto-match engine
- Scan and Process tabs
- Process tab implementation
### 📋 Phase 3: Search & Tags
### 📋 Phase 4: Search & Tags
- Search endpoints with filters
- Tag management UI
- Virtualized photo grid
- Advanced filtering
### 🎨 Phase 4: Polish & Release
### 🎨 Phase 5: Polish & Release
- Performance optimization
- Accessibility improvements
- Production deployment

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.8 MiB

57
run_api_with_worker.sh Executable file
View File

@ -0,0 +1,57 @@
#!/bin/bash
# Start FastAPI server with RQ worker in background
cd "$(dirname "$0")"
# Activate virtual environment if it exists
if [ -d "venv" ]; then
source venv/bin/activate
fi
# Set Python path
export PYTHONPATH="$(pwd)"
# Check if Redis is running
if ! redis-cli ping > /dev/null 2>&1; then
echo "⚠️ Redis is not running. Starting Redis..."
redis-server --daemonize yes 2>/dev/null || {
echo "❌ Failed to start Redis. Please start it manually: redis-server"
exit 1
}
sleep 1
fi
# Function to cleanup on exit
cleanup() {
echo "Shutting down..."
kill $WORKER_PID 2>/dev/null
kill $API_PID 2>/dev/null
exit
}
trap cleanup SIGINT SIGTERM
# Start RQ worker in background
echo "🚀 Starting RQ worker..."
python -m src.web.worker &
WORKER_PID=$!
# Give worker a moment to start
sleep 2
# Start FastAPI server
echo "🚀 Starting FastAPI server..."
uvicorn src.web.app:app --host 127.0.0.1 --port 8000 &
API_PID=$!
echo ""
echo "✅ Server running on http://127.0.0.1:8000"
echo "✅ Worker running (PID: $WORKER_PID)"
echo "✅ API running (PID: $API_PID)"
echo ""
echo "Press Ctrl+C to stop both services"
echo ""
# Wait for both processes
wait

View File

@ -1,5 +1,11 @@
from __future__ import annotations
import os
import subprocess
import sys
from contextlib import asynccontextmanager
from pathlib import Path
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
@ -14,10 +20,86 @@ from src.web.api.tags import router as tags_router
from src.web.api.version import router as version_router
from src.web.settings import APP_TITLE, APP_VERSION
# Global worker process (will be set in lifespan)
_worker_process: subprocess.Popen | None = None
def start_worker() -> None:
"""Start RQ worker in background subprocess."""
global _worker_process
try:
from redis import Redis
# Check Redis connection first
redis_conn = Redis(host="localhost", port=6379, db=0, decode_responses=False)
redis_conn.ping()
# Start worker as a subprocess (avoids signal handler issues)
project_root = Path(__file__).parent.parent.parent
python_executable = sys.executable
_worker_process = subprocess.Popen(
[
python_executable,
"-m",
"src.web.worker",
],
cwd=str(project_root),
stdout=None, # Don't capture - let output go to console
stderr=None, # Don't capture - let errors go to console
env={
**{k: v for k, v in os.environ.items()},
"PYTHONPATH": str(project_root),
}
)
# Give it a moment to start, then check if it's still running
import time
time.sleep(0.5)
if _worker_process.poll() is not None:
# Process already exited - there was an error
print(f"❌ Worker process exited immediately with code {_worker_process.returncode}")
print(" Check worker errors above")
else:
print(f"✅ RQ worker started in background subprocess (PID: {_worker_process.pid})")
except Exception as e:
print(f"⚠️ Failed to start RQ worker: {e}")
print(" Background jobs will not be processed. Ensure Redis is running.")
def stop_worker() -> None:
"""Stop RQ worker gracefully."""
global _worker_process
if _worker_process:
try:
_worker_process.terminate()
try:
_worker_process.wait(timeout=5)
except subprocess.TimeoutExpired:
_worker_process.kill()
print("✅ RQ worker stopped")
except Exception:
pass
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Lifespan context manager for startup and shutdown events."""
# Startup
start_worker()
yield
# Shutdown
stop_worker()
def create_app() -> FastAPI:
"""Create and configure the FastAPI application instance."""
app = FastAPI(title=APP_TITLE, version=APP_VERSION)
app = FastAPI(
title=APP_TITLE,
version=APP_VERSION,
lifespan=lifespan,
)
app.add_middleware(
CORSMiddleware,

View File

@ -6,8 +6,9 @@ import signal
import sys
from typing import NoReturn
import uuid
from rq import Worker
from rq.connections import use_connection
from redis import Redis
from src.web.services.tasks import import_photos_task
@ -24,12 +25,15 @@ def main() -> NoReturn:
signal.signal(signal.SIGTERM, _handle_sigterm)
signal.signal(signal.SIGINT, _handle_sigterm)
# Generate unique worker name to avoid conflicts
worker_name = f"punimtag-worker-{uuid.uuid4().hex[:8]}"
# Register tasks with worker
# Tasks are imported from services.tasks
worker = Worker(
["default"],
connection=redis_conn,
name="punimtag-worker",
name=worker_name,
)
# Start worker