feat: Add new analysis documents and update installation scripts for backend integration
This commit introduces several new analysis documents, including Auto-Match Load Performance Analysis, Folder Picker Analysis, Monorepo Migration Summary, and various performance analysis documents. Additionally, the installation scripts are updated to reflect changes in backend service paths, ensuring proper integration with the new backend structure. These enhancements provide better documentation and streamline the setup process for users.
This commit is contained in:
parent
12c62f1deb
commit
68d280e8f5
174
AUTOMATCH_LOAD_ANALYSIS.md
Normal file
174
AUTOMATCH_LOAD_ANALYSIS.md
Normal file
@ -0,0 +1,174 @@
|
||||
# Auto-Match Load Performance Analysis
|
||||
|
||||
## Summary
|
||||
Auto-Match page loads significantly slower than Identify page because it lacks the performance optimizations that Identify uses. Auto-Match always fetches all data upfront with no caching, while Identify uses sessionStorage caching and lazy loading.
|
||||
|
||||
## Identify Page Optimizations (Current)
|
||||
|
||||
### 1. **SessionStorage Caching**
|
||||
- **State Caching**: Caches faces, current index, similar faces, and form data in sessionStorage
|
||||
- **Settings Caching**: Caches filter settings (pageSize, minQuality, sortBy, etc.)
|
||||
- **Restoration**: On mount, restores cached state instead of making API calls
|
||||
- **Implementation**:
|
||||
- `STATE_KEY = 'identify_state'` - stores faces, currentIdx, similar, faceFormData, selectedSimilar
|
||||
- `SETTINGS_KEY = 'identify_settings'` - stores filter settings
|
||||
- Only loads fresh data if no cached state exists
|
||||
|
||||
### 2. **Lazy Loading**
|
||||
- **Similar Faces**: Only loads similar faces when:
|
||||
- `compareEnabled` is true
|
||||
- Current face changes
|
||||
- Not loaded during initial page load
|
||||
- **Images**: Uses lazy loading for similar face images (`loading="lazy"`)
|
||||
|
||||
### 3. **Image Preloading**
|
||||
- Preloads next/previous face images in background
|
||||
- Uses `new Image()` to preload without blocking UI
|
||||
- Delayed by 100ms to avoid blocking current image load
|
||||
|
||||
### 4. **Batch Operations**
|
||||
- Uses `batchSimilarity` endpoint for unique faces filtering
|
||||
- Single API call instead of multiple individual calls
|
||||
|
||||
### 5. **Progressive State Management**
|
||||
- Uses refs to track restoration state
|
||||
- Prevents unnecessary reloads during state restoration
|
||||
- Only triggers API calls when actually needed
|
||||
|
||||
## Auto-Match Page (Current - No Optimizations)
|
||||
|
||||
### 1. **No Caching**
|
||||
- **No sessionStorage**: Always makes fresh API calls on mount
|
||||
- **No state restoration**: Always starts from scratch
|
||||
- **No settings persistence**: Tolerance and other settings reset on page reload
|
||||
|
||||
### 2. **Eager Loading**
|
||||
- **All Data Upfront**: Loads ALL people and ALL matches in single API call
|
||||
- **No Lazy Loading**: All match data loaded even if user never views it
|
||||
- **No Progressive Loading**: Everything must be loaded before UI is usable
|
||||
|
||||
### 3. **No Image Preloading**
|
||||
- Images load on-demand as user navigates
|
||||
- No preloading of next/previous person images
|
||||
|
||||
### 4. **Large API Response**
|
||||
- Backend returns complete dataset:
|
||||
- All identified people
|
||||
- All matches for each person
|
||||
- All face metadata (photo info, locations, quality scores, etc.)
|
||||
- Response size can be very large (hundreds of KB to MB) depending on:
|
||||
- Number of identified people
|
||||
- Number of matches per person
|
||||
- Amount of metadata per match
|
||||
|
||||
### 5. **Backend Processing**
|
||||
The `find_auto_match_matches` function:
|
||||
- Queries all identified faces (one per person, quality >= 0.3)
|
||||
- For EACH person, calls `find_similar_faces` to find matches
|
||||
- This means N database queries (where N = number of people)
|
||||
- All processing happens synchronously before response is sent
|
||||
|
||||
## Performance Comparison
|
||||
|
||||
### Identify Page Load Flow
|
||||
```
|
||||
1. Check sessionStorage for cached state
|
||||
2. If cached: Restore state (instant, no API call)
|
||||
3. If not cached: Load faces (paginated, ~50 faces)
|
||||
4. Load similar faces only when face changes (lazy)
|
||||
5. Preload next/previous images (background)
|
||||
```
|
||||
|
||||
### Auto-Match Page Load Flow
|
||||
```
|
||||
1. Always call API (no cache check)
|
||||
2. Backend processes ALL people:
|
||||
- Query all identified faces
|
||||
- For each person: query similar faces
|
||||
- Build complete response with all matches
|
||||
3. Wait for complete response (can be large)
|
||||
4. Render all data at once
|
||||
```
|
||||
|
||||
## Key Differences
|
||||
|
||||
| Feature | Identify | Auto-Match |
|
||||
|---------|----------|------------|
|
||||
| **Caching** | ✅ sessionStorage | ❌ None |
|
||||
| **State Restoration** | ✅ Yes | ❌ No |
|
||||
| **Lazy Loading** | ✅ Similar faces only | ❌ All data upfront |
|
||||
| **Image Preloading** | ✅ Next/prev faces | ❌ None |
|
||||
| **Pagination** | ✅ Yes (page_size) | ❌ No (all at once) |
|
||||
| **Progressive Loading** | ✅ Yes | ❌ No |
|
||||
| **API Call Size** | Small (paginated) | Large (all data) |
|
||||
| **Backend Queries** | 1-2 queries | N+1 queries (N = people) |
|
||||
|
||||
## Why Auto-Match is Slower
|
||||
|
||||
1. **No Caching**: Every page load requires full API call
|
||||
2. **Large Response**: All people + all matches in single response
|
||||
3. **N+1 Query Problem**: Backend makes one query per person to find matches
|
||||
4. **Synchronous Processing**: All processing happens before response
|
||||
5. **No Lazy Loading**: All match data loaded even if never viewed
|
||||
|
||||
## Potential Optimizations for Auto-Match
|
||||
|
||||
### 1. **Add SessionStorage Caching** (High Impact)
|
||||
- Cache people list and matches in sessionStorage
|
||||
- Restore on mount instead of API call
|
||||
- Similar to Identify page approach
|
||||
|
||||
### 2. **Lazy Load Matches** (High Impact)
|
||||
- Load people list first
|
||||
- Load matches for current person only
|
||||
- Load matches for next person in background
|
||||
- Similar to how Identify loads similar faces
|
||||
|
||||
### 3. **Pagination** (Medium Impact)
|
||||
- Paginate people list (e.g., 20 people per page)
|
||||
- Load matches only for visible people
|
||||
- Reduces initial response size
|
||||
|
||||
### 4. **Backend Optimization** (High Impact)
|
||||
- Batch similarity queries instead of N+1 pattern
|
||||
- Use `calculate_batch_similarities` for all people at once
|
||||
- Cache results if tolerance hasn't changed
|
||||
|
||||
### 5. **Image Preloading** (Low Impact)
|
||||
- Preload reference face images for next/previous people
|
||||
- Preload match images for current person
|
||||
|
||||
### 6. **Progressive Rendering** (Medium Impact)
|
||||
- Show people list immediately
|
||||
- Load matches progressively as user navigates
|
||||
- Show loading indicators for matches
|
||||
|
||||
## Code Locations
|
||||
|
||||
### Identify Page
|
||||
- **Frontend**: `frontend/src/pages/Identify.tsx`
|
||||
- Lines 42-45: SessionStorage keys
|
||||
- Lines 272-347: State restoration logic
|
||||
- Lines 349-399: State saving logic
|
||||
- Lines 496-527: Image preloading
|
||||
- Lines 258-270: Lazy loading of similar faces
|
||||
|
||||
### Auto-Match Page
|
||||
- **Frontend**: `frontend/src/pages/AutoMatch.tsx`
|
||||
- Lines 35-71: `loadAutoMatch` function (always calls API)
|
||||
- Lines 74-77: Auto-load on mount (no cache check)
|
||||
|
||||
### Backend
|
||||
- **API Endpoint**: `src/web/api/faces.py` (lines 539-702)
|
||||
- **Service Function**: `src/web/services/face_service.py` (lines 1736-1846)
|
||||
- `find_auto_match_matches`: Processes all people synchronously
|
||||
|
||||
## Recommendations
|
||||
|
||||
1. **Immediate**: Add sessionStorage caching (similar to Identify)
|
||||
2. **High Priority**: Implement lazy loading of matches
|
||||
3. **Medium Priority**: Optimize backend to use batch queries
|
||||
4. **Low Priority**: Add image preloading
|
||||
|
||||
The biggest win would be adding sessionStorage caching, which would make subsequent page loads instant (like Identify).
|
||||
|
||||
@ -86,3 +86,4 @@ This implementation provides a foundation for more sophisticated calibration met
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
233
FOLDER_PICKER_ANALYSIS.md
Normal file
233
FOLDER_PICKER_ANALYSIS.md
Normal file
@ -0,0 +1,233 @@
|
||||
# Folder Picker Analysis - Getting Full Paths
|
||||
|
||||
## Problem
|
||||
Browsers don't expose full file system paths for security reasons. Current implementation only gets folder names, not full absolute paths.
|
||||
|
||||
## Current Limitations
|
||||
|
||||
### Browser-Based Solutions (Current)
|
||||
1. **File System Access API** (`showDirectoryPicker`)
|
||||
- ✅ No confirmation dialog
|
||||
- ❌ Only returns folder name, not full path
|
||||
- ❌ Only works in Chrome 86+, Edge 86+, Opera 72+
|
||||
|
||||
2. **webkitdirectory input**
|
||||
- ✅ Works in all browsers
|
||||
- ❌ Shows security confirmation dialog
|
||||
- ❌ Only returns relative paths, not absolute paths
|
||||
|
||||
## Alternative Solutions
|
||||
|
||||
### ✅ **Option 1: Backend API with Tkinter (RECOMMENDED)**
|
||||
|
||||
**How it works:**
|
||||
- Frontend calls backend API endpoint
|
||||
- Backend uses `tkinter.filedialog.askdirectory()` to show native folder picker
|
||||
- Backend returns full absolute path to frontend
|
||||
- Frontend populates the path input
|
||||
|
||||
**Pros:**
|
||||
- ✅ Returns full absolute path
|
||||
- ✅ Native OS dialog (looks native on Windows/Linux/macOS)
|
||||
- ✅ No browser security restrictions
|
||||
- ✅ tkinter already used in project
|
||||
- ✅ Cross-platform support
|
||||
- ✅ No confirmation dialogs
|
||||
|
||||
**Cons:**
|
||||
- ⚠️ Requires backend to be running on same machine as user
|
||||
- ⚠️ Backend needs GUI access (tkinter requires display)
|
||||
- ⚠️ May need X11 forwarding for remote servers
|
||||
|
||||
**Implementation:**
|
||||
```python
|
||||
# Backend API endpoint
|
||||
@router.post("/browse-folder")
|
||||
def browse_folder() -> dict:
|
||||
"""Open native folder picker and return selected path."""
|
||||
import tkinter as tk
|
||||
from tkinter import filedialog
|
||||
|
||||
# Create root window (hidden)
|
||||
root = tk.Tk()
|
||||
root.withdraw() # Hide main window
|
||||
root.attributes('-topmost', True) # Bring to front
|
||||
|
||||
# Show folder picker
|
||||
folder_path = filedialog.askdirectory(
|
||||
title="Select folder to scan",
|
||||
mustexist=True
|
||||
)
|
||||
|
||||
root.destroy()
|
||||
|
||||
if folder_path:
|
||||
return {"path": folder_path, "success": True}
|
||||
else:
|
||||
return {"path": "", "success": False, "message": "No folder selected"}
|
||||
```
|
||||
|
||||
```typescript
|
||||
// Frontend API call
|
||||
const browseFolder = async (): Promise<string | null> => {
|
||||
const { data } = await apiClient.post<{path: string, success: boolean}>(
|
||||
'/api/v1/photos/browse-folder'
|
||||
)
|
||||
return data.success ? data.path : null
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **Option 2: Backend API with PyQt/PySide**
|
||||
|
||||
**How it works:**
|
||||
- Similar to Option 1, but uses PyQt/PySide instead of tkinter
|
||||
- More modern UI, but requires additional dependency
|
||||
|
||||
**Pros:**
|
||||
- ✅ Returns full absolute path
|
||||
- ✅ More modern-looking dialogs
|
||||
- ✅ Better customization options
|
||||
|
||||
**Cons:**
|
||||
- ❌ Requires additional dependency (PyQt5/PyQt6/PySide2/PySide6)
|
||||
- ❌ Larger package size
|
||||
- ❌ Same GUI access requirements as tkinter
|
||||
|
||||
---
|
||||
|
||||
### **Option 3: Backend API with Platform-Specific Tools**
|
||||
|
||||
**How it works:**
|
||||
- Use platform-specific command-line tools to open folder pickers
|
||||
- Windows: PowerShell script
|
||||
- Linux: `zenity`, `kdialog`, or `yad`
|
||||
- macOS: AppleScript
|
||||
|
||||
**Pros:**
|
||||
- ✅ Returns full absolute path
|
||||
- ✅ No GUI framework required
|
||||
- ✅ Works on headless servers with X11 forwarding
|
||||
|
||||
**Cons:**
|
||||
- ❌ Platform-specific code required
|
||||
- ❌ Requires external tools to be installed
|
||||
- ❌ More complex implementation
|
||||
- ❌ Less consistent UI across platforms
|
||||
|
||||
**Example (Linux with zenity):**
|
||||
```python
|
||||
import subprocess
|
||||
import platform
|
||||
|
||||
def browse_folder_zenity():
|
||||
result = subprocess.run(
|
||||
['zenity', '--file-selection', '--directory'],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
return result.stdout.strip() if result.returncode == 0 else None
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### **Option 4: Electron App (Not Applicable)**
|
||||
|
||||
**How it works:**
|
||||
- Convert web app to Electron app
|
||||
- Use Electron's `dialog.showOpenDialog()` API
|
||||
|
||||
**Pros:**
|
||||
- ✅ Returns full absolute path
|
||||
- ✅ Native OS dialogs
|
||||
- ✅ No browser restrictions
|
||||
|
||||
**Cons:**
|
||||
- ❌ Requires complete app restructuring
|
||||
- ❌ Not applicable (this is a web app, not Electron)
|
||||
- ❌ Much larger application size
|
||||
|
||||
---
|
||||
|
||||
### **Option 5: Custom File Browser UI**
|
||||
|
||||
**How it works:**
|
||||
- Build custom file browser in React
|
||||
- Backend API provides directory listings
|
||||
- User navigates through folders in UI
|
||||
- Select folder when found
|
||||
|
||||
**Pros:**
|
||||
- ✅ Full control over UI/UX
|
||||
- ✅ Can show full paths
|
||||
- ✅ No native dialogs needed
|
||||
|
||||
**Cons:**
|
||||
- ❌ Complex implementation
|
||||
- ❌ Requires multiple API calls
|
||||
- ❌ Slower user experience
|
||||
- ❌ Need to handle permissions, hidden files, etc.
|
||||
|
||||
---
|
||||
|
||||
## Recommendation
|
||||
|
||||
**✅ Use Option 1: Backend API with Tkinter**
|
||||
|
||||
This is the best solution because:
|
||||
1. **tkinter is already used** in the project (face_processing.py)
|
||||
2. **Simple implementation** - just one API endpoint
|
||||
3. **Returns full paths** - solves the core problem
|
||||
4. **Native dialogs** - familiar to users
|
||||
5. **No additional dependencies** - tkinter is built into Python
|
||||
6. **Cross-platform** - works on Windows, Linux, macOS
|
||||
|
||||
### Implementation Steps
|
||||
|
||||
1. **Create backend API endpoint** (`/api/v1/photos/browse-folder`)
|
||||
- Use `tkinter.filedialog.askdirectory()`
|
||||
- Return selected path as JSON
|
||||
|
||||
2. **Add frontend API method**
|
||||
- Call the new endpoint
|
||||
- Handle response and populate path input
|
||||
|
||||
3. **Update Browse button handler**
|
||||
- Call backend API instead of browser picker
|
||||
- Show loading state while waiting
|
||||
- Handle errors gracefully
|
||||
|
||||
4. **Fallback option**
|
||||
- Keep browser-based picker as fallback
|
||||
- Use if backend API fails or unavailable
|
||||
|
||||
### Considerations
|
||||
|
||||
- **Headless servers**: If backend runs on headless server, need X11 forwarding or use Option 3 (platform-specific tools)
|
||||
- **Remote access**: If users access from remote machines, backend must be on same machine as user
|
||||
- **Error handling**: Handle cases where tkinter dialog can't be shown (no display, permissions, etc.)
|
||||
|
||||
---
|
||||
|
||||
## Quick Comparison Table
|
||||
|
||||
| Solution | Full Path | Native Dialog | Dependencies | Complexity | Recommended |
|
||||
|----------|-----------|---------------|--------------|------------|-------------|
|
||||
| **Backend + Tkinter** | ✅ | ✅ | None (built-in) | Low | ✅ **YES** |
|
||||
| Backend + PyQt | ✅ | ✅ | PyQt/PySide | Medium | ⚠️ Maybe |
|
||||
| Platform Tools | ✅ | ✅ | zenity/kdialog/etc | High | ⚠️ Maybe |
|
||||
| Custom UI | ✅ | ❌ | None | Very High | ❌ No |
|
||||
| Electron | ✅ | ✅ | Electron | Very High | ❌ No |
|
||||
| Browser API | ❌ | ✅ | None | Low | ❌ No |
|
||||
|
||||
---
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Implement backend API endpoint with tkinter
|
||||
2. Add frontend API method
|
||||
3. Update Browse button to use backend API
|
||||
4. Add error handling and fallback
|
||||
5. Test on all platforms (Windows, Linux, macOS)
|
||||
|
||||
125
MONOREPO_MIGRATION.md
Normal file
125
MONOREPO_MIGRATION.md
Normal file
@ -0,0 +1,125 @@
|
||||
# Monorepo Migration Summary
|
||||
|
||||
This document summarizes the migration from separate `punimtag` and `punimtag-viewer` projects to a unified monorepo structure.
|
||||
|
||||
## Migration Date
|
||||
December 2024
|
||||
|
||||
## Changes Made
|
||||
|
||||
### Directory Structure
|
||||
|
||||
**Before:**
|
||||
```
|
||||
punimtag/
|
||||
├── src/web/ # Backend API
|
||||
└── frontend/ # Admin React frontend
|
||||
|
||||
punimtag-viewer/ # Separate repository
|
||||
└── (Next.js viewer)
|
||||
```
|
||||
|
||||
**After:**
|
||||
```
|
||||
punimtag/
|
||||
├── backend/ # FastAPI backend (renamed from src/web)
|
||||
├── admin-frontend/ # React admin interface (renamed from frontend)
|
||||
└── viewer-frontend/ # Next.js viewer (moved from punimtag-viewer)
|
||||
```
|
||||
|
||||
### Import Path Changes
|
||||
|
||||
All Python imports have been updated:
|
||||
- `from src.web.*` → `from backend.*`
|
||||
- `import src.web.*` → `import backend.*`
|
||||
|
||||
### Configuration Updates
|
||||
|
||||
1. **install.sh**: Updated to install dependencies for both frontends
|
||||
2. **package.json**: Created root package.json with workspace scripts
|
||||
3. **run_api_with_worker.sh**: Updated to use `backend.app` instead of `src.web.app`
|
||||
4. **run_worker.sh**: Updated to use `backend.worker` instead of `src.web.worker`
|
||||
5. **docker-compose.yml**: Updated service commands to use `backend.*` paths
|
||||
|
||||
### Environment Files
|
||||
|
||||
- **admin-frontend/.env**: Backend API URL configuration
|
||||
- **viewer-frontend/.env.local**: Database and NextAuth configuration
|
||||
|
||||
### Port Configuration
|
||||
|
||||
- **Admin Frontend**: Port 3000 (unchanged)
|
||||
- **Viewer Frontend**: Port 3001 (configured in viewer-frontend/package.json)
|
||||
- **Backend API**: Port 8000 (unchanged)
|
||||
|
||||
## Running the Application
|
||||
|
||||
### Development
|
||||
|
||||
**Terminal 1 - Backend:**
|
||||
```bash
|
||||
source venv/bin/activate
|
||||
export PYTHONPATH=$(pwd)
|
||||
uvicorn backend.app:app --host 127.0.0.1 --port 8000
|
||||
```
|
||||
|
||||
**Terminal 2 - Admin Frontend:**
|
||||
```bash
|
||||
cd admin-frontend
|
||||
npm run dev
|
||||
```
|
||||
|
||||
**Terminal 3 - Viewer Frontend:**
|
||||
```bash
|
||||
cd viewer-frontend
|
||||
npm run dev
|
||||
```
|
||||
|
||||
### Using Root Scripts
|
||||
|
||||
```bash
|
||||
# Install all dependencies
|
||||
npm run install:all
|
||||
|
||||
# Run individual services
|
||||
npm run dev:backend
|
||||
npm run dev:admin
|
||||
npm run dev:viewer
|
||||
```
|
||||
|
||||
## Benefits
|
||||
|
||||
1. **Unified Setup**: Single installation script for all components
|
||||
2. **Easier Maintenance**: All code in one repository
|
||||
3. **Shared Configuration**: Common environment variables and settings
|
||||
4. **Simplified Deployment**: Single repository to deploy
|
||||
5. **Better Organization**: Clear separation of admin and viewer interfaces
|
||||
|
||||
## Migration Checklist
|
||||
|
||||
- [x] Rename `src/web` to `backend`
|
||||
- [x] Rename `frontend` to `admin-frontend`
|
||||
- [x] Copy `punimtag-viewer` to `viewer-frontend`
|
||||
- [x] Update all Python imports
|
||||
- [x] Update all scripts
|
||||
- [x] Update install.sh
|
||||
- [x] Create root package.json
|
||||
- [x] Update docker-compose.yml
|
||||
- [x] Update README.md
|
||||
- [x] Update scripts in scripts/ directory
|
||||
|
||||
## Notes
|
||||
|
||||
- The viewer frontend manages the `punimtag_auth` database
|
||||
- Both frontends share the main `punimtag` database
|
||||
- Backend API serves both frontends
|
||||
- All database schemas remain unchanged
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Test all three services start correctly
|
||||
2. Verify database connections work
|
||||
3. Test authentication flows
|
||||
4. Update CI/CD pipelines if applicable
|
||||
5. Archive or remove the old `punimtag-viewer` repository
|
||||
|
||||
298
README.md
298
README.md
@ -4,6 +4,8 @@
|
||||
|
||||
A fast, simple, and modern web application for organizing and tagging photos using state-of-the-art DeepFace AI with ArcFace recognition model.
|
||||
|
||||
**Monorepo Structure:** This project contains both the admin interface (React) and viewer interface (Next.js) in a unified repository for easier maintenance and setup.
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Features
|
||||
@ -57,10 +59,16 @@ The script will:
|
||||
- ✅ Set up PostgreSQL databases (main + auth)
|
||||
- ✅ Create Python virtual environment
|
||||
- ✅ Install all Python dependencies
|
||||
- ✅ Install all frontend dependencies
|
||||
- ✅ Install all frontend dependencies (admin-frontend and viewer-frontend)
|
||||
- ✅ Create `.env` configuration files
|
||||
- ✅ Create necessary data directories
|
||||
|
||||
**Note:** After installation, you'll need to generate Prisma clients for the viewer-frontend:
|
||||
```bash
|
||||
cd viewer-frontend
|
||||
npx prisma generate
|
||||
```
|
||||
|
||||
**Note:** On macOS or other systems, the script will skip system dependency installation. You'll need to install PostgreSQL and Redis manually.
|
||||
|
||||
#### Option 2: Manual Installation
|
||||
@ -78,17 +86,24 @@ source venv/bin/activate # On Windows: venv\Scripts\activate
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Install frontend dependencies
|
||||
cd frontend
|
||||
cd admin-frontend
|
||||
npm install
|
||||
cd ../viewer-frontend
|
||||
npm install
|
||||
# Generate Prisma clients for viewer-frontend (after setting up .env)
|
||||
npx prisma generate
|
||||
cd ..
|
||||
```
|
||||
|
||||
### Database Setup
|
||||
|
||||
**PostgreSQL (Default - Network Database):**
|
||||
The application is configured to use PostgreSQL by default. The application requires **two separate databases**:
|
||||
**Database Configuration:**
|
||||
The application uses **two separate databases**:
|
||||
1. **Main database** (`punimtag`) - Stores photos, faces, people, tags, and backend user accounts
|
||||
- **Default: SQLite** at `data/punimtag.db` (for development)
|
||||
- **Optional: PostgreSQL** (for production)
|
||||
2. **Auth database** (`punimtag_auth`) - Stores frontend website user accounts and moderation data
|
||||
- **Required: PostgreSQL** (always uses PostgreSQL)
|
||||
|
||||
Both database connections are configured via the `.env` file.
|
||||
|
||||
@ -140,12 +155,12 @@ Alternatively, use the automated script (requires sudo password):
|
||||
```
|
||||
|
||||
**Configuration:**
|
||||
The `.env` file contains both database connection strings:
|
||||
The `.env` file in the project root contains database connection strings:
|
||||
```bash
|
||||
# Main application database
|
||||
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
|
||||
# Main application database (SQLite - default for development)
|
||||
DATABASE_URL=sqlite:///data/punimtag.db
|
||||
|
||||
# Auth database (for frontend website users)
|
||||
# Auth database (PostgreSQL - always required for frontend website users)
|
||||
DATABASE_URL_AUTH=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag_auth
|
||||
```
|
||||
|
||||
@ -153,23 +168,26 @@ DATABASE_URL_AUTH=postgresql+psycopg2://punimtag:punimtag_password@localhost:543
|
||||
The database and all tables are automatically created on first startup. No manual migration is needed!
|
||||
|
||||
The web application will:
|
||||
- Connect to PostgreSQL using the `.env` configuration
|
||||
- Connect to the database using the `.env` configuration
|
||||
- Create all required tables with the correct schema on startup
|
||||
- Match the desktop version schema exactly for compatibility
|
||||
|
||||
**Manual Setup (Optional):**
|
||||
If you need to reset the database or create it manually:
|
||||
**Note:** The main database uses SQLite by default for easier development. For production, you can switch to PostgreSQL by updating `DATABASE_URL` in `.env`.
|
||||
|
||||
**SQLite (Default - Local Database):**
|
||||
The main database uses SQLite by default for development. The `.env` file should contain:
|
||||
```bash
|
||||
source venv/bin/activate
|
||||
export PYTHONPATH=/home/ladmin/Code/punimtag
|
||||
# Recreate all tables from models
|
||||
python scripts/recreate_tables_web.py
|
||||
# Main database (SQLite - default for development)
|
||||
DATABASE_URL=sqlite:///data/punimtag.db
|
||||
|
||||
# Or use absolute path:
|
||||
# DATABASE_URL=file:/home/ladmin/code/punimtag/data/punimtag.db
|
||||
```
|
||||
|
||||
**SQLite (Alternative - Local Database):**
|
||||
To use SQLite instead of PostgreSQL, comment out or remove the `DATABASE_URL` line in `.env`, or set it to:
|
||||
**PostgreSQL (Optional - for Production):**
|
||||
To use PostgreSQL for the main database instead, set:
|
||||
```bash
|
||||
DATABASE_URL=sqlite:///data/punimtag.db
|
||||
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
|
||||
```
|
||||
|
||||
**Database Schema:**
|
||||
@ -221,27 +239,46 @@ The separate auth database (`punimtag_auth`) stores frontend website user accoun
|
||||
redis-server
|
||||
```
|
||||
|
||||
#### Option 1: Manual Start (Recommended for Development)
|
||||
#### Option 1: Using Helper Scripts (Recommended)
|
||||
|
||||
**Terminal 1 - Backend API:**
|
||||
**Terminal 1 - Backend API + Worker:**
|
||||
```bash
|
||||
cd /home/ladmin/Code/punimtag
|
||||
source venv/bin/activate
|
||||
export PYTHONPATH=/home/ladmin/Code/punimtag
|
||||
uvicorn src.web.app:app --host 127.0.0.1 --port 8000
|
||||
cd punimtag
|
||||
./run_api_with_worker.sh
|
||||
```
|
||||
|
||||
This script will:
|
||||
- Check if Redis is running (start it if needed)
|
||||
- Ensure database schema is up to date
|
||||
- Start the RQ worker in the background
|
||||
- Start the FastAPI server
|
||||
- Handle cleanup on Ctrl+C
|
||||
|
||||
You should see:
|
||||
```
|
||||
✅ Database already initialized (7 tables exist)
|
||||
✅ RQ worker started in background subprocess (PID: ...)
|
||||
INFO: Started server process
|
||||
INFO: Uvicorn running on http://127.0.0.1:8000
|
||||
✅ Database schema ready
|
||||
🚀 Starting RQ worker...
|
||||
🚀 Starting FastAPI server...
|
||||
✅ Server running on http://127.0.0.1:8000
|
||||
✅ Worker running (PID: ...)
|
||||
✅ API running (PID: ...)
|
||||
```
|
||||
|
||||
**Terminal 2 - Frontend:**
|
||||
**Alternative: Start backend only (without worker):**
|
||||
```bash
|
||||
cd /home/ladmin/Code/punimtag/frontend
|
||||
cd punimtag
|
||||
./start_backend.sh
|
||||
```
|
||||
|
||||
**Stop the backend:**
|
||||
```bash
|
||||
cd punimtag
|
||||
./stop_backend.sh
|
||||
```
|
||||
|
||||
**Terminal 2 - Admin Frontend:**
|
||||
```bash
|
||||
cd punimtag/admin-frontend
|
||||
npm run dev
|
||||
```
|
||||
|
||||
@ -251,44 +288,70 @@ VITE v5.4.21 ready in 811 ms
|
||||
➜ Local: http://localhost:3000/
|
||||
```
|
||||
|
||||
#### Option 2: Using Helper Script (Backend + Worker)
|
||||
|
||||
**Terminal 1 - Backend API + Worker:**
|
||||
**Terminal 3 - Viewer Frontend (Optional):**
|
||||
```bash
|
||||
cd /home/ladmin/Code/punimtag
|
||||
./run_api_with_worker.sh
|
||||
```
|
||||
|
||||
This script will:
|
||||
- Check if Redis is running (start it if needed)
|
||||
- Start the RQ worker in the background
|
||||
- Start the FastAPI server
|
||||
- Handle cleanup on Ctrl+C
|
||||
|
||||
**Terminal 2 - Frontend:**
|
||||
```bash
|
||||
cd /home/ladmin/Code/punimtag/frontend
|
||||
cd punimtag/viewer-frontend
|
||||
# Generate Prisma clients (only needed once or after schema changes)
|
||||
npx prisma generate
|
||||
npm run dev
|
||||
```
|
||||
|
||||
#### Access the Application
|
||||
You should see:
|
||||
```
|
||||
▲ Next.js 16.1.1 (Turbopack)
|
||||
- Local: http://localhost:3001/
|
||||
```
|
||||
|
||||
1. Open your browser to **http://localhost:3000**
|
||||
2. Login with default credentials:
|
||||
- Username: `admin`
|
||||
- Password: `admin`
|
||||
3. API documentation available at **http://127.0.0.1:8000/docs**
|
||||
#### Option 2: Manual Start
|
||||
|
||||
**Terminal 1 - Backend API:**
|
||||
```bash
|
||||
cd punimtag
|
||||
source venv/bin/activate
|
||||
export PYTHONPATH="$(pwd)"
|
||||
python3 -m uvicorn backend.app:app --host 127.0.0.1 --port 8000 --reload
|
||||
```
|
||||
|
||||
**Note:** If you encounter warnings about "Electron/Chromium" when running `uvicorn`, use `python3 -m uvicorn` instead, or use the helper scripts above.
|
||||
|
||||
**Terminal 2 - Admin Frontend:**
|
||||
```bash
|
||||
cd punimtag/admin-frontend
|
||||
npm run dev
|
||||
```
|
||||
|
||||
**Terminal 3 - Viewer Frontend (Optional):**
|
||||
```bash
|
||||
cd punimtag/viewer-frontend
|
||||
npx prisma generate # Only needed once or after schema changes
|
||||
npm run dev
|
||||
```
|
||||
|
||||
#### Access the Applications
|
||||
|
||||
1. **Admin Interface**: Open your browser to **http://localhost:3000**
|
||||
- Login with default credentials:
|
||||
- Username: `admin`
|
||||
- Password: `admin`
|
||||
2. **Viewer Interface** (Optional): Open your browser to **http://localhost:3001**
|
||||
- Public photo viewing interface
|
||||
- Separate authentication system
|
||||
3. **API Documentation**: Available at **http://127.0.0.1:8000/docs**
|
||||
|
||||
#### Troubleshooting
|
||||
|
||||
**Port 8000 already in use:**
|
||||
```bash
|
||||
# Find and kill the process using port 8000
|
||||
# Use the stop script
|
||||
cd punimtag
|
||||
./stop_backend.sh
|
||||
|
||||
# Or manually find and kill the process
|
||||
lsof -i :8000
|
||||
kill <PID>
|
||||
|
||||
# Or use pkill
|
||||
pkill -f "uvicorn.*app"
|
||||
pkill -f "uvicorn.*backend.app"
|
||||
```
|
||||
|
||||
**Port 3000 already in use:**
|
||||
@ -297,7 +360,7 @@ pkill -f "uvicorn.*app"
|
||||
lsof -i :3000
|
||||
kill <PID>
|
||||
|
||||
# Or change the port in frontend/vite.config.ts
|
||||
# Or change the port in admin-frontend/vite.config.ts
|
||||
```
|
||||
|
||||
**Redis not running:**
|
||||
@ -306,17 +369,37 @@ kill <PID>
|
||||
sudo systemctl start redis-server
|
||||
# Or
|
||||
redis-server
|
||||
|
||||
# Verify Redis is running
|
||||
redis-cli ping # Should respond with "PONG"
|
||||
```
|
||||
|
||||
**Worker module not found error:**
|
||||
If you see `ModuleNotFoundError: No module named 'backend'`:
|
||||
- Make sure you're using the helper scripts (`./run_api_with_worker.sh` or `./start_backend.sh`)
|
||||
- These scripts set PYTHONPATH correctly
|
||||
- If running manually, ensure `export PYTHONPATH="$(pwd)"` is set
|
||||
|
||||
**Python/Cursor interception warnings:**
|
||||
If you see warnings about "Electron/Chromium" when running `uvicorn`:
|
||||
- Use `python3 -m uvicorn` instead of just `uvicorn`
|
||||
- Or use the helper scripts which handle this automatically
|
||||
|
||||
**Database issues:**
|
||||
```bash
|
||||
# Recreate all tables (WARNING: This will delete all data!)
|
||||
cd /home/ladmin/Code/punimtag
|
||||
source venv/bin/activate
|
||||
export PYTHONPATH=/home/ladmin/Code/punimtag
|
||||
python scripts/recreate_tables_web.py
|
||||
# The database is automatically created on first startup
|
||||
# If you need to reset it, delete the database file:
|
||||
rm data/punimtag.db
|
||||
|
||||
# The schema will be recreated on next startup
|
||||
```
|
||||
|
||||
**Viewer frontend shows 0 photos:**
|
||||
- Make sure the database has photos (import them via admin frontend)
|
||||
- Verify `DATABASE_URL` in `viewer-frontend/.env` points to the correct database
|
||||
- Ensure Prisma client is generated: `cd viewer-frontend && npx prisma generate`
|
||||
- Check that photos are marked as `processed: true` in the database
|
||||
|
||||
#### Important Notes
|
||||
|
||||
- The database and tables are **automatically created on first startup** - no manual setup needed!
|
||||
@ -340,14 +423,16 @@ python scripts/recreate_tables_web.py
|
||||
|
||||
```
|
||||
punimtag/
|
||||
├── src/ # Source code
|
||||
│ ├── web/ # Web backend
|
||||
│ │ ├── api/ # API routers
|
||||
│ │ ├── db/ # Database models and session
|
||||
│ │ ├── schemas/ # Pydantic models
|
||||
│ │ └── services/ # Business logic services
|
||||
│ └── core/ # Legacy desktop business logic
|
||||
├── frontend/ # React frontend
|
||||
├── backend/ # FastAPI backend
|
||||
│ ├── api/ # API routers
|
||||
│ ├── db/ # Database models and session
|
||||
│ ├── schemas/ # Pydantic models
|
||||
│ ├── services/ # Business logic services
|
||||
│ ├── constants/ # Constants and configuration
|
||||
│ ├── utils/ # Utility functions
|
||||
│ ├── app.py # FastAPI application
|
||||
│ └── worker.py # RQ worker for background jobs
|
||||
├── admin-frontend/ # React admin interface
|
||||
│ ├── src/
|
||||
│ │ ├── api/ # API client
|
||||
│ │ ├── components/ # React components
|
||||
@ -355,11 +440,20 @@ punimtag/
|
||||
│ │ ├── hooks/ # Custom hooks
|
||||
│ │ └── pages/ # Page components
|
||||
│ └── package.json
|
||||
├── viewer-frontend/ # Next.js viewer interface
|
||||
│ ├── app/ # Next.js app router
|
||||
│ ├── components/ # React components
|
||||
│ ├── lib/ # Utilities and database
|
||||
│ ├── prisma/ # Prisma schemas
|
||||
│ └── package.json
|
||||
├── src/ # Legacy desktop code
|
||||
│ └── core/ # Legacy desktop business logic
|
||||
├── tests/ # Test suite
|
||||
├── docs/ # Documentation
|
||||
├── data/ # Application data (database, images)
|
||||
├── alembic/ # Database migrations
|
||||
└── deploy/ # Docker deployment configs
|
||||
├── scripts/ # Utility scripts
|
||||
├── deploy/ # Docker deployment configs
|
||||
└── package.json # Root package.json for monorepo
|
||||
```
|
||||
|
||||
---
|
||||
@ -392,7 +486,9 @@ punimtag/
|
||||
- ✅ All tables created automatically on startup: `photos`, `faces`, `people`, `person_encodings`, `tags`, `phototaglinkage`
|
||||
- ✅ Schema matches desktop version exactly for full compatibility
|
||||
- ✅ Indices configured for performance
|
||||
- ✅ SQLite database at `data/punimtag.db` (auto-created if missing)
|
||||
- ✅ SQLite database at `data/punimtag.db` (auto-created if missing, default for development)
|
||||
- ✅ PostgreSQL support for production deployments
|
||||
- ✅ Separate auth database (PostgreSQL) for frontend user accounts
|
||||
|
||||
### Image Ingestion & Processing
|
||||
|
||||
@ -482,23 +578,23 @@ punimtag/
|
||||
|
||||
### Database
|
||||
|
||||
**PostgreSQL (Default - Network Database):**
|
||||
The application uses PostgreSQL by default, configured via the `.env` file:
|
||||
**SQLite (Default - Local Database):**
|
||||
The main database uses SQLite by default for development, configured via the `.env` file:
|
||||
```bash
|
||||
# Main application database
|
||||
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
|
||||
# Main application database (SQLite - default)
|
||||
DATABASE_URL=sqlite:///data/punimtag.db
|
||||
|
||||
# Auth database (for frontend website users)
|
||||
# Auth database (PostgreSQL - always required for frontend website users)
|
||||
DATABASE_URL_AUTH=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag_auth
|
||||
```
|
||||
|
||||
**SQLite (Alternative - Local Database):**
|
||||
To use SQLite instead, comment out or remove the `DATABASE_URL` line in `.env`, or set:
|
||||
**PostgreSQL (Optional - for Production):**
|
||||
To use PostgreSQL for the main database instead:
|
||||
```bash
|
||||
DATABASE_URL=sqlite:///data/punimtag.db
|
||||
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
|
||||
```
|
||||
|
||||
**Note:** When using SQLite, the auth database (`DATABASE_URL_AUTH`) should still be configured as PostgreSQL if you need frontend website user authentication features. The auth database is optional but required for full multi-user functionality.
|
||||
**Note:** The auth database (`DATABASE_URL_AUTH`) always uses PostgreSQL and is required for frontend website user authentication features.
|
||||
|
||||
### Environment Variables
|
||||
|
||||
@ -506,10 +602,10 @@ Configuration is managed via the `.env` file in the project root. A `.env.exampl
|
||||
|
||||
**Required Configuration:**
|
||||
```bash
|
||||
# Database (PostgreSQL by default)
|
||||
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
|
||||
# Database (SQLite by default for development)
|
||||
DATABASE_URL=sqlite:///data/punimtag.db
|
||||
|
||||
# Auth Database (for frontend website user accounts - separate from main database)
|
||||
# Auth Database (PostgreSQL - always required for frontend website user accounts)
|
||||
DATABASE_URL_AUTH=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag_auth
|
||||
|
||||
# JWT Secrets (change in production!)
|
||||
@ -523,13 +619,45 @@ ADMIN_PASSWORD=admin
|
||||
PHOTO_STORAGE_DIR=data/uploads
|
||||
```
|
||||
|
||||
**Frontend Configuration:**
|
||||
Create a `.env` file in the `frontend/` directory:
|
||||
**Admin Frontend Configuration:**
|
||||
Create a `.env` file in the `admin-frontend/` directory:
|
||||
```bash
|
||||
# Backend API URL (must be accessible from browsers)
|
||||
VITE_API_URL=http://127.0.0.1:8000
|
||||
```
|
||||
|
||||
**Viewer Frontend Configuration:**
|
||||
Create a `.env` file in the `viewer-frontend/` directory:
|
||||
```bash
|
||||
# Main database connection (SQLite - matches backend default)
|
||||
# Use absolute path for SQLite
|
||||
DATABASE_URL=file:/home/ladmin/code/punimtag/data/punimtag.db
|
||||
|
||||
# Auth database connection (PostgreSQL - always required)
|
||||
DATABASE_URL_AUTH=postgresql://punimtag:punimtag_password@localhost:5432/punimtag_auth
|
||||
|
||||
# Write-capable database connection (optional, falls back to DATABASE_URL if not set)
|
||||
DATABASE_URL_WRITE=file:/home/ladmin/code/punimtag/data/punimtag.db
|
||||
|
||||
# NextAuth configuration
|
||||
NEXTAUTH_URL=http://localhost:3001
|
||||
NEXTAUTH_SECRET=dev-secret-key-change-in-production
|
||||
```
|
||||
|
||||
**Generate Prisma Clients:**
|
||||
After setting up the `.env` file, generate the Prisma clients:
|
||||
```bash
|
||||
cd viewer-frontend
|
||||
npx prisma generate
|
||||
```
|
||||
|
||||
**Important:** The viewer frontend uses **SQLite** for the main database (matching the backend default). The Prisma schema is configured for SQLite. If you change the backend to PostgreSQL, you'll need to:
|
||||
1. Update `viewer-frontend/prisma/schema.prisma` to use `provider = "postgresql"`
|
||||
2. Update `DATABASE_URL` in `viewer-frontend/.env` to the PostgreSQL connection string
|
||||
3. Run `npx prisma generate` again
|
||||
|
||||
**Note:** The viewer frontend uses the same database as the backend by default. For production deployments, you may want to create separate read-only and write users for better security.
|
||||
|
||||
**Note:** The `.env` file is automatically loaded by the application using `python-dotenv`. Environment variables can also be set directly in your shell if preferred.
|
||||
|
||||
---
|
||||
|
||||
@ -49,3 +49,7 @@ module.exports = {
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@ -30,6 +30,6 @@
|
||||
"postcss": "^8.4.31",
|
||||
"tailwindcss": "^3.3.5",
|
||||
"typescript": "^5.2.2",
|
||||
"vite": "^5.0.0"
|
||||
"vite": "^5.4.0"
|
||||
}
|
||||
}
|
||||
@ -1,5 +1,8 @@
|
||||
import axios from 'axios'
|
||||
|
||||
// Get API base URL from environment variable or use default
|
||||
// The .env file should contain: VITE_API_URL=http://127.0.0.1:8000
|
||||
// Alternatively, Vite proxy can be used (configured in vite.config.ts) by setting VITE_API_URL to empty string
|
||||
const API_BASE_URL = import.meta.env.VITE_API_URL || 'http://127.0.0.1:8000'
|
||||
|
||||
export const apiClient = axios.create({
|
||||
@ -18,10 +21,27 @@ apiClient.interceptors.request.use((config) => {
|
||||
return config
|
||||
})
|
||||
|
||||
// Handle 401 errors
|
||||
// Handle 401 errors and network errors
|
||||
apiClient.interceptors.response.use(
|
||||
(response) => response,
|
||||
(error) => {
|
||||
// Handle network errors (no response from server)
|
||||
if (!error.response && (error.message === 'Network Error' || error.code === 'ERR_NETWORK')) {
|
||||
// Check if user is logged in
|
||||
const token = localStorage.getItem('access_token')
|
||||
if (!token) {
|
||||
// Not logged in - redirect to login
|
||||
const isLoginPage = window.location.pathname === '/login'
|
||||
if (!isLoginPage) {
|
||||
window.location.href = '/login'
|
||||
return Promise.reject(error)
|
||||
}
|
||||
}
|
||||
// If logged in but network error, it's a connection issue
|
||||
console.error('Network Error:', error)
|
||||
}
|
||||
|
||||
// Handle 401 Unauthorized
|
||||
if (error.response?.status === 401) {
|
||||
// Don't redirect if we're already on the login page (prevents clearing error messages)
|
||||
const isLoginPage = window.location.pathname === '/login'
|
||||
@ -26,6 +26,7 @@ export const jobsApi = {
|
||||
},
|
||||
|
||||
streamJobProgress: (jobId: string): EventSource => {
|
||||
// EventSource needs absolute URL - use VITE_API_URL or fallback to direct backend URL
|
||||
const baseURL = import.meta.env.VITE_API_URL || 'http://127.0.0.1:8000'
|
||||
return new EventSource(`${baseURL}/api/v1/jobs/stream/${jobId}`)
|
||||
},
|
||||
@ -74,6 +74,7 @@ export const photosApi = {
|
||||
},
|
||||
|
||||
streamJobProgress: (jobId: string): EventSource => {
|
||||
// EventSource needs absolute URL - use VITE_API_URL or fallback to direct backend URL
|
||||
const baseURL = import.meta.env.VITE_API_URL || 'http://127.0.0.1:8000'
|
||||
return new EventSource(`${baseURL}/api/v1/jobs/stream/${jobId}`)
|
||||
},
|
||||
@ -36,3 +36,7 @@ export const rolePermissionsApi = {
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@ -122,3 +122,7 @@ export default videosApi
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@ -30,7 +30,17 @@ export default function ApproveIdentified() {
|
||||
const response = await pendingIdentificationsApi.list(includeDenied)
|
||||
setPendingIdentifications(response.items)
|
||||
} catch (err: any) {
|
||||
setError(err.response?.data?.detail || err.message || 'Failed to load pending identifications')
|
||||
let errorMessage = 'Failed to load pending identifications'
|
||||
if (err.response?.data?.detail) {
|
||||
errorMessage = err.response.data.detail
|
||||
} else if (err.message) {
|
||||
errorMessage = err.message
|
||||
// Provide more context for network errors
|
||||
if (err.message === 'Network Error' || err.code === 'ERR_NETWORK') {
|
||||
errorMessage = `Network Error: Cannot connect to backend API (${apiClient.defaults.baseURL}). Please check:\n1. Backend is running\n2. You are logged in\n3. CORS is configured correctly`
|
||||
}
|
||||
}
|
||||
setError(errorMessage)
|
||||
console.error('Error loading pending identifications:', err)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
@ -306,9 +306,10 @@ export default function Search() {
|
||||
|
||||
// Auto-run search for no_faces and no_tags
|
||||
// No alert shown when no photos found
|
||||
} catch (error) {
|
||||
} catch (error: any) {
|
||||
console.error('Error searching photos:', error)
|
||||
alert('Error searching photos. Please try again.')
|
||||
const errorMessage = error.response?.data?.detail || error.message || 'Unknown error'
|
||||
alert(`Error searching photos: ${errorMessage}\n\nPlease check:\n1. Backend API is running (http://127.0.0.1:8000)\n2. You are logged in\n3. Database connection is working`)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
0
backend/__init__.py
Normal file
0
backend/__init__.py
Normal file
@ -2,6 +2,7 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Annotated
|
||||
|
||||
@ -10,15 +11,15 @@ from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
|
||||
from jose import JWTError, jwt
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.constants.roles import (
|
||||
from backend.constants.roles import (
|
||||
DEFAULT_ADMIN_ROLE,
|
||||
DEFAULT_USER_ROLE,
|
||||
ROLE_VALUES,
|
||||
)
|
||||
from src.web.db.session import get_db
|
||||
from src.web.db.models import User
|
||||
from src.web.utils.password import verify_password, hash_password
|
||||
from src.web.schemas.auth import (
|
||||
from backend.db.session import get_db
|
||||
from backend.db.models import User
|
||||
from backend.utils.password import verify_password, hash_password
|
||||
from backend.schemas.auth import (
|
||||
LoginRequest,
|
||||
RefreshRequest,
|
||||
TokenResponse,
|
||||
@ -26,7 +27,7 @@ from src.web.schemas.auth import (
|
||||
PasswordChangeRequest,
|
||||
PasswordChangeResponse,
|
||||
)
|
||||
from src.web.services.role_permissions import fetch_role_permissions_map
|
||||
from backend.services.role_permissions import fetch_role_permissions_map
|
||||
|
||||
router = APIRouter(prefix="/auth", tags=["auth"])
|
||||
security = HTTPBearer()
|
||||
@ -37,9 +38,9 @@ ALGORITHM = "HS256"
|
||||
ACCESS_TOKEN_EXPIRE_MINUTES = 360
|
||||
REFRESH_TOKEN_EXPIRE_DAYS = 7
|
||||
|
||||
# Single user mode placeholder
|
||||
SINGLE_USER_USERNAME = "admin"
|
||||
SINGLE_USER_PASSWORD = "admin" # Change in production
|
||||
# Single user mode placeholder - read from environment or use defaults
|
||||
SINGLE_USER_USERNAME = os.getenv("ADMIN_USERNAME", "admin")
|
||||
SINGLE_USER_PASSWORD = os.getenv("ADMIN_PASSWORD", "admin") # Change in production
|
||||
|
||||
|
||||
def create_access_token(data: dict, expires_delta: timedelta) -> str:
|
||||
@ -96,7 +97,7 @@ def get_current_user_with_id(
|
||||
|
||||
# If user doesn't exist, create them (for bootstrap scenarios)
|
||||
if not user:
|
||||
from src.web.utils.password import hash_password
|
||||
from backend.utils.password import hash_password
|
||||
|
||||
# Generate unique email to avoid conflicts
|
||||
base_email = f"{username}@example.com"
|
||||
@ -255,7 +256,7 @@ def get_current_user_info(
|
||||
|
||||
# If no admins exist, bootstrap current user as admin
|
||||
if admin_count == 0:
|
||||
from src.web.utils.password import hash_password
|
||||
from backend.utils.password import hash_password
|
||||
|
||||
# Generate unique email to avoid conflicts
|
||||
base_email = f"{username}@example.com"
|
||||
@ -10,16 +10,16 @@ from fastapi.responses import JSONResponse
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.api.auth import get_current_user
|
||||
from src.web.api.users import get_current_admin_user
|
||||
from src.web.db.session import get_auth_db, get_db
|
||||
from src.web.schemas.auth_users import (
|
||||
from backend.api.auth import get_current_user
|
||||
from backend.api.users import get_current_admin_user
|
||||
from backend.db.session import get_auth_db, get_db
|
||||
from backend.schemas.auth_users import (
|
||||
AuthUserCreateRequest,
|
||||
AuthUserResponse,
|
||||
AuthUserUpdateRequest,
|
||||
AuthUsersListResponse,
|
||||
)
|
||||
from src.web.utils.password import hash_password
|
||||
from backend.utils.password import hash_password
|
||||
|
||||
router = APIRouter(prefix="/auth-users", tags=["auth-users"])
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -10,9 +10,9 @@ from sqlalchemy import func
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import Annotated
|
||||
|
||||
from src.web.db.session import get_db
|
||||
from src.web.api.auth import get_current_user_with_id
|
||||
from src.web.schemas.faces import (
|
||||
from backend.db.session import get_db
|
||||
from backend.api.auth import get_current_user_with_id
|
||||
from backend.schemas.faces import (
|
||||
ProcessFacesRequest,
|
||||
ProcessFacesResponse,
|
||||
UnidentifiedFacesQuery,
|
||||
@ -41,9 +41,9 @@ from src.web.schemas.faces import (
|
||||
DeleteFacesRequest,
|
||||
DeleteFacesResponse,
|
||||
)
|
||||
from src.web.schemas.people import PersonCreateRequest, PersonResponse
|
||||
from src.web.db.models import Face, Person, PersonEncoding, Photo
|
||||
from src.web.services.face_service import (
|
||||
from backend.schemas.people import PersonCreateRequest, PersonResponse
|
||||
from backend.db.models import Face, Person, PersonEncoding, Photo
|
||||
from backend.services.face_service import (
|
||||
list_unidentified_faces,
|
||||
find_similar_faces,
|
||||
calculate_batch_similarities,
|
||||
@ -81,7 +81,7 @@ def process_faces(request: ProcessFacesRequest) -> ProcessFacesResponse:
|
||||
# Enqueue face processing job
|
||||
# Pass function as string path to avoid serialization issues
|
||||
job = queue.enqueue(
|
||||
"src.web.services.tasks.process_faces_task",
|
||||
"backend.services.tasks.process_faces_task",
|
||||
batch_size=request.batch_size,
|
||||
detector_backend=request.detector_backend,
|
||||
model_name=request.model_name,
|
||||
@ -350,7 +350,7 @@ def get_face_crop(face_id: int, db: Session = Depends(get_db)) -> Response:
|
||||
import ast
|
||||
import tempfile
|
||||
from PIL import Image
|
||||
from src.web.db.models import Face, Photo
|
||||
from backend.db.models import Face, Photo
|
||||
from src.utils.exif_utils import EXIFOrientationHandler
|
||||
|
||||
face = db.query(Face).filter(Face.id == face_id).first()
|
||||
@ -593,7 +593,7 @@ def auto_match_faces(
|
||||
- Only auto-accepts matches with similarity >= threshold
|
||||
- Only auto-accepts faces with quality > 50% (quality_score > 0.5)
|
||||
"""
|
||||
from src.web.db.models import Person, Photo
|
||||
from backend.db.models import Person, Photo
|
||||
from sqlalchemy import func
|
||||
|
||||
# Track statistics for auto-accept
|
||||
@ -755,7 +755,7 @@ def get_auto_match_people(
|
||||
Note: Only returns people if there are unidentified faces in the database
|
||||
(since people can't have matches if there are no unidentified faces).
|
||||
"""
|
||||
from src.web.db.models import Person, Photo
|
||||
from backend.db.models import Person, Photo
|
||||
|
||||
# Get people list (fast - no match calculations, but checks for unidentified faces)
|
||||
people_data = get_auto_match_people_list(
|
||||
@ -814,7 +814,7 @@ def get_auto_match_person_matches(
|
||||
|
||||
This endpoint is called on-demand when user navigates to a person.
|
||||
"""
|
||||
from src.web.db.models import Photo
|
||||
from backend.db.models import Photo
|
||||
|
||||
# Get matches for this person
|
||||
similar_faces = get_person_matches_service(
|
||||
@ -12,7 +12,7 @@ from redis import Redis
|
||||
import json
|
||||
import time
|
||||
|
||||
from src.web.schemas.jobs import JobResponse, JobStatus
|
||||
from backend.schemas.jobs import JobResponse, JobStatus
|
||||
|
||||
router = APIRouter(prefix="/jobs", tags=["jobs"])
|
||||
|
||||
@ -10,11 +10,11 @@ from pydantic import BaseModel, ConfigDict
|
||||
from sqlalchemy import text, func
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.constants.roles import DEFAULT_USER_ROLE
|
||||
from src.web.db.session import get_auth_db, get_db
|
||||
from src.web.db.models import Face, Person, PersonEncoding, User
|
||||
from src.web.api.users import get_current_admin_user, require_feature_permission
|
||||
from src.web.utils.password import hash_password
|
||||
from backend.constants.roles import DEFAULT_USER_ROLE
|
||||
from backend.db.session import get_auth_db, get_db
|
||||
from backend.db.models import Face, Person, PersonEncoding, User
|
||||
from backend.api.users import get_current_admin_user, require_feature_permission
|
||||
from backend.utils.password import hash_password
|
||||
|
||||
router = APIRouter(prefix="/pending-identifications", tags=["pending-identifications"])
|
||||
|
||||
@ -10,9 +10,9 @@ from pydantic import BaseModel, ConfigDict, Field
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.api.users import require_feature_permission
|
||||
from src.web.db.models import Photo, PhotoTagLinkage, Tag
|
||||
from src.web.db.session import get_auth_db, get_db
|
||||
from backend.api.users import require_feature_permission
|
||||
from backend.db.models import Photo, PhotoTagLinkage, Tag
|
||||
from backend.db.session import get_auth_db, get_db
|
||||
|
||||
router = APIRouter(prefix="/pending-linkages", tags=["pending-linkages"])
|
||||
|
||||
@ -13,11 +13,11 @@ from pydantic import BaseModel, ConfigDict
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.db.session import get_auth_db, get_db
|
||||
from src.web.api.users import get_current_admin_user, require_feature_permission
|
||||
from src.web.api.auth import get_current_user
|
||||
from src.web.services.photo_service import import_photo_from_path, calculate_file_hash
|
||||
from src.web.settings import PHOTO_STORAGE_DIR
|
||||
from backend.db.session import get_auth_db, get_db
|
||||
from backend.api.users import get_current_admin_user, require_feature_permission
|
||||
from backend.api.auth import get_current_user
|
||||
from backend.services.photo_service import import_photo_from_path, calculate_file_hash
|
||||
from backend.settings import PHOTO_STORAGE_DIR
|
||||
|
||||
router = APIRouter(prefix="/pending-photos", tags=["pending-photos"])
|
||||
|
||||
@ -8,10 +8,10 @@ from fastapi import APIRouter, Depends, HTTPException, Query, Response, status
|
||||
from sqlalchemy import func
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.db.session import get_db
|
||||
from src.web.db.models import Person, Face, PersonEncoding, PhotoPersonLinkage, Photo
|
||||
from src.web.api.auth import get_current_user_with_id
|
||||
from src.web.schemas.people import (
|
||||
from backend.db.session import get_db
|
||||
from backend.db.models import Person, Face, PersonEncoding, PhotoPersonLinkage, Photo
|
||||
from backend.api.auth import get_current_user_with_id
|
||||
from backend.schemas.people import (
|
||||
PeopleListResponse,
|
||||
PersonCreateRequest,
|
||||
PersonResponse,
|
||||
@ -19,8 +19,8 @@ from src.web.schemas.people import (
|
||||
PersonWithFacesResponse,
|
||||
PeopleWithFacesListResponse,
|
||||
)
|
||||
from src.web.schemas.faces import PersonFacesResponse, PersonFaceItem, AcceptMatchesRequest, IdentifyFaceResponse
|
||||
from src.web.services.face_service import accept_auto_match_matches
|
||||
from backend.schemas.faces import PersonFacesResponse, PersonFaceItem, AcceptMatchesRequest, IdentifyFaceResponse
|
||||
from backend.services.face_service import accept_auto_match_matches
|
||||
|
||||
router = APIRouter(prefix="/people", tags=["people"])
|
||||
|
||||
@ -179,7 +179,7 @@ def get_person_faces(person_id: int, db: Session = Depends(get_db)) -> PersonFac
|
||||
if not person:
|
||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"Person {person_id} not found")
|
||||
|
||||
from src.web.db.models import Photo
|
||||
from backend.db.models import Photo
|
||||
|
||||
faces = (
|
||||
db.query(Face)
|
||||
@ -260,7 +260,7 @@ def accept_matches(
|
||||
3. Updates person encodings (removes old, adds current)
|
||||
Tracks which user identified the faces.
|
||||
"""
|
||||
from src.web.api.auth import get_current_user_with_id
|
||||
from backend.api.auth import get_current_user_with_id
|
||||
|
||||
user_id = current_user["user_id"]
|
||||
identified_count, updated_count = accept_auto_match_matches(
|
||||
@ -12,14 +12,14 @@ from rq import Queue
|
||||
from redis import Redis
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.db.session import get_db
|
||||
from src.web.api.auth import get_current_user
|
||||
from src.web.api.users import get_current_admin_user
|
||||
from backend.db.session import get_db
|
||||
from backend.api.auth import get_current_user
|
||||
from backend.api.users import get_current_admin_user
|
||||
|
||||
# Redis connection for RQ
|
||||
redis_conn = Redis(host="localhost", port=6379, db=0, decode_responses=False)
|
||||
queue = Queue(connection=redis_conn)
|
||||
from src.web.schemas.photos import (
|
||||
from backend.schemas.photos import (
|
||||
PhotoImportRequest,
|
||||
PhotoImportResponse,
|
||||
PhotoResponse,
|
||||
@ -30,15 +30,15 @@ from src.web.schemas.photos import (
|
||||
BulkRemoveFavoritesRequest,
|
||||
BulkRemoveFavoritesResponse,
|
||||
)
|
||||
from src.web.schemas.search import (
|
||||
from backend.schemas.search import (
|
||||
PhotoSearchResult,
|
||||
SearchPhotosResponse,
|
||||
)
|
||||
from src.web.services.photo_service import (
|
||||
from backend.services.photo_service import (
|
||||
find_photos_in_folder,
|
||||
import_photo_from_path,
|
||||
)
|
||||
from src.web.services.search_service import (
|
||||
from backend.services.search_service import (
|
||||
get_favorite_photos,
|
||||
get_photo_face_count,
|
||||
get_photo_person,
|
||||
@ -83,7 +83,7 @@ def search_photos(
|
||||
- Search unprocessed: returns photos that have not been processed for face detection
|
||||
- Search favorites: returns photos favorited by current user
|
||||
"""
|
||||
from src.web.db.models import PhotoFavorite
|
||||
from backend.db.models import PhotoFavorite
|
||||
|
||||
items: List[PhotoSearchResult] = []
|
||||
total = 0
|
||||
@ -355,7 +355,7 @@ def import_photos(
|
||||
# Enqueue job
|
||||
# Pass function as string path to avoid serialization issues
|
||||
job = queue.enqueue(
|
||||
"src.web.services.tasks.import_photos_task",
|
||||
"backend.services.tasks.import_photos_task",
|
||||
request.folder_path,
|
||||
request.recursive,
|
||||
job_timeout="1h", # Allow up to 1 hour for large imports
|
||||
@ -384,7 +384,7 @@ async def upload_photos(
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
|
||||
from src.web.settings import PHOTO_STORAGE_DIR
|
||||
from backend.settings import PHOTO_STORAGE_DIR
|
||||
|
||||
# Ensure storage directory exists
|
||||
storage_dir = Path(PHOTO_STORAGE_DIR)
|
||||
@ -500,7 +500,7 @@ def browse_folder() -> dict:
|
||||
@router.get("/{photo_id}", response_model=PhotoResponse)
|
||||
def get_photo(photo_id: int, db: Session = Depends(get_db)) -> PhotoResponse:
|
||||
"""Get photo by ID."""
|
||||
from src.web.db.models import Photo
|
||||
from backend.db.models import Photo
|
||||
|
||||
photo = db.query(Photo).filter(Photo.id == photo_id).first()
|
||||
if not photo:
|
||||
@ -517,7 +517,7 @@ def get_photo_image(photo_id: int, db: Session = Depends(get_db)) -> FileRespons
|
||||
"""Serve photo image file for display (not download)."""
|
||||
import os
|
||||
import mimetypes
|
||||
from src.web.db.models import Photo
|
||||
from backend.db.models import Photo
|
||||
|
||||
photo = db.query(Photo).filter(Photo.id == photo_id).first()
|
||||
if not photo:
|
||||
@ -555,7 +555,7 @@ def toggle_favorite(
|
||||
db: Session = Depends(get_db),
|
||||
) -> dict:
|
||||
"""Toggle favorite status of a photo for current user."""
|
||||
from src.web.db.models import Photo, PhotoFavorite
|
||||
from backend.db.models import Photo, PhotoFavorite
|
||||
|
||||
username = current_user["username"]
|
||||
|
||||
@ -602,7 +602,7 @@ def check_favorite(
|
||||
db: Session = Depends(get_db),
|
||||
) -> dict:
|
||||
"""Check if photo is favorited by current user."""
|
||||
from src.web.db.models import PhotoFavorite
|
||||
from backend.db.models import PhotoFavorite
|
||||
|
||||
username = current_user["username"]
|
||||
|
||||
@ -628,7 +628,7 @@ def bulk_add_favorites(
|
||||
Only adds favorites for photos that aren't already favorites.
|
||||
Uses a single database transaction for better performance.
|
||||
"""
|
||||
from src.web.db.models import Photo, PhotoFavorite
|
||||
from backend.db.models import Photo, PhotoFavorite
|
||||
|
||||
photo_ids = request.photo_ids
|
||||
if not photo_ids:
|
||||
@ -690,7 +690,7 @@ def bulk_remove_favorites(
|
||||
Only removes favorites for photos that are currently favorites.
|
||||
Uses a single database transaction for better performance.
|
||||
"""
|
||||
from src.web.db.models import Photo, PhotoFavorite
|
||||
from backend.db.models import Photo, PhotoFavorite
|
||||
|
||||
photo_ids = request.photo_ids
|
||||
if not photo_ids:
|
||||
@ -745,7 +745,7 @@ def bulk_delete_photos(
|
||||
db: Session = Depends(get_db),
|
||||
) -> BulkDeletePhotosResponse:
|
||||
"""Delete multiple photos and all related data (faces, encodings, tags, favorites)."""
|
||||
from src.web.db.models import Photo, PhotoTagLinkage
|
||||
from backend.db.models import Photo, PhotoTagLinkage
|
||||
|
||||
photo_ids = list(dict.fromkeys(request.photo_ids))
|
||||
if not photo_ids:
|
||||
@ -804,7 +804,7 @@ def open_photo_folder(photo_id: int, db: Session = Depends(get_db)) -> dict:
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
from src.web.db.models import Photo
|
||||
from backend.db.models import Photo
|
||||
|
||||
photo = db.query(Photo).filter(Photo.id == photo_id).first()
|
||||
if not photo:
|
||||
@ -10,9 +10,9 @@ from pydantic import BaseModel, ConfigDict
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.db.session import get_auth_db, get_db
|
||||
from src.web.db.models import Photo, PhotoTagLinkage
|
||||
from src.web.api.users import get_current_admin_user, require_feature_permission
|
||||
from backend.db.session import get_auth_db, get_db
|
||||
from backend.db.models import Photo, PhotoTagLinkage
|
||||
from backend.api.users import get_current_admin_user, require_feature_permission
|
||||
|
||||
router = APIRouter(prefix="/reported-photos", tags=["reported-photos"])
|
||||
|
||||
@ -7,16 +7,16 @@ from typing import Annotated
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.api.users import get_current_admin_user
|
||||
from src.web.constants.role_features import ROLE_FEATURES, ROLE_FEATURE_KEYS
|
||||
from src.web.constants.roles import ROLE_VALUES
|
||||
from src.web.db.session import get_db
|
||||
from src.web.schemas.role_permissions import (
|
||||
from backend.api.users import get_current_admin_user
|
||||
from backend.constants.role_features import ROLE_FEATURES, ROLE_FEATURE_KEYS
|
||||
from backend.constants.roles import ROLE_VALUES
|
||||
from backend.db.session import get_db
|
||||
from backend.schemas.role_permissions import (
|
||||
RoleFeatureSchema,
|
||||
RolePermissionsResponse,
|
||||
RolePermissionsUpdateRequest,
|
||||
)
|
||||
from src.web.services.role_permissions import (
|
||||
from backend.services.role_permissions import (
|
||||
ensure_role_permissions_initialized,
|
||||
fetch_role_permissions_map,
|
||||
set_role_permissions,
|
||||
@ -68,3 +68,7 @@ def update_role_permissions(
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@ -7,8 +7,8 @@ from typing import List
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.db.session import get_db
|
||||
from src.web.schemas.tags import (
|
||||
from backend.db.session import get_db
|
||||
from backend.schemas.tags import (
|
||||
PhotoTagsRequest,
|
||||
PhotoTagsResponse,
|
||||
TagCreateRequest,
|
||||
@ -21,7 +21,7 @@ from src.web.schemas.tags import (
|
||||
PhotosWithTagsResponse,
|
||||
PhotoWithTagsItem,
|
||||
)
|
||||
from src.web.services.tag_service import (
|
||||
from backend.services.tag_service import (
|
||||
add_tags_to_photos,
|
||||
get_or_create_tag,
|
||||
list_tags,
|
||||
@ -31,7 +31,7 @@ from src.web.services.tag_service import (
|
||||
delete_tags,
|
||||
get_photos_with_tags,
|
||||
)
|
||||
from src.web.db.models import Photo
|
||||
from backend.db.models import Photo
|
||||
|
||||
router = APIRouter(prefix="/tags", tags=["tags"])
|
||||
|
||||
@ -11,24 +11,24 @@ from sqlalchemy import text
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.api.auth import get_current_user
|
||||
from src.web.constants.roles import (
|
||||
from backend.api.auth import get_current_user
|
||||
from backend.constants.roles import (
|
||||
DEFAULT_ADMIN_ROLE,
|
||||
DEFAULT_USER_ROLE,
|
||||
ROLE_VALUES,
|
||||
UserRole,
|
||||
is_admin_role,
|
||||
)
|
||||
from src.web.db.session import get_auth_db, get_db
|
||||
from src.web.db.models import Face, PhotoFavorite, PhotoPersonLinkage, User
|
||||
from src.web.schemas.users import (
|
||||
from backend.db.session import get_auth_db, get_db
|
||||
from backend.db.models import Face, PhotoFavorite, PhotoPersonLinkage, User
|
||||
from backend.schemas.users import (
|
||||
UserCreateRequest,
|
||||
UserResponse,
|
||||
UserUpdateRequest,
|
||||
UsersListResponse,
|
||||
)
|
||||
from src.web.utils.password import hash_password
|
||||
from src.web.services.role_permissions import fetch_role_permissions_map
|
||||
from backend.utils.password import hash_password
|
||||
from backend.services.role_permissions import fetch_role_permissions_map
|
||||
|
||||
router = APIRouter(prefix="/users", tags=["users"])
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -2,7 +2,7 @@ from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter
|
||||
|
||||
from src.web.settings import APP_VERSION
|
||||
from backend.settings import APP_VERSION
|
||||
|
||||
|
||||
router = APIRouter()
|
||||
@ -9,10 +9,10 @@ from fastapi import APIRouter, Depends, HTTPException, Query, status
|
||||
from fastapi.responses import FileResponse
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.web.db.session import get_db
|
||||
from src.web.db.models import Photo, User
|
||||
from src.web.api.auth import get_current_user_with_id
|
||||
from src.web.schemas.videos import (
|
||||
from backend.db.session import get_db
|
||||
from backend.db.models import Photo, User
|
||||
from backend.api.auth import get_current_user_with_id
|
||||
from backend.schemas.videos import (
|
||||
ListVideosResponse,
|
||||
VideoListItem,
|
||||
PersonInfo,
|
||||
@ -22,14 +22,14 @@ from src.web.schemas.videos import (
|
||||
IdentifyVideoResponse,
|
||||
RemoveVideoPersonResponse,
|
||||
)
|
||||
from src.web.services.video_service import (
|
||||
from backend.services.video_service import (
|
||||
list_videos_for_identification,
|
||||
get_video_people,
|
||||
identify_person_in_video,
|
||||
remove_person_from_video,
|
||||
get_video_people_count,
|
||||
)
|
||||
from src.web.services.thumbnail_service import get_video_thumbnail_path
|
||||
from backend.services.thumbnail_service import get_video_thumbnail_path
|
||||
|
||||
router = APIRouter(prefix="/videos", tags=["videos"])
|
||||
|
||||
@ -337,3 +337,7 @@ def get_video_file(
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@ -10,31 +10,31 @@ from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from sqlalchemy import inspect, text
|
||||
|
||||
from src.web.api.auth import router as auth_router
|
||||
from src.web.api.faces import router as faces_router
|
||||
from src.web.api.health import router as health_router
|
||||
from src.web.api.jobs import router as jobs_router
|
||||
from src.web.api.metrics import router as metrics_router
|
||||
from src.web.api.people import router as people_router
|
||||
from src.web.api.pending_identifications import router as pending_identifications_router
|
||||
from src.web.api.pending_linkages import router as pending_linkages_router
|
||||
from src.web.api.photos import router as photos_router
|
||||
from src.web.api.reported_photos import router as reported_photos_router
|
||||
from src.web.api.pending_photos import router as pending_photos_router
|
||||
from src.web.api.tags import router as tags_router
|
||||
from src.web.api.users import router as users_router
|
||||
from src.web.api.auth_users import router as auth_users_router
|
||||
from src.web.api.role_permissions import router as role_permissions_router
|
||||
from src.web.api.videos import router as videos_router
|
||||
from src.web.api.version import router as version_router
|
||||
from src.web.settings import APP_TITLE, APP_VERSION
|
||||
from src.web.constants.roles import DEFAULT_ADMIN_ROLE, DEFAULT_USER_ROLE, ROLE_VALUES
|
||||
from src.web.db.base import Base, engine
|
||||
from src.web.db.session import auth_engine, database_url
|
||||
from backend.api.auth import router as auth_router
|
||||
from backend.api.faces import router as faces_router
|
||||
from backend.api.health import router as health_router
|
||||
from backend.api.jobs import router as jobs_router
|
||||
from backend.api.metrics import router as metrics_router
|
||||
from backend.api.people import router as people_router
|
||||
from backend.api.pending_identifications import router as pending_identifications_router
|
||||
from backend.api.pending_linkages import router as pending_linkages_router
|
||||
from backend.api.photos import router as photos_router
|
||||
from backend.api.reported_photos import router as reported_photos_router
|
||||
from backend.api.pending_photos import router as pending_photos_router
|
||||
from backend.api.tags import router as tags_router
|
||||
from backend.api.users import router as users_router
|
||||
from backend.api.auth_users import router as auth_users_router
|
||||
from backend.api.role_permissions import router as role_permissions_router
|
||||
from backend.api.videos import router as videos_router
|
||||
from backend.api.version import router as version_router
|
||||
from backend.settings import APP_TITLE, APP_VERSION
|
||||
from backend.constants.roles import DEFAULT_ADMIN_ROLE, DEFAULT_USER_ROLE, ROLE_VALUES
|
||||
from backend.db.base import Base, engine
|
||||
from backend.db.session import auth_engine, database_url, get_auth_database_url
|
||||
# Import models to ensure they're registered with Base.metadata
|
||||
from src.web.db import models # noqa: F401
|
||||
from src.web.db.models import RolePermission
|
||||
from src.web.utils.password import hash_password
|
||||
from backend.db import models # noqa: F401
|
||||
from backend.db.models import RolePermission
|
||||
from backend.utils.password import hash_password
|
||||
|
||||
# Global worker process (will be set in lifespan)
|
||||
_worker_process: subprocess.Popen | None = None
|
||||
@ -52,22 +52,31 @@ def start_worker() -> None:
|
||||
redis_conn.ping()
|
||||
|
||||
# Start worker as a subprocess (avoids signal handler issues)
|
||||
project_root = Path(__file__).parent.parent.parent
|
||||
# __file__ is backend/app.py, so parent.parent is the project root (punimtag/)
|
||||
project_root = Path(__file__).parent.parent
|
||||
|
||||
# Use explicit Python path to avoid Cursor interception
|
||||
# Check if sys.executable is Cursor, if so use /usr/bin/python3
|
||||
python_executable = sys.executable
|
||||
if "cursor" in python_executable.lower() or not python_executable.startswith("/usr"):
|
||||
python_executable = "/usr/bin/python3"
|
||||
|
||||
# Ensure PYTHONPATH is set correctly
|
||||
worker_env = {
|
||||
**{k: v for k, v in os.environ.items()},
|
||||
"PYTHONPATH": str(project_root),
|
||||
}
|
||||
|
||||
_worker_process = subprocess.Popen(
|
||||
[
|
||||
python_executable,
|
||||
"-m",
|
||||
"src.web.worker",
|
||||
"backend.worker",
|
||||
],
|
||||
cwd=str(project_root),
|
||||
stdout=None, # Don't capture - let output go to console
|
||||
stderr=None, # Don't capture - let errors go to console
|
||||
env={
|
||||
**{k: v for k, v in os.environ.items()},
|
||||
"PYTHONPATH": str(project_root),
|
||||
}
|
||||
env=worker_env
|
||||
)
|
||||
# Give it a moment to start, then check if it's still running
|
||||
import time
|
||||
@ -378,7 +387,7 @@ def ensure_face_excluded_column(inspector) -> None:
|
||||
|
||||
columns = {column["name"] for column in inspector.get_columns("faces")}
|
||||
if "excluded" in columns:
|
||||
print("ℹ️ excluded column already exists in faces table")
|
||||
# Column already exists, no need to print or do anything
|
||||
return
|
||||
|
||||
print("🔄 Adding excluded column to faces table...")
|
||||
@ -476,17 +485,32 @@ def ensure_photo_person_linkage_table(inspector) -> None:
|
||||
|
||||
|
||||
def ensure_auth_user_is_active_column() -> None:
|
||||
"""Ensure auth database users table contains is_active column."""
|
||||
"""Ensure auth database users table contains is_active column.
|
||||
|
||||
NOTE: Auth database is managed by the frontend. This function only checks/updates
|
||||
if the database and table already exist. It will not fail if they don't exist.
|
||||
"""
|
||||
if auth_engine is None:
|
||||
# Auth database not configured
|
||||
return
|
||||
|
||||
try:
|
||||
from sqlalchemy import inspect as sqlalchemy_inspect
|
||||
auth_inspector = sqlalchemy_inspect(auth_engine)
|
||||
|
||||
# Try to get inspector - gracefully handle if database doesn't exist
|
||||
try:
|
||||
auth_inspector = sqlalchemy_inspect(auth_engine)
|
||||
except Exception as inspect_exc:
|
||||
error_str = str(inspect_exc).lower()
|
||||
if "does not exist" in error_str or "database" in error_str:
|
||||
# Database doesn't exist - that's okay, frontend will create it
|
||||
return
|
||||
# Some other error - log but don't fail
|
||||
print(f"ℹ️ Could not inspect auth database: {inspect_exc}")
|
||||
return
|
||||
|
||||
if "users" not in auth_inspector.get_table_names():
|
||||
print("ℹ️ Auth database users table does not exist yet")
|
||||
# Table doesn't exist - that's okay, frontend will create it
|
||||
return
|
||||
|
||||
columns = {column["name"] for column in auth_inspector.get_columns("users")}
|
||||
@ -545,11 +569,170 @@ def ensure_role_permissions_table(inspector) -> None:
|
||||
print(f"⚠️ Failed to create role_permissions table: {exc}")
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
def ensure_postgresql_database(db_url: str) -> None:
|
||||
"""Ensure PostgreSQL database exists, create it if it doesn't."""
|
||||
if not db_url.startswith("postgresql"):
|
||||
return # Not PostgreSQL, skip
|
||||
|
||||
try:
|
||||
from urllib.parse import urlparse, parse_qs
|
||||
import os
|
||||
import psycopg2
|
||||
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT
|
||||
|
||||
# Parse the database URL
|
||||
parsed = urlparse(db_url.replace("postgresql+psycopg2://", "postgresql://"))
|
||||
db_name = parsed.path.lstrip("/")
|
||||
user = parsed.username
|
||||
password = parsed.password
|
||||
host = parsed.hostname or "localhost"
|
||||
port = parsed.port or 5432
|
||||
|
||||
if not db_name:
|
||||
return # No database name specified
|
||||
|
||||
# Try to connect to the database
|
||||
try:
|
||||
test_conn = psycopg2.connect(
|
||||
host=host,
|
||||
port=port,
|
||||
user=user,
|
||||
password=password,
|
||||
database=db_name
|
||||
)
|
||||
test_conn.close()
|
||||
return # Database exists
|
||||
except psycopg2.OperationalError as e:
|
||||
if "does not exist" not in str(e):
|
||||
# Some other error (permissions, connection, etc.)
|
||||
print(f"⚠️ Cannot check if database '{db_name}' exists: {e}")
|
||||
return
|
||||
|
||||
# Database doesn't exist - try to create it
|
||||
print(f"🔄 Creating PostgreSQL database '{db_name}'...")
|
||||
|
||||
# Connect to postgres database to create the new database
|
||||
# Try with the configured user first (they might have CREATEDB privilege)
|
||||
admin_conn = None
|
||||
try:
|
||||
admin_conn = psycopg2.connect(
|
||||
host=host,
|
||||
port=port,
|
||||
user=user,
|
||||
password=password,
|
||||
database="postgres"
|
||||
)
|
||||
except psycopg2.OperationalError:
|
||||
# Try postgres superuser (might need password from environment or .pgpass)
|
||||
try:
|
||||
import os
|
||||
postgres_password = os.getenv("POSTGRES_PASSWORD", "")
|
||||
admin_conn = psycopg2.connect(
|
||||
host=host,
|
||||
port=port,
|
||||
user="postgres",
|
||||
password=postgres_password if postgres_password else None,
|
||||
database="postgres"
|
||||
)
|
||||
except psycopg2.OperationalError as e:
|
||||
print(f"⚠️ Cannot create database '{db_name}': insufficient privileges")
|
||||
print(f" Error: {e}")
|
||||
print(f" Please create it manually:")
|
||||
print(f" sudo -u postgres psql -c \"CREATE DATABASE {db_name};\"")
|
||||
print(f" sudo -u postgres psql -c \"GRANT ALL PRIVILEGES ON DATABASE {db_name} TO {user};\"")
|
||||
return
|
||||
|
||||
if admin_conn is None:
|
||||
return
|
||||
|
||||
admin_conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
|
||||
cursor = admin_conn.cursor()
|
||||
|
||||
# Check if database exists
|
||||
cursor.execute(
|
||||
"SELECT 1 FROM pg_database WHERE datname = %s",
|
||||
(db_name,)
|
||||
)
|
||||
exists = cursor.fetchone()
|
||||
|
||||
if not exists:
|
||||
# Create the database
|
||||
try:
|
||||
cursor.execute(f'CREATE DATABASE "{db_name}"')
|
||||
if user != "postgres" and admin_conn.info.user == "postgres":
|
||||
# Grant privileges to the user if we're connected as postgres
|
||||
try:
|
||||
cursor.execute(f'GRANT ALL PRIVILEGES ON DATABASE "{db_name}" TO "{user}"')
|
||||
except Exception as grant_exc:
|
||||
print(f"⚠️ Created database '{db_name}' but could not grant privileges: {grant_exc}")
|
||||
|
||||
# Grant schema permissions (needed for creating tables)
|
||||
if admin_conn.info.user == "postgres":
|
||||
try:
|
||||
# Connect to the new database to grant schema permissions
|
||||
cursor.close()
|
||||
admin_conn.close()
|
||||
schema_conn = psycopg2.connect(
|
||||
host=host,
|
||||
port=port,
|
||||
user="postgres",
|
||||
password=os.getenv("POSTGRES_PASSWORD", "") if os.getenv("POSTGRES_PASSWORD") else None,
|
||||
database=db_name
|
||||
)
|
||||
schema_conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
|
||||
schema_cursor = schema_conn.cursor()
|
||||
schema_cursor.execute(f'GRANT ALL ON SCHEMA public TO "{user}"')
|
||||
schema_cursor.execute(f'ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON TABLES TO "{user}"')
|
||||
schema_cursor.execute(f'ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON SEQUENCES TO "{user}"')
|
||||
schema_cursor.close()
|
||||
schema_conn.close()
|
||||
print(f"✅ Granted schema permissions to user '{user}'")
|
||||
except Exception as schema_exc:
|
||||
print(f"⚠️ Created database '{db_name}' but could not grant schema permissions: {schema_exc}")
|
||||
print(f" Please run manually:")
|
||||
print(f" sudo -u postgres psql -d {db_name} -c \"GRANT ALL ON SCHEMA public TO {user};\"")
|
||||
print(f" sudo -u postgres psql -d {db_name} -c \"ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON TABLES TO {user};\"")
|
||||
|
||||
print(f"✅ Created database '{db_name}'")
|
||||
except Exception as create_exc:
|
||||
print(f"⚠️ Failed to create database '{db_name}': {create_exc}")
|
||||
print(f" Please create it manually:")
|
||||
print(f" sudo -u postgres psql -c \"CREATE DATABASE {db_name};\"")
|
||||
if user != "postgres":
|
||||
print(f" sudo -u postgres psql -c \"GRANT ALL PRIVILEGES ON DATABASE {db_name} TO {user};\"")
|
||||
cursor.close()
|
||||
admin_conn.close()
|
||||
return
|
||||
else:
|
||||
print(f"ℹ️ Database '{db_name}' already exists")
|
||||
|
||||
cursor.close()
|
||||
admin_conn.close()
|
||||
except Exception as exc:
|
||||
print(f"⚠️ Failed to ensure database exists: {exc}")
|
||||
import traceback
|
||||
print(f" Traceback: {traceback.format_exc()}")
|
||||
# Don't raise - let the connection attempt fail naturally with a clearer error
|
||||
|
||||
|
||||
def ensure_auth_database_tables() -> None:
|
||||
"""Ensure auth database tables exist, create them if they don't.
|
||||
|
||||
NOTE: This function is deprecated. Auth database is now managed by the frontend.
|
||||
This function is kept for backward compatibility but will not create tables.
|
||||
"""
|
||||
# Auth database is managed by the frontend - do not create tables here
|
||||
return
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Lifespan context manager for startup and shutdown events."""
|
||||
# Ensure database exists and tables are created on first run
|
||||
try:
|
||||
# Ensure main PostgreSQL database exists
|
||||
# This must happen BEFORE we try to use the engine
|
||||
ensure_postgresql_database(database_url)
|
||||
|
||||
# Note: Auth database is managed by the frontend, not created here
|
||||
|
||||
if database_url.startswith("sqlite"):
|
||||
db_path = database_url.replace("sqlite:///", "")
|
||||
db_file = Path(db_path)
|
||||
@ -586,8 +769,15 @@ async def lifespan(app: FastAPI):
|
||||
ensure_face_excluded_column(inspector)
|
||||
ensure_role_permissions_table(inspector)
|
||||
|
||||
# Ensure auth database schema
|
||||
ensure_auth_user_is_active_column()
|
||||
# Note: Auth database schema and tables are managed by the frontend
|
||||
# Only check/update if the database exists (don't create it)
|
||||
if auth_engine is not None:
|
||||
try:
|
||||
ensure_auth_user_is_active_column()
|
||||
except Exception as auth_exc:
|
||||
# Auth database might not exist yet - that's okay, frontend will handle it
|
||||
print(f"ℹ️ Auth database not available: {auth_exc}")
|
||||
print(" Frontend will manage auth database setup")
|
||||
except Exception as exc:
|
||||
print(f"❌ Database initialization failed: {exc}")
|
||||
raise
|
||||
@ -4,7 +4,7 @@ from __future__ import annotations
|
||||
|
||||
from typing import Dict, Final, List, Set
|
||||
|
||||
from src.web.constants.roles import UserRole
|
||||
from backend.constants.roles import UserRole
|
||||
|
||||
ROLE_FEATURES: Final[List[dict[str, str]]] = [
|
||||
{"key": "scan", "label": "Scan"},
|
||||
@ -2,8 +2,8 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from src.web.db.models import Base
|
||||
from src.web.db.session import engine
|
||||
from backend.db.models import Base
|
||||
from backend.db.session import engine
|
||||
|
||||
__all__ = ["Base", "engine"]
|
||||
|
||||
@ -21,7 +21,7 @@ from sqlalchemy import (
|
||||
)
|
||||
from sqlalchemy.orm import declarative_base, relationship
|
||||
|
||||
from src.web.constants.roles import DEFAULT_USER_ROLE
|
||||
from backend.constants.roles import DEFAULT_USER_ROLE
|
||||
|
||||
if TYPE_CHECKING:
|
||||
pass
|
||||
@ -6,7 +6,7 @@ from typing import Dict
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
|
||||
from src.web.constants.roles import DEFAULT_USER_ROLE, UserRole
|
||||
from backend.constants.roles import DEFAULT_USER_ROLE, UserRole
|
||||
|
||||
|
||||
class LoginRequest(BaseModel):
|
||||
@ -6,8 +6,8 @@ from typing import Dict
|
||||
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
from src.web.constants.role_features import ROLE_FEATURES
|
||||
from src.web.constants.roles import UserRole
|
||||
from backend.constants.role_features import ROLE_FEATURES
|
||||
from backend.constants.roles import UserRole
|
||||
|
||||
|
||||
class RoleFeatureSchema(BaseModel):
|
||||
@ -42,3 +42,7 @@ class RolePermissionsUpdateRequest(BaseModel):
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@ -7,7 +7,7 @@ from typing import Optional
|
||||
|
||||
from pydantic import BaseModel, ConfigDict, EmailStr, Field
|
||||
|
||||
from src.web.constants.roles import DEFAULT_USER_ROLE, UserRole
|
||||
from backend.constants.roles import DEFAULT_USER_ROLE, UserRole
|
||||
|
||||
|
||||
class UserResponse(BaseModel):
|
||||
@ -90,3 +90,7 @@ class RemoveVideoPersonResponse(BaseModel):
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user