docs: Update README.md for PostgreSQL requirement and remove SQLite references

This commit updates the README.md to reflect the requirement of PostgreSQL for both development and production environments. It clarifies the database setup instructions, removes references to SQLite, and ensures consistency in the documentation regarding database configurations. Additionally, it enhances the clarity of environment variable settings and database schema compatibility between the web and desktop versions.
This commit is contained in:
Tanya 2026-01-06 11:56:08 -05:00
parent b104dcba71
commit 1f3f35d535
5 changed files with 95 additions and 540 deletions

View File

@ -33,7 +33,7 @@ A fast, simple, and modern web application for organizing and tagging photos usi
- **Python 3.12 or higher** (with pip)
- **Node.js 18+ and npm**
- **PostgreSQL** (for production, optional for development with SQLite)
- **PostgreSQL** (required for both development and production)
- **Redis** (for background job processing)
**Note:** The automated installation script (`./install.sh`) will install PostgreSQL and Redis automatically on Ubuntu/Debian systems.
@ -98,12 +98,11 @@ cd ..
### Database Setup
**Database Configuration:**
The application uses **two separate databases**:
The application uses **two separate PostgreSQL databases**:
1. **Main database** (`punimtag`) - Stores photos, faces, people, tags, and backend user accounts
- **Default: SQLite** at `data/punimtag.db` (for development)
- **Optional: PostgreSQL** (for production)
- **Required: PostgreSQL**
2. **Auth database** (`punimtag_auth`) - Stores frontend website user accounts and moderation data
- **Required: PostgreSQL** (always uses PostgreSQL)
- **Required: PostgreSQL**
Both database connections are configured via the `.env` file.
@ -157,10 +156,10 @@ Alternatively, use the automated script (requires sudo password):
**Configuration:**
The `.env` file in the project root contains database connection strings:
```bash
# Main application database (SQLite - default for development)
DATABASE_URL=sqlite:///data/punimtag.db
# Main application database (PostgreSQL - required)
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
# Auth database (PostgreSQL - always required for frontend website users)
# Auth database (PostgreSQL - required for frontend website users)
DATABASE_URL_AUTH=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag_auth
```
@ -172,24 +171,6 @@ The web application will:
- Create all required tables with the correct schema on startup
- Match the desktop version schema exactly for compatibility
**Note:** The main database uses SQLite by default for easier development. For production, you can switch to PostgreSQL by updating `DATABASE_URL` in `.env`.
**SQLite (Default - Local Database):**
The main database uses SQLite by default for development. The `.env` file should contain:
```bash
# Main database (SQLite - default for development)
DATABASE_URL=sqlite:///data/punimtag.db
# Or use absolute path:
# DATABASE_URL=file:/home/ladmin/code/punimtag/data/punimtag.db
```
**PostgreSQL (Optional - for Production):**
To use PostgreSQL for the main database instead, set:
```bash
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
```
**Database Schema:**
The web version uses the **exact same schema** as the desktop version for full compatibility:
- `photos` - Photo metadata (path, filename, date_taken, processed, media_type)
@ -469,7 +450,7 @@ punimtag/
- ✅ Job management endpoints (RQ/Redis integration)
- ✅ SQLAlchemy models for all entities
- ✅ Alembic migrations configured and applied
- ✅ Database initialized (SQLite default, PostgreSQL supported)
- ✅ Database initialized (PostgreSQL required)
- ✅ RQ worker auto-start (starts automatically with API server)
- ✅ Pending linkage moderation API for user tag suggestions
@ -486,8 +467,7 @@ punimtag/
- ✅ All tables created automatically on startup: `photos`, `faces`, `people`, `person_encodings`, `tags`, `phototaglinkage`
- ✅ Schema matches desktop version exactly for full compatibility
- ✅ Indices configured for performance
- ✅ SQLite database at `data/punimtag.db` (auto-created if missing, default for development)
- ✅ PostgreSQL support for production deployments
- ✅ PostgreSQL database (required for both development and production)
- ✅ Separate auth database (PostgreSQL) for frontend user accounts
### Image Ingestion & Processing
@ -578,34 +558,26 @@ punimtag/
### Database
**SQLite (Default - Local Database):**
The main database uses SQLite by default for development, configured via the `.env` file:
**PostgreSQL (Required):**
Both databases use PostgreSQL. Configure via the `.env` file:
```bash
# Main application database (SQLite - default)
DATABASE_URL=sqlite:///data/punimtag.db
# Main application database (PostgreSQL - required)
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
# Auth database (PostgreSQL - always required for frontend website users)
# Auth database (PostgreSQL - required for frontend website users)
DATABASE_URL_AUTH=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag_auth
```
**PostgreSQL (Optional - for Production):**
To use PostgreSQL for the main database instead:
```bash
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
```
**Note:** The auth database (`DATABASE_URL_AUTH`) always uses PostgreSQL and is required for frontend website user authentication features.
### Environment Variables
Configuration is managed via the `.env` file in the project root. A `.env.example` template is provided.
**Required Configuration:**
```bash
# Database (SQLite by default for development)
DATABASE_URL=sqlite:///data/punimtag.db
# Main Database (PostgreSQL - required)
DATABASE_URL=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag
# Auth Database (PostgreSQL - always required for frontend website user accounts)
# Auth Database (PostgreSQL - required for frontend website user accounts)
DATABASE_URL_AUTH=postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag_auth
# JWT Secrets (change in production!)
@ -629,15 +601,14 @@ VITE_API_URL=http://127.0.0.1:8000
**Viewer Frontend Configuration:**
Create a `.env` file in the `viewer-frontend/` directory:
```bash
# Main database connection (SQLite - matches backend default)
# Use absolute path for SQLite
DATABASE_URL=file:/home/ladmin/code/punimtag/data/punimtag.db
# Main database connection (PostgreSQL - required)
DATABASE_URL=postgresql://punimtag:punimtag_password@localhost:5432/punimtag
# Auth database connection (PostgreSQL - always required)
# Auth database connection (PostgreSQL - required)
DATABASE_URL_AUTH=postgresql://punimtag:punimtag_password@localhost:5432/punimtag_auth
# Write-capable database connection (optional, falls back to DATABASE_URL if not set)
DATABASE_URL_WRITE=file:/home/ladmin/code/punimtag/data/punimtag.db
DATABASE_URL_WRITE=postgresql://punimtag:punimtag_password@localhost:5432/punimtag
# NextAuth configuration
NEXTAUTH_URL=http://localhost:3001
@ -651,10 +622,7 @@ cd viewer-frontend
npx prisma generate
```
**Important:** The viewer frontend uses **SQLite** for the main database (matching the backend default). The Prisma schema is configured for SQLite. If you change the backend to PostgreSQL, you'll need to:
1. Update `viewer-frontend/prisma/schema.prisma` to use `provider = "postgresql"`
2. Update `DATABASE_URL` in `viewer-frontend/.env` to the PostgreSQL connection string
3. Run `npx prisma generate` again
**Important:** The viewer frontend uses **PostgreSQL** for the main database (matching the backend). The Prisma schema is configured for PostgreSQL.
**Note:** The viewer frontend uses the same database as the backend by default. For production deployments, you may want to create separate read-only and write users for better security.
@ -677,7 +645,7 @@ npx prisma generate
**Backend:**
- **Framework**: FastAPI (Python 3.12+)
- **Database**: PostgreSQL (default, network), SQLite (optional, local)
- **Database**: PostgreSQL (required)
- **ORM**: SQLAlchemy 2.0
- **Configuration**: Environment variables via `.env` file (python-dotenv)
- **Jobs**: Redis + RQ
@ -743,7 +711,7 @@ npx prisma generate
## 🐛 Known Limitations
- Multi-user support with role-based permissions (single-user mode deprecated)
- SQLite for development (PostgreSQL recommended for production)
- PostgreSQL for both development and production
- GPU acceleration not yet implemented (CPU-only for now)
- Large databases (>50K photos) may require optimization
- DeepFace model downloads on first use (can take 5-10 minutes, ~100MB)

View File

@ -132,38 +132,24 @@ def ensure_user_password_hash_column(inspector) -> None:
print("🔄 Adding password_hash column to users table...")
default_hash = hash_password("changeme")
dialect = engine.dialect.name
with engine.connect() as connection:
with connection.begin():
if dialect == "postgresql":
# PostgreSQL: Add column as nullable first, then update, then set NOT NULL
connection.execute(
text("ALTER TABLE users ADD COLUMN IF NOT EXISTS password_hash TEXT")
)
connection.execute(
text(
"UPDATE users SET password_hash = :default_hash "
"WHERE password_hash IS NULL OR password_hash = ''"
),
{"default_hash": default_hash},
)
# Set NOT NULL constraint
connection.execute(
text("ALTER TABLE users ALTER COLUMN password_hash SET NOT NULL")
)
else:
# SQLite
connection.execute(
text("ALTER TABLE users ADD COLUMN password_hash TEXT")
)
connection.execute(
text(
"UPDATE users SET password_hash = :default_hash "
"WHERE password_hash IS NULL OR password_hash = ''"
),
{"default_hash": default_hash},
)
# PostgreSQL: Add column as nullable first, then update, then set NOT NULL
connection.execute(
text("ALTER TABLE users ADD COLUMN IF NOT EXISTS password_hash TEXT")
)
connection.execute(
text(
"UPDATE users SET password_hash = :default_hash "
"WHERE password_hash IS NULL OR password_hash = ''"
),
{"default_hash": default_hash},
)
# Set NOT NULL constraint
connection.execute(
text("ALTER TABLE users ALTER COLUMN password_hash SET NOT NULL")
)
print("✅ Added password_hash column to users table (default password: changeme)")
@ -178,22 +164,12 @@ def ensure_user_password_change_required_column(inspector) -> None:
return
print("🔄 Adding password_change_required column to users table...")
dialect = engine.dialect.name
with engine.connect() as connection:
with connection.begin():
if dialect == "postgresql":
connection.execute(
text("ALTER TABLE users ADD COLUMN IF NOT EXISTS password_change_required BOOLEAN NOT NULL DEFAULT true")
)
else:
# SQLite
connection.execute(
text("ALTER TABLE users ADD COLUMN password_change_required BOOLEAN DEFAULT 1")
)
connection.execute(
text("UPDATE users SET password_change_required = 1 WHERE password_change_required IS NULL")
)
connection.execute(
text("ALTER TABLE users ADD COLUMN IF NOT EXISTS password_change_required BOOLEAN NOT NULL DEFAULT true")
)
print("✅ Added password_change_required column to users table")
@ -209,40 +185,32 @@ def ensure_user_email_unique_constraint(inspector) -> None:
return
# Check if unique constraint already exists on email
dialect = engine.dialect.name
with engine.connect() as connection:
if dialect == "postgresql":
# Check if unique constraint exists
result = connection.execute(text("""
SELECT constraint_name
FROM information_schema.table_constraints
WHERE table_name = 'users'
AND constraint_type = 'UNIQUE'
AND constraint_name LIKE '%email%'
"""))
if result.first():
print(" Unique constraint on email column already exists")
return
# Try to add unique constraint (will fail if duplicates exist)
try:
print("🔄 Adding unique constraint to email column...")
connection.execute(text("ALTER TABLE users ADD CONSTRAINT uq_users_email UNIQUE (email)"))
connection.commit()
print("✅ Added unique constraint to email column")
except Exception as e:
# If constraint already exists or duplicates exist, that's okay
# API validation will prevent new duplicates
if "already exists" in str(e).lower() or "duplicate" in str(e).lower():
print(f" Could not add unique constraint (may have duplicates): {e}")
else:
print(f"⚠️ Could not add unique constraint: {e}")
else:
# SQLite - unique constraint is handled at column level
# Check if column already has unique constraint
# SQLite doesn't easily support adding unique constraints to existing columns
# The model definition will handle it for new tables
print(" SQLite: Unique constraint on email will be enforced by model definition for new tables")
# Check if unique constraint exists
result = connection.execute(text("""
SELECT constraint_name
FROM information_schema.table_constraints
WHERE table_name = 'users'
AND constraint_type = 'UNIQUE'
AND constraint_name LIKE '%email%'
"""))
if result.first():
print(" Unique constraint on email column already exists")
return
# Try to add unique constraint (will fail if duplicates exist)
try:
print("🔄 Adding unique constraint to email column...")
connection.execute(text("ALTER TABLE users ADD CONSTRAINT uq_users_email UNIQUE (email)"))
connection.commit()
print("✅ Added unique constraint to email column")
except Exception as e:
# If constraint already exists or duplicates exist, that's okay
# API validation will prevent new duplicates
if "already exists" in str(e).lower() or "duplicate" in str(e).lower():
print(f" Could not add unique constraint (may have duplicates): {e}")
else:
print(f"⚠️ Could not add unique constraint: {e}")
def ensure_face_identified_by_user_id_column(inspector) -> None:
@ -271,18 +239,6 @@ def ensure_face_identified_by_user_id_column(inspector) -> None:
)
except Exception:
pass # Index might already exist
else:
# SQLite
connection.execute(
text("ALTER TABLE faces ADD COLUMN identified_by_user_id INTEGER REFERENCES users(id)")
)
# SQLite doesn't support IF NOT EXISTS for indexes, so we'll try to create it
try:
connection.execute(
text("CREATE INDEX idx_faces_identified_by ON faces(identified_by_user_id)")
)
except Exception:
pass # Index might already exist
print("✅ Added identified_by_user_id column to faces table")
@ -370,22 +326,6 @@ def ensure_photo_media_type_column(inspector) -> None:
)
except Exception:
pass # Index might already exist
else:
# SQLite
connection.execute(
text("ALTER TABLE photos ADD COLUMN media_type TEXT DEFAULT 'image'")
)
# Update existing rows to have 'image' as default
connection.execute(
text("UPDATE photos SET media_type = 'image' WHERE media_type IS NULL")
)
# SQLite doesn't support IF NOT EXISTS for indexes, so we'll try to create it
try:
connection.execute(
text("CREATE INDEX idx_photos_media_type ON photos(media_type)")
)
except Exception:
pass # Index might already exist
print("✅ Added media_type column to photos table")
@ -418,18 +358,6 @@ def ensure_face_excluded_column(inspector) -> None:
)
except Exception:
pass # Index might already exist
else:
# SQLite
connection.execute(
text("ALTER TABLE faces ADD COLUMN excluded BOOLEAN DEFAULT 0 NOT NULL")
)
# Create index
try:
connection.execute(
text("CREATE INDEX idx_faces_excluded ON faces(excluded)")
)
except Exception:
pass # Index might already exist
print("✅ Added excluded column to faces table")
@ -440,11 +368,9 @@ def ensure_photo_person_linkage_table(inspector) -> None:
return
print("🔄 Creating photo_person_linkage table...")
dialect = engine.dialect.name
with engine.connect() as connection:
with connection.begin():
if dialect == "postgresql":
connection.execute(text("""
CREATE TABLE IF NOT EXISTS photo_person_linkage (
id SERIAL PRIMARY KEY,
@ -467,30 +393,6 @@ def ensure_photo_person_linkage_table(inspector) -> None:
)
except Exception:
pass # Index might already exist
else:
# SQLite
connection.execute(text("""
CREATE TABLE IF NOT EXISTS photo_person_linkage (
id INTEGER PRIMARY KEY AUTOINCREMENT,
photo_id INTEGER NOT NULL REFERENCES photos(id) ON DELETE CASCADE,
person_id INTEGER NOT NULL REFERENCES people(id) ON DELETE CASCADE,
identified_by_user_id INTEGER REFERENCES users(id),
created_date DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
UNIQUE(photo_id, person_id)
)
"""))
# Create indexes
for idx_name, idx_col in [
("idx_photo_person_photo", "photo_id"),
("idx_photo_person_person", "person_id"),
("idx_photo_person_user", "identified_by_user_id"),
]:
try:
connection.execute(
text(f"CREATE INDEX {idx_name} ON photo_person_linkage({idx_col})")
)
except Exception:
pass # Index might already exist
print("✅ Created photo_person_linkage table")
@ -536,15 +438,9 @@ def ensure_auth_user_is_active_column() -> None:
try:
with auth_engine.connect() as connection:
with connection.begin():
if dialect == "postgresql":
connection.execute(
text("ALTER TABLE users ADD COLUMN IF NOT EXISTS is_active BOOLEAN DEFAULT TRUE")
)
else:
# SQLite
connection.execute(
text("ALTER TABLE users ADD COLUMN is_active BOOLEAN DEFAULT 1")
)
connection.execute(
text("ALTER TABLE users ADD COLUMN IF NOT EXISTS is_active BOOLEAN DEFAULT TRUE")
)
print("✅ Added is_active column to auth database users table")
except Exception as alter_exc:
# Check if it's a permission error
@ -552,10 +448,7 @@ def ensure_auth_user_is_active_column() -> None:
if "permission" in error_str.lower() or "insufficient" in error_str.lower() or "owner" in error_str.lower():
print("⚠️ Cannot add is_active column: insufficient database privileges")
print(" The column will need to be added manually by a database administrator:")
if dialect == "postgresql":
print(" ALTER TABLE users ADD COLUMN is_active BOOLEAN DEFAULT TRUE;")
else:
print(" ALTER TABLE users ADD COLUMN is_active BOOLEAN DEFAULT 1;")
print(" ALTER TABLE users ADD COLUMN is_active BOOLEAN DEFAULT TRUE;")
print(" Until then, users with linked data cannot be deleted.")
else:
# Some other error
@ -743,11 +636,6 @@ async def lifespan(app: FastAPI):
# Note: Auth database is managed by the frontend, not created here
if database_url.startswith("sqlite"):
db_path = database_url.replace("sqlite:///", "")
db_file = Path(db_path)
db_file.parent.mkdir(parents=True, exist_ok=True)
# Only create tables if they don't already exist (safety check)
inspector = inspect(engine)
existing_tables = set(inspector.get_table_names())

View File

@ -10,7 +10,6 @@ from sqlalchemy import (
Column,
Date,
DateTime,
String,
ForeignKey,
Index,
Integer,
@ -19,7 +18,6 @@ from sqlalchemy import (
Text,
UniqueConstraint,
CheckConstraint,
TypeDecorator,
)
from sqlalchemy.orm import declarative_base, relationship
@ -31,147 +29,6 @@ if TYPE_CHECKING:
Base = declarative_base()
class PrismaCompatibleDateTime(TypeDecorator):
"""
DateTime type that stores in a format compatible with Prisma's SQLite driver.
Prisma's SQLite driver has issues with microseconds in datetime strings.
This type ensures datetimes are stored in ISO format without microseconds:
'YYYY-MM-DD HH:MM:SS' instead of 'YYYY-MM-DD HH:MM:SS.ffffff'
Uses String as the underlying type for SQLite to have full control over the format.
"""
impl = String
cache_ok = True
def process_bind_param(self, value, dialect):
"""Convert Python datetime to SQL string format without microseconds."""
if value is None:
return None
if isinstance(value, datetime):
# Strip microseconds and format as ISO string without microseconds
# This ensures Prisma can read it correctly
return value.replace(microsecond=0).strftime('%Y-%m-%d %H:%M:%S')
# If it's already a string, ensure it doesn't have microseconds
if isinstance(value, str):
try:
# Parse and reformat to remove microseconds
if '.' in value:
# Has microseconds or timezone info - strip them
dt = datetime.strptime(value.split('.')[0], '%Y-%m-%d %H:%M:%S')
elif 'T' in value:
# ISO format with T
dt = datetime.fromisoformat(value.replace('Z', '+00:00').split('.')[0])
else:
# Already in correct format
return value
return dt.strftime('%Y-%m-%d %H:%M:%S')
except (ValueError, TypeError):
# If parsing fails, return as-is
return value
return value
def process_result_value(self, value, dialect):
"""Convert SQL string back to Python datetime."""
if value is None:
return None
if isinstance(value, str):
# Parse ISO format string
try:
# Try parsing with microseconds first (for existing data)
if '.' in value:
return datetime.strptime(value.split('.')[0], '%Y-%m-%d %H:%M:%S')
else:
return datetime.strptime(value, '%Y-%m-%d %H:%M:%S')
except ValueError:
# Fallback to ISO format parser
return datetime.fromisoformat(value.replace('Z', '+00:00'))
return value
class PrismaCompatibleDate(TypeDecorator):
"""
Date type that stores in DateTime format for Prisma compatibility.
Prisma's SQLite driver expects DateTime format (YYYY-MM-DD HH:MM:SS) even for dates.
This type stores dates with a time component (00:00:00) so Prisma can read them correctly,
while still using Python's date type in the application.
Uses String as the underlying type for SQLite to have full control over the format.
"""
impl = String
cache_ok = True
def process_bind_param(self, value, dialect):
"""Convert Python date to space-separated DateTime format for Prisma compatibility."""
if value is None:
return None
if isinstance(value, date):
# Store date in space-separated format: YYYY-MM-DD HH:MM:SS (matching date_added format)
return value.strftime('%Y-%m-%d 00:00:00')
if isinstance(value, datetime):
# If datetime is passed, extract date and format with time component
return value.date().strftime('%Y-%m-%d 00:00:00')
if isinstance(value, str):
# If it's already a string, ensure it's in space-separated format
try:
# Try to parse and convert to space-separated format
if 'T' in value:
# ISO format with T - convert to space-separated
date_part, time_part = value.split('T', 1)
time_part = time_part.split('+')[0].split('-')[0].split('Z')[0].split('.')[0]
if len(time_part.split(':')) == 3:
return f"{date_part} {time_part}"
else:
return f"{date_part} 00:00:00"
elif ' ' in value:
# Already space-separated - ensure it has time component
parts = value.split(' ', 1)
if len(parts) == 2:
date_part, time_part = parts
time_part = time_part.split('.')[0] # Remove microseconds if present
if len(time_part.split(':')) == 3:
return f"{date_part} {time_part}"
# Missing time component - add it
return f"{parts[0]} 00:00:00"
else:
# Just date (YYYY-MM-DD) - add time component
d = datetime.strptime(value, '%Y-%m-%d').date()
return d.strftime('%Y-%m-%d 00:00:00')
except (ValueError, TypeError):
# If parsing fails, return as-is
return value
return value
def process_result_value(self, value, dialect):
"""Convert SQL string back to Python date."""
if value is None:
return None
if isinstance(value, str):
# Extract date part from ISO 8601 or space-separated DateTime string
try:
if 'T' in value:
# ISO format with T
return datetime.fromisoformat(value.split('T')[0]).date()
elif ' ' in value:
# Space-separated format - extract date part
return datetime.strptime(value.split()[0], '%Y-%m-%d').date()
else:
# Just date (YYYY-MM-DD)
return datetime.strptime(value, '%Y-%m-%d').date()
except ValueError:
# Fallback to ISO format parser
try:
return datetime.fromisoformat(value.split('T')[0]).date()
except:
return datetime.strptime(value.split()[0], '%Y-%m-%d').date()
if isinstance(value, (date, datetime)):
if isinstance(value, datetime):
return value.date()
return value
return value
class Photo(Base):
"""Photo model - matches desktop schema exactly."""
@ -180,8 +37,8 @@ class Photo(Base):
id = Column(Integer, primary_key=True, autoincrement=True, index=True)
path = Column(Text, unique=True, nullable=False, index=True)
filename = Column(Text, nullable=False)
date_added = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
date_taken = Column(PrismaCompatibleDate, nullable=True, index=True)
date_added = Column(DateTime, default=datetime.utcnow, nullable=False)
date_taken = Column(Date, nullable=True, index=True)
processed = Column(Boolean, default=False, nullable=False, index=True)
file_hash = Column(Text, nullable=True, index=True) # Nullable to support existing photos without hashes
media_type = Column(Text, default="image", nullable=False, index=True) # "image" or "video"
@ -214,7 +71,7 @@ class Person(Base):
middle_name = Column(Text, nullable=True)
maiden_name = Column(Text, nullable=True)
date_of_birth = Column(Date, nullable=True)
created_date = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
created_date = Column(DateTime, default=datetime.utcnow, nullable=False)
faces = relationship("Face", back_populates="person")
person_encodings = relationship(
@ -285,7 +142,7 @@ class PersonEncoding(Base):
quality_score = Column(Numeric, default=0.0, nullable=False, index=True)
detector_backend = Column(Text, default="retinaface", nullable=False)
model_name = Column(Text, default="ArcFace", nullable=False)
created_date = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
created_date = Column(DateTime, default=datetime.utcnow, nullable=False)
person = relationship("Person", back_populates="person_encodings")
face = relationship("Face", back_populates="person_encodings")
@ -303,7 +160,7 @@ class Tag(Base):
id = Column(Integer, primary_key=True, autoincrement=True, index=True)
tag_name = Column(Text, unique=True, nullable=False, index=True)
created_date = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
created_date = Column(DateTime, default=datetime.utcnow, nullable=False)
photo_tags = relationship(
"PhotoTagLinkage", back_populates="tag", cascade="all, delete-orphan"
@ -322,7 +179,7 @@ class PhotoTagLinkage(Base):
Integer, default=0, nullable=False,
server_default="0"
)
created_date = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
created_date = Column(DateTime, default=datetime.utcnow, nullable=False)
photo = relationship("Photo", back_populates="photo_tags")
tag = relationship("Tag", back_populates="photo_tags")
@ -343,7 +200,7 @@ class PhotoFavorite(Base):
id = Column(Integer, primary_key=True, autoincrement=True)
username = Column(Text, nullable=False, index=True)
photo_id = Column(Integer, ForeignKey("photos.id"), nullable=False, index=True)
created_date = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
created_date = Column(DateTime, default=datetime.utcnow, nullable=False)
photo = relationship("Photo", back_populates="favorites")
@ -374,8 +231,8 @@ class User(Base):
index=True,
)
password_change_required = Column(Boolean, default=True, nullable=False, index=True)
created_date = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
last_login = Column(PrismaCompatibleDateTime, nullable=True)
created_date = Column(DateTime, default=datetime.utcnow, nullable=False)
last_login = Column(DateTime, nullable=True)
__table_args__ = (
Index("idx_users_username", "username"),
@ -399,7 +256,7 @@ class PhotoPersonLinkage(Base):
photo_id = Column(Integer, ForeignKey("photos.id"), nullable=False, index=True)
person_id = Column(Integer, ForeignKey("people.id"), nullable=False, index=True)
identified_by_user_id = Column(Integer, ForeignKey("users.id"), nullable=True, index=True)
created_date = Column(PrismaCompatibleDateTime, default=datetime.utcnow, nullable=False)
created_date = Column(DateTime, default=datetime.utcnow, nullable=False)
photo = relationship("Photo", back_populates="video_people")
person = relationship("Person", back_populates="video_photos")

View File

@ -20,8 +20,8 @@ def get_database_url() -> str:
db_url = os.getenv("DATABASE_URL")
if db_url:
return db_url
# Default to SQLite for development
return "sqlite:///data/punimtag.db"
# Default to PostgreSQL for development
return "postgresql+psycopg2://punimtag:punimtag_password@localhost:5432/punimtag"
def get_auth_database_url() -> str:
@ -34,24 +34,17 @@ def get_auth_database_url() -> str:
database_url = get_database_url()
# SQLite-specific configuration
connect_args = {}
if database_url.startswith("sqlite"):
connect_args = {"check_same_thread": False}
# PostgreSQL connection pool settings
pool_kwargs = {"pool_pre_ping": True}
if database_url.startswith("postgresql"):
pool_kwargs.update({
"pool_size": 10,
"max_overflow": 20,
"pool_recycle": 3600,
})
pool_kwargs = {
"pool_pre_ping": True,
"pool_size": 10,
"max_overflow": 20,
"pool_recycle": 3600,
}
engine = create_engine(
database_url,
future=True,
connect_args=connect_args,
**pool_kwargs
)
SessionLocal = sessionmaker(bind=engine, autoflush=False, autocommit=False, future=True)
@ -69,22 +62,16 @@ def get_db() -> Generator:
# Auth database setup
try:
auth_database_url = get_auth_database_url()
auth_connect_args = {}
if auth_database_url.startswith("sqlite"):
auth_connect_args = {"check_same_thread": False}
auth_pool_kwargs = {"pool_pre_ping": True}
if auth_database_url.startswith("postgresql"):
auth_pool_kwargs.update({
"pool_size": 10,
"max_overflow": 20,
"pool_recycle": 3600,
})
auth_pool_kwargs = {
"pool_pre_ping": True,
"pool_size": 10,
"max_overflow": 20,
"pool_recycle": 3600,
}
auth_engine = create_engine(
auth_database_url,
future=True,
connect_args=auth_connect_args,
**auth_pool_kwargs
)
AuthSessionLocal = sessionmaker(bind=auth_engine, autoflush=False, autocommit=False, future=True)

View File

@ -1,145 +0,0 @@
#!/usr/bin/env python3
"""
Fix DateTime format in SQLite database to be compatible with Prisma.
This script updates all DateTime columns to remove microseconds,
ensuring Prisma's SQLite driver can read them correctly.
"""
import sqlite3
from pathlib import Path
from datetime import datetime
import sys
def fix_datetime_in_db(db_path: str) -> None:
"""Fix datetime formats in SQLite database."""
conn = sqlite3.connect(db_path)
cursor = conn.cursor()
# Tables and their datetime columns
datetime_columns = {
'people': ['created_date', 'date_of_birth'],
'photos': ['date_added', 'date_taken'],
'faces': ['created_date'],
'tags': ['created_date'],
'phototaglinkage': ['created_date'],
'person_encodings': ['created_date'],
'photo_favorites': ['created_date'],
'users': ['created_date', 'last_login'],
'photo_person_linkage': ['created_date'],
}
fixed_count = 0
for table, columns in datetime_columns.items():
# Check if table exists
cursor.execute("""
SELECT name FROM sqlite_master
WHERE type='table' AND name=?
""", (table,))
if not cursor.fetchone():
print(f"⚠️ Table '{table}' does not exist, skipping...")
continue
for column in columns:
# Check if column exists
cursor.execute(f"PRAGMA table_info({table})")
columns_info = cursor.fetchall()
column_exists = any(col[1] == column for col in columns_info)
if not column_exists:
print(f"⚠️ Column '{table}.{column}' does not exist, skipping...")
continue
# Get primary key column name
cursor.execute(f"PRAGMA table_info({table})")
columns_info = cursor.fetchall()
pk_column = None
for col in columns_info:
if col[5] == 1: # pk flag
pk_column = col[1]
break
if not pk_column:
print(f"⚠️ Table '{table}' has no primary key, skipping column '{column}'...")
continue
# Get all rows with this column
cursor.execute(f"SELECT {pk_column}, {column} FROM {table} WHERE {column} IS NOT NULL")
rows = cursor.fetchall()
for row_id, dt_value in rows:
if dt_value is None:
continue
try:
# Parse the datetime value
if isinstance(dt_value, str):
# Check if it's a Date (YYYY-MM-DD) or DateTime (YYYY-MM-DD HH:MM:SS)
if ' ' in dt_value or 'T' in dt_value:
# It's a DateTime - try parsing with microseconds
if '.' in dt_value:
dt = datetime.strptime(dt_value.split('.')[0], '%Y-%m-%d %H:%M:%S')
elif 'T' in dt_value:
# ISO format with T
dt = datetime.fromisoformat(dt_value.replace('Z', '+00:00').split('.')[0])
else:
dt = datetime.strptime(dt_value, '%Y-%m-%d %H:%M:%S')
# Format without microseconds
new_value = dt.strftime('%Y-%m-%d %H:%M:%S')
# Update if different
if new_value != dt_value:
cursor.execute(
f"UPDATE {table} SET {column} = ? WHERE {pk_column} = ?",
(new_value, row_id)
)
fixed_count += 1
print(f"✅ Fixed {table}.{column} for {pk_column}={row_id}: {dt_value} -> {new_value}")
else:
# It's a Date (YYYY-MM-DD) - this is fine, no need to fix
pass
except (ValueError, TypeError) as e:
print(f"⚠️ Could not parse {table}.{column} for {pk_column}={row_id}: {dt_value} ({e})")
continue
conn.commit()
conn.close()
print(f"\n✅ Fixed {fixed_count} datetime values")
print("✅ Database datetime format is now Prisma-compatible")
if __name__ == "__main__":
# Get database path from environment or use default
import os
from pathlib import Path
db_url = os.getenv("DATABASE_URL", "sqlite:///data/punimtag.db")
if db_url.startswith("sqlite:///"):
db_path = db_url.replace("sqlite:///", "")
elif db_url.startswith("file:"):
db_path = db_url.replace("file:", "")
else:
print("❌ This script only works with SQLite databases")
print(f" DATABASE_URL: {db_url}")
sys.exit(1)
# Resolve relative path
if not Path(db_path).is_absolute():
# Assume relative to project root
project_root = Path(__file__).parent.parent.parent
db_path = project_root / db_path
db_path = str(db_path)
if not Path(db_path).exists():
print(f"❌ Database file not found: {db_path}")
sys.exit(1)
print(f"🔧 Fixing datetime format in: {db_path}\n")
fix_datetime_in_db(db_path)
print("\n✅ Done!")