POTE/LOCAL_TEST_GUIDE.md
ilia 0d8d85adc1 Add complete automation, reporting, and CI/CD system
Features Added:
==============

📧 EMAIL REPORTING SYSTEM:
- EmailReporter: Send reports via SMTP (Gmail, SendGrid, custom)
- ReportGenerator: Generate daily/weekly summaries with HTML/text formatting
- Configurable via .env (SMTP_HOST, SMTP_PORT, etc.)
- Scripts: send_daily_report.py, send_weekly_report.py

🤖 AUTOMATED RUNS:
- automated_daily_run.sh: Full daily ETL pipeline + reporting
- automated_weekly_run.sh: Weekly pattern analysis + reports
- setup_cron.sh: Interactive cron job setup (5-minute setup)
- Logs saved to ~/logs/ with automatic cleanup

🔍 HEALTH CHECKS:
- health_check.py: System health monitoring
- Checks: DB connection, data freshness, counts, recent alerts
- JSON output for programmatic use
- Exit codes for monitoring integration

🚀 CI/CD PIPELINE:
- .github/workflows/ci.yml: Full CI/CD pipeline
- GitHub Actions / Gitea Actions compatible
- Jobs: lint & test, security scan, dependency scan, Docker build
- PostgreSQL service for integration tests
- 93 tests passing in CI

📚 COMPREHENSIVE DOCUMENTATION:
- AUTOMATION_QUICKSTART.md: 5-minute email setup guide
- docs/12_automation_and_reporting.md: Full automation guide
- Updated README.md with automation links
- Deployment → Production workflow guide

🛠️ IMPROVEMENTS:
- All shell scripts made executable
- Environment variable examples in .env.example
- Report logs saved with timestamps
- 30-day log retention with auto-cleanup
- Health checks can be scheduled via cron

WHAT THIS ENABLES:
==================
After deployment, users can:
1. Set up automated daily/weekly email reports (5 min)
2. Receive HTML+text emails with:
   - New trades, market alerts, suspicious timing
   - Weekly patterns, rankings, repeat offenders
3. Monitor system health automatically
4. Run full CI/CD pipeline on every commit
5. Deploy with confidence (tests + security scans)

USAGE:
======
# One-time setup (on deployed server)
./scripts/setup_cron.sh

# Or manually send reports
python scripts/send_daily_report.py --to user@example.com
python scripts/send_weekly_report.py --to user@example.com

# Check system health
python scripts/health_check.py

See AUTOMATION_QUICKSTART.md for full instructions.

93 tests passing | Full CI/CD | Email reports ready
2025-12-15 15:34:31 -05:00

6.4 KiB

Local Testing Guide for POTE

Testing Locally Before Deployment

Quick Test - Run Full Suite

cd /home/user/Documents/code/pote
source venv/bin/activate
pytest -v

Expected Result: All 55 tests should pass


📊 Current Data Status

Live Data Status: NOT LIVE YET

Why?

  • 🔴 House Stock Watcher API is DOWN (domain issues, unreachable)
  • 🟢 yfinance works (for price data)
  • 🟡 Sample data available (5 trades from fixtures)

What Data Do You Have?

On Your Deployed System (Proxmox):

ssh poteapp@10.0.10.95
cd ~/pote
source venv/bin/activate
python ~/status.sh

This will show:

  • 5 sample trades (from fixtures)
  • 5 securities (NVDA, MSFT, AAPL, TSLA, GOOGL)
  • 0 price data (needs manual fetch)

🧪 Testing Analytics Locally

1. Unit Tests (Fast, No External Dependencies)

# Test analytics calculations with mock data
pytest tests/test_analytics.py -v

# Test integration with realistic data
pytest tests/test_analytics_integration.py -v

These tests:

  • Create synthetic price data
  • Simulate trades with known returns
  • Verify calculations are correct
  • Test edge cases (missing data, sell trades, etc.)

2. Manual Test with Local Database

# Create a fresh local database
export DATABASE_URL="sqlite:///./test_pote.db"

# Run migrations
alembic upgrade head

# Ingest sample data
python scripts/ingest_from_fixtures.py

# Fetch some real price data (requires internet)
python scripts/fetch_sample_prices.py

# Now test analytics
python scripts/analyze_official.py "Nancy Pelosi"

3. Test Individual Components

# Test return calculator
from pote.analytics.returns import ReturnCalculator
from pote.db import get_session

session = next(get_session())
calc = ReturnCalculator(session)

# Test with your data...

📦 What Gets Tested?

Core Functionality (All Working )

  1. Database Models - Officials, Securities, Trades, Prices
  2. Data Ingestion - Trade loading, security enrichment
  3. Analytics Engine - Returns, benchmarks, metrics
  4. Edge Cases - Missing data, sell trades, disclosure lags

Integration Tests Cover:

  • Return calculations over multiple time windows (30/60/90/180 days)
  • Benchmark comparisons (stock vs SPY/QQQ)
  • Abnormal return (alpha) calculations
  • Official performance summaries
  • Sector analysis
  • Disclosure timing analysis
  • Top performer rankings
  • System-wide statistics

🔄 Getting Live Data

Option 1: Wait for House Stock Watcher API

The API is currently down. Once it's back up:

python scripts/fetch_congressional_trades.py --days 30

Option 2: Use Manual CSV Import (NOW)

Step 1: Find a source

Step 2: Format as CSV

python scripts/scrape_alternative_sources.py template
# Edit trades_template.csv with real data

python scripts/scrape_alternative_sources.py import-csv trades_template.csv

Option 3: Add Individual Trades Manually

python scripts/add_custom_trades.py \
  --official-name "Nancy Pelosi" \
  --party "Democrat" \
  --chamber "House" \
  --state "CA" \
  --ticker "NVDA" \
  --company-name "NVIDIA Corporation" \
  --side "buy" \
  --value-min 15001 \
  --value-max 50000 \
  --transaction-date "2024-01-15" \
  --disclosure-date "2024-02-01"

Option 4: Use the Free Alternative API (QuiverQuant - Requires API Key)

Sign up at https://www.quiverquant.com/ (free tier available)

export QUIVER_API_KEY="your_key_here"
# Then implement client (we can add this)

📈 After Adding Data, Fetch Prices

# This will fetch prices for all securities in your database
python scripts/fetch_sample_prices.py

# Then enrich security info (name, sector, industry)
python scripts/enrich_securities.py

🎯 Complete Local Test Workflow

# 1. Run all tests
pytest -v
# ✅ All 55 tests should pass

# 2. Check local database
python -c "
from pote.db import get_session
from pote.db.models import Official, Trade, Security, Price

with next(get_session()) as session:
    print(f'Officials: {session.query(Official).count()}')
    print(f'Trades: {session.query(Trade).count()}')
    print(f'Securities: {session.query(Security).count()}')
    print(f'Prices: {session.query(Price).count()}')
"

# 3. Add some test data
python scripts/ingest_from_fixtures.py

# 4. Fetch price data
python scripts/fetch_sample_prices.py

# 5. Run analytics
python scripts/analyze_official.py "Nancy Pelosi"

# 6. Calculate all returns
python scripts/calculate_all_returns.py --window 90

🚀 Deploy to Proxmox

Once local tests pass:

# Push code
git add -A
git commit -m "Your changes"
git push

# SSH to Proxmox container
ssh root@10.0.10.95

# Pull updates
su - poteapp
cd ~/pote
git pull
source venv/bin/activate

# Run tests on server
pytest -v

# Update database
alembic upgrade head

# Restart services if using systemd

🐛 Common Issues

"No price data found"

Fix: Run python scripts/fetch_sample_prices.py

"No trades in database"

Fix:

  • Option 1: python scripts/ingest_from_fixtures.py (sample data)
  • Option 2: Manually add trades (see Option 3 above)
  • Option 3: Wait for House Stock Watcher API to come back online

"Connection refused" (on Proxmox)

Fix: Check PostgreSQL is running and configured correctly

sudo systemctl status postgresql
sudo -u postgres psql -c "\l"

📊 Test Coverage

Run tests with coverage report:

pytest --cov=src/pote --cov-report=html
firefox htmlcov/index.html  # View coverage report

Current Coverage:

  • Models: ~90%
  • Ingestion: ~85%
  • Analytics: ~80%
  • Overall: ~85%

Summary

Before Deploying:

  1. Run pytest -v - all tests pass
  2. Run make lint - no errors
  3. Test locally with sample data
  4. Verify analytics work with synthetic prices

Getting Live Data:

  • 🔴 House Stock Watcher API is down (external issue)
  • 🟢 Manual CSV import works NOW
  • 🟢 yfinance for prices works NOW
  • 🟡 QuiverQuant available (requires free API key)

You can deploy and use the system NOW with:

  • Manual data entry
  • CSV imports
  • Fixture data for testing
  • Full analytics on whatever data you add