Tanya 67c1227b55
All checks were successful
CI / skip-ci-check (pull_request) Successful in 1m35s
CI / lint-and-type-check (pull_request) Successful in 2m11s
CI / python-lint (pull_request) Successful in 1m58s
CI / test-backend (pull_request) Successful in 3m57s
CI / build (pull_request) Successful in 4m41s
CI / secret-scanning (pull_request) Successful in 1m42s
CI / dependency-scan (pull_request) Successful in 1m41s
CI / sast-scan (pull_request) Successful in 2m46s
CI / workflow-summary (pull_request) Successful in 1m33s
chore: Add blank lines to improve readability in various files
This commit adds blank lines to the end of several files, including pytest.ini, README.md, and various scripts in the viewer-frontend. These changes enhance the readability and maintainability of the codebase by ensuring consistent formatting.
2026-01-12 11:36:29 -05:00
..

Running Backend API Tests

Quick Start

./run_tests.sh

Option 2: Using npm script

npm run test:backend

Option 3: Manual command

export PYTHONPATH=$(pwd)
export SKIP_DEEPFACE_IN_TESTS=1
./venv/bin/python3 -m pytest tests/ -v

Where to See Test Results

Test results are displayed in your terminal/console where you run the command.

Example Output

When tests run successfully, you'll see output like:

tests/test_api_auth.py::TestLogin::test_login_success_with_valid_credentials PASSED
tests/test_api_auth.py::TestLogin::test_login_failure_with_invalid_credentials PASSED
tests/test_api_auth.py::TestTokenRefresh::test_refresh_token_success PASSED
...
========================= 26 passed in 2.34s =========================

Understanding the Output

  • PASSED (green) - Test passed successfully
  • FAILED (red) - Test failed (shows error details)
  • ERROR (red) - Test had an error during setup/teardown
  • SKIPPED (yellow) - Test was skipped

Verbose Output

The -v flag shows:

  • Each test function name
  • Pass/fail status for each test
  • Summary at the end

Detailed Failure Information

If a test fails, pytest shows:

  • The test that failed
  • The assertion that failed
  • The actual vs expected values
  • A traceback showing where the error occurred

Test Coverage

To see coverage report:

export PYTHONPATH=$(pwd)
export SKIP_DEEPFACE_IN_TESTS=1
./venv/bin/python3 -m pytest tests/ --cov=backend --cov-report=term-missing

This shows:

  • Which lines of code are covered by tests
  • Which lines are missing coverage
  • Overall coverage percentage

Running Specific Tests

Run a single test file

./venv/bin/python3 -m pytest tests/test_api_auth.py -v

Run a specific test class

./venv/bin/python3 -m pytest tests/test_api_auth.py::TestLogin -v

Run a specific test

./venv/bin/python3 -m pytest tests/test_api_auth.py::TestLogin::test_login_success_with_valid_credentials -v

CI/CD Test Results

In CI (GitHub Actions/Gitea Actions), test results appear in:

  1. CI Logs - Check the "Run backend tests" step in the workflow
  2. Test Artifacts - JUnit XML files are generated for test reporting tools
  3. Coverage Reports - Coverage XML files are generated

Troubleshooting

Tests not showing output?

  • Make sure you're running in a terminal (not an IDE output panel that might hide output)
  • Try adding -s flag: pytest tests/ -v -s (shows print statements)

Tests hanging?

  • Check if database is accessible
  • Verify SKIP_DEEPFACE_IN_TESTS=1 is set (prevents DeepFace from loading)

Import errors?

  • Make sure virtual environment is activated or use ./venv/bin/python3
  • Verify all dependencies are installed: ./venv/bin/pip install -r requirements.txt