Skip to content

Conversation

llbbl
Copy link

@llbbl llbbl commented Jun 23, 2025

Add Complete Python Testing Infrastructure

Summary

This PR sets up a comprehensive testing infrastructure for the Python active learning project using Poetry as the package manager and pytest as the testing framework. The setup provides a ready-to-use testing environment where developers can immediately start writing tests.

Changes Made

Package Management

  • Poetry Setup: Created pyproject.toml with Poetry configuration as the project's package manager
  • Dependency Migration: Migrated existing dependencies from requirements.txt to Poetry
  • Development Dependencies: Added pytest, pytest-cov, and pytest-mock as development dependencies

Testing Configuration

  • pytest Configuration:

    • Set up test discovery patterns for test_*.py and *_test.py files
    • Configured coverage reporting with 80% threshold
    • Added HTML and XML coverage report generation
    • Enabled strict markers and comprehensive output formatting
  • Coverage Configuration:

    • Configured source directories and exclusion patterns
    • Set up coverage thresholds and reporting formats
    • Excluded test files and virtual environments from coverage

Project Structure

tests/
├── __init__.py
├── conftest.py          # Shared fixtures and configuration
├── test_setup_validation.py  # Validation tests
├── unit/
│   └── __init__.py
└── integration/
    └── __init__.py

Shared Fixtures (conftest.py)

  • temp_dir: Creates temporary directories for test files
  • mock_model: Provides mock PyTorch models
  • sample_data: Generates sample data for testing
  • mock_dataset: Creates mock datasets
  • csv_data: Sets up temporary CSV files
  • mock_config: Provides configuration objects
  • reset_random_seeds: Ensures reproducible tests
  • capture_stdout: Captures print output
  • mock_file_operations: Mocks file I/O
  • device: Provides PyTorch device configuration

Test Markers

  • @pytest.mark.unit: For fast, isolated unit tests
  • @pytest.mark.integration: For integration tests with external dependencies
  • @pytest.mark.slow: For slow-running tests

Additional Setup

  • Updated .gitignore with testing artifacts and Claude settings
  • Created validation tests to verify the infrastructure works correctly
  • Configured Poetry scripts for easy test execution

How to Use

Install Dependencies

poetry install

Run Tests

# Run all tests with coverage
poetry run pytest

# Or use the configured scripts
poetry run test
poetry run tests

# Run specific test types
poetry run pytest -m unit
poetry run pytest -m integration
poetry run pytest -m "not slow"

# Run tests in a specific directory
poetry run pytest tests/unit/

# Run with different verbosity
poetry run pytest -v  # verbose
poetry run pytest -q  # quiet

View Coverage Reports

  • Terminal: Coverage is displayed after each test run
  • HTML Report: Open htmlcov/index.html in a browser
  • XML Report: Available at coverage.xml for CI integration

Notes

  • The existing Python scripts contain module-level code that executes on import, which is why they're excluded from coverage requirements
  • The 80% coverage threshold applies to new test code added to the project
  • All pytest standard options are available when running tests
  • The infrastructure is designed to support both unit and integration testing patterns

Next Steps

Developers can now:

  1. Write unit tests in tests/unit/
  2. Write integration tests in tests/integration/
  3. Use the provided fixtures for common testing patterns
  4. Run tests with coverage reporting to ensure code quality

- Set up Poetry as package manager with pyproject.toml
- Add pytest, pytest-cov, and pytest-mock as dev dependencies
- Configure comprehensive pytest and coverage settings
- Create test directory structure with unit/integration folders
- Add shared fixtures in conftest.py for common test patterns
- Configure test markers for unit, integration, and slow tests
- Set up Poetry scripts for running tests
- Update .gitignore with testing and Claude-related entries
- Add validation tests to verify infrastructure setup
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant