alair 4cd7a66317 提交 3 månader sedan
..
e2e 4cd7a66317 提交 3 månader sedan
fixtures 4cd7a66317 提交 3 månader sedan
integration 4cd7a66317 提交 3 månader sedan
logs 4cd7a66317 提交 3 månader sedan
unit 4cd7a66317 提交 3 månader sedan
README.md 4cd7a66317 提交 3 månader sedan
TESTING_FRAMEWORK_VERIFICATION.md 4cd7a66317 提交 3 månader sedan
__init__.py 4cd7a66317 提交 3 månader sedan
conftest.py 4cd7a66317 提交 3 månader sedan
test_framework_setup.py 4cd7a66317 提交 3 månader sedan

README.md

RAG System Test Suite

This directory contains the comprehensive test suite for the RAG System refactoring project.

Directory Structure

tests/
├── unit/                    # Unit tests for individual components
│   ├── domain/             # Domain layer tests
│   ├── application/        # Application layer tests
│   └── infrastructure/     # Infrastructure layer tests
├── integration/            # Integration tests for component interactions
│   ├── api/               # API integration tests
│   └── database/          # Database integration tests
├── e2e/                   # End-to-end tests for complete workflows
├── fixtures/              # Test data and fixtures
│   └── documents/         # Sample documents for testing
├── logs/                  # Test execution logs
├── conftest.py           # Global pytest configuration and fixtures
└── README.md             # This file

Running Tests

Run All Tests

pytest

Run Specific Test Categories

# Unit tests only
pytest -m unit

# Integration tests only
pytest -m integration

# End-to-end tests only
pytest -m e2e

# Run tests in a specific directory
pytest tests/unit/
pytest tests/integration/
pytest tests/e2e/

Run with Coverage

# Generate coverage report
pytest --cov=src --cov-report=html --cov-report=term

# View HTML coverage report
# Open htmlcov/index.html in your browser

# Generate XML coverage report (for CI/CD)
pytest --cov=src --cov-report=xml

Run Property-Based Tests

# Run all property tests
pytest -m property

# Run property tests with verbose output
pytest -m property -v

# Run property tests with more examples
pytest -m property --hypothesis-show-statistics

Run Tests with Different Verbosity

# Minimal output
pytest -q

# Verbose output
pytest -v

# Very verbose output (show test names and results)
pytest -vv

Run Specific Tests

# Run a specific test file
pytest tests/unit/domain/test_entities.py

# Run a specific test class
pytest tests/unit/domain/test_entities.py::TestDocument

# Run a specific test method
pytest tests/unit/domain/test_entities.py::TestDocument::test_update_content

Run Tests with Markers

# Run only slow tests
pytest -m slow

# Run tests that don't require database
pytest -m "not requires_db"

# Run unit tests that don't require external services
pytest -m "unit and not requires_external_service"

Run Tests in Parallel

# Install pytest-xdist first: pip install pytest-xdist
pytest -n auto

Run Tests with Timeout

# Install pytest-timeout first: pip install pytest-timeout
pytest --timeout=300

Test Markers

The following markers are available for categorizing tests:

  • @pytest.mark.unit - Unit tests (automatically added for tests in unit/)
  • @pytest.mark.integration - Integration tests (automatically added for tests in integration/)
  • @pytest.mark.e2e - End-to-end tests (automatically added for tests in e2e/)
  • @pytest.mark.slow - Tests that take > 1 second
  • @pytest.mark.requires_db - Tests requiring database connection
  • @pytest.mark.requires_vector_db - Tests requiring vector database connection
  • @pytest.mark.requires_external_service - Tests requiring external services
  • @pytest.mark.property - Property-based tests using Hypothesis

Writing Tests

Unit Test Example

# tests/unit/domain/test_entities.py
import pytest
from domain.vector_search.entities import Document

class TestDocument:
    """Document entity unit tests"""
    
    def test_update_content_changes_content(self):
        """Test that updating content changes the document content"""
        # Arrange
        doc = Document(id="1", content="original", ...)
        
        # Act
        doc.update_content("new content")
        
        # Assert
        assert doc.content == "new content"

Integration Test Example

# tests/integration/infrastructure/test_repositories.py
import pytest

@pytest.mark.integration
@pytest.mark.requires_db
class TestDocumentRepository:
    """Document repository integration tests"""
    
    async def test_save_and_find(self, test_db_session):
        """Test saving and finding a document"""
        # Test implementation
        pass

Property-Based Test Example

# tests/unit/domain/test_properties.py
from hypothesis import given, strategies as st
import pytest

class TestVectorProperties:
    """Vector property-based tests"""
    
    @pytest.mark.property
    @given(st.lists(st.floats(allow_nan=False), min_size=1))
    def test_vector_dimension_count(self, dimensions):
        """Property: Vector dimension count equals list length"""
        vector = Vector(dimensions)
        assert vector.dimension_count == len(dimensions)

End-to-End Test Example

# tests/e2e/test_document_workflow.py
import pytest
from httpx import AsyncClient

@pytest.mark.e2e
class TestDocumentWorkflow:
    """Document workflow end-to-end tests"""
    
    async def test_complete_lifecycle(self, client):
        """Test complete document lifecycle"""
        # Create -> Search -> Update -> Delete
        pass

Test Fixtures

Global fixtures are defined in conftest.py:

  • event_loop - Async event loop for async tests
  • reset_singletons - Resets singleton instances between tests
  • test_data_dir - Path to test data directory
  • sample_documents_dir - Path to sample documents
  • mock_settings - Mock configuration settings
  • test_db_session - Test database session (to be implemented)
  • mock_vector_db - Mock vector database (to be implemented)
  • mock_embedding_service - Mock embedding service (to be implemented)

Coverage Goals

  • Overall: ≥ 80%
  • Domain Layer: ≥ 90%
  • Application Layer: ≥ 85%
  • Infrastructure Layer: ≥ 70%
  • Presentation Layer: ≥ 75%

Continuous Integration

Tests are automatically run in CI/CD pipeline on:

  • Every push to main branch
  • Every pull request
  • Scheduled daily runs

See .github/workflows/test.yml for CI configuration.

Troubleshooting

Tests Fail with Import Errors

Make sure you have installed all dependencies:

pip install -r requirements.txt
pip install pytest pytest-asyncio pytest-cov hypothesis httpx

Async Tests Don't Run

Make sure pytest-asyncio is installed and asyncio_mode = auto is set in pytest.ini.

Coverage Report Not Generated

Run pytest with coverage flags:

pytest --cov=src --cov-report=html

Tests Are Too Slow

Run tests in parallel:

pip install pytest-xdist
pytest -n auto

Best Practices

  1. Test Isolation: Each test should be independent and not rely on other tests
  2. Clear Names: Use descriptive test names that explain what is being tested
  3. AAA Pattern: Structure tests with Arrange, Act, Assert sections
  4. Mock External Dependencies: Use mocks for external services in unit tests
  5. Test Edge Cases: Don't just test the happy path
  6. Keep Tests Fast: Unit tests should run in < 100ms
  7. Use Fixtures: Reuse common setup code with fixtures
  8. Document Complex Tests: Add docstrings explaining what the test validates

Resources