Skip to content

Testing and Validation Strategy

Overview

This document defines the comprehensive testing and validation strategy for F5 XC User and Group Sync tool, covering unit testing, integration testing, and end-to-end testing approaches. The strategy ensures code quality, reliability, and production readiness through systematic validation at multiple levels with a target code coverage of ≥90%.

Unit Testing Strategy

Objective: Verify individual components function correctly in isolation

Scope: All modules insrc/xc_user_group_sync/

Coverage Target: ≥90% code coverage

Key Test Areas:

  1. CSV Parsing (test_csv_parser.py)
  2. Valid CSV with required columns
  3. Missing required columns
  4. Malformed CSV data
  5. Empty CSV file
  6. Large CSV files (performance)

  7. LDAP DN Extraction (test_ldap_utils.py)

  8. Valid LDAP DNs with single CN
  9. DNs with multiple CNs (extract first)
  10. DNs with special characters (escaped)
  11. Malformed DNs (error handling)
  12. DNs with Unicode characters

  13. Group Name Validation (test_validation.py)

  14. Valid group names (alphanumeric, hyphen, underscore)
  15. Invalid characters (spaces, special chars)
  16. Length validation (1-128 characters)
  17. Empty strings
  18. Unicode handling

  19. Configuration Loading (test_config.py)

  20. Hierarchical .env file loading
  21. Environment variable precedence
  22. Custom DOTENV_PATH
  23. Missing configuration
  24. Invalid configuration values

  25. Retry Logic (test_retry.py)

  26. Exponential backoff calculation
  27. Retriable vs non-retriable errors
  28. Max attempts enforcement
  29. Backoff min/max bounds

Testing Framework: pytest with coverage plugin

Example Test:

def test_extract_cn_valid_dn():
    """Test CN extraction from valid LDAP DN"""
    dn = "CN=Admins,OU=Groups,DC=example,DC=com"
    result = extract_cn(dn)
    assert result == "Admins"

def test_extract_cn_multiple_cns():
    """Test CN extraction when multiple CNs present"""
    dn = "CN=Users,CN=Admins,OU=Groups,DC=example,DC=com"
    result = extract_cn(dn)
    assert result == "Users"  # Should extract first CN

def test_extract_cn_malformed_dn():
    """Test error handling for malformed DN"""
    dn = "OU=Groups,DC=example,DC=com"  # Missing CN
    with pytest.raises(ValueError, match="No CN component"):
        extract_cn(dn)

Running Unit Tests:

# Run all unit tests with coverage
pytest tests/unit/ --cov=src/xc_user_group_sync --cov-report=html

# Run specific test file
pytest tests/unit/test_ldap_utils.py -v

# Run tests matching pattern
pytest -k "test_extract_cn" -v

Integration Testing Strategy

Objective: Verify components work together correctly with mocked external services

Scope: End-to-end workflows with mocked F5 XC API

Key Test Scenarios:

  1. Complete Sync Workflow
  2. Parse CSV → Aggregate groups → Sync to mock API → Verify results
  3. Test create, update, and unchanged operations
  4. Verify operation counters accuracy

  5. Authentication Methods

  6. Test P12 authentication flow
  7. Test PEM certificate authentication
  8. Test API token authentication
  9. Test authentication failure handling

  10. Retry Mechanism

  11. Mock transient failures (connection timeout, HTTP 503)
  12. Verify retry attempts occur
  13. Verify exponential backoff timing
  14. Verify eventual success after retries

  15. Error Handling

  16. Mock permanent failures (HTTP 401, 404, 400)
  17. Verify no retry occurs
  18. Verify error reporting
  19. Verify partial failure handling

  20. Dry-Run Mode

  21. Verify no API calls made
  22. Verify planned operations displayed
  23. Verify operation counts accurate

Testing Framework: pytest with requests-mock

Example Integration Test:

@pytest.fixture
def mock_xc_api(requests_mock):
    """Mock F5 XC API responses"""
    requests_mock.get(
        "https://tenant.console.ves.volterra.io/api/web/namespaces/system/user_groups",
        json={"items": [{"name": "existing-group", "users": ["user1@example.com"]}]}
    )
    requests_mock.post(
        "https://tenant.console.ves.volterra.io/api/web/namespaces/system/user_groups",
        status_code=201
    )
    return requests_mock

def test_complete_sync_workflow(mock_xc_api, tmp_path):
    """Test complete sync workflow with mocked API"""
    # Create test CSV
    csv_file = tmp_path / "test.csv"
    csv_file.write_text(
        "Email,Entitlement Display Name\n"
        "user1@example.com,\"CN=existing-group,OU=Groups,DC=example,DC=com\"\n"
        "user2@example.com,\"CN=new-group,OU=Groups,DC=example,DC=com\"\n"
    )

    # Run sync
    result = runner.invoke(cli, ['sync', '--csv', str(csv_file)])

    # Verify results
    assert result.exit_code == 0
    assert "created=1" in result.output  # new-group created
    assert "updated=1" in result.output  # existing-group updated

    # Verify API calls
    assert mock_xc_api.call_count == 3  # GET + POST + PUT

Running Integration Tests:

# Run all integration tests
pytest tests/integration/ -v

# Run with API mock verification
pytest tests/integration/ --log-cli-level=DEBUG

End-to-End Testing Strategy

Objective: Verify complete system functionality with real or near-real F5 XC environment

Options:

Option 1: F5 XC Sandbox/Staging Environment (Recommended)

# Configure for staging environment
export XC_API_URL="https://tenant.staging.volterra.us"
export TENANT_ID="test-tenant"
export VOLT_API_P12_FILE="staging-cert.p12"
export VES_P12_PASSWORD="test-password"  # pragma: allowlist secret

# Run E2E test
pytest tests/e2e/ --e2e-env=staging

Option 2: Mock F5 XC Server with Docker

# Start mock F5 XC API server
docker-compose -f tests/e2e/docker-compose.yml up -d

# Run E2E tests against mock server
pytest tests/e2e/ --e2e-env=mock

# Cleanup
docker-compose -f tests/e2e/docker-compose.yml down

Key E2E Test Scenarios:

  1. First-Time Setup
  2. Run setup script with P12 file
  3. Verify .env created correctly
  4. Verify certificates extracted
  5. Run first sync successfully

  6. Large Dataset Performance

  7. Generate synthetic CSV with 10,000+ rows
  8. Measure execution time
  9. Verify memory usage within limits
  10. Validate all operations completed

  11. Prune Operation

  12. Create groups in F5 XC not in CSV
  13. Run sync with --prune
  14. Verify orphaned groups deleted
  15. Verify CSV groups remain

  16. Error Recovery

  17. Simulate network interruptions
  18. Verify retry mechanism works
  19. Verify partial completion handling
  20. Verify resumption succeeds

  21. Multi-Environment

  22. Test production configuration
  23. Test staging configuration
  24. Verify environment detection
  25. Verify SSL warnings for staging

Running E2E Tests:

# Run all E2E tests (requires staging environment)
pytest tests/e2e/ --e2e-env=staging -v

# Run specific scenario
pytest tests/e2e/test_large_dataset.py -v

# Run with performance profiling
pytest tests/e2e/ --e2e-env=staging --profile