-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Part of: #663
Part of: #EPIC_NUMBER
[Conversation Reference: "Story 1: Log Viewing with Basic Display - As an administrator, I want to view operational logs in the dashboard so that I can monitor system activity"]
Story Overview
Objective: Implement log viewing infrastructure with SQLite storage, basic display in admin dashboard Logs tab, and API access via REST and MCP endpoints.
User Value: Administrators gain centralized visibility into operational logs through a dedicated dashboard tab, enabling effective monitoring and initial troubleshooting without accessing server files directly.
Acceptance Criteria Summary: Logs displayed with timestamp, level, message, correlation ID; refresh capability; paginated REST and MCP API access.
Acceptance Criteria
AC1: Web UI Log Display
Scenario: Administrator views logs in admin dashboard
Given I am logged into the admin dashboard
And operational logs exist in the system
When I click on the "Logs" tab
Then I see a table displaying log entries
And each log entry shows timestamp, level, message, and correlation ID columns
And logs are displayed in reverse chronological order (newest first)
And the table supports paginationTechnical Requirements:
- Create new "Logs" tab in admin dashboard navigation
- Implement HTMX-based log table with server-side rendering
- Display columns: timestamp, level, source, message, correlation_id
- Default sort: timestamp descending (newest first)
- Pagination with configurable page size (default 50)
- Handle empty state gracefully (no logs message)
AC2: Log Refresh Functionality
Scenario: Administrator refreshes log display
Given logs are displayed in the Logs tab
When I click the "Refresh" button
Then the latest logs are fetched from the server
And the display updates with new log entries
And a loading indicator shows during refreshTechnical Requirements:
- Add Refresh button to Logs tab UI
- Implement HTMX partial refresh (no full page reload)
- Show loading indicator during fetch
- Preserve current filter/search state on refresh
AC3: REST API Log Access
Scenario: Administrator queries logs via REST API
Given I have admin authentication credentials
When I send a GET request to /admin/api/logs
Then I receive a JSON response with paginated log entries
And each entry includes timestamp, level, source, message, correlation_id
And the response includes pagination metadata (total, page, page_size)Technical Requirements:
- Create GET /admin/api/logs endpoint
- Require admin authentication
- Support query parameters: page, page_size, sort_order
- Return JSON with logs array and pagination metadata
- Default page_size: 50, max page_size: 1000
AC4: MCP API Log Access
Scenario: Administrator queries logs via MCP API
Given I have admin authentication credentials
When I call the admin_logs_query MCP tool
Then I receive paginated log entries
And each entry includes timestamp, level, source, message, correlation_id
And the response includes pagination metadataTechnical Requirements:
- Create admin_logs_query MCP tool
- Require admin credentials for MCP tool access
- Support parameters: page, page_size, sort_order
- Return structured response matching REST API format
- Ensure API parity with REST endpoint
AC5: SQLite Log Storage Infrastructure
Scenario: System stores logs in queryable database
Given the CIDX server is running
When log messages are generated
Then they are written to ~/.cidx-server/logs.db
And existing console/file logging continues unchanged
And the SQLite database has proper indexes for efficient queriesTechnical Requirements:
- Implement SQLiteLogHandler class extending logging.Handler
- Create logs table with schema: id, timestamp, level, source, message, correlation_id, user_id, request_path, extra_data, created_at
- Create indexes: idx_logs_timestamp, idx_logs_level, idx_logs_correlation_id, idx_logs_source
- Integrate handler into logging configuration alongside existing handlers
- Ensure thread-safe writes
- Database location: ~/.cidx-server/logs.db
AC6: LogAggregatorService Backend
Scenario: Shared backend service for all interfaces
Given the LogAggregatorService is initialized
When any interface (Web UI, REST, MCP) requests logs
Then the service queries SQLite database
And returns consistent results across all interfacesTechnical Requirements:
- Create LogAggregatorService class
- Implement query method with pagination support
- Implement count method for total records
- Share service instance across Web UI, REST API, and MCP API
- Ensure consistent response format
Implementation Status
Progress Tracking:
- Core implementation complete
- Unit tests passing (X/Y tests)
- Integration tests passing (X/Y tests)
- E2E tests passing (X/Y tests)
- Code review approved
- Manual E2E testing completed by Claude Code
- Documentation updated
Completion: 0/Y tasks complete (0%)
Technical Implementation Details
Component Structure
src/cidx_server/
logging/
sqlite_handler.py # SQLiteLogHandler class
log_aggregator.py # LogAggregatorService class
web/
routes.py # Add /admin/logs route
templates/
admin/
logs.html # Logs tab template
_logs_table.html # HTMX partial for log table
api/
admin_routes.py # Add /admin/api/logs endpoint
mcp/
admin_tools.py # Add admin_logs_query tool
API Response Format (REST and MCP)
{
"logs": [
{
"id": 123,
"timestamp": "2025-01-02T10:30:00Z",
"level": "ERROR",
"source": "auth.oidc",
"message": "SSO authentication failed",
"correlation_id": "550e8400-e29b-41d4-a716-446655440000",
"user_id": "admin@example.com",
"request_path": "/auth/sso/callback"
}
],
"pagination": {
"page": 1,
"page_size": 50,
"total": 1234,
"total_pages": 25
}
}Testing Requirements
Unit Test Coverage
- SQLiteLogHandler correctly writes log records to database
- SQLiteLogHandler handles concurrent writes safely
- LogAggregatorService returns correct paginated results
- LogAggregatorService handles empty database gracefully
- REST endpoint returns proper JSON format
- MCP tool returns proper structured response
Integration Test Coverage
- End-to-end flow: log generated -> stored in SQLite -> queryable via API
- REST API authentication requirement enforced
- MCP API authentication requirement enforced
- Web UI renders logs from database correctly
E2E Test Coverage
- Login to admin dashboard, navigate to Logs tab, verify logs display
- Click Refresh, verify new logs appear
- Query REST API /admin/api/logs, verify JSON response
- Call MCP admin_logs_query tool, verify response
Performance Requirements
Response Time Targets
- Log page initial load: <3 seconds
- Refresh operation: <2 seconds
- REST API query: <2 seconds
- MCP API query: <2 seconds
Resource Requirements
- Memory: <50 MB for log aggregator service
- Storage: SQLite database with indexes (~1KB per 10 log entries)
- Network: Minimal (paginated responses)
Error Handling Specifications
User-Friendly Error Messages
"Unable to load logs. Please try again or contact support if the issue persists."
"Log database not available. Server may be starting up."
"Authentication required. Please log in to view logs."
Recovery Guidance
- Database unavailable: Logs tab shows friendly message, suggests retry
- Authentication failure: Redirect to login page
- Query timeout: Show partial results with warning
Definition of Done
Functional Completion
- All acceptance criteria satisfied with evidence
- All technical requirements implemented
- SQLiteLogHandler integrated with existing logging
- Web UI Logs tab functional
- REST API endpoint functional
- MCP API tool functional
- LogAggregatorService shared across all interfaces
Quality Validation
- >90% test coverage achieved
- All tests passing (unit, integration, E2E)
- Code review approved
- Manual testing validated with evidence
- Performance benchmarks met
Integration Readiness
- Story delivers working, deployable software
- Full vertical slice implemented (storage -> backend -> UI/API)
- No broken functionality
- Documentation complete
Story Points: Large
Priority: Critical (P1) - Foundation for all other stories
Dependencies: None - First story in epic
Success Metric: Administrators can view paginated logs through Web UI, REST API, and MCP API