mirror of
https://github.com/kjanat/livegraphs-django.git
synced 2026-02-13 12:55:42 +01:00
Introduces Playwright testing agents (planner, generator, healer) powered by MCP to plan, generate, and heal end-to-end tests. Configures the MCP server and integrates agent workflows into OpenCode and GitHub chat modes to enable AI-assisted testing. Adds Playwright test dependency and updates lockfile; adjusts markdown lint ignores to reduce noise. Adds contributor guidance for Claude Code to streamline local development. Normalizes shell script shebangs to use /usr/bin/env bash for portability. Enables automated browser testing workflows and resilient test maintenance within AI-enabled tooling.
277 lines
8.0 KiB
Markdown
277 lines
8.0 KiB
Markdown
# CLAUDE.md
|
|
|
|
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
|
|
|
## Project Overview
|
|
|
|
Multi-tenant Django analytics dashboard for chat session data. Companies upload CSV files or connect external APIs to visualize chat metrics, sentiment analysis, and session details. Built with Django 5.2+, Python 3.13+, managed via UV package manager.
|
|
|
|
## Essential Commands
|
|
|
|
### Development Server
|
|
|
|
```bash
|
|
# Start Django dev server (port 8001)
|
|
make run
|
|
# or
|
|
cd dashboard_project && uv run python manage.py runserver 8001
|
|
```
|
|
|
|
### Database Operations
|
|
|
|
```bash
|
|
# Create migrations after model changes
|
|
make makemigrations
|
|
|
|
# Apply migrations
|
|
make migrate
|
|
|
|
# Reset database (flush + migrate)
|
|
make reset-db
|
|
|
|
# Create superuser
|
|
make superuser
|
|
```
|
|
|
|
### Background Tasks (Celery)
|
|
|
|
```bash
|
|
# Start Celery worker (separate terminal)
|
|
make celery
|
|
# or
|
|
cd dashboard_project && uv run celery -A dashboard_project worker --loglevel=info
|
|
|
|
# Start Celery Beat scheduler (separate terminal)
|
|
make celery-beat
|
|
# or
|
|
cd dashboard_project && uv run celery -A dashboard_project beat --scheduler django_celery_beat.schedulers:DatabaseScheduler
|
|
|
|
# Start all services (web + celery + beat) with foreman
|
|
make procfile
|
|
```
|
|
|
|
### Testing & Quality
|
|
|
|
```bash
|
|
# Run tests
|
|
make test
|
|
# or
|
|
uv run -m pytest
|
|
|
|
# Run single test
|
|
cd dashboard_project && uv run -m pytest path/to/test_file.py::test_function
|
|
|
|
# Linting
|
|
make lint # Python only
|
|
npm run lint:py # Ruff check
|
|
npm run lint:py:fix # Auto-fix Python issues
|
|
|
|
# Formatting
|
|
make format # Ruff + Black
|
|
npm run format # Prettier (templates) + Python
|
|
npm run format:check # Verify formatting
|
|
|
|
# JavaScript linting
|
|
npm run lint:js
|
|
npm run lint:js:fix
|
|
|
|
# Markdown linting
|
|
npm run lint:md
|
|
npm run lint:md:fix
|
|
|
|
# Type checking
|
|
npm run typecheck:py # Python with ty
|
|
npm run typecheck:js # JavaScript with oxlint
|
|
```
|
|
|
|
### Dependency Management (UV)
|
|
|
|
```bash
|
|
# Install all dependencies
|
|
uv pip install -e ".[dev]"
|
|
|
|
# Add new package
|
|
uv pip install <package-name>
|
|
# Then manually update pyproject.toml dependencies
|
|
|
|
# Update lockfile
|
|
make lock # or uv pip freeze > requirements.lock
|
|
```
|
|
|
|
### Docker
|
|
|
|
```bash
|
|
make docker-build
|
|
make docker-up
|
|
make docker-down
|
|
```
|
|
|
|
## Architecture
|
|
|
|
### Three-App Structure
|
|
|
|
1. **accounts** - Authentication & multi-tenancy
|
|
- `CustomUser` extends AbstractUser with `company` FK and `is_company_admin` flag
|
|
- `Company` model is top-level organizational unit
|
|
- All users belong to exactly one Company
|
|
|
|
2. **dashboard** - Core analytics
|
|
- `DataSource` - CSV uploads or external API links, owned by Company
|
|
- `ChatSession` - Parsed chat data from CSVs/APIs, linked to DataSource
|
|
- `Dashboard` - Custom dashboard configs with M2M to DataSources
|
|
- Views: dashboard display, CSV upload, data export (CSV/JSON/Excel), search
|
|
|
|
3. **data_integration** - External API data fetching
|
|
- `ExternalDataSource` - API credentials and endpoints
|
|
- `ChatSession` & `ChatMessage` - API-fetched data models (parallel to dashboard.ChatSession)
|
|
- Celery tasks for async API data fetching via `tasks.py`
|
|
|
|
### Multi-Tenancy Model
|
|
|
|
```text
|
|
Company (root isolation)
|
|
├── CustomUser (employees, one is_company_admin)
|
|
├── DataSource (CSV files or API links)
|
|
│ └── ChatSession (parsed data)
|
|
└── Dashboard (M2M to DataSources)
|
|
```
|
|
|
|
**Critical**: All views must filter by `request.user.company` to enforce data isolation.
|
|
|
|
### Data Flow
|
|
|
|
**CSV Upload**:
|
|
|
|
1. User uploads CSV via `dashboard/views.py:upload_data`
|
|
2. CSV parsed, creates DataSource + multiple ChatSession records
|
|
3. Dashboard aggregates ChatSessions for visualization
|
|
|
|
**External API**:
|
|
|
|
1. Admin configures ExternalDataSource with API credentials
|
|
2. Celery task (`data_integration/tasks.py`) fetches data periodically
|
|
3. Creates ChatSession + ChatMessage records in `data_integration` app
|
|
4. Optionally synced to `dashboard` app for unified analytics
|
|
|
|
### Key Design Patterns
|
|
|
|
- **Multi-tenant isolation**: Every query filtered by Company FK
|
|
- **Role-based access**: is_staff (Django admin), is_company_admin (company management), regular user (view only)
|
|
- **Dual ChatSession models**: `dashboard.ChatSession` (CSV-based) and `data_integration.ChatSession` (API-based) exist separately
|
|
- **Async processing**: Celery handles long-running API fetches, uses Redis or SQLite backend
|
|
|
|
## Configuration Notes
|
|
|
|
### Settings (`dashboard_project/settings.py`)
|
|
|
|
- Uses `python-dotenv` for environment variables
|
|
- Multi-app: accounts, dashboard, data_integration
|
|
- Celery configured in `dashboard_project/celery.py`
|
|
- Custom user model: `AUTH_USER_MODEL = "accounts.CustomUser"`
|
|
|
|
### Environment Variables
|
|
|
|
Create `.env` from `.env.sample`:
|
|
|
|
- `DJANGO_SECRET_KEY` - Generate for production
|
|
- `DJANGO_DEBUG` - Set False in production
|
|
- `EXTERNAL_API_USERNAME` / `EXTERNAL_API_PASSWORD` - For data_integration API
|
|
- `CELERY_BROKER_URL` - Redis URL or SQLite fallback
|
|
|
|
### Template Formatting
|
|
|
|
- Prettier configured for Django templates via `prettier-plugin-jinja-template`
|
|
- Pre-commit hook auto-formats HTML templates
|
|
- Run manually: `npm run format`
|
|
|
|
## Common Patterns
|
|
|
|
### Adding New Model
|
|
|
|
1. Edit `models.py` in appropriate app
|
|
2. `make makemigrations`
|
|
3. `make migrate`
|
|
4. Register in `admin.py` if needed
|
|
5. Update views to filter by company
|
|
|
|
### CSV Upload Field Mapping
|
|
|
|
Expected CSV columns (see README.md for full schema):
|
|
|
|
- session_id, start_time, end_time, ip_address, country, language
|
|
- messages_sent, sentiment, escalated, forwarded_hr
|
|
- full_transcript, avg_response_time, tokens, tokens_eur
|
|
- category, initial_msg, user_rating
|
|
|
|
### Testing Celery Tasks
|
|
|
|
```bash
|
|
cd dashboard_project
|
|
uv run python manage.py test_celery
|
|
```
|
|
|
|
### Creating Sample Data
|
|
|
|
```bash
|
|
cd dashboard_project
|
|
uv run python manage.py create_sample_data
|
|
```
|
|
|
|
Creates admin user (admin/admin123), 3 companies with users, sample dashboards.
|
|
|
|
## Development Workflow
|
|
|
|
1. **Before starting**: `uv venv && source .venv/bin/activate && uv sync"`
|
|
2. **Run migrations**: `make migrate`
|
|
3. **Start services**: Terminal 1: `make run`, Terminal 2: `make celery`, Terminal 3: `make celery-beat`
|
|
4. **Make changes**: Edit code, test locally
|
|
5. **Test**: `make test` and `make lint`
|
|
6. **Format**: `make format && bun run format`
|
|
7. **Commit**: Pre-commit hooks run automatically
|
|
|
|
> `yq -r '.scripts' package.json`
|
|
>
|
|
> ```json
|
|
> {
|
|
> "format": "prettier --write .; bun format:py",
|
|
> "format:check": "prettier --check .; bun format:py -- --check",
|
|
> "format:py": "uvx ruff format",
|
|
> "lint:js": "oxlint",
|
|
> "lint:js:fix": "bun lint:js -- --fix",
|
|
> "lint:js:strict": "oxlint --import-plugin -D correctness -W suspicious",
|
|
> "lint:md": "markdownlint-cli2 \"**/*.md\" \"#node_modules\" \"#.{node_modules,trunk,grit,venv,opencode,github/chatmodes,claude/agents}\"",
|
|
> "lint:md:fix": "bun lint:md -- --fix",
|
|
> "lint:py": "uvx ruff check",
|
|
> "lint:py:fix": "uvx ruff check --fix",
|
|
> "typecheck:js": "oxlint --type-aware",
|
|
> "typecheck:js:fix": "bun typecheck:js -- --fix",
|
|
> "typecheck:py": "uvx ty check"
|
|
> }
|
|
> ```
|
|
|
|
## Important Context
|
|
|
|
- **Django 5.2+** specific features may be in use
|
|
- **UV package manager** preferred over pip for speed
|
|
- **Celery** required for background tasks, needs Redis or SQLite backend
|
|
- **Multi-tenancy** is enforced at query level, not database level
|
|
- **Bootstrap 5** + **Plotly.js** for frontend
|
|
- **Working directory**: All Django commands run from `dashboard_project/` subdirectory
|
|
|
|
## File Organization
|
|
|
|
- **Django apps**: `dashboard_project/{accounts,dashboard,data_integration}/`
|
|
- **Settings**: `dashboard_project/dashboard_project/settings.py`
|
|
- **Static files**: `dashboard_project/static/`
|
|
- **Templates**: `dashboard_project/templates/`
|
|
- **Uploaded CSVs**: `dashboard_project/media/data_sources/`
|
|
- **Scripts**: `dashboard_project/scripts/` (cleanup, data fixes)
|
|
- **Examples**: `examples/` (sample CSV files)
|
|
|
|
## Testing Notes
|
|
|
|
- pytest configured via `pyproject.toml`
|
|
- Test discovery: `test_*.py` files in `dashboard_project/`
|
|
- Django settings: `DJANGO_SETTINGS_MODULE = "dashboard_project.settings"`
|
|
- Run specific test: `cd dashboard_project && uv run -m pytest path/to/test.py::TestClass::test_method`
|