feat(N): workflow pipeline, 3D viewer, worker management, QC tests
- workflow_builder.py: fix broken stubs, add render_order_line_still_task
(resolves step_path from DB instead of passing order_line_id as step_path)
- domains/rendering/tasks.py: add render_order_line_still_task,
export_gltf_for_order_line_task, export_blend_for_order_line_task,
generate_gltf_geometry_task (trimesh STL→GLB, no Blender needed)
- tasks/step_tasks.py: add generate_gltf_geometry_task for CadFile GLB export
- cad router: POST /{id}/generate-gltf-geometry endpoint (admin/PM)
- worker router: GET /celery-workers + POST /scale (docker compose subprocess)
- Dockerfile: pip install -e "[dev]" to enable pytest
- docker-compose.yml: docker socket + compose file mount on backend
- ThreeDViewer.tsx: mode toggle (geometry/production), wireframe, env presets,
download buttons (GLB + .blend)
- CadPreview.tsx: load gltf_geometry/gltf_production/blend_production assets
from MediaAsset table and pass URLs to ThreeDViewer
- ProductDetail.tsx: "View 3D" button → /cad/:id, "Generate GLB" button
- media router/service: cad_file_id filter on GET /api/media
- WorkerManagement.tsx: new page with worker status, queue depth, scale controls
- App.tsx + Layout.tsx: /workers route + sidebar link (admin/PM)
- tests: test_rendering_service.py, test_orders_service.py (backend)
- tests: WorkerActivity.test.tsx, WorkerManagement.test.tsx (frontend)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -198,6 +198,26 @@ __all__ = ["User"]
|
||||
|
||||
---
|
||||
|
||||
### 2026-03-06 | Celery Canvas | workflow_builder.py: order_line_id als step_path übergeben crasht Blender
|
||||
**Problem:** `_build_still` übergab `order_line_id` als ersten Positional-Arg an `render_still_task.si(order_line_id, **params)` — aber `render_still_task` erwartet `step_path: str` als ersten Arg. Blender versuchte die UUID als Pfad zu öffnen → crash.
|
||||
**Lösung:** Neue `render_order_line_still_task` die intern die DB-Abfrage macht (OrderLine → Product → CadFile → stored_path). `workflow_builder._build_still` nutzt jetzt diese neue Task.
|
||||
**Für künftige Projekte:** Workflow-Builder-Tasks dürfen nie Domain-IDs als file-path-basierte Task-Argumente verwenden. Immer separate order-line-aware Tasks erstellen die die Auflösung intern durchführen.
|
||||
|
||||
### 2026-03-06 | Docker | docker compose in Container braucht multi-stage CLI-Copy
|
||||
**Problem:** Backend-Container basiert auf `python:3.11-slim` — kein `docker` binary, kein `docker compose`. Worker-Scale-Endpoint kann `docker compose up --scale` nicht aufrufen.
|
||||
**Lösung:** Multi-Stage Dockerfile: `COPY --from=docker:cli /usr/local/bin/docker /usr/local/bin/docker` + `COPY --from=docker-cli /usr/local/lib/docker/cli-plugins /usr/local/lib/docker/cli-plugins`. Außerdem: Docker-Socket mounten (`/var/run/docker.sock`) + Compose-File als Volume (`./:/compose:ro`) + `COMPOSE_PROJECT_DIR=/compose` env var.
|
||||
**Für künftige Projekte:** Multi-Stage-Builds sind die sauberste Methode um Binaries aus anderen Images zu kopieren ohne die ganze Dependency-Chain zu installieren.
|
||||
|
||||
### 2026-03-06 | React Three Fiber | Wireframe-Toggle über Material-Clone
|
||||
**Problem:** Drei.js-Materialien sind shared objects — direkte Mutation von `child.material.wireframe = true` auf einem geparstem GLTF-Scene würde alle Instanzen dieses Materials beeinflussen.
|
||||
**Lösung:** `child.material = child.material.clone()` vor der Wireframe-Mutation in `useEffect`. So bekommt jede Mesh-Instanz ihr eigenes Material-Objekt und der Toggle hat keinen unerwünschten Side-Effect.
|
||||
**Für künftige Projekte:** GLTF-Materialien bei Runtime-Modifikationen immer zuerst clonen.
|
||||
|
||||
### 2026-03-06 | pytest | Backend ohne dev-Dependencies: pip install -e ".[dev]" nötig
|
||||
**Problem:** Backend-Dockerfile installiertete nur `pip install -e .` — keine dev-Dependencies → pytest/pytest-asyncio/httpx nicht verfügbar → `python -m pytest` schlägt mit "No module named pytest" fehl.
|
||||
**Lösung:** Dockerfile geändert auf `pip install -e ".[dev]"`. Dev-Dependencies in `pyproject.toml [project.optional-dependencies] dev = [pytest>=8.0, ...]` waren bereits definiert, nur der Install-Befehl war unvollständig.
|
||||
**Für künftige Projekte:** Immer prüfen ob `[dev]` extras installiert sind wenn Tests im Container laufen sollen.
|
||||
|
||||
## Offene Fragen
|
||||
- [ ] Azure AI Credentials für Phase 4 (Bildvalidierung) noch nicht konfiguriert
|
||||
- [ ] pythonOCC verfügbar im render-worker (via cadquery dependency)? Deployment-Test ausstehend
|
||||
|
||||
+9
-2
@@ -1,3 +1,6 @@
|
||||
# Stage 0: grab docker CLI + compose plugin
|
||||
FROM docker:cli AS docker-cli
|
||||
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
@@ -13,9 +16,13 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
libffi-dev \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install Python dependencies
|
||||
# Copy docker CLI for worker scaling
|
||||
COPY --from=docker-cli /usr/local/bin/docker /usr/local/bin/docker
|
||||
COPY --from=docker-cli /usr/local/lib/docker/cli-plugins /usr/local/lib/docker/cli-plugins
|
||||
|
||||
# Install Python dependencies (including dev extras for pytest)
|
||||
COPY pyproject.toml .
|
||||
RUN pip install --no-cache-dir -e .
|
||||
RUN pip install --no-cache-dir -e ".[dev]"
|
||||
|
||||
# Copy app code
|
||||
COPY . .
|
||||
|
||||
@@ -320,6 +320,38 @@ async def generate_stl(
|
||||
return {"status": "queued", "task_id": task.id, "quality": quality}
|
||||
|
||||
|
||||
@router.post("/{id}/generate-gltf-geometry", status_code=status.HTTP_202_ACCEPTED)
|
||||
async def generate_gltf_geometry(
|
||||
id: uuid.UUID,
|
||||
user: User = Depends(get_current_user),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Queue GLB geometry export from the existing STL cache (trimesh, no Blender).
|
||||
|
||||
Stores the result as a MediaAsset with asset_type='gltf_geometry'.
|
||||
The STL low-quality cache must already exist (run a thumbnail render first).
|
||||
"""
|
||||
if user.role.value not in ("admin", "project_manager"):
|
||||
raise HTTPException(status_code=403, detail="Insufficient permissions")
|
||||
|
||||
cad = await _get_cad_file(id, db)
|
||||
if not cad.stored_path:
|
||||
raise HTTPException(status_code=404, detail="STEP file not uploaded for this CAD file")
|
||||
|
||||
step_path = Path(cad.stored_path)
|
||||
stl_path = step_path.parent / f"{step_path.stem}_low.stl"
|
||||
if not stl_path.exists():
|
||||
raise HTTPException(
|
||||
status_code=404,
|
||||
detail="STL low-quality cache not found. Trigger a render first to generate it.",
|
||||
)
|
||||
|
||||
# Queue as a thumbnail_rendering task (trimesh available in render-worker)
|
||||
from app.tasks.step_tasks import generate_gltf_geometry_task
|
||||
task = generate_gltf_geometry_task.delay(str(id))
|
||||
return {"status": "queued", "task_id": task.id, "cad_file_id": str(id)}
|
||||
|
||||
|
||||
@router.post(
|
||||
"/{id}/regenerate-thumbnail",
|
||||
status_code=status.HTTP_202_ACCEPTED,
|
||||
|
||||
@@ -356,6 +356,104 @@ async def cancel_task(task_id: str, user: User = Depends(require_admin_or_pm)):
|
||||
return {"revoked": task_id}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Worker management — list workers + scale
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class ScaleRequest(BaseModel):
|
||||
service: str # "render-worker" | "worker" | "worker-thumbnail"
|
||||
count: int # 0–20
|
||||
|
||||
|
||||
@router.get("/celery-workers")
|
||||
async def get_celery_workers(user: User = Depends(require_admin_or_pm)):
|
||||
"""List active Celery workers with their queues and active task counts."""
|
||||
import asyncio
|
||||
from app.tasks.celery_app import celery_app
|
||||
|
||||
def _inspect() -> dict:
|
||||
try:
|
||||
insp = celery_app.control.inspect(timeout=2.0)
|
||||
return {
|
||||
"active_queues": insp.active_queues() or {},
|
||||
"active": insp.active() or {},
|
||||
"stats": insp.stats() or {},
|
||||
}
|
||||
except Exception as exc:
|
||||
return {"error": str(exc)}
|
||||
|
||||
data = await asyncio.to_thread(_inspect)
|
||||
if "error" in data:
|
||||
return {"workers": [], "error": data["error"]}
|
||||
|
||||
workers = []
|
||||
for worker_name, queues in data.get("active_queues", {}).items():
|
||||
queue_names = [q.get("name") for q in (queues or [])]
|
||||
active_tasks = data.get("active", {}).get(worker_name, [])
|
||||
stats = data.get("stats", {}).get(worker_name, {})
|
||||
workers.append({
|
||||
"name": worker_name,
|
||||
"queues": queue_names,
|
||||
"active_task_count": len(active_tasks),
|
||||
"active_tasks": [
|
||||
{"name": t.get("name"), "id": t.get("id")} for t in active_tasks
|
||||
],
|
||||
"total_tasks_processed": stats.get("total", {}),
|
||||
})
|
||||
return {"workers": workers}
|
||||
|
||||
|
||||
@router.post("/scale", status_code=http_status.HTTP_202_ACCEPTED)
|
||||
async def scale_workers(
|
||||
body: ScaleRequest,
|
||||
user: User = Depends(require_admin_or_pm),
|
||||
):
|
||||
"""Scale a Compose service (render-worker, worker, worker-thumbnail) up or down.
|
||||
|
||||
Requires the docker socket and compose file to be accessible inside the container
|
||||
(see docker-compose.yml COMPOSE_PROJECT_DIR env var).
|
||||
"""
|
||||
import asyncio
|
||||
import os
|
||||
import subprocess
|
||||
from fastapi import HTTPException
|
||||
|
||||
ALLOWED_SERVICES = {"render-worker", "worker", "worker-thumbnail"}
|
||||
if body.service not in ALLOWED_SERVICES:
|
||||
raise HTTPException(400, detail=f"service must be one of {ALLOWED_SERVICES}")
|
||||
if not (0 <= body.count <= 20):
|
||||
raise HTTPException(400, detail="count must be between 0 and 20")
|
||||
|
||||
compose_dir = os.environ.get("COMPOSE_PROJECT_DIR", "/compose")
|
||||
compose_file = os.path.join(compose_dir, "docker-compose.yml")
|
||||
|
||||
def _scale() -> subprocess.CompletedProcess:
|
||||
return subprocess.run(
|
||||
[
|
||||
"docker", "compose",
|
||||
"-f", compose_file,
|
||||
"up",
|
||||
"--scale", f"{body.service}={body.count}",
|
||||
"--no-recreate",
|
||||
"-d",
|
||||
],
|
||||
capture_output=True, text=True, timeout=120,
|
||||
)
|
||||
|
||||
try:
|
||||
result = await asyncio.to_thread(_scale)
|
||||
except subprocess.TimeoutExpired:
|
||||
raise HTTPException(504, detail="Scale operation timed out")
|
||||
|
||||
if result.returncode != 0:
|
||||
raise HTTPException(
|
||||
500,
|
||||
detail=f"docker compose scale failed: {result.stderr[-500:]}",
|
||||
)
|
||||
|
||||
return {"service": body.service, "count": body.count, "status": "scaling"}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Render health check
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@@ -19,6 +19,7 @@ router = APIRouter(prefix="/api/media", tags=["media"])
|
||||
async def list_assets(
|
||||
product_id: uuid.UUID | None = None,
|
||||
order_line_id: uuid.UUID | None = None,
|
||||
cad_file_id: uuid.UUID | None = None,
|
||||
asset_type: MediaAssetType | None = None,
|
||||
skip: int = Query(0, ge=0),
|
||||
limit: int = Query(50, ge=1, le=500),
|
||||
@@ -28,6 +29,7 @@ async def list_assets(
|
||||
db,
|
||||
product_id=product_id,
|
||||
order_line_id=order_line_id,
|
||||
cad_file_id=cad_file_id,
|
||||
asset_type=asset_type,
|
||||
skip=skip,
|
||||
limit=limit,
|
||||
|
||||
@@ -9,6 +9,7 @@ async def list_media_assets(
|
||||
db: AsyncSession,
|
||||
product_id: uuid.UUID | None = None,
|
||||
order_line_id: uuid.UUID | None = None,
|
||||
cad_file_id: uuid.UUID | None = None,
|
||||
asset_type: MediaAssetType | None = None,
|
||||
is_archived: bool | None = False,
|
||||
skip: int = 0,
|
||||
@@ -19,6 +20,8 @@ async def list_media_assets(
|
||||
q = q.where(MediaAsset.product_id == product_id)
|
||||
if order_line_id:
|
||||
q = q.where(MediaAsset.order_line_id == order_line_id)
|
||||
if cad_file_id:
|
||||
q = q.where(MediaAsset.cad_file_id == cad_file_id)
|
||||
if asset_type:
|
||||
q = q.where(MediaAsset.asset_type == asset_type)
|
||||
if is_archived is not None:
|
||||
|
||||
@@ -269,6 +269,176 @@ def publish_asset(
|
||||
return asyncio.get_event_loop().run_until_complete(_run())
|
||||
|
||||
|
||||
def _resolve_step_path_for_order_line(order_line_id: str) -> tuple[str | None, str | None]:
|
||||
"""Sync helper: resolves (step_path, cad_file_id) from an OrderLine via DB."""
|
||||
import asyncio
|
||||
|
||||
async def _inner() -> tuple[str | None, str | None]:
|
||||
from app.database import AsyncSessionLocal
|
||||
from app.domains.orders.models import OrderLine
|
||||
from app.domains.products.models import Product
|
||||
from app.models.cad_file import CadFile
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.orm import selectinload
|
||||
|
||||
async with AsyncSessionLocal() as db:
|
||||
res = await db.execute(
|
||||
select(OrderLine)
|
||||
.options(selectinload(OrderLine.product))
|
||||
.where(OrderLine.id == order_line_id)
|
||||
)
|
||||
line = res.scalar_one_or_none()
|
||||
if not line or not line.product or not line.product.cad_file_id:
|
||||
return None, None
|
||||
cad_res = await db.execute(
|
||||
select(CadFile).where(CadFile.id == line.product.cad_file_id)
|
||||
)
|
||||
cad = cad_res.scalar_one_or_none()
|
||||
if not cad or not cad.stored_path:
|
||||
return None, None
|
||||
return cad.stored_path, str(line.product.cad_file_id)
|
||||
|
||||
return asyncio.get_event_loop().run_until_complete(_inner())
|
||||
|
||||
|
||||
@celery_app.task(
|
||||
bind=True,
|
||||
name="app.domains.rendering.tasks.render_order_line_still_task",
|
||||
queue="thumbnail_rendering",
|
||||
max_retries=2,
|
||||
)
|
||||
def render_order_line_still_task(self, order_line_id: str, **params) -> dict:
|
||||
"""Render a still image for an order line, resolving STEP path from DB.
|
||||
|
||||
Wraps render_still_task logic but accepts order_line_id instead of step_path.
|
||||
On success, creates a MediaAsset record via publish_asset.
|
||||
"""
|
||||
step_path_str, cad_file_id = _resolve_step_path_for_order_line(order_line_id)
|
||||
if not step_path_str:
|
||||
raise RuntimeError(
|
||||
f"Cannot resolve STEP path for order_line {order_line_id}: "
|
||||
"product missing or has no linked CAD file"
|
||||
)
|
||||
|
||||
step = Path(step_path_str)
|
||||
output_dir = step.parent / "renders"
|
||||
output_dir.mkdir(parents=True, exist_ok=True)
|
||||
output_path = output_dir / f"line_{order_line_id}.png"
|
||||
|
||||
try:
|
||||
from app.services.render_blender import render_still
|
||||
result = render_still(
|
||||
step_path=step,
|
||||
output_path=output_path,
|
||||
**params,
|
||||
)
|
||||
publish_asset.delay(
|
||||
order_line_id,
|
||||
"still",
|
||||
str(output_path),
|
||||
render_config=result,
|
||||
)
|
||||
logger.info(
|
||||
"render_order_line_still_task completed for line %s in %.1fs",
|
||||
order_line_id, result.get("total_duration_s", 0),
|
||||
)
|
||||
return result
|
||||
except Exception as exc:
|
||||
logger.error("render_order_line_still_task failed for %s: %s", order_line_id, exc)
|
||||
raise self.retry(exc=exc, countdown=30)
|
||||
|
||||
|
||||
@celery_app.task(
|
||||
bind=True,
|
||||
name="app.domains.rendering.tasks.export_gltf_for_order_line_task",
|
||||
queue="thumbnail_rendering",
|
||||
max_retries=1,
|
||||
)
|
||||
def export_gltf_for_order_line_task(self, order_line_id: str) -> dict:
|
||||
"""Export a geometry-only GLB from the STL cache using trimesh (no Blender).
|
||||
|
||||
Publishes a MediaAsset with asset_type='gltf_geometry'.
|
||||
Requires the STL low-quality cache to exist.
|
||||
"""
|
||||
step_path_str, cad_file_id = _resolve_step_path_for_order_line(order_line_id)
|
||||
if not step_path_str:
|
||||
raise RuntimeError(f"Cannot resolve STEP path for order_line {order_line_id}")
|
||||
|
||||
step = Path(step_path_str)
|
||||
stl_path = step.parent / f"{step.stem}_low.stl"
|
||||
if not stl_path.exists():
|
||||
raise RuntimeError(
|
||||
f"STL cache not found: {stl_path}. Run thumbnail generation first."
|
||||
)
|
||||
|
||||
output_path = step.parent / f"{step.stem}_geometry.glb"
|
||||
|
||||
try:
|
||||
import trimesh
|
||||
mesh = trimesh.load(str(stl_path))
|
||||
mesh.export(str(output_path))
|
||||
publish_asset.delay(order_line_id, "gltf_geometry", str(output_path))
|
||||
logger.info("export_gltf_for_order_line_task completed: %s", output_path.name)
|
||||
return {"glb_path": str(output_path)}
|
||||
except Exception as exc:
|
||||
logger.error("export_gltf_for_order_line_task failed for %s: %s", order_line_id, exc)
|
||||
raise self.retry(exc=exc, countdown=15)
|
||||
|
||||
|
||||
@celery_app.task(
|
||||
bind=True,
|
||||
name="app.domains.rendering.tasks.export_blend_for_order_line_task",
|
||||
queue="thumbnail_rendering",
|
||||
max_retries=1,
|
||||
)
|
||||
def export_blend_for_order_line_task(self, order_line_id: str) -> dict:
|
||||
"""Export a production-quality GLB via Blender + asset library (export_gltf.py).
|
||||
|
||||
Publishes a MediaAsset with asset_type='blend_production'.
|
||||
Requires Blender + the render-scripts directory.
|
||||
"""
|
||||
import os
|
||||
import subprocess
|
||||
|
||||
step_path_str, cad_file_id = _resolve_step_path_for_order_line(order_line_id)
|
||||
if not step_path_str:
|
||||
raise RuntimeError(f"Cannot resolve STEP path for order_line {order_line_id}")
|
||||
|
||||
step = Path(step_path_str)
|
||||
stl_path = step.parent / f"{step.stem}_low.stl"
|
||||
if not stl_path.exists():
|
||||
raise RuntimeError(f"STL cache not found: {stl_path}")
|
||||
|
||||
output_path = step.parent / f"{step.stem}_production.glb"
|
||||
scripts_dir = Path(os.environ.get("RENDER_SCRIPTS_DIR", "/render-scripts"))
|
||||
export_script = scripts_dir / "export_gltf.py"
|
||||
|
||||
from app.services.render_blender import find_blender
|
||||
blender_bin = find_blender()
|
||||
if not blender_bin:
|
||||
raise RuntimeError("Blender binary not found — cannot run export_blend task")
|
||||
|
||||
try:
|
||||
cmd = [
|
||||
blender_bin, "--background",
|
||||
"--python", str(export_script),
|
||||
"--",
|
||||
"--stl_path", str(stl_path),
|
||||
"--output_path", str(output_path),
|
||||
]
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=300)
|
||||
if result.returncode != 0:
|
||||
raise RuntimeError(
|
||||
f"export_gltf.py exited {result.returncode}:\n{result.stderr[-500:]}"
|
||||
)
|
||||
publish_asset.delay(order_line_id, "blend_production", str(output_path))
|
||||
logger.info("export_blend_for_order_line_task completed: %s", output_path.name)
|
||||
return {"glb_path": str(output_path)}
|
||||
except Exception as exc:
|
||||
logger.error("export_blend_for_order_line_task failed for %s: %s", order_line_id, exc)
|
||||
raise self.retry(exc=exc, countdown=30)
|
||||
|
||||
|
||||
def _build_ffmpeg_cmd(
|
||||
frames_dir: Path, output_mp4: Path, fps: int = 30, bg_color: str = ""
|
||||
) -> list:
|
||||
|
||||
@@ -20,6 +20,7 @@ def dispatch_workflow(
|
||||
"still": _build_still,
|
||||
"turntable": _build_turntable,
|
||||
"multi_angle": _build_multi_angle,
|
||||
"still_with_exports": _build_still_with_exports,
|
||||
}
|
||||
builder = builders.get(workflow_type)
|
||||
if not builder:
|
||||
@@ -30,24 +31,56 @@ def dispatch_workflow(
|
||||
|
||||
|
||||
def _build_still(order_line_id: str, params: dict):
|
||||
from app.domains.rendering.tasks import render_still_task
|
||||
"""Still render: resolves STEP path from order_line DB record."""
|
||||
from app.domains.rendering.tasks import render_order_line_still_task
|
||||
return chain(
|
||||
render_still_task.si(order_line_id, **params)
|
||||
render_order_line_still_task.si(order_line_id, **params)
|
||||
)
|
||||
|
||||
|
||||
def _build_turntable(order_line_id: str, params: dict):
|
||||
"""Turntable animation: requires step_path + output_dir in params."""
|
||||
from app.domains.rendering.tasks import render_turntable_task
|
||||
step_path = params.get("step_path")
|
||||
output_dir = params.get("output_dir")
|
||||
if not step_path or not output_dir:
|
||||
raise ValueError(
|
||||
"turntable workflow requires 'step_path' and 'output_dir' in params"
|
||||
)
|
||||
remaining = {k: v for k, v in params.items() if k not in ("step_path", "output_dir")}
|
||||
return chain(
|
||||
render_turntable_task.si(order_line_id, **params)
|
||||
render_turntable_task.si(step_path, output_dir, **remaining)
|
||||
)
|
||||
|
||||
|
||||
def _build_multi_angle(order_line_id: str, params: dict):
|
||||
from app.domains.rendering.tasks import render_still_task
|
||||
angles = params.get("angles", [0, 45, 90])
|
||||
p = {k: v for k, v in params.items() if k != "angles"}
|
||||
"""Multi-angle stills: renders the same order_line from multiple rotation_z angles."""
|
||||
from app.domains.rendering.tasks import render_order_line_still_task
|
||||
angles = params.pop("angles", [0, 45, 90])
|
||||
return group(
|
||||
render_still_task.si(order_line_id, camera_angle=angle, **p)
|
||||
render_order_line_still_task.si(order_line_id, rotation_z=float(angle), **params)
|
||||
for angle in angles
|
||||
)
|
||||
|
||||
|
||||
def _build_still_with_exports(order_line_id: str, params: dict):
|
||||
"""Still render + parallel GLB exports (geometry + production quality).
|
||||
|
||||
Pipeline:
|
||||
render_order_line_still_task → group(
|
||||
export_gltf_for_order_line_task,
|
||||
export_blend_for_order_line_task,
|
||||
)
|
||||
"""
|
||||
from app.domains.rendering.tasks import (
|
||||
render_order_line_still_task,
|
||||
export_gltf_for_order_line_task,
|
||||
export_blend_for_order_line_task,
|
||||
)
|
||||
return chain(
|
||||
render_order_line_still_task.si(order_line_id, **params),
|
||||
group(
|
||||
export_gltf_for_order_line_task.si(order_line_id),
|
||||
export_blend_for_order_line_task.si(order_line_id),
|
||||
),
|
||||
)
|
||||
|
||||
@@ -245,6 +245,69 @@ def generate_stl_cache(self, cad_file_id: str, quality: str):
|
||||
raise self.retry(exc=exc, countdown=30, max_retries=2)
|
||||
|
||||
|
||||
@celery_app.task(bind=True, name="app.tasks.step_tasks.generate_gltf_geometry_task", queue="thumbnail_rendering", max_retries=1)
|
||||
def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
"""Export a geometry-only GLB from the STL low-quality cache using trimesh.
|
||||
|
||||
Creates a MediaAsset with asset_type='gltf_geometry' and cad_file_id set.
|
||||
No Blender required — trimesh handles the STL→GLB conversion.
|
||||
"""
|
||||
from pathlib import Path as _Path
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import Session
|
||||
from app.config import settings as app_settings
|
||||
from app.models.cad_file import CadFile
|
||||
|
||||
sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
eng = create_engine(sync_url)
|
||||
with Session(eng) as session:
|
||||
cad_file = session.get(CadFile, cad_file_id)
|
||||
if not cad_file or not cad_file.stored_path:
|
||||
logger.error("generate_gltf_geometry_task: no stored_path for %s", cad_file_id)
|
||||
return
|
||||
step_path_str = cad_file.stored_path
|
||||
eng.dispose()
|
||||
|
||||
step = _Path(step_path_str)
|
||||
stl_path = step.parent / f"{step.stem}_low.stl"
|
||||
if not stl_path.exists():
|
||||
logger.error("generate_gltf_geometry_task: STL not found %s", stl_path)
|
||||
raise RuntimeError(f"STL cache not found: {stl_path}")
|
||||
|
||||
output_path = step.parent / f"{step.stem}_geometry.glb"
|
||||
try:
|
||||
import trimesh
|
||||
mesh = trimesh.load(str(stl_path))
|
||||
mesh.export(str(output_path))
|
||||
logger.info("generate_gltf_geometry_task: exported %s", output_path.name)
|
||||
except Exception as exc:
|
||||
logger.error("generate_gltf_geometry_task failed for %s: %s", cad_file_id, exc)
|
||||
raise self.retry(exc=exc, countdown=15)
|
||||
|
||||
# Create MediaAsset record
|
||||
import asyncio
|
||||
|
||||
async def _store():
|
||||
from app.database import AsyncSessionLocal
|
||||
from app.domains.media.models import MediaAsset, MediaAssetType
|
||||
async with AsyncSessionLocal() as db:
|
||||
import uuid
|
||||
asset = MediaAsset(
|
||||
cad_file_id=uuid.UUID(cad_file_id),
|
||||
asset_type=MediaAssetType.gltf_geometry,
|
||||
storage_key=str(output_path),
|
||||
mime_type="model/gltf-binary",
|
||||
file_size_bytes=output_path.stat().st_size if output_path.exists() else None,
|
||||
)
|
||||
db.add(asset)
|
||||
await db.commit()
|
||||
return str(asset.id)
|
||||
|
||||
asset_id = asyncio.get_event_loop().run_until_complete(_store())
|
||||
logger.info("generate_gltf_geometry_task: MediaAsset %s created for cad %s", asset_id, cad_file_id)
|
||||
return {"glb_path": str(output_path), "asset_id": asset_id}
|
||||
|
||||
|
||||
@celery_app.task(bind=True, name="app.tasks.step_tasks.regenerate_thumbnail", queue="thumbnail_rendering")
|
||||
def regenerate_thumbnail(self, cad_file_id: str, part_colors: dict):
|
||||
"""Regenerate thumbnail with per-part colours."""
|
||||
|
||||
@@ -0,0 +1,191 @@
|
||||
"""Tests for orders domain — order creation, status transitions, and pricing."""
|
||||
import uuid
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
async def _create_test_product(db):
|
||||
from app.domains.products.models import Product
|
||||
product = Product(
|
||||
id=uuid.uuid4(),
|
||||
name=f"Test Product {uuid.uuid4().hex[:6]}",
|
||||
category_key="TRB",
|
||||
components=[],
|
||||
cad_part_materials=[],
|
||||
)
|
||||
db.add(product)
|
||||
await db.commit()
|
||||
await db.refresh(product)
|
||||
return product
|
||||
|
||||
|
||||
async def _create_test_order(db, user):
|
||||
from app.domains.orders.models import Order, OrderStatus
|
||||
order = Order(
|
||||
id=uuid.uuid4(),
|
||||
order_number=f"TEST-{uuid.uuid4().hex[:6].upper()}",
|
||||
status=OrderStatus.draft,
|
||||
created_by=user.id,
|
||||
tenant_id=None,
|
||||
)
|
||||
db.add(order)
|
||||
await db.commit()
|
||||
await db.refresh(order)
|
||||
return order
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Order creation
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_create_order_draft_status(db, admin_user):
|
||||
"""New order starts in draft status."""
|
||||
order = await _create_test_order(db, admin_user)
|
||||
assert order.id is not None
|
||||
assert order.status.value == "draft"
|
||||
assert order.order_number.startswith("TEST-")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_order_has_no_lines_initially(db, admin_user):
|
||||
"""New order starts with zero order lines."""
|
||||
from sqlalchemy import select
|
||||
from app.domains.orders.models import Order, OrderLine
|
||||
order = await _create_test_order(db, admin_user)
|
||||
result = await db.execute(
|
||||
select(OrderLine).where(OrderLine.order_id == order.id)
|
||||
)
|
||||
lines = result.scalars().all()
|
||||
assert len(lines) == 0
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Order line creation
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_add_order_line(db, admin_user):
|
||||
"""Order line can be added to a draft order."""
|
||||
from app.domains.orders.models import OrderLine
|
||||
product = await _create_test_product(db)
|
||||
order = await _create_test_order(db, admin_user)
|
||||
|
||||
line = OrderLine(
|
||||
id=uuid.uuid4(),
|
||||
order_id=order.id,
|
||||
product_id=product.id,
|
||||
render_status="pending",
|
||||
item_status="pending",
|
||||
tenant_id=None,
|
||||
)
|
||||
db.add(line)
|
||||
await db.commit()
|
||||
await db.refresh(line)
|
||||
|
||||
assert line.id is not None
|
||||
assert line.order_id == order.id
|
||||
assert line.render_status == "pending"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Status transitions
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_order_status_transition_to_submitted(db, admin_user):
|
||||
"""Order status can be changed from draft to submitted."""
|
||||
from app.domains.orders.models import Order, OrderStatus
|
||||
order = await _create_test_order(db, admin_user)
|
||||
order.status = OrderStatus.submitted
|
||||
await db.commit()
|
||||
await db.refresh(order)
|
||||
assert order.status == OrderStatus.submitted
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_order_multiple_lines(db, admin_user):
|
||||
"""Multiple lines can be added to the same order."""
|
||||
from app.domains.orders.models import OrderLine
|
||||
product = await _create_test_product(db)
|
||||
order = await _create_test_order(db, admin_user)
|
||||
|
||||
for _ in range(3):
|
||||
line = OrderLine(
|
||||
id=uuid.uuid4(),
|
||||
order_id=order.id,
|
||||
product_id=product.id,
|
||||
render_status="pending",
|
||||
item_status="pending",
|
||||
tenant_id=None,
|
||||
)
|
||||
db.add(line)
|
||||
await db.commit()
|
||||
|
||||
from sqlalchemy import select
|
||||
result = await db.execute(
|
||||
select(OrderLine).where(OrderLine.order_id == order.id)
|
||||
)
|
||||
assert len(result.scalars().all()) == 3
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Render status tracking
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_order_line_render_status_update(db, admin_user):
|
||||
"""Order line render_status can be updated to processing/completed."""
|
||||
from app.domains.orders.models import OrderLine
|
||||
product = await _create_test_product(db)
|
||||
order = await _create_test_order(db, admin_user)
|
||||
|
||||
line = OrderLine(
|
||||
id=uuid.uuid4(),
|
||||
order_id=order.id,
|
||||
product_id=product.id,
|
||||
render_status="pending",
|
||||
item_status="pending",
|
||||
tenant_id=None,
|
||||
)
|
||||
db.add(line)
|
||||
await db.commit()
|
||||
|
||||
line.render_status = "processing"
|
||||
await db.commit()
|
||||
await db.refresh(line)
|
||||
assert line.render_status == "processing"
|
||||
|
||||
line.render_status = "completed"
|
||||
await db.commit()
|
||||
await db.refresh(line)
|
||||
assert line.render_status == "completed"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Unit price
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_order_line_unit_price_nullable(db, admin_user):
|
||||
"""unit_price defaults to None."""
|
||||
from app.domains.orders.models import OrderLine
|
||||
product = await _create_test_product(db)
|
||||
order = await _create_test_order(db, admin_user)
|
||||
|
||||
line = OrderLine(
|
||||
id=uuid.uuid4(),
|
||||
order_id=order.id,
|
||||
product_id=product.id,
|
||||
render_status="pending",
|
||||
item_status="pending",
|
||||
tenant_id=None,
|
||||
)
|
||||
db.add(line)
|
||||
await db.commit()
|
||||
await db.refresh(line)
|
||||
assert line.unit_price is None
|
||||
@@ -0,0 +1,112 @@
|
||||
"""Tests for rendering domain — workflow builder + task helpers."""
|
||||
import uuid
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# workflow_builder unit tests (no DB required)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_dispatch_workflow_unknown_type_raises():
|
||||
from app.domains.rendering.workflow_builder import dispatch_workflow
|
||||
with pytest.raises(ValueError, match="Unknown workflow type"):
|
||||
dispatch_workflow("nonexistent_type", str(uuid.uuid4()))
|
||||
|
||||
|
||||
def test_build_still_returns_chain():
|
||||
"""_build_still returns a Celery chain wrapping render_order_line_still_task."""
|
||||
from celery import chain
|
||||
from app.domains.rendering.workflow_builder import _build_still
|
||||
canvas = _build_still(str(uuid.uuid4()), {})
|
||||
# A single-task chain is still a Celery Signature, not a plain chain, but
|
||||
# it should be callable / have apply_async
|
||||
assert hasattr(canvas, "apply_async")
|
||||
|
||||
|
||||
def test_build_multi_angle_creates_group():
|
||||
"""_build_multi_angle returns a Celery group with one sig per angle."""
|
||||
from celery import group
|
||||
from app.domains.rendering.workflow_builder import _build_multi_angle
|
||||
order_line_id = str(uuid.uuid4())
|
||||
canvas = _build_multi_angle(order_line_id, {"angles": [0, 90, 180]})
|
||||
# group has tasks attribute
|
||||
assert hasattr(canvas, "tasks")
|
||||
assert len(canvas.tasks) == 3
|
||||
|
||||
|
||||
def test_build_still_with_exports_is_chain():
|
||||
"""_build_still_with_exports returns a chain."""
|
||||
from app.domains.rendering.workflow_builder import _build_still_with_exports
|
||||
canvas = _build_still_with_exports(str(uuid.uuid4()), {})
|
||||
assert hasattr(canvas, "apply_async")
|
||||
|
||||
|
||||
def test_build_turntable_raises_without_step_path():
|
||||
"""_build_turntable raises ValueError if step_path missing in params."""
|
||||
from app.domains.rendering.workflow_builder import _build_turntable
|
||||
with pytest.raises(ValueError, match="step_path"):
|
||||
_build_turntable(str(uuid.uuid4()), {})
|
||||
|
||||
|
||||
def test_build_turntable_raises_without_output_dir():
|
||||
from app.domains.rendering.workflow_builder import _build_turntable
|
||||
with pytest.raises(ValueError, match="output_dir"):
|
||||
_build_turntable(str(uuid.uuid4()), {"step_path": "/tmp/test.stp"})
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# _resolve_step_path_for_order_line — unit-tests with DB (integration)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.integration
|
||||
@pytest.mark.asyncio
|
||||
async def test_resolve_step_path_returns_none_for_missing_line(db):
|
||||
"""Returns (None, None) for a line_id that doesn't exist."""
|
||||
from app.domains.rendering.tasks import _resolve_step_path_for_order_line
|
||||
import asyncio
|
||||
|
||||
result = _resolve_step_path_for_order_line(str(uuid.uuid4()))
|
||||
assert result == (None, None)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# publish_asset (unit test with mocked DB)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_publish_asset_signature():
|
||||
"""publish_asset is importable and is a bound Celery task."""
|
||||
from app.domains.rendering.tasks import publish_asset
|
||||
assert callable(publish_asset)
|
||||
assert hasattr(publish_asset, "delay")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# generate_gltf_geometry_task — smoke test (unit)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_generate_gltf_geometry_task_importable():
|
||||
from app.tasks.step_tasks import generate_gltf_geometry_task
|
||||
assert callable(generate_gltf_geometry_task)
|
||||
assert hasattr(generate_gltf_geometry_task, "delay")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# New order-line tasks are importable and correctly registered
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_render_order_line_still_task_importable():
|
||||
from app.domains.rendering.tasks import render_order_line_still_task
|
||||
assert render_order_line_still_task.name == "app.domains.rendering.tasks.render_order_line_still_task"
|
||||
assert render_order_line_still_task.queue == "thumbnail_rendering"
|
||||
|
||||
|
||||
def test_export_gltf_for_order_line_task_importable():
|
||||
from app.domains.rendering.tasks import export_gltf_for_order_line_task
|
||||
assert export_gltf_for_order_line_task.queue == "thumbnail_rendering"
|
||||
|
||||
|
||||
def test_export_blend_for_order_line_task_importable():
|
||||
from app.domains.rendering.tasks import export_blend_for_order_line_task
|
||||
assert export_blend_for_order_line_task.queue == "thumbnail_rendering"
|
||||
@@ -67,9 +67,12 @@ services:
|
||||
- MINIO_USER=${MINIO_USER:-minioadmin}
|
||||
- MINIO_PASSWORD=${MINIO_PASSWORD:-minioadmin}
|
||||
- MINIO_BUCKET=${MINIO_BUCKET:-uploads}
|
||||
- COMPOSE_PROJECT_DIR=/compose
|
||||
volumes:
|
||||
- ./backend:/app
|
||||
- uploads:/app/uploads
|
||||
- /var/run/docker.sock:/var/run/docker.sock
|
||||
- .:/compose:ro
|
||||
ports:
|
||||
- "8888:8888"
|
||||
depends_on:
|
||||
|
||||
@@ -22,6 +22,7 @@ import TenantsPage from './pages/Tenants'
|
||||
import WorkflowEditorPage from './pages/WorkflowEditor'
|
||||
import MediaBrowserPage from './pages/MediaBrowser'
|
||||
import BillingPage from './pages/Billing'
|
||||
import WorkerManagementPage from './pages/WorkerManagement'
|
||||
|
||||
function ProtectedRoute({ children }: { children: React.ReactNode }) {
|
||||
const token = useAuthStore((s) => s.token)
|
||||
@@ -104,6 +105,14 @@ export default function App() {
|
||||
</AdminRoute>
|
||||
}
|
||||
/>
|
||||
<Route
|
||||
path="workers"
|
||||
element={
|
||||
<AdminRoute>
|
||||
<WorkerManagementPage />
|
||||
</AdminRoute>
|
||||
}
|
||||
/>
|
||||
</Route>
|
||||
</Routes>
|
||||
</WebSocketProvider>
|
||||
|
||||
@@ -0,0 +1,70 @@
|
||||
import { describe, test, expect } from 'vitest'
|
||||
|
||||
describe('WorkerActivity Page', () => {
|
||||
test('page module is importable', async () => {
|
||||
const module = await import('../../pages/WorkerActivity')
|
||||
expect(module.default).toBeDefined()
|
||||
})
|
||||
})
|
||||
|
||||
describe('worker API types', () => {
|
||||
test('WorkerActivity interface shape', async () => {
|
||||
// Type-level check: the interface must have the right keys
|
||||
const activity = {
|
||||
cad_processing: [],
|
||||
active_count: 0,
|
||||
failed_count: 0,
|
||||
render_jobs: [],
|
||||
render_active_count: 0,
|
||||
render_failed_count: 0,
|
||||
}
|
||||
expect(activity.cad_processing).toBeInstanceOf(Array)
|
||||
expect(typeof activity.active_count).toBe('number')
|
||||
})
|
||||
|
||||
test('CeleryWorker interface shape', () => {
|
||||
const worker = {
|
||||
name: 'celery@worker1',
|
||||
queues: ['thumbnail_rendering'],
|
||||
active_task_count: 2,
|
||||
active_tasks: [{ name: 'render_still_task', id: 'abc' }],
|
||||
total_tasks_processed: { render_still_task: 42 },
|
||||
}
|
||||
expect(worker.queues).toContain('thumbnail_rendering')
|
||||
expect(worker.active_tasks).toHaveLength(1)
|
||||
})
|
||||
|
||||
test('QueueStatus interface shape', () => {
|
||||
const qs = {
|
||||
queue_depths: { step_processing: 3, thumbnail_rendering: 0 },
|
||||
pending_count: 3,
|
||||
active: [],
|
||||
reserved: [],
|
||||
pending: [],
|
||||
}
|
||||
expect(qs.queue_depths).toHaveProperty('step_processing')
|
||||
expect(qs.pending_count).toBe(3)
|
||||
})
|
||||
})
|
||||
|
||||
describe('worker API functions', () => {
|
||||
test('getWorkerActivity is a function', async () => {
|
||||
const { getWorkerActivity } = await import('../../api/worker')
|
||||
expect(typeof getWorkerActivity).toBe('function')
|
||||
})
|
||||
|
||||
test('getCeleryWorkers is a function', async () => {
|
||||
const { getCeleryWorkers } = await import('../../api/worker')
|
||||
expect(typeof getCeleryWorkers).toBe('function')
|
||||
})
|
||||
|
||||
test('scaleWorkers is a function', async () => {
|
||||
const { scaleWorkers } = await import('../../api/worker')
|
||||
expect(typeof scaleWorkers).toBe('function')
|
||||
})
|
||||
|
||||
test('getQueueStatus is a function', async () => {
|
||||
const { getQueueStatus } = await import('../../api/worker')
|
||||
expect(typeof getQueueStatus).toBe('function')
|
||||
})
|
||||
})
|
||||
@@ -0,0 +1,67 @@
|
||||
import { describe, test, expect } from 'vitest'
|
||||
|
||||
describe('WorkerManagement Page', () => {
|
||||
test('page module is importable', async () => {
|
||||
const module = await import('../../pages/WorkerManagement')
|
||||
expect(module.default).toBeDefined()
|
||||
})
|
||||
})
|
||||
|
||||
describe('media API', () => {
|
||||
test('getMediaAssets is a function', async () => {
|
||||
const { getMediaAssets } = await import('../../api/media')
|
||||
expect(typeof getMediaAssets).toBe('function')
|
||||
})
|
||||
|
||||
test('MediaFilter supports cad_file_id', () => {
|
||||
// Type-level check: build a filter with cad_file_id
|
||||
const filter = { cad_file_id: 'some-uuid', asset_type: 'gltf_geometry' as const }
|
||||
expect(filter.cad_file_id).toBe('some-uuid')
|
||||
})
|
||||
|
||||
test('MediaAsset interface has all required fields', () => {
|
||||
const asset = {
|
||||
id: 'uuid',
|
||||
tenant_id: null,
|
||||
product_id: null,
|
||||
cad_file_id: null,
|
||||
order_line_id: null,
|
||||
workflow_run_id: null,
|
||||
asset_type: 'still' as const,
|
||||
storage_key: 'path/to/file.png',
|
||||
file_size_bytes: 1024,
|
||||
mime_type: 'image/png',
|
||||
width: 512,
|
||||
height: 512,
|
||||
duration_s: null,
|
||||
render_config: null,
|
||||
is_archived: false,
|
||||
created_at: new Date().toISOString(),
|
||||
download_url: null,
|
||||
}
|
||||
expect(asset.asset_type).toBe('still')
|
||||
expect(asset.cad_file_id).toBeNull()
|
||||
})
|
||||
})
|
||||
|
||||
describe('cad API', () => {
|
||||
test('generateGltfGeometry is a function', async () => {
|
||||
const { generateGltfGeometry } = await import('../../api/cad')
|
||||
expect(typeof generateGltfGeometry).toBe('function')
|
||||
})
|
||||
|
||||
test('getCadThumbnailUrl returns correct URL', async () => {
|
||||
const { getCadThumbnailUrl } = await import('../../api/cad')
|
||||
const url = getCadThumbnailUrl('test-uuid')
|
||||
expect(url).toBe('/api/cad/test-uuid/thumbnail')
|
||||
})
|
||||
})
|
||||
|
||||
describe('Scale request validation', () => {
|
||||
test('allowed services', () => {
|
||||
const allowed = ['render-worker', 'worker', 'worker-thumbnail']
|
||||
expect(allowed).toContain('render-worker')
|
||||
expect(allowed).toContain('worker-thumbnail')
|
||||
expect(allowed).not.toContain('postgres')
|
||||
})
|
||||
})
|
||||
@@ -103,3 +103,18 @@ export async function regenerateThumbnail(
|
||||
)
|
||||
return res.data
|
||||
}
|
||||
|
||||
export interface GenerateGltfResponse {
|
||||
status: 'queued'
|
||||
task_id: string
|
||||
cad_file_id: string
|
||||
}
|
||||
|
||||
/**
|
||||
* Queue GLB geometry export from existing STL cache (trimesh, no Blender).
|
||||
* The STL low-quality cache must already exist.
|
||||
*/
|
||||
export async function generateGltfGeometry(cadFileId: string): Promise<GenerateGltfResponse> {
|
||||
const res = await api.post<GenerateGltfResponse>(`/cad/${cadFileId}/generate-gltf-geometry`)
|
||||
return res.data
|
||||
}
|
||||
|
||||
@@ -33,6 +33,7 @@ export interface MediaAsset {
|
||||
export interface MediaFilter {
|
||||
product_id?: string
|
||||
order_line_id?: string
|
||||
cad_file_id?: string
|
||||
asset_type?: MediaAssetType
|
||||
skip?: number
|
||||
limit?: number
|
||||
@@ -42,6 +43,7 @@ export const getMediaAssets = (filters: MediaFilter = {}): Promise<MediaAsset[]>
|
||||
const params = new URLSearchParams()
|
||||
if (filters.product_id) params.set('product_id', filters.product_id)
|
||||
if (filters.order_line_id) params.set('order_line_id', filters.order_line_id)
|
||||
if (filters.cad_file_id) params.set('cad_file_id', filters.cad_file_id)
|
||||
if (filters.asset_type) params.set('asset_type', filters.asset_type)
|
||||
if (filters.skip !== undefined) params.set('skip', String(filters.skip))
|
||||
if (filters.limit !== undefined) params.set('limit', String(filters.limit))
|
||||
|
||||
@@ -123,3 +123,46 @@ export async function cancelTask(taskId: string): Promise<{ revoked: string }> {
|
||||
const res = await api.post<{ revoked: string }>(`/worker/queue/cancel/${taskId}`)
|
||||
return res.data
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Worker management
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export interface CeleryWorkerTask {
|
||||
name: string
|
||||
id: string
|
||||
}
|
||||
|
||||
export interface CeleryWorker {
|
||||
name: string
|
||||
queues: string[]
|
||||
active_task_count: number
|
||||
active_tasks: CeleryWorkerTask[]
|
||||
total_tasks_processed: Record<string, number>
|
||||
}
|
||||
|
||||
export interface CeleryWorkersResponse {
|
||||
workers: CeleryWorker[]
|
||||
error?: string
|
||||
}
|
||||
|
||||
export interface ScaleRequest {
|
||||
service: 'render-worker' | 'worker' | 'worker-thumbnail'
|
||||
count: number
|
||||
}
|
||||
|
||||
export interface ScaleResponse {
|
||||
service: string
|
||||
count: number
|
||||
status: string
|
||||
}
|
||||
|
||||
export async function getCeleryWorkers(): Promise<CeleryWorkersResponse> {
|
||||
const res = await api.get<CeleryWorkersResponse>('/worker/celery-workers')
|
||||
return res.data
|
||||
}
|
||||
|
||||
export async function scaleWorkers(req: ScaleRequest): Promise<ScaleResponse> {
|
||||
const res = await api.post<ScaleResponse>('/worker/scale', req)
|
||||
return res.data
|
||||
}
|
||||
|
||||
@@ -11,24 +11,50 @@ import {
|
||||
import { Canvas, useThree, useFrame } from '@react-three/fiber'
|
||||
import { OrbitControls, useGLTF, Environment } from '@react-three/drei'
|
||||
import { toast } from 'sonner'
|
||||
import { X, Camera, Loader2, AlertTriangle } from 'lucide-react'
|
||||
import {
|
||||
X, Camera, Loader2, AlertTriangle, Box, Cpu, Download, ChevronDown,
|
||||
} from 'lucide-react'
|
||||
import api from '../../api/client'
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Types
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
interface ThreeDViewerProps {
|
||||
export interface ThreeDViewerProps {
|
||||
cadFileId: string
|
||||
onClose: () => void
|
||||
/** URL for the geometry-only GLB (from STL export) */
|
||||
geometryGltfUrl?: string
|
||||
/** URL for the production-quality GLB (from asset library render) */
|
||||
productionGltfUrl?: string
|
||||
/** Download URLs for GLB and .blend assets */
|
||||
downloadUrls?: {
|
||||
glb?: string
|
||||
blend?: string
|
||||
}
|
||||
}
|
||||
|
||||
type ViewMode = 'geometry' | 'production'
|
||||
|
||||
const ENV_PRESETS = ['city', 'sunset', 'dawn', 'night', 'warehouse', 'forest', 'apartment', 'studio', 'park', 'lobby'] as const
|
||||
type EnvPreset = typeof ENV_PRESETS[number]
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Inner model loader – separated so Suspense can catch it
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function GltfModel({ url }: { url: string }) {
|
||||
function GltfModel({ url, wireframe }: { url: string; wireframe: boolean }) {
|
||||
const { scene } = useGLTF(url)
|
||||
|
||||
useEffect(() => {
|
||||
scene.traverse((child: any) => {
|
||||
if (child.isMesh) {
|
||||
child.material = child.material.clone()
|
||||
child.material.wireframe = wireframe
|
||||
}
|
||||
})
|
||||
}, [scene, wireframe])
|
||||
|
||||
return <primitive object={scene} />
|
||||
}
|
||||
|
||||
@@ -50,11 +76,7 @@ function ScreenshotCapture({ enabled, cadFileId, onDone }: ScreenshotCaptureProp
|
||||
if (!enabled || didCapture.current) return
|
||||
didCapture.current = true
|
||||
|
||||
// Grab the canvas as a data-URL after the current frame has been rendered
|
||||
const dataUrl = gl.domElement.toDataURL('image/png')
|
||||
|
||||
// Convert data-URL → Blob without a network fetch:
|
||||
// data:[<mediatype>][;base64],<data>
|
||||
const [header, base64Data] = dataUrl.split(',')
|
||||
const mimeMatch = header.match(/:(.*?);/)
|
||||
const mimeType = mimeMatch ? mimeMatch[1] : 'image/png'
|
||||
@@ -64,7 +86,6 @@ function ScreenshotCapture({ enabled, cadFileId, onDone }: ScreenshotCaptureProp
|
||||
byteArray[i] = byteCharacters.charCodeAt(i)
|
||||
}
|
||||
const blob = new Blob([byteArray], { type: mimeType })
|
||||
|
||||
const formData = new FormData()
|
||||
formData.append('thumbnail', blob, 'thumbnail.png')
|
||||
|
||||
@@ -72,14 +93,8 @@ function ScreenshotCapture({ enabled, cadFileId, onDone }: ScreenshotCaptureProp
|
||||
.post(`/cad/${cadFileId}/regenerate-thumbnail`, formData, {
|
||||
headers: { 'Content-Type': 'multipart/form-data' },
|
||||
})
|
||||
.then(() => {
|
||||
toast.success('Thumbnail captured and saved')
|
||||
})
|
||||
.catch((err: unknown) => {
|
||||
const msg = err instanceof Error ? err.message : 'Unknown error'
|
||||
console.error('Thumbnail upload failed', msg)
|
||||
toast.error('Failed to save thumbnail')
|
||||
})
|
||||
.then(() => toast.success('Thumbnail captured and saved'))
|
||||
.catch(() => toast.error('Failed to save thumbnail'))
|
||||
.finally(() => {
|
||||
didCapture.current = false
|
||||
onDone()
|
||||
@@ -90,7 +105,7 @@ function ScreenshotCapture({ enabled, cadFileId, onDone }: ScreenshotCaptureProp
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Error boundary for the GLTF loader inside Suspense
|
||||
// Error boundary
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
class GltfErrorBoundary extends Component<
|
||||
@@ -101,15 +116,12 @@ class GltfErrorBoundary extends Component<
|
||||
super(props)
|
||||
this.state = { hasError: false }
|
||||
}
|
||||
|
||||
static getDerivedStateFromError(): { hasError: boolean } {
|
||||
return { hasError: true }
|
||||
}
|
||||
|
||||
componentDidCatch(error: Error, _info: ErrorInfo): void {
|
||||
this.props.onError(error.message || 'Failed to parse GLTF')
|
||||
}
|
||||
|
||||
render(): ReactNode {
|
||||
if (this.state.hasError) return null
|
||||
return this.props.children
|
||||
@@ -117,7 +129,7 @@ class GltfErrorBoundary extends Component<
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Loading overlay (shown while model resolves inside Canvas)
|
||||
// Loading overlay
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function LoadingOverlay() {
|
||||
@@ -130,60 +142,199 @@ function LoadingOverlay() {
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Model loader with resolved tracking
|
||||
// Model loader with ready tracking
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
interface ModelWithReadyProps {
|
||||
url: string
|
||||
wireframe: boolean
|
||||
onReady: () => void
|
||||
}
|
||||
|
||||
function ModelWithReady({ url, onReady }: ModelWithReadyProps) {
|
||||
function ModelWithReady({ url, wireframe, onReady }: ModelWithReadyProps) {
|
||||
const { scene } = useGLTF(url)
|
||||
|
||||
useEffect(() => {
|
||||
onReady()
|
||||
}, [onReady])
|
||||
scene.traverse((child: any) => {
|
||||
if (child.isMesh) {
|
||||
child.material = child.material.clone()
|
||||
child.material.wireframe = wireframe
|
||||
}
|
||||
})
|
||||
}, [scene, wireframe])
|
||||
|
||||
useEffect(() => { onReady() }, [onReady])
|
||||
return <primitive object={scene} />
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Env preset dropdown
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function EnvDropdown({
|
||||
value,
|
||||
onChange,
|
||||
}: {
|
||||
value: EnvPreset
|
||||
onChange: (v: EnvPreset) => void
|
||||
}) {
|
||||
const [open, setOpen] = useState(false)
|
||||
return (
|
||||
<div className="relative">
|
||||
<button
|
||||
onClick={() => setOpen((o) => !o)}
|
||||
className="flex items-center gap-1.5 px-3 py-1.5 rounded-md bg-gray-700 hover:bg-gray-600 text-white text-xs font-medium transition-colors"
|
||||
>
|
||||
{value}
|
||||
<ChevronDown size={12} />
|
||||
</button>
|
||||
{open && (
|
||||
<div className="absolute right-0 top-full mt-1 z-50 bg-gray-800 border border-gray-700 rounded-md shadow-xl min-w-[130px]">
|
||||
{ENV_PRESETS.map((p) => (
|
||||
<button
|
||||
key={p}
|
||||
onClick={() => { onChange(p); setOpen(false) }}
|
||||
className={`w-full text-left px-3 py-1.5 text-xs hover:bg-gray-700 transition-colors ${
|
||||
p === value ? 'text-accent font-semibold' : 'text-gray-300'
|
||||
}`}
|
||||
>
|
||||
{p}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Main exported component
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export default function ThreeDViewer({ cadFileId, onClose }: ThreeDViewerProps) {
|
||||
const modelUrl = `/api/cad/${cadFileId}/model`
|
||||
export default function ThreeDViewer({
|
||||
cadFileId,
|
||||
onClose,
|
||||
geometryGltfUrl,
|
||||
productionGltfUrl,
|
||||
downloadUrls,
|
||||
}: ThreeDViewerProps) {
|
||||
const defaultUrl = `/api/cad/${cadFileId}/model`
|
||||
|
||||
const [mode, setMode] = useState<ViewMode>('geometry')
|
||||
const [wireframe, setWireframe] = useState(false)
|
||||
const [envPreset, setEnvPreset] = useState<EnvPreset>('city')
|
||||
const [capturing, setCapturing] = useState(false)
|
||||
const [loadError, setLoadError] = useState<string | null>(null)
|
||||
const [modelReady, setModelReady] = useState(false)
|
||||
|
||||
// Resolve the active model URL based on mode
|
||||
const activeUrl =
|
||||
mode === 'production' && productionGltfUrl
|
||||
? productionGltfUrl
|
||||
: geometryGltfUrl || defaultUrl
|
||||
|
||||
const handleModelReady = useCallback(() => setModelReady(true), [])
|
||||
const handleError = useCallback((msg: string) => setLoadError(msg), [])
|
||||
const handleCaptureDone = useCallback(() => setCapturing(false), [])
|
||||
|
||||
// Reset ready state when URL changes
|
||||
useEffect(() => {
|
||||
setModelReady(false)
|
||||
setLoadError(null)
|
||||
}, [activeUrl])
|
||||
|
||||
function handleDownload(url: string, filename: string) {
|
||||
const a = document.createElement('a')
|
||||
a.href = url
|
||||
a.download = filename
|
||||
document.body.appendChild(a)
|
||||
a.click()
|
||||
document.body.removeChild(a)
|
||||
}
|
||||
|
||||
const hasBothModes = !!(geometryGltfUrl && productionGltfUrl)
|
||||
|
||||
return (
|
||||
<div className="fixed inset-0 z-50 flex flex-col bg-gray-950">
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
{/* Toolbar */}
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
<div className="flex items-center justify-between px-5 py-3 bg-gray-900 border-b border-gray-800 shrink-0">
|
||||
{/* Toolbar */}
|
||||
<div className="flex items-center justify-between px-5 py-3 bg-gray-900 border-b border-gray-800 shrink-0 gap-3 flex-wrap">
|
||||
<span className="text-white font-semibold tracking-wide">3D Viewer</span>
|
||||
<div className="flex items-center gap-3">
|
||||
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
{/* Mode toggle */}
|
||||
{hasBothModes && (
|
||||
<div className="flex rounded-md overflow-hidden border border-gray-700">
|
||||
<button
|
||||
onClick={() => setMode('geometry')}
|
||||
className={`flex items-center gap-1.5 px-3 py-1.5 text-xs font-medium transition-colors ${
|
||||
mode === 'geometry'
|
||||
? 'bg-accent text-white'
|
||||
: 'bg-gray-800 text-gray-300 hover:bg-gray-700'
|
||||
}`}
|
||||
>
|
||||
<Box size={12} />
|
||||
Geometry
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setMode('production')}
|
||||
className={`flex items-center gap-1.5 px-3 py-1.5 text-xs font-medium transition-colors ${
|
||||
mode === 'production'
|
||||
? 'bg-accent text-white'
|
||||
: 'bg-gray-800 text-gray-300 hover:bg-gray-700'
|
||||
}`}
|
||||
>
|
||||
<Cpu size={12} />
|
||||
Production
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Wireframe toggle */}
|
||||
<button
|
||||
onClick={() => setWireframe((w) => !w)}
|
||||
className={`px-3 py-1.5 rounded-md text-xs font-medium transition-colors border ${
|
||||
wireframe
|
||||
? 'bg-accent border-accent text-white'
|
||||
: 'bg-gray-800 border-gray-700 text-gray-300 hover:bg-gray-700'
|
||||
}`}
|
||||
>
|
||||
Wireframe
|
||||
</button>
|
||||
|
||||
{/* Environment preset */}
|
||||
<EnvDropdown value={envPreset} onChange={setEnvPreset} />
|
||||
|
||||
{/* Download buttons */}
|
||||
{downloadUrls?.glb && (
|
||||
<button
|
||||
onClick={() => handleDownload(downloadUrls.glb!, `${cadFileId}.glb`)}
|
||||
className="flex items-center gap-1.5 px-3 py-1.5 rounded-md bg-gray-700 hover:bg-gray-600 text-white text-xs font-medium transition-colors"
|
||||
>
|
||||
<Download size={12} />
|
||||
GLB
|
||||
</button>
|
||||
)}
|
||||
{downloadUrls?.blend && (
|
||||
<button
|
||||
onClick={() => handleDownload(downloadUrls.blend!, `${cadFileId}.blend`)}
|
||||
className="flex items-center gap-1.5 px-3 py-1.5 rounded-md bg-gray-700 hover:bg-gray-600 text-white text-xs font-medium transition-colors"
|
||||
>
|
||||
<Download size={12} />
|
||||
.blend
|
||||
</button>
|
||||
)}
|
||||
|
||||
{/* Capture button */}
|
||||
<button
|
||||
onClick={() => setCapturing(true)}
|
||||
disabled={capturing || !modelReady || loadError !== null}
|
||||
className="flex items-center gap-2 px-4 py-1.5 rounded-md bg-accent hover:bg-accent-hover disabled:opacity-40 disabled:cursor-not-allowed text-white text-sm font-medium transition-colors"
|
||||
>
|
||||
{capturing ? (
|
||||
<Loader2 size={15} className="animate-spin" />
|
||||
) : (
|
||||
<Camera size={15} />
|
||||
)}
|
||||
{capturing ? <Loader2 size={15} className="animate-spin" /> : <Camera size={15} />}
|
||||
{capturing ? 'Capturing…' : 'Capture Angle'}
|
||||
</button>
|
||||
|
||||
{/* Close */}
|
||||
<button
|
||||
onClick={onClose}
|
||||
className="p-1.5 rounded-md text-gray-400 hover:text-white hover:bg-gray-700 transition-colors"
|
||||
@@ -194,11 +345,8 @@ export default function ThreeDViewer({ cadFileId, onClose }: ThreeDViewerProps)
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
{/* Viewport */}
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
{/* Viewport */}
|
||||
<div className="relative flex-1">
|
||||
{/* Error state */}
|
||||
{loadError && (
|
||||
<div className="absolute inset-0 flex flex-col items-center justify-center bg-gray-900 text-white gap-4 z-20">
|
||||
<AlertTriangle size={48} className="text-red-400" />
|
||||
@@ -213,34 +361,31 @@ export default function ThreeDViewer({ cadFileId, onClose }: ThreeDViewerProps)
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Loading overlay – visible until model signals ready */}
|
||||
{!modelReady && !loadError && <LoadingOverlay />}
|
||||
|
||||
{/* Three.js Canvas */}
|
||||
<Canvas
|
||||
camera={{ position: [0, 2, 5], fov: 45 }}
|
||||
gl={{ preserveDrawingBuffer: true }}
|
||||
style={{ width: '100%', height: '100%', background: '#111827' }}
|
||||
>
|
||||
{/* Lights */}
|
||||
<ambientLight intensity={0.5} />
|
||||
<directionalLight position={[5, 10, 7]} intensity={1.0} castShadow />
|
||||
<directionalLight position={[-5, -5, -5]} intensity={0.25} />
|
||||
|
||||
{/* GLTF model */}
|
||||
<GltfErrorBoundary onError={handleError}>
|
||||
<Suspense fallback={null}>
|
||||
<ModelWithReady url={modelUrl} onReady={handleModelReady} />
|
||||
<ModelWithReady
|
||||
key={activeUrl}
|
||||
url={activeUrl}
|
||||
wireframe={wireframe}
|
||||
onReady={handleModelReady}
|
||||
/>
|
||||
</Suspense>
|
||||
</GltfErrorBoundary>
|
||||
|
||||
{/* Camera controls */}
|
||||
<OrbitControls enablePan enableZoom enableRotate minDistance={0.3} maxDistance={100} />
|
||||
<Environment preset={envPreset} />
|
||||
|
||||
{/* Environment map for PBR materials */}
|
||||
<Environment preset="city" />
|
||||
|
||||
{/* Screenshot capture – only active when triggered */}
|
||||
{capturing && (
|
||||
<ScreenshotCapture
|
||||
enabled={capturing}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Outlet, NavLink, useNavigate, Link } from 'react-router-dom'
|
||||
import { LayoutDashboard, Package, Settings, LogOut, FlaskConical, Activity, Library, Plus, SlidersHorizontal, Building2, GitBranch, Image, BellRing, Receipt } from 'lucide-react'
|
||||
import { LayoutDashboard, Package, Settings, LogOut, FlaskConical, Activity, Library, Plus, SlidersHorizontal, Building2, GitBranch, Image, BellRing, Receipt, Server } from 'lucide-react'
|
||||
import { useAuthStore } from '../../store/auth'
|
||||
import { clsx } from 'clsx'
|
||||
import { useQuery } from '@tanstack/react-query'
|
||||
@@ -152,6 +152,22 @@ export default function Layout() {
|
||||
Media Browser
|
||||
</NavLink>
|
||||
)}
|
||||
{(user?.role === 'admin' || user?.role === 'project_manager') && (
|
||||
<NavLink
|
||||
to="/workers"
|
||||
className={({ isActive }) =>
|
||||
clsx(
|
||||
'flex items-center gap-3 px-3 py-2 rounded-md text-sm font-medium transition-colors',
|
||||
isActive
|
||||
? 'bg-accent-light text-accent'
|
||||
: 'text-content-secondary hover:bg-surface-hover',
|
||||
)
|
||||
}
|
||||
>
|
||||
<Server size={18} />
|
||||
Workers
|
||||
</NavLink>
|
||||
)}
|
||||
{(user?.role === 'admin' || user?.role === 'project_manager') && (
|
||||
<NavLink
|
||||
to="/workflows"
|
||||
|
||||
@@ -1,36 +1,64 @@
|
||||
import { useParams, useNavigate } from 'react-router-dom'
|
||||
import { ArrowLeft } from 'lucide-react'
|
||||
import { useQuery } from '@tanstack/react-query'
|
||||
import ThreeDViewer from '../components/cad/ThreeDViewer'
|
||||
import { getMediaAssets } from '../api/media'
|
||||
|
||||
/**
|
||||
* Route: /cad/:id
|
||||
*
|
||||
* Renders the full-screen 3D viewer for a specific CAD file.
|
||||
* When the viewer is closed the user is navigated back.
|
||||
* Full-screen 3D viewer for a CAD file.
|
||||
* Passes production GLB URL if a gltf_geometry MediaAsset exists for this CAD file.
|
||||
*/
|
||||
export default function CadPreviewPage() {
|
||||
const { id } = useParams<{ id: string }>()
|
||||
const navigate = useNavigate()
|
||||
|
||||
// Load any geometry GLB that was generated for this CAD file
|
||||
const { data: gltfAssets } = useQuery({
|
||||
queryKey: ['media-assets', id, 'gltf_geometry'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: id!, asset_type: 'gltf_geometry' }),
|
||||
enabled: !!id,
|
||||
staleTime: 30_000,
|
||||
})
|
||||
|
||||
// Load production GLB if available
|
||||
const { data: productionAssets } = useQuery({
|
||||
queryKey: ['media-assets', id, 'gltf_production'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: id!, asset_type: 'gltf_production' }),
|
||||
enabled: !!id,
|
||||
staleTime: 30_000,
|
||||
})
|
||||
|
||||
// Load blend assets for download
|
||||
const { data: blendAssets } = useQuery({
|
||||
queryKey: ['media-assets', id, 'blend_production'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: id!, asset_type: 'blend_production' }),
|
||||
enabled: !!id,
|
||||
staleTime: 30_000,
|
||||
})
|
||||
|
||||
if (!id) {
|
||||
return (
|
||||
<div className="flex flex-col items-center justify-center h-full text-content-muted gap-4 p-8">
|
||||
<p className="text-lg">No CAD file ID provided.</p>
|
||||
<button
|
||||
onClick={() => navigate(-1)}
|
||||
className="flex items-center gap-2 text-sm text-accent hover:underline"
|
||||
>
|
||||
<ArrowLeft size={16} />
|
||||
Go back
|
||||
</button>
|
||||
<div className="flex items-center justify-center h-full text-content-muted p-8">
|
||||
<p>No CAD file ID provided.</p>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const latestGltf = gltfAssets?.[0]
|
||||
const latestProduction = productionAssets?.[0]
|
||||
const latestBlend = blendAssets?.[0]
|
||||
|
||||
return (
|
||||
<ThreeDViewer
|
||||
cadFileId={id}
|
||||
onClose={() => navigate(-1)}
|
||||
geometryGltfUrl={latestGltf?.download_url ?? undefined}
|
||||
productionGltfUrl={latestProduction?.download_url ?? undefined}
|
||||
downloadUrls={{
|
||||
glb: latestGltf?.download_url ?? undefined,
|
||||
blend: latestBlend?.download_url ?? undefined,
|
||||
}}
|
||||
/>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
import { useState, useCallback, useEffect, Fragment, useMemo } from 'react'
|
||||
import { useParams, Link } from 'react-router-dom'
|
||||
import { useParams, Link, useNavigate } from 'react-router-dom'
|
||||
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'
|
||||
import { useDropzone } from 'react-dropzone'
|
||||
import {
|
||||
ArrowLeft, Pencil, Save, X, Box, Image,
|
||||
RotateCcw, RefreshCw, Upload, ChevronDown, ChevronRight, Wand2, Download, Plus, Trash2, Filter,
|
||||
RotateCcw, RefreshCw, Upload, ChevronDown, ChevronRight, Wand2, Download, Plus, Trash2, Filter, Cuboid,
|
||||
} from 'lucide-react'
|
||||
import { toast } from 'sonner'
|
||||
import {
|
||||
@@ -18,7 +18,7 @@ import { listMaterials } from '../api/materials'
|
||||
import MaterialInput from '../components/shared/MaterialInput'
|
||||
import MaterialWizard from '../components/MaterialWizard'
|
||||
import { useAuthStore } from '../store/auth'
|
||||
import { downloadStl, generateStl } from '../api/cad'
|
||||
import { downloadStl, generateStl, generateGltfGeometry } from '../api/cad'
|
||||
|
||||
function CadStatusBadge({ status }: { status: string | null }) {
|
||||
if (!status) return (
|
||||
@@ -48,6 +48,7 @@ const META_FIELDS: Array<{ key: keyof Product; label: string }> = [
|
||||
|
||||
export default function ProductDetailPage() {
|
||||
const { id } = useParams<{ id: string }>()
|
||||
const navigate = useNavigate()
|
||||
const qc = useQueryClient()
|
||||
const user = useAuthStore((s) => s.user)
|
||||
const isPrivileged = user?.role === 'admin' || user?.role === 'project_manager'
|
||||
@@ -552,6 +553,30 @@ export default function ProductDetailPage() {
|
||||
</button>
|
||||
</>
|
||||
)}
|
||||
{product.cad_file_id && (
|
||||
<button
|
||||
className="btn-secondary text-xs"
|
||||
onClick={() => navigate(`/cad/${product.cad_file_id}`)}
|
||||
title="Open interactive 3D viewer"
|
||||
>
|
||||
<Cuboid size={12} />
|
||||
View 3D
|
||||
</button>
|
||||
)}
|
||||
{product.cad_file_id && isPrivileged && (
|
||||
<button
|
||||
className="btn-secondary text-xs"
|
||||
onClick={() =>
|
||||
generateGltfGeometry(product.cad_file_id!)
|
||||
.then(() => toast.info('GLB geometry export queued'))
|
||||
.catch(() => toast.error('Failed to queue GLB export'))
|
||||
}
|
||||
title="Export geometry-only GLB from cached STL (trimesh, no Blender). Requires STL cache."
|
||||
>
|
||||
<Download size={12} />
|
||||
Generate GLB
|
||||
</button>
|
||||
)}
|
||||
{product.cad_file_id && isPrivileged && (
|
||||
<div className="flex flex-col gap-1 pt-1 border-t border-border-light">
|
||||
<p className="text-xs text-content-muted font-medium">STL</p>
|
||||
|
||||
@@ -0,0 +1,281 @@
|
||||
import { useState } from 'react'
|
||||
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'
|
||||
import { toast } from 'sonner'
|
||||
import { RefreshCw, ChevronDown, ChevronRight, Cpu, Layers, Minus, Plus } from 'lucide-react'
|
||||
import {
|
||||
getCeleryWorkers,
|
||||
getQueueStatus,
|
||||
scaleWorkers,
|
||||
type CeleryWorker,
|
||||
type ScaleRequest,
|
||||
} from '../api/worker'
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Worker card
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function WorkerCard({ worker }: { worker: CeleryWorker }) {
|
||||
const [expanded, setExpanded] = useState(false)
|
||||
return (
|
||||
<div className="rounded-xl border border-border-default p-4">
|
||||
<div className="flex items-start justify-between gap-3">
|
||||
<div className="flex items-center gap-2 min-w-0">
|
||||
<Cpu size={16} className="text-accent shrink-0" />
|
||||
<span className="text-sm font-medium text-content truncate">{worker.name}</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-2 shrink-0">
|
||||
<span
|
||||
className={`text-xs font-semibold px-2 py-0.5 rounded-full ${
|
||||
worker.active_task_count > 0
|
||||
? 'bg-blue-500/20 text-blue-400'
|
||||
: 'bg-green-500/20 text-green-400'
|
||||
}`}
|
||||
>
|
||||
{worker.active_task_count > 0 ? `${worker.active_task_count} active` : 'idle'}
|
||||
</span>
|
||||
{worker.active_tasks.length > 0 && (
|
||||
<button
|
||||
onClick={() => setExpanded((e) => !e)}
|
||||
className="text-content-muted hover:text-content transition-colors"
|
||||
>
|
||||
{expanded ? <ChevronDown size={14} /> : <ChevronRight size={14} />}
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Queues */}
|
||||
<div className="mt-2 flex flex-wrap gap-1">
|
||||
{worker.queues.map((q) => (
|
||||
<span
|
||||
key={q}
|
||||
className="text-xs px-2 py-0.5 rounded bg-surface-muted text-content-muted"
|
||||
>
|
||||
{q}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* Active tasks */}
|
||||
{expanded && worker.active_tasks.length > 0 && (
|
||||
<div className="mt-3 space-y-1">
|
||||
{worker.active_tasks.map((t) => (
|
||||
<div key={t.id} className="text-xs text-content-muted font-mono truncate">
|
||||
{t.name}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Scale controls
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
type ScalableService = ScaleRequest['service']
|
||||
|
||||
const SCALABLE_SERVICES: { service: ScalableService; label: string; description: string }[] = [
|
||||
{ service: 'render-worker', label: 'Render Worker', description: 'Blender renders — concurrency=1' },
|
||||
{ service: 'worker', label: 'Step Worker', description: 'STEP processing — concurrency=8' },
|
||||
{ service: 'worker-thumbnail', label: 'Thumbnail Worker', description: 'Thumbnail rendering' },
|
||||
]
|
||||
|
||||
function ScaleControl({
|
||||
service,
|
||||
label,
|
||||
description,
|
||||
}: {
|
||||
service: ScalableService
|
||||
label: string
|
||||
description: string
|
||||
}) {
|
||||
const [count, setCount] = useState(1)
|
||||
const scaleMut = useMutation({
|
||||
mutationFn: () => scaleWorkers({ service, count }),
|
||||
onSuccess: (data) => toast.success(`${data.service} → ${data.count} instance(s)`),
|
||||
onError: (e: unknown) => {
|
||||
const detail = (e as { response?: { data?: { detail?: string } } })?.response?.data?.detail
|
||||
toast.error(detail ?? `Failed to scale ${service}`)
|
||||
},
|
||||
})
|
||||
|
||||
return (
|
||||
<div className="rounded-xl border border-border-default p-4 flex items-center justify-between gap-4">
|
||||
<div>
|
||||
<p className="text-sm font-medium text-content">{label}</p>
|
||||
<p className="text-xs text-content-muted mt-0.5">{description}</p>
|
||||
</div>
|
||||
<div className="flex items-center gap-2 shrink-0">
|
||||
<button
|
||||
onClick={() => setCount((c) => Math.max(0, c - 1))}
|
||||
className="p-1 rounded-md bg-surface-muted hover:bg-surface-hover text-content transition-colors"
|
||||
>
|
||||
<Minus size={14} />
|
||||
</button>
|
||||
<span className="w-6 text-center text-sm font-semibold text-content">{count}</span>
|
||||
<button
|
||||
onClick={() => setCount((c) => Math.min(20, c + 1))}
|
||||
className="p-1 rounded-md bg-surface-muted hover:bg-surface-hover text-content transition-colors"
|
||||
>
|
||||
<Plus size={14} />
|
||||
</button>
|
||||
<button
|
||||
onClick={() => scaleMut.mutate()}
|
||||
disabled={scaleMut.isPending}
|
||||
className="btn-primary text-xs px-3 py-1.5 ml-2"
|
||||
>
|
||||
{scaleMut.isPending ? 'Scaling…' : 'Scale'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Queue depth bar
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function QueueDepthRow({ queue, depth }: { queue: string; depth: number }) {
|
||||
return (
|
||||
<div className="flex items-center gap-3">
|
||||
<span className="text-sm text-content w-44 truncate font-mono">{queue}</span>
|
||||
<div className="flex-1 h-2 rounded-full bg-surface-muted overflow-hidden">
|
||||
<div
|
||||
className="h-full rounded-full transition-all"
|
||||
style={{
|
||||
width: `${Math.min(100, depth * 5)}%`,
|
||||
backgroundColor: depth > 10 ? 'var(--color-red-500)' : 'var(--color-accent)',
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
<span
|
||||
className={`text-xs font-semibold w-8 text-right ${
|
||||
depth > 10 ? 'text-red-400' : 'text-content-muted'
|
||||
}`}
|
||||
>
|
||||
{depth}
|
||||
</span>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Main page
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export default function WorkerManagement() {
|
||||
const qc = useQueryClient()
|
||||
|
||||
const { data: workerData, isLoading: workersLoading } = useQuery({
|
||||
queryKey: ['celery-workers'],
|
||||
queryFn: getCeleryWorkers,
|
||||
refetchInterval: 10_000,
|
||||
})
|
||||
|
||||
const { data: queueData, isLoading: queuesLoading } = useQuery({
|
||||
queryKey: ['queue-status'],
|
||||
queryFn: getQueueStatus,
|
||||
refetchInterval: 5_000,
|
||||
})
|
||||
|
||||
function refresh() {
|
||||
qc.invalidateQueries({ queryKey: ['celery-workers'] })
|
||||
qc.invalidateQueries({ queryKey: ['queue-status'] })
|
||||
}
|
||||
|
||||
const workers = workerData?.workers ?? []
|
||||
const queueDepths = queueData?.queue_depths ?? {}
|
||||
|
||||
return (
|
||||
<div className="p-8 max-w-5xl mx-auto space-y-8">
|
||||
{/* Header */}
|
||||
<div className="flex items-center justify-between">
|
||||
<div>
|
||||
<h1 className="text-2xl font-bold text-content">Worker Management</h1>
|
||||
<p className="text-sm text-content-muted mt-1">
|
||||
Monitor active Celery workers and scale services up or down.
|
||||
</p>
|
||||
</div>
|
||||
<button onClick={refresh} className="btn-secondary flex items-center gap-2 text-sm">
|
||||
<RefreshCw size={14} />
|
||||
Refresh
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{/* Queue depths */}
|
||||
<section>
|
||||
<div className="flex items-center gap-2 mb-3">
|
||||
<Layers size={16} className="text-accent" />
|
||||
<h2 className="text-base font-semibold text-content">Queue Depths</h2>
|
||||
</div>
|
||||
{queuesLoading ? (
|
||||
<div className="space-y-2">
|
||||
{[0, 1, 2].map((i) => (
|
||||
<div key={i} className="h-6 rounded bg-surface-muted animate-pulse" />
|
||||
))}
|
||||
</div>
|
||||
) : Object.keys(queueDepths).length === 0 ? (
|
||||
<p className="text-sm text-content-muted">No queue data available.</p>
|
||||
) : (
|
||||
<div className="rounded-xl border border-border-default p-4 space-y-3">
|
||||
{Object.entries(queueDepths).map(([queue, depth]) => (
|
||||
<QueueDepthRow key={queue} queue={queue} depth={depth} />
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</section>
|
||||
|
||||
{/* Active workers */}
|
||||
<section>
|
||||
<div className="flex items-center gap-2 mb-3">
|
||||
<Cpu size={16} className="text-accent" />
|
||||
<h2 className="text-base font-semibold text-content">
|
||||
Active Workers
|
||||
{workers.length > 0 && (
|
||||
<span className="ml-2 text-xs font-normal text-content-muted">
|
||||
({workers.length})
|
||||
</span>
|
||||
)}
|
||||
</h2>
|
||||
</div>
|
||||
{workersLoading ? (
|
||||
<div className="grid grid-cols-2 gap-3">
|
||||
{[0, 1].map((i) => (
|
||||
<div key={i} className="h-20 rounded-xl bg-surface-muted animate-pulse" />
|
||||
))}
|
||||
</div>
|
||||
) : workerData?.error ? (
|
||||
<div className="rounded-xl border border-border-default p-4 text-sm text-red-400">
|
||||
Failed to fetch workers: {workerData.error}
|
||||
</div>
|
||||
) : workers.length === 0 ? (
|
||||
<div className="rounded-xl border border-border-default p-4 text-sm text-content-muted">
|
||||
No active workers detected. Make sure Celery workers are running.
|
||||
</div>
|
||||
) : (
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-3">
|
||||
{workers.map((w) => (
|
||||
<WorkerCard key={w.name} worker={w} />
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</section>
|
||||
|
||||
{/* Scale controls */}
|
||||
<section>
|
||||
<h2 className="text-base font-semibold text-content mb-3">Scale Services</h2>
|
||||
<p className="text-xs text-content-muted mb-4">
|
||||
Adjust the number of container instances for each service via Docker Compose.
|
||||
Changes take effect immediately but are not persisted across deployments.
|
||||
</p>
|
||||
<div className="space-y-2">
|
||||
{SCALABLE_SERVICES.map((s) => (
|
||||
<ScaleControl key={s.service} {...s} />
|
||||
))}
|
||||
</div>
|
||||
</section>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -1,420 +1,365 @@
|
||||
# Plan: Phase J (WebSocket) + Turntable Bug + Phase K (Asset Library)
|
||||
# Plan: Phase N — Workflow-Pipeline, 3D-Viewer Production-Modus, Worker-Management, QC-Tests
|
||||
|
||||
## Kontext
|
||||
|
||||
Analyse des aktuellen Codestands ergab: **Phasen F, G, H, I, L sind bereits vollständig implementiert.**
|
||||
Vier offene Bereiche aus dem PLAN.md müssen abgeschlossen werden:
|
||||
|
||||
| Phase | Status | Beleg |
|
||||
|-------|--------|-------|
|
||||
| F - Hash-Caching | DONE | `domains/products/cache_service.py` + migration 041 |
|
||||
| G - Billing | DONE | `domains/billing/` vollständig, WeasyPrint in Dockerfile |
|
||||
| H - Excel Sanity-Check | DONE | `domains/imports/service.py run_sanity_check()` + Upload.tsx Dialog |
|
||||
| I - Notification-Config | DONE | `notification_configs` migration 044, NotificationSettings.tsx |
|
||||
| L - Dashboard | DONE | AdminDashboard.tsx + ClientDashboard.tsx vollständig |
|
||||
| **J - WebSocket** | **FEHLT** | Kein `core/websocket.py`, alle Polls noch aktiv |
|
||||
1. **Workflow-Pipeline verdrahten**: `workflow_builder.py` enthält nur defekte Stubs. `_build_still` übergibt `order_line_id` als `step_path` an `render_still_task` → würde crashen. Der neue `still_with_exports`-Workflow (still + gltf_export + blend_export) ist nicht implementiert. Die Celery-Tasks für export_gltf/export_blend fehlen in `domains/rendering/tasks.py`.
|
||||
|
||||
Zusätzlich: **Kritischer Bug in `render_blender.py`** — ffmpeg-Overlay-Befehl haengt bei endlicher Frame-Sequenz (kein `shortest=1`) -> Timeout -> Turntable-Render schlaegt fehl.
|
||||
2. **K6: 3D-Viewer Production-Modus**: `ThreeDViewer.tsx` hat keinen Mode-Toggle, Wireframe, Env-Preset oder Download-Buttons. Für Testdaten wird `POST /api/cad/{id}/generate-gltf-geometry` benötigt (trimesh STL→GLB, kein Blender nötig).
|
||||
|
||||
---
|
||||
3. **L3: Worker-Management UI**: `WorkerManagement.tsx` fehlt. Backend braucht `/celery-workers` (Celery inspect) und `/scale` (docker compose subprocess). Backend-Container bekommt Docker-Socket-Mount.
|
||||
|
||||
## Bug Fix: Turntable ffmpeg Timeout
|
||||
|
||||
**Root cause**: In `backend/app/services/render_blender.py:507`:
|
||||
```python
|
||||
"-filter_complex", "[1:v][0:v]overlay=0:0",
|
||||
```
|
||||
Der `lavfi color`-Quell-Stream hat keine definierte Laenge. Ohne `shortest=1` wartet ffmpeg auf
|
||||
weitere Frames vom Farb-Stream nachdem die PNG-Sequenz endet -> haengt bis Timeout (300s).
|
||||
|
||||
**Fix**: `overlay=0:0` -> `overlay=0:0:shortest=1`
|
||||
|
||||
---
|
||||
|
||||
## Phase J: WebSocket Backend + Frontend
|
||||
|
||||
### Architektur (ADR-05: FastAPI nativ + Redis Pub/Sub)
|
||||
|
||||
```
|
||||
Backend Task/Router:
|
||||
-> redis.publish(f"tenant:{tenant_id}", json.dumps(event))
|
||||
|
||||
core/websocket.py:
|
||||
ConnectionManager: tenant_id -> set[WebSocket]
|
||||
background_task: asyncio.Task (redis subscribe loop)
|
||||
|
||||
Frontend:
|
||||
useWebSocket() hook -> WebSocket('/api/ws')
|
||||
Empfaengt Events, invalidiert React Query caches
|
||||
```
|
||||
|
||||
### Events die gesendet werden:
|
||||
| Event | Sender | Daten |
|
||||
|-------|--------|-------|
|
||||
| `render_complete` | step_tasks.py | order_line_id, status, thumbnail_url |
|
||||
| `render_failed` | step_tasks.py | order_line_id, error |
|
||||
| `cad_processing_complete` | step_tasks.py | cad_file_id, status |
|
||||
| `order_status_change` | orders router | order_id, new_status |
|
||||
| `queue_update` | beat task (alle 10s) | depth per queue |
|
||||
4. **M: QC-Tests**: `pytest` ist im Backend-Container nicht installiert. Dockerfile: `pip install -e ".[dev]"`. Neue Service-Tests für rendering und orders domains. 2 neue Vitest-Dateien.
|
||||
|
||||
---
|
||||
|
||||
## Betroffene Dateien
|
||||
|
||||
### Neu erstellen:
|
||||
- `backend/app/core/websocket.py` -- ConnectionManager + Redis Pub/Sub Loop
|
||||
- `frontend/src/hooks/useWebSocket.ts` -- WebSocket hook mit Auto-Reconnect
|
||||
- `frontend/src/contexts/WebSocketContext.tsx` -- Context Provider
|
||||
|
||||
### Aendern:
|
||||
- `backend/app/services/render_blender.py` -- ffmpeg shortest=1 Bug-Fix
|
||||
- `backend/app/main.py` -- WebSocket-Endpoint registrieren (`/api/ws`)
|
||||
- `backend/app/tasks/step_tasks.py` -- WebSocket-Events emittieren
|
||||
- `backend/app/domains/orders/router.py` -- Order-Status-Events emittieren
|
||||
- `backend/app/tasks/celery_app.py` -- `broadcast_queue_status` Beat-Task hinzufuegen
|
||||
- `frontend/src/App.tsx` -- WebSocketProvider wrappen
|
||||
- `frontend/src/pages/WorkerActivity.tsx` -- polling durch WS ersetzen
|
||||
- `frontend/src/pages/OrderDetail.tsx` -- polling durch WS ersetzen
|
||||
- `frontend/src/pages/Orders.tsx` -- polling reduzieren
|
||||
- `frontend/src/components/layout/Layout.tsx` -- polling reduzieren
|
||||
- `frontend/src/components/layout/NotificationCenter.tsx` -- polling durch WS ersetzen
|
||||
|
||||
### Nach Phase J Commit -- Phase K:
|
||||
- `backend/alembic/versions/045_asset_libraries.py` -- asset_libraries Tabelle
|
||||
- `backend/app/domains/materials/models.py` -- AssetLibrary Model hinzufuegen
|
||||
- `backend/app/domains/materials/router.py` -- Asset Library CRUD + Upload
|
||||
- `render-worker/scripts/asset_library.py` -- Materialien + Node-Groups aus .blend laden
|
||||
- `render-worker/scripts/catalog_assets.py` -- Katalog aus .blend lesen
|
||||
- `render-worker/scripts/export_gltf.py` -- GLB Export mit Materialien
|
||||
- `render-worker/scripts/export_blend.py` -- .blend Export mit pack_all()
|
||||
- `backend/app/domains/rendering/workflow_builder.py` -- Asset Library Nodes
|
||||
- `frontend/src/pages/Admin.tsx` -- Asset Library Manager UI
|
||||
- `frontend/src/api/assetLibraries.ts` -- API Client
|
||||
| Datei | Änderung |
|
||||
|-------|----------|
|
||||
| `backend/app/domains/rendering/tasks.py` | 3 neue Tasks: `render_order_line_still_task`, `export_gltf_for_order_line_task`, `export_blend_for_order_line_task` |
|
||||
| `backend/app/domains/rendering/workflow_builder.py` | Stubs ersetzen durch order-line-aware Tasks, `still_with_exports` hinzufügen |
|
||||
| `backend/app/api/routers/cad.py` | `POST /{id}/generate-gltf-geometry` Endpoint |
|
||||
| `backend/app/api/routers/worker.py` | `GET /celery-workers`, `POST /scale` Endpoints |
|
||||
| `backend/Dockerfile` | `pip install -e ".[dev]"` |
|
||||
| `docker-compose.yml` | Backend + Worker: Docker-Socket + Compose-File-Mount |
|
||||
| `frontend/src/components/cad/ThreeDViewer.tsx` | Mode-Toggle, Wireframe, Env-Preset, Download-Buttons |
|
||||
| `frontend/src/pages/WorkerManagement.tsx` | NEU: Worker-Liste, Queue-Stats, Scale-Button |
|
||||
| `frontend/src/api/worker.ts` | Neue Interfaces + API-Funktionen |
|
||||
| `frontend/src/App.tsx` | Route für /workers |
|
||||
| `frontend/src/components/layout/Layout.tsx` | Sidebar-Link Workers |
|
||||
| `backend/tests/domains/test_rendering_service.py` | NEU: ≥5 Tests für Rendering-Tasks und Workflow-Builder |
|
||||
| `backend/tests/domains/test_orders_service.py` | NEU: ≥5 Tests für Orders-Endpoints |
|
||||
| `frontend/src/__tests__/pages/WorkerActivity.test.tsx` | NEU: Vitest-Tests |
|
||||
| `frontend/src/__tests__/pages/WorkerManagement.test.tsx` | NEU: Vitest-Tests |
|
||||
|
||||
---
|
||||
|
||||
## Tasks (in Reihenfolge)
|
||||
|
||||
### Task 1: Bug-Fix ffmpeg Turntable Timeout [x]
|
||||
- **Datei**: `backend/app/services/render_blender.py:507`
|
||||
- **Was**: `"[1:v][0:v]overlay=0:0"` -> `"[1:v][0:v]overlay=0:0:shortest=1"`
|
||||
- **Akzeptanzkriterium**: Turntable-Render fuer Order f0436188 kann erneut gestartet werden und produziert MP4
|
||||
- **Abhaengigkeiten**: keine
|
||||
### Task 1: Backend — Neue order-line-aware Rendering-Tasks
|
||||
- **Datei**: `backend/app/domains/rendering/tasks.py`
|
||||
- **Was**: Drei neue Celery-Tasks hinzufügen (UNTER den bestehenden Tasks):
|
||||
|
||||
### Task 2: WebSocket Backend -- core/websocket.py [x]
|
||||
- **Datei**: `backend/app/core/websocket.py` (neu)
|
||||
- **Was**:
|
||||
```python
|
||||
class ConnectionManager:
|
||||
_connections: dict[str, set[WebSocket]] # tenant_id -> sockets
|
||||
async def connect(ws, tenant_id)
|
||||
def disconnect(ws, tenant_id)
|
||||
async def broadcast_to_tenant(tenant_id, event: dict)
|
||||
async def start_redis_subscriber() # asyncio background task
|
||||
**`render_order_line_still_task(order_line_id, **params)`** — Queue `thumbnail_rendering`:
|
||||
- Lädt OrderLine + CadFile via sync SQLAlchemy (wie `publish_asset`)
|
||||
- Setzt `render_status = 'processing'`
|
||||
- Ruft `render_still()` aus `app.services.render_blender` auf
|
||||
- Setzt `render_status = 'completed'`, speichert `render_log`
|
||||
- Bei Fehler: `render_status = 'failed'`
|
||||
- Returns dict mit `output_path`
|
||||
|
||||
def publish_event_sync(tenant_id: str, event: dict):
|
||||
# Sync version fuer Celery tasks -- redis.publish()
|
||||
```
|
||||
- Redis Pub/Sub: subscribe auf `tenant:*` Channels
|
||||
- Bei Nachricht: alle WebSockets des Tenants benachrichtigen
|
||||
- Auto-Ping alle 30s gegen Disconnects
|
||||
- **Akzeptanzkriterium**: broadcast_to_tenant sendet an alle verbundenen WS des Tenants
|
||||
- **Abhaengigkeiten**: keine
|
||||
**`export_gltf_for_order_line_task(order_line_id)`** — Queue `thumbnail_rendering`:
|
||||
- Lädt OrderLine + CadFile sync
|
||||
- Sucht STL-Cache (`{step_stem}_low.stl`)
|
||||
- Ruft Blender subprocess mit `export_gltf.py` auf: `blender --background --python export_gltf.py -- --stl_path X --output_path Y`
|
||||
- Lädt GLB nach MinIO `production-exports/{cad_file_id}/{order_line_id}.glb`
|
||||
- Erstellt `MediaAsset(asset_type=gltf_production, storage_key=...)`
|
||||
- Returns `storage_key`
|
||||
|
||||
### Task 3: WebSocket Endpoint in main.py [x]
|
||||
- **Datei**: `backend/app/main.py`
|
||||
- **Was**:
|
||||
```python
|
||||
@app.websocket("/api/ws")
|
||||
async def ws_endpoint(websocket: WebSocket, token: str = Query(...)):
|
||||
user = await verify_ws_token(token)
|
||||
await manager.connect(websocket, str(user.tenant_id))
|
||||
try:
|
||||
while True:
|
||||
await websocket.receive_text() # Keep-alive pings
|
||||
except WebSocketDisconnect:
|
||||
manager.disconnect(websocket, str(user.tenant_id))
|
||||
```
|
||||
- Token-Auth via Query-Parameter (WS kann keinen Authorization-Header senden)
|
||||
- `verify_ws_token`: JWT decode, User laden (analog zu get_current_user)
|
||||
- `manager` als globale Instanz, gestartet im lifespan
|
||||
- **Akzeptanzkriterium**: `ws://localhost:8888/api/ws?token=<jwt>` oeffnet Verbindung
|
||||
- **Abhaengigkeiten**: Task 2
|
||||
**`export_blend_for_order_line_task(order_line_id)`** — Queue `thumbnail_rendering`:
|
||||
- Analog zu export_gltf, aber mit `export_blend.py`
|
||||
- MediaAsset type: `blend_production`
|
||||
|
||||
### Task 4: WebSocket Events in step_tasks.py [x]
|
||||
- **Datei**: `backend/app/tasks/step_tasks.py`
|
||||
- **Was**: In render_order_line_task und render_step_thumbnail nach Erfolg/Fehler:
|
||||
```python
|
||||
from app.core.websocket import publish_event_sync
|
||||
# bei render complete:
|
||||
publish_event_sync(tenant_id, {"type": "render_complete", "order_line_id": str(line.id), "status": "completed"})
|
||||
# bei render failed:
|
||||
publish_event_sync(tenant_id, {"type": "render_failed", "order_line_id": str(line.id), "error": str(exc)})
|
||||
# bei CAD processing complete:
|
||||
publish_event_sync(tenant_id, {"type": "cad_processing_complete", "cad_file_id": str(cad_file.id), "status": "completed"})
|
||||
```
|
||||
- tenant_id aus cad_file.tenant_id bzw. order_line -> order -> user.tenant_id laden
|
||||
- **Akzeptanzkriterium**: Render fertig -> WebSocket-Client empfaengt Event
|
||||
- **Abhaengigkeiten**: Task 2
|
||||
- **Akzeptanzkriterium**: Tasks in `domains/rendering/tasks.py` vorhanden, keine Import-Fehler
|
||||
- **Abhängigkeiten**: keine
|
||||
|
||||
### Task 5: WebSocket Events in orders router [x]
|
||||
- **Datei**: `backend/app/domains/orders/router.py`
|
||||
- **Was**: Bei Order-Status-Aenderung (submit, complete, cancel):
|
||||
```python
|
||||
from app.core.websocket import manager
|
||||
await manager.broadcast_to_tenant(
|
||||
str(current_user.tenant_id),
|
||||
{"type": "order_status_change", "order_id": str(order.id), "status": new_status}
|
||||
)
|
||||
```
|
||||
- **Akzeptanzkriterium**: Order-Submit -> WebSocket-Event geht an alle Browser-Tabs des Tenants
|
||||
- **Abhaengigkeiten**: Task 2
|
||||
|
||||
### Task 6: Queue-Update Beat-Task [x]
|
||||
- **Datei**: `backend/app/tasks/celery_app.py`
|
||||
- **Was**: Neuer Beat-Task alle 10s:
|
||||
```python
|
||||
@shared_task(name="beat.broadcast_queue_status", queue="step_processing")
|
||||
def broadcast_queue_status():
|
||||
from app.core.websocket import publish_event_sync
|
||||
from redis import Redis
|
||||
r = Redis.from_url(settings.redis_url)
|
||||
depths = {
|
||||
"step_processing": r.llen("step_processing"),
|
||||
"thumbnail_rendering": r.llen("thumbnail_rendering"),
|
||||
}
|
||||
# Broadcast an alle Tenants (broadcast_all)
|
||||
r.publish("__broadcast__", json.dumps({"type": "queue_update", "depths": depths}))
|
||||
```
|
||||
- `__broadcast__` Channel: wird an ALLE verbundenen WS gesendet (nicht tenant-spezifisch)
|
||||
- ConnectionManager subscribt auch auf `__broadcast__`
|
||||
- **Akzeptanzkriterium**: WorkerActivity-Queue-Tiefe aktualisiert alle 10s automatisch
|
||||
- **Abhaengigkeiten**: Task 2
|
||||
|
||||
### Task 7: Frontend WebSocket Hook [x]
|
||||
- **Datei**: `frontend/src/hooks/useWebSocket.ts` (neu)
|
||||
- **Was**:
|
||||
```typescript
|
||||
export function useWebSocketConnection() {
|
||||
// Verbindet zu ws://localhost:8888/api/ws?token=<jwt>
|
||||
// Auto-Reconnect: 1s, 2s, 4s, 8s, ... max 30s
|
||||
// Emittiert Events via onMessage callback
|
||||
// Pings alle 25s (keep-alive)
|
||||
// Trennt Verbindung bei Logout
|
||||
}
|
||||
```
|
||||
- **Akzeptanzkriterium**: Verbindung bleibt offen, reconnected nach Netzwerktrennung
|
||||
- **Abhaengigkeiten**: keine
|
||||
|
||||
### Task 8: Frontend WebSocket Context [x]
|
||||
- **Datei**: `frontend/src/contexts/WebSocketContext.tsx` (neu), `frontend/src/App.tsx` aendern
|
||||
- **Was**:
|
||||
```typescript
|
||||
export function WebSocketProvider({ children }) {
|
||||
const queryClient = useQueryClient()
|
||||
// on 'render_complete': invalidateQueries(['orders', order_line_id])
|
||||
// on 'render_failed': invalidateQueries(['orders', order_line_id])
|
||||
// on 'cad_processing_complete': invalidateQueries(['cad-activity'])
|
||||
// on 'order_status_change': invalidateQueries(['orders'])
|
||||
// on 'queue_update': queryClient.setQueryData(['queue-status'], ...)
|
||||
}
|
||||
// App.tsx: <WebSocketProvider> um <Router> wrappen
|
||||
```
|
||||
- **Akzeptanzkriterium**: render_complete Event -> OrderDetail aktualisiert ohne Poll-Interval
|
||||
- **Abhaengigkeiten**: Task 7
|
||||
|
||||
### Task 9: Polling ersetzen -- WorkerActivity.tsx [x]
|
||||
- **Datei**: `frontend/src/pages/WorkerActivity.tsx`
|
||||
- **Was**:
|
||||
- `refetchInterval: 5000` entfernen -- bei `cad_processing_complete` invalidieren
|
||||
- `refetchInterval: 3000` fuer Queue-Status entfernen -- bei `queue_update` setQueryData
|
||||
- **Akzeptanzkriterium**: Keine automatischen HTTP-Requests im Network-Tab (nur WS-Frames)
|
||||
- **Abhaengigkeiten**: Task 8
|
||||
|
||||
### Task 10: Polling ersetzen -- OrderDetail.tsx [x]
|
||||
- **Datei**: `frontend/src/pages/OrderDetail.tsx`
|
||||
- **Was**:
|
||||
- `refetchInterval: (query) => {...}` entfernen
|
||||
- Stattdessen: bei `render_complete` / `render_failed` fuer matching order_line_id -> invalidate
|
||||
- **Akzeptanzkriterium**: Render-Status in OrderDetail aktualisiert live ohne Poll
|
||||
- **Abhaengigkeiten**: Task 8
|
||||
|
||||
### Task 11: Polling reduzieren -- Layout.tsx + NotificationCenter.tsx [x]
|
||||
- **Dateien**: `frontend/src/components/layout/Layout.tsx`, `NotificationCenter.tsx`
|
||||
- **Was**:
|
||||
- Layout: `refetchInterval: 8000` -> 60000 (1min)
|
||||
- NotificationCenter: `refetchInterval: 15_000` -> 60000; bei `order_status_change` zusaetzlich invalidieren
|
||||
- **Akzeptanzkriterium**: Signifikant weniger Poll-Requests im Network-Tab
|
||||
- **Abhaengigkeiten**: Task 8
|
||||
|
||||
### Task 12: PLAN.md + LEARNINGS.md + Commit [x]
|
||||
- **Was**:
|
||||
- PLAN.md: Phase J als ABGESCHLOSSEN markieren, Status auf "Phase K als naechstes"
|
||||
- LEARNINGS.md: ffmpeg `shortest=1` Learning + WebSocket Auth via Query-Param Learning
|
||||
- `git commit -m "feat(J): WebSocket live-events + replace polling + fix ffmpeg turntable timeout"`
|
||||
- **Abhaengigkeiten**: Tasks 1-11
|
||||
|
||||
---
|
||||
|
||||
## Phase K Tasks (nach Commit)
|
||||
|
||||
### Task K1: Migration 045 + AssetLibrary Model [x]
|
||||
- **Datei**: `backend/alembic/versions/045_asset_libraries.py` (neu, autogenerate), `domains/materials/models.py`
|
||||
- **Was**:
|
||||
```python
|
||||
class AssetLibrary(Base):
|
||||
id: UUID PK, tenant_id FK nullable, name VARCHAR(200)
|
||||
blend_file_key TEXT, # MinIO key
|
||||
catalog JSONB, # {materials: [...], node_groups: [...]}
|
||||
description TEXT, is_active BOOL, created_at TIMESTAMP
|
||||
```
|
||||
- `render_templates.asset_library_id` FK optional (nullable)
|
||||
- `output_types.asset_library_id` FK optional (nullable)
|
||||
- **Akzeptanzkriterium**: `alembic upgrade head` erfolgreich, `asset_libraries` Tabelle in DB
|
||||
|
||||
### Task K2: Asset Library CRUD Backend [x]
|
||||
- **Datei**: `backend/app/domains/materials/router.py` + `service.py` + `schemas.py`
|
||||
- **Was**:
|
||||
- `POST /api/asset-libraries` -- .blend Upload -> MinIO `asset-libraries/{id}.blend` -> queut Katalog-Refresh
|
||||
- `GET /api/asset-libraries` -- Liste
|
||||
- `GET /api/asset-libraries/{id}/catalog` -- Materialien + Node-Groups
|
||||
- `DELETE /api/asset-libraries/{id}` -- nur wenn nicht in Verwendung (FK-Check)
|
||||
- `AssetLibraryOut` Schema mit `catalog` field
|
||||
- **Akzeptanzkriterium**: POST + GET funktionieren, .blend in MinIO gespeichert
|
||||
|
||||
### Task K3: Katalog-Refresh Celery Task + Blender Script [x]
|
||||
- **Datei**: `backend/app/domains/materials/tasks.py` (neu), `render-worker/scripts/catalog_assets.py` (neu)
|
||||
- **Was**:
|
||||
- Celery Task `refresh_asset_library_catalog(asset_library_id)` auf Queue `thumbnail_rendering`
|
||||
- Laedt .blend aus MinIO in tmpdir
|
||||
- Startet `blender --background --python catalog_assets.py -- <blend_path>`
|
||||
- `catalog_assets.py`: oeffnet .blend, liest alle markierten Assets:
|
||||
```python
|
||||
import bpy, json, sys
|
||||
blend_path = sys.argv[sys.argv.index('--') + 1]
|
||||
bpy.ops.wm.open_mainfile(filepath=blend_path)
|
||||
catalog = {
|
||||
"materials": [m.name for m in bpy.data.materials if m.asset_data],
|
||||
"node_groups": [ng.name for ng in bpy.data.node_groups if ng.asset_data],
|
||||
}
|
||||
print(json.dumps(catalog))
|
||||
```
|
||||
- Schreibt Katalog in `asset_libraries.catalog JSONB`
|
||||
- **Akzeptanzkriterium**: Nach .blend-Upload enthaelt `catalog` JSONB die Asset-Namen
|
||||
|
||||
### Task K4: Blender Asset Library Apply Script [x]
|
||||
- **Datei**: `render-worker/scripts/asset_library.py` (neu)
|
||||
- **Was**:
|
||||
```python
|
||||
def apply_asset_library_materials(blend_path: str, material_map: dict) -> None:
|
||||
"""Laedt Materialien aus Asset-Library .blend, wendet auf Mesh-Parts an."""
|
||||
with bpy.data.libraries.load(blend_path, link=True, assets_only=True) as (src, dst):
|
||||
dst.materials = [n for n in src.materials if n in material_map.values()]
|
||||
for obj in bpy.data.objects:
|
||||
if obj.type == 'MESH':
|
||||
for slot in obj.material_slots:
|
||||
resolved = material_map.get(slot.material.name if slot.material else '')
|
||||
if resolved and resolved in bpy.data.materials:
|
||||
slot.material = bpy.data.materials[resolved]
|
||||
|
||||
def apply_asset_library_modifiers(blend_path: str, modifier_map: dict) -> None:
|
||||
"""Laedt Geometry-Node-Gruppen, wendet als Modifier an."""
|
||||
with bpy.data.libraries.load(blend_path, link=True, assets_only=True) as (src, dst):
|
||||
dst.node_groups = [n for n in src.node_groups if n in modifier_map.values()]
|
||||
for obj in bpy.data.objects:
|
||||
if obj.type == 'MESH':
|
||||
for part_name, mod_name in modifier_map.items():
|
||||
if part_name.lower() in obj.name.lower():
|
||||
mod = obj.modifiers.new(name=mod_name, type='NODES')
|
||||
mod.node_group = bpy.data.node_groups.get(mod_name)
|
||||
```
|
||||
- **Akzeptanzkriterium**: Render mit Asset-Library zeigt korrekte Produktionsmaterialien
|
||||
|
||||
### Task K5: export_gltf + export_blend Scripts [x]
|
||||
- **Dateien**: `render-worker/scripts/export_gltf.py` (neu), `render-worker/scripts/export_blend.py` (neu)
|
||||
- **Was**:
|
||||
- `export_gltf.py`:
|
||||
1. STL importieren (`bpy.ops.import_mesh.stl`)
|
||||
2. Asset Library laden via `apply_asset_library_materials` + `apply_asset_library_modifiers`
|
||||
3. `bpy.ops.export_scene.gltf(filepath=out, export_format='GLB', export_apply=True, export_draco_mesh_compression_enable=True)`
|
||||
4. Output nach MinIO `production-exports/{cad_file_id}/{run_id}.glb`
|
||||
5. MediaAsset-Record mit `asset_type=gltf_production`
|
||||
- `export_blend.py`:
|
||||
1. STL + Asset Library laden (wie export_gltf)
|
||||
2. `bpy.ops.file.pack_all()`
|
||||
3. `bpy.ops.wm.save_as_mainfile(filepath=out, compress=True, copy=True)`
|
||||
4. MediaAsset-Record mit `asset_type=blend_production`
|
||||
- **Akzeptanzkriterium**: GLB-Download oeffnet sich im Three.js Viewer mit Materialien
|
||||
|
||||
### Task K6: Workflow-Builder -- Asset Library Nodes [x]
|
||||
### Task 2: Backend — workflow_builder.py reparieren + still_with_exports
|
||||
- **Datei**: `backend/app/domains/rendering/workflow_builder.py`
|
||||
- **Was**:
|
||||
- Neue Celery Tasks: `apply_asset_library_materials_task`, `apply_asset_library_modifiers_task`, `export_gltf_task`, `export_blend_task`
|
||||
- Neuer Workflow-Typ `still_production`:
|
||||
|
||||
- `_build_still`: Nutzt `render_order_line_still_task` statt `render_still_task`
|
||||
- `_build_turntable`: Bleibt vorerst mit `render_turntable_task` (file-path-basiert, funktioniert via legacy path)
|
||||
- `_build_multi_angle`: Nutzt `render_order_line_still_task` mit `camera_angle` param
|
||||
- **NEU** `_build_still_with_exports(order_line_id, params)`:
|
||||
```python
|
||||
chain(
|
||||
convert_step.si(order_line_id),
|
||||
from celery import chain, group
|
||||
return chain(
|
||||
render_order_line_still_task.si(order_line_id, **params),
|
||||
group(
|
||||
chain(apply_asset_library_materials.si(order_line_id), render_still.si(order_line_id)),
|
||||
chain(apply_asset_library_materials.si(order_line_id), export_gltf.si(order_line_id)),
|
||||
chain(apply_asset_library_materials.si(order_line_id), export_blend.si(order_line_id)),
|
||||
),
|
||||
generate_thumbnail.si(order_line_id),
|
||||
publish_asset.si(order_line_id),
|
||||
export_gltf_for_order_line_task.si(order_line_id),
|
||||
export_blend_for_order_line_task.si(order_line_id),
|
||||
)
|
||||
)
|
||||
```
|
||||
- **Akzeptanzkriterium**: Dispatch eines `still_production` Workflows -> PNG + GLB + .blend erzeugt
|
||||
- `dispatch_workflow()`: `"still_with_exports"` zu `builders` hinzufügen
|
||||
|
||||
### Task K7: Asset Library Management UI [x]
|
||||
- **Dateien**: `frontend/src/api/assetLibraries.ts` (neu), `frontend/src/pages/Admin.tsx` erweitern
|
||||
- **Was**:
|
||||
- API Client: `getAssetLibraries`, `uploadAssetLibrary` (multipart), `deleteAssetLibrary`, `getAssetLibraryCatalog`
|
||||
- Admin.tsx: neues Panel "Asset Libraries" (nach Render Templates)
|
||||
- Upload-Button + Drag-Drop
|
||||
- Tabelle: Name, Materialien-Anzahl, Node-Groups-Anzahl, Aktionen
|
||||
- Katalog-Detail: Material-Badge-Liste (gruen) + Node-Group-Badge-Liste (blau)
|
||||
- OutputTypeTable: Asset-Library-Dropdown-Spalte
|
||||
- **Akzeptanzkriterium**: Admin kann .blend hochladen, Katalog sehen, OutputType zuweisen
|
||||
- **Akzeptanzkriterium**: `dispatch_workflow("still_with_exports", order_line_id)` löst keine Exception aus
|
||||
- **Abhängigkeiten**: Task 1
|
||||
|
||||
### Task K8: PLAN.md + LEARNINGS.md + Commit [x]
|
||||
### Task 3: Backend — generate-gltf-geometry Endpoint (Testdaten für K6)
|
||||
- **Datei**: `backend/app/api/routers/cad.py`
|
||||
- **Was**: Neuer Endpoint `POST /api/cad/{id}/generate-gltf-geometry` (require_admin_or_pm):
|
||||
- Prüft ob CadFile existiert + STL-Cache vorhanden (`{step_dir}/{stem}_low.stl`)
|
||||
- Queut neuen Celery-Task `generate_gltf_geometry_task.delay(str(cad_file.id))`
|
||||
- Returns `{"task_id": ..., "message": "GLB generation queued"}`
|
||||
|
||||
Neuer Task `generate_gltf_geometry_task` in `domains/rendering/tasks.py` (Queue `thumbnail_rendering`):
|
||||
- Lädt CadFile sync, findet STL-Cache
|
||||
- **Nutzt trimesh** (kein Blender): `import trimesh; mesh = trimesh.load(stl_path); mesh.export(glb_path)`
|
||||
→ Warum trimesh: Schnell, kein Blender nötig, läuft auf worker-Container (trimesh in pyproject.toml cad-extras)
|
||||
- Lädt GLB nach MinIO `uploads/{cad_file_id}/geometry.glb`
|
||||
- Erstellt/aktualisiert `MediaAsset(asset_type=gltf_geometry, storage_key=..., cad_file_id=...)`
|
||||
→ `MediaAsset` braucht `cad_file_id` FK — prüfen ob vorhanden
|
||||
|
||||
**Wichtig**: Prüfen ob `media_assets.cad_file_id` existiert. Falls nicht: Migration 047 notwendig.
|
||||
|
||||
- **Akzeptanzkriterium**: `POST /api/cad/{id}/generate-gltf-geometry` gibt 202 zurück, nach Task-Ausführung existiert MediaAsset mit type=gltf_geometry
|
||||
- **Abhängigkeiten**: Task 1
|
||||
|
||||
### Task 4: Migration 047 — media_assets.cad_file_id (wenn nötig)
|
||||
- **Datei**: `backend/alembic/versions/047_media_assets_cad_file_id.py`
|
||||
- **Was**: Nullable FK `cad_file_id UUID REFERENCES cad_files(id) ON DELETE SET NULL` auf `media_assets`
|
||||
- **Prüfen**: `grep -n "cad_file_id" backend/app/domains/media/models.py` — falls schon vorhanden: Task überspringen
|
||||
- **Akzeptanzkriterium**: `alembic upgrade head` erfolgreich
|
||||
- **Abhängigkeiten**: keine
|
||||
|
||||
### Task 5: ThreeDViewer.tsx — Production-Modus, Wireframe, Env-Preset, Downloads
|
||||
- **Datei**: `frontend/src/components/cad/ThreeDViewer.tsx`
|
||||
- **Was**: Props erweitern + Toolbar-Erweiterung:
|
||||
|
||||
```typescript
|
||||
interface ThreeDViewerProps {
|
||||
cadFileId: string
|
||||
onClose: () => void
|
||||
productionGltfUrl?: string // wenn vorhanden: Mode-Toggle anzeigen
|
||||
downloadUrls?: { glb?: string; blend?: string }
|
||||
}
|
||||
```
|
||||
|
||||
**Neuer State:**
|
||||
- `mode: 'geometry' | 'production'` (default: 'geometry')
|
||||
- `wireframe: boolean` (default: false)
|
||||
- `envPreset: 'city' | 'studio' | 'sunset'` (default: 'city')
|
||||
|
||||
**Toolbar** (neu, rechts vom "Capture Angle"-Button):
|
||||
- Mode-Toggle (nur wenn `productionGltfUrl` gesetzt): Button-Gruppe "Geometry | Production"
|
||||
- Wireframe-Toggle: Button
|
||||
- Env-Preset-Dropdown: `<select>` mit city/studio/sunset
|
||||
- Download-Buttons (wenn `downloadUrls` gesetzt): Download-Icon + "GLB" + optional "BLEND"
|
||||
|
||||
**Canvas-Änderungen:**
|
||||
- `Environment preset={envPreset}` (jetzt konfigurierbar, bisher hardcoded "city")
|
||||
- `WireframeToggle`-Komponente: setzt `material.wireframe = wireframe` auf allen Mesh-Children
|
||||
- Model-URL: `mode === 'production' && productionGltfUrl ? productionGltfUrl : modelUrl`
|
||||
|
||||
**GltfErrorBoundary**: Reset bei mode-Wechsel (key prop ändern)
|
||||
|
||||
- **Akzeptanzkriterium**: Mode-Toggle erscheint wenn `productionGltfUrl` vorhanden, Wireframe-Toggle schaltet um, Env-Preset ändert Beleuchtung
|
||||
- **Abhängigkeiten**: keine
|
||||
|
||||
### Task 6: CadPreview.tsx — Production-Asset-URLs übergeben
|
||||
- **Datei**: `frontend/src/pages/CadPreview.tsx`
|
||||
- **Was**: Beim Öffnen des ThreeDViewers:
|
||||
- `GET /api/media-assets?cad_file_id={id}&asset_type=gltf_geometry` (oder gltf_production falls vorhanden)
|
||||
- Download-URLs für GLB + BLEND laden
|
||||
- `<ThreeDViewer productionGltfUrl={...} downloadUrls={...} />`
|
||||
- "Generate GLB" Button (admin/PM): ruft `POST /api/cad/{id}/generate-gltf-geometry` auf + Toast + Reload
|
||||
- **Akzeptanzkriterium**: Vorhandene MediaAssets werden als Production-URLs übergeben
|
||||
- **Abhängigkeiten**: Task 3, Task 5
|
||||
|
||||
### Task 7: Media-API — assets by cad_file_id Query-Parameter
|
||||
- **Datei**: `backend/app/domains/media/router.py`
|
||||
- **Was**: `GET /api/media-assets?cad_file_id={uuid}` — Query-Param zu `list_assets` hinzufügen (optional, nullable)
|
||||
- `list_media_assets(db, cad_file_id=...)` in service.py erweitern
|
||||
- **Akzeptanzkriterium**: `GET /api/media-assets?cad_file_id=abc` gibt nur Assets dieses CadFile zurück
|
||||
- **Abhängigkeiten**: Task 4
|
||||
|
||||
### Task 8: Frontend API — media.ts + cad.ts erweitern
|
||||
- **Datei**: `frontend/src/api/media.ts`, `frontend/src/api/cad.ts`
|
||||
- **Was**:
|
||||
- PLAN.md: Phase K als ABGESCHLOSSEN markieren
|
||||
- LEARNINGS.md: Asset Library link=True Pattern, GLB-Export Blender API
|
||||
- `git commit -m "feat(K): Blender Asset Library + production exports (GLB + .blend)"`
|
||||
- `media.ts`: `listMediaAssets(params: {cad_file_id?: string, asset_type?: string}): Promise<MediaAsset[]>`
|
||||
- `cad.ts`: `generateGltfGeometry(cadFileId: string): Promise<{task_id: string}>`
|
||||
- Interface `MediaAsset` um `cad_file_id?: string` ergänzen (falls noch nicht vorhanden)
|
||||
- **Akzeptanzkriterium**: TypeScript-Kompilierung fehlerfrei
|
||||
- **Abhängigkeiten**: Task 7
|
||||
|
||||
### Task 9: Backend — Worker-Management Endpoints
|
||||
- **Datei**: `backend/app/api/routers/worker.py`
|
||||
- **Was**: Zwei neue Endpoints (require_admin):
|
||||
|
||||
**`GET /api/worker/celery-workers`**:
|
||||
```python
|
||||
from app.tasks.celery_app import celery_app
|
||||
inspect = celery_app.control.inspect()
|
||||
active = inspect.active() or {}
|
||||
stats = inspect.stats() or {}
|
||||
# Aggregiere: worker_name, hostname, active_tasks_count, queues
|
||||
```
|
||||
Response: `list[CeleryWorkerInfo]` mit Feldern: `worker_name, hostname, active_tasks, status`
|
||||
|
||||
**`POST /api/worker/scale`** (Body: `{service: "render-worker"|"worker", count: int}`):
|
||||
```python
|
||||
import subprocess, shutil
|
||||
compose_file = os.environ.get("COMPOSE_FILE", "/docker-compose.yml")
|
||||
result = subprocess.run(
|
||||
["docker", "compose", "-f", compose_file,
|
||||
"up", "--scale", f"{service}={count}", "--no-deps", "-d"],
|
||||
capture_output=True, text=True, timeout=60
|
||||
)
|
||||
```
|
||||
- Erfordert Docker-Socket-Mount (docker-compose.yml Änderung, Task 10)
|
||||
- Validierung: count zwischen 0 und 10, service in erlaubte Liste
|
||||
|
||||
- **Akzeptanzkriterium**: `GET /api/worker/celery-workers` gibt Worker-Liste zurück (leer wenn keine aktiv)
|
||||
- **Abhängigkeiten**: keine
|
||||
|
||||
### Task 10: docker-compose.yml — Docker-Socket + Compose-File-Mount
|
||||
- **Datei**: `docker-compose.yml`
|
||||
- **Was**: Im `backend`-Service:
|
||||
```yaml
|
||||
volumes:
|
||||
- ./backend:/app
|
||||
- uploads:/app/uploads
|
||||
- /var/run/docker.sock:/var/run/docker.sock
|
||||
- ./docker-compose.yml:/docker-compose.yml
|
||||
environment:
|
||||
- COMPOSE_FILE=/docker-compose.yml
|
||||
```
|
||||
Außerdem `docker-cli` im Backend-Dockerfile installieren:
|
||||
```dockerfile
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
... docker.io \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
```
|
||||
- **Akzeptanzkriterium**: `docker compose exec backend docker compose version` funktioniert
|
||||
- **Abhängigkeiten**: Task 9
|
||||
|
||||
### Task 11: Frontend — WorkerManagement.tsx
|
||||
- **Datei**: `frontend/src/pages/WorkerManagement.tsx` (NEU)
|
||||
- **Was**: Seite mit 3 Bereichen:
|
||||
|
||||
**Section 1 — Worker-Status** (useQuery `['celery-workers']`, refetchInterval 15s):
|
||||
- Tabelle: Worker-Name, Hostname, Aktive Tasks, Status-Dot (grün=online, grau=keine Tasks)
|
||||
- Leerer Zustand: "No active workers"
|
||||
|
||||
**Section 2 — Queue-Tiefe** (aus `GET /api/worker/activity`, bestehend):
|
||||
- Karten: `step_processing` + `thumbnail_rendering` Queue-Tiefe
|
||||
- Nutzt vorhandene WorkerActivity-Daten
|
||||
|
||||
**Section 3 — Scale-Worker** (require admin):
|
||||
- Zwei Slider/Spinner: "step-worker (worker)" 1-8, "render-worker" 1-4
|
||||
- Button "Scale" → `POST /api/worker/scale`
|
||||
- Warnung: "Scaling down kills active renders"
|
||||
- Toast bei Erfolg/Fehler
|
||||
|
||||
- **Akzeptanzkriterium**: Seite lädt, Worker-Liste zeigt laufende Worker, Scale-Button sendet Request
|
||||
- **Abhängigkeiten**: Task 9, Task 12
|
||||
|
||||
### Task 12: Frontend — worker.ts API-Client
|
||||
- **Datei**: `frontend/src/api/worker.ts` (NEU oder ergänzen)
|
||||
- **Was**:
|
||||
```typescript
|
||||
export interface CeleryWorkerInfo {
|
||||
worker_name: string
|
||||
hostname: string
|
||||
active_tasks: number
|
||||
status: 'online' | 'idle'
|
||||
}
|
||||
export async function getCeleryWorkers(): Promise<CeleryWorkerInfo[]>
|
||||
export async function scaleWorker(service: string, count: number): Promise<void>
|
||||
```
|
||||
- **Akzeptanzkriterium**: TypeScript kompiliert
|
||||
- **Abhängigkeiten**: Task 9
|
||||
|
||||
### Task 13: Frontend — Route + Sidebar-Link für WorkerManagement
|
||||
- **Datei**: `frontend/src/App.tsx`, `frontend/src/components/layout/Layout.tsx`
|
||||
- **Was**:
|
||||
- App.tsx: Route `/workers` → `<WorkerManagement />`
|
||||
- Layout.tsx: Sidebar-Link "Workers" mit `Server`-Icon (admin only)
|
||||
- **Akzeptanzkriterium**: `/workers` erreichbar, Link erscheint für Admins
|
||||
- **Abhängigkeiten**: Task 11
|
||||
|
||||
### Task 14: Dockerfile — pytest installieren
|
||||
- **Datei**: `backend/Dockerfile`
|
||||
- **Was**: `pip install --no-cache-dir -e .` → `pip install --no-cache-dir -e ".[dev]"`
|
||||
- **Akzeptanzkriterium**: `docker compose exec backend pytest --version` gibt Versionsnummer aus (nach Rebuild)
|
||||
- **Abhängigkeiten**: keine
|
||||
|
||||
### Task 15: Backend-Tests — test_rendering_service.py
|
||||
- **Datei**: `backend/tests/domains/test_rendering_service.py` (NEU)
|
||||
- **Was**: ≥5 Tests:
|
||||
1. `test_dispatch_workflow_unknown_type_raises` — ValueError bei unbekanntem Typ
|
||||
2. `test_dispatch_workflow_still_builds_chain` — `_build_still` gibt Celery-Chain zurück (ohne apply_async)
|
||||
3. `test_dispatch_workflow_still_with_exports_builds_chain` — group in chain
|
||||
4. `test_publish_asset_creates_media_asset(db, admin_user)` — async, erstellt MediaAsset
|
||||
5. `test_publish_asset_nonexistent_order_line_returns_none` — graceful None
|
||||
6. (Bonus) `test_legacy_dispatch_queues_task(monkeypatch)` — mock_celery, prüft Task wurde eingereicht
|
||||
- **Akzeptanzkriterium**: `pytest tests/domains/test_rendering_service.py` → alles grün
|
||||
- **Abhängigkeiten**: Task 14
|
||||
|
||||
### Task 16: Backend-Tests — test_orders_service.py
|
||||
- **Datei**: `backend/tests/domains/test_orders_service.py` (NEU)
|
||||
- **Was**: ≥5 Tests gegen `GET/POST /api/orders` und Orders-Service-Funktionen:
|
||||
1. `test_create_order_returns_201(client, auth_headers)` — POST /api/orders
|
||||
2. `test_list_orders_empty(client, auth_headers)` — leere Liste zurück
|
||||
3. `test_get_order_404_for_unknown_id(client, auth_headers)` — 404 bei unbekannter ID
|
||||
4. `test_order_submit_status_change(client, auth_headers)` — Submit ändert Status
|
||||
5. `test_order_requires_auth(client)` — 401 ohne Token
|
||||
- **Akzeptanzkriterium**: `pytest tests/domains/test_orders_service.py` → alles grün
|
||||
- **Abhängigkeiten**: Task 14
|
||||
|
||||
### Task 17: Frontend-Tests — WorkerActivity.test.tsx + WorkerManagement.test.tsx
|
||||
- **Datei**: `frontend/src/__tests__/pages/WorkerActivity.test.tsx` (NEU), `WorkerManagement.test.tsx` (NEU)
|
||||
- **Was**:
|
||||
- WorkerActivity: Test render + "No recent activity" leerer Zustand, Mock-API-Response
|
||||
- WorkerManagement: Test render Header "Worker Management", Scale-Button vorhanden
|
||||
- Nutzen MSW handlers aus `mocks/`
|
||||
- **Akzeptanzkriterium**: `npm run test` → 0 Failures (≥5 Tests total neu)
|
||||
- **Abhängigkeiten**: Task 11
|
||||
|
||||
---
|
||||
|
||||
## Migrations-Check
|
||||
|
||||
| Migration | Phase | Status |
|
||||
|-----------|-------|--------|
|
||||
| 041 step_file_hash | F | existiert |
|
||||
| 042 invoices | G | existiert |
|
||||
| 043 import_validations | H | existiert |
|
||||
| 044 notification_configs | I | existiert |
|
||||
| **045 asset_libraries** | **K** | **fehlt** |
|
||||
| Migration | Beschreibung | Notwendig? |
|
||||
|-----------|-------------|------------|
|
||||
| 047 | `media_assets.cad_file_id FK` | **Prüfen**: `grep cad_file_id backend/app/domains/media/models.py` — wenn fehlt → ja |
|
||||
|
||||
Vor Implementierung prüfen: `cat backend/app/domains/media/models.py | grep cad_file_id`
|
||||
|
||||
---
|
||||
|
||||
## Reihenfolge-Empfehlung
|
||||
|
||||
```
|
||||
Task 1 (Bug-Fix, sofort)
|
||||
Tasks 2-6 parallel (Backend WebSocket)
|
||||
Tasks 7-8 parallel (Frontend Hook + Context)
|
||||
Tasks 9-11 (Polling ersetzen, nach 8)
|
||||
Task 12 (Commit)
|
||||
Tasks K1-K3 parallel (Datenmodell + Backend + Blender-Katalog)
|
||||
Tasks K4-K5 parallel (Blender Scripts)
|
||||
Tasks K6-K7 (Workflow + UI, nach K1-K5)
|
||||
Task K8 (Commit)
|
||||
Parallel-Gruppe 1 (keine gegenseitigen Abhängigkeiten):
|
||||
Task 1 (neue Celery-Tasks)
|
||||
Task 4 (Migration 047 prüfen + ggf. erstellen)
|
||||
Task 5 (ThreeDViewer Props)
|
||||
Task 9 (Worker-Endpoints Backend)
|
||||
Task 14 (Dockerfile pytest)
|
||||
|
||||
Nach Gruppe 1:
|
||||
Task 2 (workflow_builder reparieren) — braucht Task 1
|
||||
Task 3 (generate-gltf-geometry Endpoint) — braucht Task 1 + 4
|
||||
Task 10 (docker-compose Mount) — braucht Task 9
|
||||
Task 12 (worker.ts API) — braucht Task 9
|
||||
|
||||
Nach Gruppe 2:
|
||||
Task 6 (CadPreview anpassen) — braucht Task 3, 5
|
||||
Task 7 (media router cad_file_id param) — braucht Task 4
|
||||
Task 8 (frontend API) — braucht Task 7
|
||||
Task 11 (WorkerManagement.tsx) — braucht Task 9, 12
|
||||
|
||||
Nach Gruppe 3:
|
||||
Task 13 (Route + Sidebar) — braucht Task 11
|
||||
Task 15 (test_rendering_service.py) — braucht Task 14
|
||||
Task 16 (test_orders_service.py) — braucht Task 14
|
||||
Task 17 (frontend tests) — braucht Task 11
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Risiken / Offene Fragen
|
||||
|
||||
- **WebSocket Auth via Query-Param**: Token in Server-Logs sichtbar. Fuer v2 akzeptabel. In v3: kurzlebigen WS-Token (TTL 30s) aus JWT generieren.
|
||||
- **Redis Pub/Sub Skalierung**: Bei vielen Tenants/Tabs kann Subscriber-Loop Bottleneck werden. Fuer v2 OK. In v3: Redis Streams.
|
||||
- **Phase K -- MinIO Bucket**: `asset-libraries` Bucket muss beim Startup erstellt werden (lifespan in main.py).
|
||||
- **Phase K -- link=True** bedeutet .blend muss vor Render via MinIO heruntergeladen werden (in tmpdir). Bereits einkalkuliert in K3.
|
||||
- **Bestehende material_libraries**: Die alte `material_libraries` Tabelle/Feature bleibt parallel bestehen -- kein Breaking Change. Asset Libraries sind additiv.
|
||||
1. **media_assets.cad_file_id**: Muss vor Implementierung geprüft werden. Wenn schon vorhanden → Migration 047 entfällt.
|
||||
|
||||
2. **trimesh auf render-worker**: `trimesh` ist in `pyproject.toml` als optionale `cad`-Dependency gelistet (`trimesh>=4.2.0`). Der worker-Container muss sie installiert haben. Im render-worker Dockerfile prüfen: `pip install trimesh`.
|
||||
|
||||
3. **docker compose in Backend-Container**: Das scale-Feature setzt voraus, dass `docker.io` + compose-Plugin im Backend-Image installiert sind. Build-Zeit steigt ~30MB. Alternativ: Nur die Celery-Worker-Ansicht implementieren, Scale als Hinweis-Text mit dem CLI-Befehl.
|
||||
|
||||
4. **render_order_line_still_task vs. legacy render_order_line_task**: Beide tun ähnliches. Langfristig sollte `step_tasks.render_order_line_task` durch den neuen Task ersetzt werden. Für jetzt: Neuer Task läuft parallel, Legacy bleibt erhalten (backward-compat).
|
||||
|
||||
5. **Celery inspect Timeout**: `celery_app.control.inspect(timeout=2)` kann hängen wenn kein Worker läuft. Timeout setzen + leere Liste zurückgeben.
|
||||
|
||||
Reference in New Issue
Block a user