feat: GPU rendering + material matching + perf improvements
- GPU: fix Cycles device activation order — set compute_device_type BEFORE engine init, re-set AFTER open_mainfile wipes preferences - GPU: remove _mark_sharp_and_seams edit-mode loop (redundant with Blender 5.0 shade_smooth_by_angle), saves ~200s/render on 175 parts - Material: fix _AFN suffix mismatch — build AF-stripped mat_map keys and add prefix fallback in _apply_material_library (blender_render.py) - Material: production GLB now uses get_material_library_path() which checks active AssetLibrary instead of empty legacy system setting - Admin: RenderTemplateTable multi-select output types (M2M frontend) - Admin: MaterialLibraryPanel replaced with link to Asset Libraries - UX: move Toaster to top-left to avoid dispatch button overlap - SQLAlchemy: add .unique() to all RenderTemplate M2M collection queries - Logging: flush=True on all Blender progress prints, stdout reconfigure Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
+1
@@ -0,0 +1 @@
|
||||
/home/hartmut/Documents/Copilot/schaefflerautomat/.claude
|
||||
@@ -1,8 +1,81 @@
|
||||
Führe alle Quality Gates aus und berichte das Ergebnis:
|
||||
|
||||
1. `npm test` – alle Tests grün?
|
||||
2. `npm run lint` – keine Warnings?
|
||||
3. `git diff --stat` – welche Dateien geändert?
|
||||
## Frontend Quality Gates
|
||||
|
||||
Wenn alle Gates grün: committe mit `git commit -m "chore: quality gate passed"`
|
||||
Wenn ein Gate rot: behebe das Problem zuerst, dann erneut prüfen.
|
||||
1. **TypeScript-Check** (wichtigster Gate!):
|
||||
```bash
|
||||
docker compose exec frontend npx tsc --noEmit 2>&1
|
||||
```
|
||||
Prüft auf fehlende Imports, Typfehler, undefinierte Variablen.
|
||||
→ Fehler hier = Blank Page im Browser. Immer als erstes prüfen.
|
||||
|
||||
2. **Vite Build** (optional, langsamer):
|
||||
```bash
|
||||
docker compose exec frontend npm run build 2>&1 | tail -20
|
||||
```
|
||||
|
||||
3. **Tests**:
|
||||
```bash
|
||||
docker compose exec frontend npm test 2>&1 | tail -20
|
||||
```
|
||||
Hinweis: `npm run lint` existiert nicht — TypeScript-Check ersetzt es.
|
||||
|
||||
## Backend Quality Gates
|
||||
|
||||
4. **Python Import-Check**:
|
||||
```bash
|
||||
docker compose exec backend python -c "from app.main import app; print('OK')" 2>&1
|
||||
```
|
||||
Prüft ob alle Python-Imports auflösbar sind.
|
||||
|
||||
5. **Backend Startup-Logs**:
|
||||
```bash
|
||||
docker compose logs backend 2>&1 | tail -20
|
||||
```
|
||||
Auf `Application startup complete` prüfen, keine Exceptions.
|
||||
|
||||
## Daten-Integrität Gates
|
||||
|
||||
7. **Keine absoluten storage_key-Pfade in media_assets**:
|
||||
```bash
|
||||
docker compose exec backend python -c "
|
||||
import asyncio
|
||||
from sqlalchemy import text
|
||||
from app.database import AsyncSessionLocal
|
||||
async def main():
|
||||
async with AsyncSessionLocal() as db:
|
||||
r = await db.execute(text(\"SELECT COUNT(*) FROM media_assets WHERE storage_key LIKE '/%' AND is_archived=false\"))
|
||||
n = r.scalar()
|
||||
print(f'Absolute storage_keys: {n}')
|
||||
if n > 0:
|
||||
print('WARNUNG: Absolute Pfade brechen bei Volume-Umzug / Infrastruktur-Änderung!')
|
||||
print('Fix: UPDATE media_assets SET storage_key = replace(storage_key, ...) WHERE ...')
|
||||
asyncio.run(main())
|
||||
"
|
||||
```
|
||||
→ Erwartet: `Absolute storage_keys: 0`
|
||||
|
||||
8. **Config-Attribute prüfen** (nach config.py-Änderungen):
|
||||
```bash
|
||||
docker compose exec backend python -c "from app.config import settings; print('upload_dir:', settings.upload_dir)"
|
||||
```
|
||||
|
||||
## Übersicht
|
||||
|
||||
9. **Geänderte Dateien**:
|
||||
```bash
|
||||
git diff --stat
|
||||
```
|
||||
|
||||
## Ergebnis
|
||||
|
||||
Wenn alle Gates grün: committe mit passendem Conventional-Commit-Message.
|
||||
Wenn ein Gate rot: Problem zuerst beheben, dann erneut prüfen.
|
||||
|
||||
## Warum diese Gates?
|
||||
|
||||
- `tsc --noEmit` fängt fehlende React-Imports (`useEffect`, `useCallback` etc.) ab, die zur Laufzeit zu einer Blank Page führen — das wichtigste Gate.
|
||||
- `npm run lint` existiert in diesem Projekt nicht (kein ESLint konfiguriert).
|
||||
- `npm test` prüft nur Test-Dateien, nicht Production-Komponenten auf Importfehler.
|
||||
- Backend-Import-Check fängt Python `ImportError` ab bevor sie in Produktion auftauchen.
|
||||
- Absolute storage_key-Pfade brechen bei jedem Volume-Umzug oder Infrastruktur-Änderung (Flamenco-Entfernung hat 396 Blender-Renders unzugänglich gemacht).
|
||||
|
||||
Binary file not shown.
@@ -0,0 +1,36 @@
|
||||
"""M2M table for render templates ↔ output types.
|
||||
|
||||
Allows one render template to be linked to multiple output types.
|
||||
|
||||
Revision ID: 047
|
||||
Revises: 046
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy.dialects.postgresql import UUID
|
||||
|
||||
revision = "047"
|
||||
down_revision = "046"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.create_table(
|
||||
"render_template_output_types",
|
||||
sa.Column("template_id", UUID(as_uuid=True), sa.ForeignKey("render_templates.id", ondelete="CASCADE"), nullable=False),
|
||||
sa.Column("output_type_id", UUID(as_uuid=True), sa.ForeignKey("output_types.id", ondelete="CASCADE"), nullable=False),
|
||||
sa.PrimaryKeyConstraint("template_id", "output_type_id"),
|
||||
)
|
||||
|
||||
# Backfill from existing render_templates.output_type_id
|
||||
op.execute("""
|
||||
INSERT INTO render_template_output_types (template_id, output_type_id)
|
||||
SELECT id, output_type_id FROM render_templates
|
||||
WHERE output_type_id IS NOT NULL
|
||||
ON CONFLICT DO NOTHING
|
||||
""")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_table("render_template_output_types")
|
||||
@@ -41,6 +41,11 @@ SETTINGS_DEFAULTS: dict[str, str] = {
|
||||
"smtp_user": "",
|
||||
"smtp_password": "",
|
||||
"smtp_from_address": "",
|
||||
# glTF tessellation quality (OCC BRepMesh)
|
||||
"gltf_preview_linear_deflection": "0.1", # mm — geometry GLB for viewer
|
||||
"gltf_preview_angular_deflection": "0.5", # rad
|
||||
"gltf_production_linear_deflection": "0.03", # mm — production GLB
|
||||
"gltf_production_angular_deflection": "0.2", # rad
|
||||
# 3D viewer / glTF export settings
|
||||
"gltf_scale_factor": "0.001",
|
||||
"gltf_smooth_normals": "true",
|
||||
@@ -71,6 +76,10 @@ class SettingsOut(BaseModel):
|
||||
smtp_user: str = ""
|
||||
smtp_password: str = ""
|
||||
smtp_from_address: str = ""
|
||||
gltf_preview_linear_deflection: float = 0.1
|
||||
gltf_preview_angular_deflection: float = 0.5
|
||||
gltf_production_linear_deflection: float = 0.03
|
||||
gltf_production_angular_deflection: float = 0.2
|
||||
gltf_scale_factor: float = 0.001
|
||||
gltf_smooth_normals: bool = True
|
||||
viewer_max_distance: float = 50.0
|
||||
@@ -99,6 +108,10 @@ class SettingsUpdate(BaseModel):
|
||||
smtp_user: str | None = None
|
||||
smtp_password: str | None = None
|
||||
smtp_from_address: str | None = None
|
||||
gltf_preview_linear_deflection: float | None = None
|
||||
gltf_preview_angular_deflection: float | None = None
|
||||
gltf_production_linear_deflection: float | None = None
|
||||
gltf_production_angular_deflection: float | None = None
|
||||
gltf_scale_factor: float | None = None
|
||||
gltf_smooth_normals: bool | None = None
|
||||
viewer_max_distance: float | None = None
|
||||
@@ -213,6 +226,10 @@ def _settings_to_out(raw: dict[str, str]) -> SettingsOut:
|
||||
smtp_user=raw.get("smtp_user", ""),
|
||||
smtp_password=raw.get("smtp_password", ""),
|
||||
smtp_from_address=raw.get("smtp_from_address", ""),
|
||||
gltf_preview_linear_deflection=float(raw.get("gltf_preview_linear_deflection", "0.1")),
|
||||
gltf_preview_angular_deflection=float(raw.get("gltf_preview_angular_deflection", "0.5")),
|
||||
gltf_production_linear_deflection=float(raw.get("gltf_production_linear_deflection", "0.03")),
|
||||
gltf_production_angular_deflection=float(raw.get("gltf_production_angular_deflection", "0.2")),
|
||||
gltf_scale_factor=float(raw.get("gltf_scale_factor", "0.001")),
|
||||
gltf_smooth_normals=raw.get("gltf_smooth_normals", "true") == "true",
|
||||
viewer_max_distance=float(raw.get("viewer_max_distance", "50")),
|
||||
@@ -328,6 +345,22 @@ async def update_settings(
|
||||
updates["gltf_pbr_roughness"] = str(body.gltf_pbr_roughness)
|
||||
if body.gltf_pbr_metallic is not None:
|
||||
updates["gltf_pbr_metallic"] = str(body.gltf_pbr_metallic)
|
||||
if body.gltf_preview_linear_deflection is not None:
|
||||
if not (0.001 <= body.gltf_preview_linear_deflection <= 10.0):
|
||||
raise HTTPException(400, detail="gltf_preview_linear_deflection must be 0.001–10.0 mm")
|
||||
updates["gltf_preview_linear_deflection"] = str(body.gltf_preview_linear_deflection)
|
||||
if body.gltf_preview_angular_deflection is not None:
|
||||
if not (0.05 <= body.gltf_preview_angular_deflection <= 1.5):
|
||||
raise HTTPException(400, detail="gltf_preview_angular_deflection must be 0.05–1.5 rad")
|
||||
updates["gltf_preview_angular_deflection"] = str(body.gltf_preview_angular_deflection)
|
||||
if body.gltf_production_linear_deflection is not None:
|
||||
if not (0.001 <= body.gltf_production_linear_deflection <= 10.0):
|
||||
raise HTTPException(400, detail="gltf_production_linear_deflection must be 0.001–10.0 mm")
|
||||
updates["gltf_production_linear_deflection"] = str(body.gltf_production_linear_deflection)
|
||||
if body.gltf_production_angular_deflection is not None:
|
||||
if not (0.05 <= body.gltf_production_angular_deflection <= 1.5):
|
||||
raise HTTPException(400, detail="gltf_production_angular_deflection must be 0.05–1.5 rad")
|
||||
updates["gltf_production_angular_deflection"] = str(body.gltf_production_angular_deflection)
|
||||
|
||||
for k, v in updates.items():
|
||||
await _save_setting(db, k, v)
|
||||
@@ -470,6 +503,40 @@ async def generate_missing_geometry_glbs(
|
||||
return {"queued": queued, "message": f"Queued {queued} missing geometry GLB task(s)"}
|
||||
|
||||
|
||||
@router.post("/settings/recover-stuck-processing", status_code=status.HTTP_200_OK)
|
||||
async def recover_stuck_processing(
|
||||
admin: User = Depends(require_admin),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Reset CAD files stuck in 'processing' for more than 10 minutes to 'failed'.
|
||||
|
||||
Call this when a CAD file shows 'processing' indefinitely. The auto-recovery
|
||||
beat task also runs every 5 minutes, so this is just for immediate relief.
|
||||
"""
|
||||
from datetime import datetime, timedelta
|
||||
from sqlalchemy import update as sql_update, and_
|
||||
|
||||
cutoff = datetime.utcnow() - timedelta(minutes=10)
|
||||
result = await db.execute(
|
||||
sql_update(CadFile)
|
||||
.where(
|
||||
and_(
|
||||
CadFile.processing_status == ProcessingStatus.processing,
|
||||
CadFile.updated_at < cutoff,
|
||||
)
|
||||
)
|
||||
.values(
|
||||
processing_status=ProcessingStatus.failed,
|
||||
error_message="Processing timed out — worker may have crashed. Use 'Regenerate Thumbnail' to retry.",
|
||||
)
|
||||
.returning(CadFile.id)
|
||||
)
|
||||
reset_ids = [str(r[0]) for r in result.fetchall()]
|
||||
await db.commit()
|
||||
return {"reset": len(reset_ids), "ids": reset_ids,
|
||||
"message": f"Reset {len(reset_ids)} stuck file(s) to 'failed'"}
|
||||
|
||||
|
||||
@router.post("/settings/seed-workflows", status_code=status.HTTP_200_OK)
|
||||
async def seed_workflows(
|
||||
admin: User = Depends(require_admin),
|
||||
|
||||
@@ -348,3 +348,32 @@ async def regenerate_thumbnail(
|
||||
}
|
||||
|
||||
|
||||
@router.post("/{id}/reset-stuck", status_code=status.HTTP_200_OK)
|
||||
async def reset_stuck_processing(
|
||||
id: uuid.UUID,
|
||||
user: User = Depends(get_current_user),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Force-reset a CAD file that is stuck in 'processing' to 'failed'.
|
||||
|
||||
Use when a file shows 'processing' indefinitely due to a worker crash.
|
||||
After resetting, click 'Regen thumbnail' to retry.
|
||||
"""
|
||||
if user.role.value not in ("admin", "project_manager"):
|
||||
raise HTTPException(status_code=403, detail="Insufficient permissions")
|
||||
|
||||
cad = await _get_cad_file(id, db)
|
||||
|
||||
if cad.processing_status != ProcessingStatus.processing:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"CAD file is not stuck — current status: {cad.processing_status.value}",
|
||||
)
|
||||
|
||||
cad.processing_status = ProcessingStatus.failed
|
||||
cad.error_message = "Manually reset — worker may have crashed. Use 'Regen thumbnail' to retry."
|
||||
await db.commit()
|
||||
|
||||
return {"cad_file_id": str(cad.id), "status": "failed", "message": "Reset to 'failed'. Use 'Regen thumbnail' to retry."}
|
||||
|
||||
|
||||
|
||||
@@ -35,8 +35,10 @@ class RenderTemplateOut(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
category_key: str | None
|
||||
output_type_id: str | None
|
||||
output_type_name: str | None
|
||||
output_type_id: str | None # legacy single FK
|
||||
output_type_name: str | None # legacy
|
||||
output_type_ids: list[str] # M2M
|
||||
output_type_names: list[str] # M2M display names
|
||||
blend_file_path: str
|
||||
original_filename: str
|
||||
target_collection: str
|
||||
@@ -54,7 +56,7 @@ class RenderTemplateOut(BaseModel):
|
||||
class RenderTemplateUpdate(BaseModel):
|
||||
name: str | None = None
|
||||
category_key: str | None = None
|
||||
output_type_id: str | None = None
|
||||
output_type_ids: list[str] | None = None # replaces output_type_id
|
||||
target_collection: str | None = None
|
||||
material_replace_enabled: bool | None = None
|
||||
lighting_only: bool | None = None
|
||||
@@ -74,12 +76,17 @@ def _to_out(t: RenderTemplate) -> dict:
|
||||
ot_name = None
|
||||
if t.output_type:
|
||||
ot_name = t.output_type.name
|
||||
# M2M output types
|
||||
ot_ids = [str(ot.id) for ot in t.output_types] if t.output_types else []
|
||||
ot_names = [ot.name for ot in t.output_types] if t.output_types else []
|
||||
return {
|
||||
"id": str(t.id),
|
||||
"name": t.name,
|
||||
"category_key": t.category_key,
|
||||
"output_type_id": str(t.output_type_id) if t.output_type_id else None,
|
||||
"output_type_name": ot_name,
|
||||
"output_type_ids": ot_ids,
|
||||
"output_type_names": ot_names,
|
||||
"blend_file_path": t.blend_file_path,
|
||||
"original_filename": t.original_filename,
|
||||
"target_collection": t.target_collection,
|
||||
@@ -103,7 +110,7 @@ async def list_render_templates(
|
||||
result = await db.execute(
|
||||
select(RenderTemplate).order_by(RenderTemplate.created_at.desc())
|
||||
)
|
||||
return [_to_out(t) for t in result.scalars().all()]
|
||||
return [_to_out(t) for t in result.unique().scalars().all()]
|
||||
|
||||
|
||||
@router.post("/render-templates", response_model=RenderTemplateOut, status_code=status.HTTP_201_CREATED)
|
||||
@@ -151,6 +158,17 @@ async def create_render_template(
|
||||
camera_orbit=camera_orbit,
|
||||
)
|
||||
db.add(tmpl)
|
||||
await db.flush()
|
||||
|
||||
# Sync M2M from initial output_type_id
|
||||
if ot_uuid:
|
||||
from app.domains.rendering.models import render_template_output_types
|
||||
await db.execute(
|
||||
render_template_output_types.insert().values(
|
||||
template_id=template_id, output_type_id=ot_uuid,
|
||||
)
|
||||
)
|
||||
|
||||
await db.commit()
|
||||
await db.refresh(tmpl)
|
||||
|
||||
@@ -170,7 +188,7 @@ async def update_render_template(
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
result = await db.execute(select(RenderTemplate).where(RenderTemplate.id == template_id))
|
||||
tmpl = result.scalar_one_or_none()
|
||||
tmpl = result.unique().scalar_one_or_none()
|
||||
if not tmpl:
|
||||
raise HTTPException(404, detail="Render template not found")
|
||||
|
||||
@@ -179,12 +197,9 @@ async def update_render_template(
|
||||
# Normalise empty strings to None for nullable fields
|
||||
if "category_key" in updates and updates["category_key"] in ("", "null"):
|
||||
updates["category_key"] = None
|
||||
if "output_type_id" in updates:
|
||||
val = updates["output_type_id"]
|
||||
if val in ("", "null", None):
|
||||
updates["output_type_id"] = None
|
||||
else:
|
||||
updates["output_type_id"] = uuid.UUID(val)
|
||||
|
||||
# Handle M2M output_type_ids
|
||||
new_ot_ids: list[str] | None = updates.pop("output_type_ids", None)
|
||||
|
||||
if updates:
|
||||
updates["updated_at"] = datetime.utcnow()
|
||||
@@ -193,9 +208,34 @@ async def update_render_template(
|
||||
.where(RenderTemplate.id == template_id)
|
||||
.values(**updates)
|
||||
)
|
||||
await db.commit()
|
||||
await db.refresh(tmpl)
|
||||
|
||||
# Sync M2M relationship
|
||||
if new_ot_ids is not None:
|
||||
from app.domains.rendering.models import render_template_output_types
|
||||
# Delete existing links
|
||||
await db.execute(
|
||||
sql_delete(render_template_output_types).where(
|
||||
render_template_output_types.c.template_id == template_id
|
||||
)
|
||||
)
|
||||
# Insert new links
|
||||
for ot_id in new_ot_ids:
|
||||
await db.execute(
|
||||
render_template_output_types.insert().values(
|
||||
template_id=template_id,
|
||||
output_type_id=uuid.UUID(ot_id),
|
||||
)
|
||||
)
|
||||
# Also update legacy FK to first OT (for backward compat)
|
||||
legacy_ot = uuid.UUID(new_ot_ids[0]) if new_ot_ids else None
|
||||
await db.execute(
|
||||
sql_update(RenderTemplate)
|
||||
.where(RenderTemplate.id == template_id)
|
||||
.values(output_type_id=legacy_ot, updated_at=datetime.utcnow())
|
||||
)
|
||||
|
||||
await db.commit()
|
||||
await db.refresh(tmpl)
|
||||
return _to_out(tmpl)
|
||||
|
||||
|
||||
@@ -206,7 +246,7 @@ async def delete_render_template(
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
result = await db.execute(select(RenderTemplate).where(RenderTemplate.id == template_id))
|
||||
tmpl = result.scalar_one_or_none()
|
||||
tmpl = result.unique().scalar_one_or_none()
|
||||
if not tmpl:
|
||||
raise HTTPException(404, detail="Render template not found")
|
||||
|
||||
@@ -231,7 +271,7 @@ async def upload_blend_file(
|
||||
raise HTTPException(400, detail="File must be a .blend file")
|
||||
|
||||
result = await db.execute(select(RenderTemplate).where(RenderTemplate.id == template_id))
|
||||
tmpl = result.scalar_one_or_none()
|
||||
tmpl = result.unique().scalar_one_or_none()
|
||||
if not tmpl:
|
||||
raise HTTPException(404, detail="Render template not found")
|
||||
|
||||
@@ -266,7 +306,7 @@ async def download_blend_file(
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
result = await db.execute(select(RenderTemplate).where(RenderTemplate.id == template_id))
|
||||
tmpl = result.scalar_one_or_none()
|
||||
tmpl = result.unique().scalar_one_or_none()
|
||||
if not tmpl:
|
||||
raise HTTPException(404, detail="Render template not found")
|
||||
|
||||
|
||||
@@ -1,10 +1,19 @@
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
from sqlalchemy import String, DateTime, Boolean, Text, Integer, Float, ForeignKey
|
||||
from sqlalchemy import String, DateTime, Boolean, Text, Integer, Float, ForeignKey, Table, Column
|
||||
from sqlalchemy.orm import Mapped, mapped_column, relationship
|
||||
from sqlalchemy.dialects.postgresql import UUID, JSONB
|
||||
from app.database import Base
|
||||
|
||||
|
||||
# M2M: render templates ↔ output types
|
||||
render_template_output_types = Table(
|
||||
"render_template_output_types",
|
||||
Base.metadata,
|
||||
Column("template_id", UUID(as_uuid=True), ForeignKey("render_templates.id", ondelete="CASCADE"), primary_key=True),
|
||||
Column("output_type_id", UUID(as_uuid=True), ForeignKey("output_types.id", ondelete="CASCADE"), primary_key=True),
|
||||
)
|
||||
|
||||
VALID_RENDER_BACKENDS = {"celery"}
|
||||
|
||||
|
||||
@@ -66,7 +75,10 @@ class RenderTemplate(Base):
|
||||
created_at: Mapped[datetime] = mapped_column(DateTime, nullable=False, server_default="now()")
|
||||
updated_at: Mapped[datetime] = mapped_column(DateTime, nullable=False, server_default="now()", onupdate=datetime.utcnow)
|
||||
|
||||
# Legacy single FK (kept for backward compat, prefer output_types M2M)
|
||||
output_type = relationship("OutputType", lazy="joined")
|
||||
# M2M: multiple output types per template
|
||||
output_types = relationship("OutputType", secondary=render_template_output_types, lazy="joined")
|
||||
|
||||
|
||||
class ProductRenderPosition(Base):
|
||||
|
||||
@@ -27,7 +27,7 @@ def _glb_from_step(step_path: Path, glb_path: Path, quality: str = "low") -> Non
|
||||
import sys as _sys
|
||||
|
||||
linear_deflection = 0.3 if quality == "low" else 0.05
|
||||
angular_deflection = 0.3 if quality == "low" else 0.1
|
||||
angular_deflection = 0.5 if quality == "low" else 0.2
|
||||
|
||||
scripts_dir = Path(os.environ.get("RENDER_SCRIPTS_DIR", "/render-scripts"))
|
||||
script_path = scripts_dir / "export_step_to_gltf.py"
|
||||
@@ -95,6 +95,7 @@ def render_still(
|
||||
denoising_quality: str = "",
|
||||
denoising_use_gpu: str = "",
|
||||
mesh_attributes: dict | None = None,
|
||||
log_callback: "Callable[[str], None] | None" = None,
|
||||
) -> dict:
|
||||
"""Convert STEP → GLB (OCC) → PNG (Blender subprocess).
|
||||
|
||||
@@ -170,49 +171,75 @@ def render_still(
|
||||
cmd += ["--mesh-attributes", json.dumps(mesh_attributes)]
|
||||
return cmd
|
||||
|
||||
def _run(eng: str) -> subprocess.CompletedProcess:
|
||||
def _run(eng: str) -> tuple[int, list[str], list[str]]:
|
||||
"""Run Blender subprocess, streaming stdout line-by-line.
|
||||
|
||||
Returns (returncode, stdout_lines, stderr_lines).
|
||||
"""
|
||||
import selectors
|
||||
proc = subprocess.Popen(
|
||||
_build_cmd(eng),
|
||||
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
|
||||
text=True, env=env, start_new_session=True,
|
||||
)
|
||||
stdout_lines: list[str] = []
|
||||
stderr_lines: list[str] = []
|
||||
deadline = time.monotonic() + 600
|
||||
|
||||
sel = selectors.DefaultSelector()
|
||||
sel.register(proc.stdout, selectors.EVENT_READ, "stdout")
|
||||
sel.register(proc.stderr, selectors.EVENT_READ, "stderr")
|
||||
|
||||
try:
|
||||
stdout, stderr = proc.communicate(timeout=600)
|
||||
except subprocess.TimeoutExpired:
|
||||
try:
|
||||
os.killpg(os.getpgid(proc.pid), signal.SIGTERM)
|
||||
except (ProcessLookupError, OSError):
|
||||
pass
|
||||
stdout, stderr = proc.communicate()
|
||||
return subprocess.CompletedProcess(_build_cmd(eng), proc.returncode, stdout, stderr)
|
||||
while sel.get_map():
|
||||
remaining = deadline - time.monotonic()
|
||||
if remaining <= 0:
|
||||
try:
|
||||
os.killpg(os.getpgid(proc.pid), signal.SIGTERM)
|
||||
except (ProcessLookupError, OSError):
|
||||
pass
|
||||
break
|
||||
events = sel.select(timeout=min(remaining, 2.0))
|
||||
for key, _ in events:
|
||||
line = key.fileobj.readline()
|
||||
if not line:
|
||||
sel.unregister(key.fileobj)
|
||||
continue
|
||||
line = line.rstrip("\n")
|
||||
if key.data == "stdout":
|
||||
stdout_lines.append(line)
|
||||
logger.info("[blender] %s", line)
|
||||
if log_callback and "[blender_render]" in line:
|
||||
log_callback(line)
|
||||
else:
|
||||
stderr_lines.append(line)
|
||||
logger.warning("[blender stderr] %s", line)
|
||||
finally:
|
||||
sel.close()
|
||||
|
||||
proc.wait(timeout=10)
|
||||
return proc.returncode, stdout_lines, stderr_lines
|
||||
|
||||
t_render = time.monotonic()
|
||||
result = _run(engine)
|
||||
returncode, stdout_lines, stderr_lines = _run(engine)
|
||||
engine_used = engine
|
||||
|
||||
log_lines = []
|
||||
for line in (result.stdout or "").splitlines():
|
||||
logger.info("[blender] %s", line)
|
||||
if "[blender_render]" in line:
|
||||
log_lines.append(line)
|
||||
for line in (result.stderr or "").splitlines():
|
||||
logger.warning("[blender stderr] %s", line)
|
||||
log_lines = [l for l in stdout_lines if "[blender_render]" in l]
|
||||
|
||||
# EEVEE fallback to Cycles on non-signal error
|
||||
if result.returncode > 0 and engine == "eevee":
|
||||
logger.warning("EEVEE failed (exit %d) — retrying with Cycles", result.returncode)
|
||||
result = _run("cycles")
|
||||
if returncode > 0 and engine == "eevee":
|
||||
logger.warning("EEVEE failed (exit %d) — retrying with Cycles", returncode)
|
||||
returncode, stdout_lines2, stderr_lines2 = _run("cycles")
|
||||
engine_used = "cycles (eevee fallback)"
|
||||
for line in (result.stdout or "").splitlines():
|
||||
logger.info("[blender-fallback] %s", line)
|
||||
if "[blender_render]" in line:
|
||||
log_lines.append(line)
|
||||
log_lines.extend(l for l in stdout_lines2 if "[blender_render]" in l)
|
||||
|
||||
if result.returncode != 0:
|
||||
if returncode != 0:
|
||||
stdout_tail = "\n".join(stdout_lines[-50:]) if stdout_lines else ""
|
||||
stderr_tail = "\n".join(stderr_lines[-20:]) if stderr_lines else ""
|
||||
raise RuntimeError(
|
||||
f"Blender exited with code {result.returncode}.\n"
|
||||
f"stdout: {(result.stdout or '')[-2000:]}\n"
|
||||
f"stderr: {(result.stderr or '')[-500:]}"
|
||||
f"Blender exited with code {returncode}.\n"
|
||||
f"stdout: {stdout_tail[-2000:]}\n"
|
||||
f"stderr: {stderr_tail[-500:]}"
|
||||
)
|
||||
|
||||
render_duration_s = round(time.monotonic() - t_render, 2)
|
||||
|
||||
@@ -715,6 +715,7 @@ def render_to_file(
|
||||
denoising_prefilter: str = "",
|
||||
denoising_quality: str = "",
|
||||
denoising_use_gpu: str = "",
|
||||
order_line_id: str | None = None,
|
||||
) -> tuple[bool, dict]:
|
||||
"""Render a STEP file to a specific output path using current system settings.
|
||||
|
||||
@@ -734,6 +735,7 @@ def render_to_file(
|
||||
target_collection: Blender collection name to import geometry into.
|
||||
material_library_path: Optional path to material library .blend file.
|
||||
material_map: Optional {part_name: material_name} for material replacement.
|
||||
order_line_id: Optional order line ID for live log streaming.
|
||||
|
||||
Returns:
|
||||
(success: bool, render_log: dict)
|
||||
@@ -819,6 +821,11 @@ def render_to_file(
|
||||
if denoising_use_gpu:
|
||||
extra["denoising_use_gpu"] = denoising_use_gpu
|
||||
from app.services.render_blender import is_blender_available, render_still
|
||||
# Build live-log callback for streaming Blender output to Redis
|
||||
_log_cb = None
|
||||
if order_line_id:
|
||||
from app.services import render_log as _rl
|
||||
_log_cb = lambda line: _rl.emit(order_line_id, line)
|
||||
if is_blender_available():
|
||||
try:
|
||||
service_data = render_still(
|
||||
@@ -845,6 +852,7 @@ def render_to_file(
|
||||
denoising_prefilter=denoising_prefilter,
|
||||
denoising_quality=denoising_quality,
|
||||
denoising_use_gpu=denoising_use_gpu,
|
||||
log_callback=_log_cb,
|
||||
)
|
||||
rendered_png = tmp_png if tmp_png.exists() else None
|
||||
except Exception as exc:
|
||||
|
||||
@@ -4,19 +4,20 @@ Used from Celery tasks (sync context) to find the best matching .blend template
|
||||
for a given category + output type combination.
|
||||
|
||||
Cascade priority (first active match wins):
|
||||
1. Exact: category_key + output_type_id
|
||||
2. Category only: category_key + output_type_id IS NULL
|
||||
3. OT only: category_key IS NULL + output_type_id
|
||||
4. Global: both NULL
|
||||
1. Exact: category_key + output_type linked via M2M
|
||||
2. Category only: category_key + no output_types linked
|
||||
3. OT only: category_key IS NULL + output_type linked via M2M
|
||||
4. Global: category_key IS NULL + no output_types linked
|
||||
5. No template → caller falls back to factory-settings behavior
|
||||
"""
|
||||
import logging
|
||||
|
||||
from sqlalchemy import create_engine, select, and_
|
||||
from sqlalchemy import create_engine, select, and_, exists
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from app.models.render_template import RenderTemplate
|
||||
from app.models.system_setting import SystemSetting
|
||||
from app.domains.rendering.models import render_template_output_types
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -37,63 +38,92 @@ def resolve_template(
|
||||
) -> RenderTemplate | None:
|
||||
"""Find the best matching active render template.
|
||||
|
||||
Uses the M2M render_template_output_types table for output type matching.
|
||||
Uses sync SQLAlchemy — safe for Celery tasks.
|
||||
"""
|
||||
engine = _get_engine()
|
||||
with Session(engine) as session:
|
||||
active = RenderTemplate.is_active == True # noqa: E712
|
||||
|
||||
# 1. Exact match
|
||||
# Helper: subquery checking if a template is linked to a specific OT
|
||||
def _has_ot(ot_id):
|
||||
return exists(
|
||||
select(render_template_output_types.c.template_id).where(and_(
|
||||
render_template_output_types.c.template_id == RenderTemplate.id,
|
||||
render_template_output_types.c.output_type_id == ot_id,
|
||||
))
|
||||
)
|
||||
|
||||
# Helper: subquery checking if a template has NO linked OTs
|
||||
_no_ots = ~exists(
|
||||
select(render_template_output_types.c.template_id).where(
|
||||
render_template_output_types.c.template_id == RenderTemplate.id,
|
||||
)
|
||||
)
|
||||
|
||||
# 1. Exact match: category_key + output_type in M2M
|
||||
if category_key and output_type_id:
|
||||
row = session.execute(
|
||||
select(RenderTemplate).where(and_(
|
||||
active,
|
||||
RenderTemplate.category_key == category_key,
|
||||
RenderTemplate.output_type_id == output_type_id,
|
||||
_has_ot(output_type_id),
|
||||
))
|
||||
).scalar_one_or_none()
|
||||
).unique().scalar_one_or_none()
|
||||
if row:
|
||||
return row
|
||||
|
||||
# 2. Category only
|
||||
# 2. Category only: category_key + no OTs linked
|
||||
if category_key:
|
||||
row = session.execute(
|
||||
select(RenderTemplate).where(and_(
|
||||
active,
|
||||
RenderTemplate.category_key == category_key,
|
||||
RenderTemplate.output_type_id.is_(None),
|
||||
_no_ots,
|
||||
))
|
||||
).scalar_one_or_none()
|
||||
).unique().scalar_one_or_none()
|
||||
if row:
|
||||
return row
|
||||
|
||||
# 3. OT only
|
||||
# 3. OT only: no category_key + output_type in M2M
|
||||
if output_type_id:
|
||||
row = session.execute(
|
||||
select(RenderTemplate).where(and_(
|
||||
active,
|
||||
RenderTemplate.category_key.is_(None),
|
||||
RenderTemplate.output_type_id == output_type_id,
|
||||
_has_ot(output_type_id),
|
||||
))
|
||||
).scalar_one_or_none()
|
||||
).unique().scalar_one_or_none()
|
||||
if row:
|
||||
return row
|
||||
|
||||
# 4. Global fallback (both NULL)
|
||||
# 4. Global fallback: no category_key + no OTs linked
|
||||
row = session.execute(
|
||||
select(RenderTemplate).where(and_(
|
||||
active,
|
||||
RenderTemplate.category_key.is_(None),
|
||||
RenderTemplate.output_type_id.is_(None),
|
||||
_no_ots,
|
||||
))
|
||||
).scalar_one_or_none()
|
||||
return row
|
||||
|
||||
|
||||
def get_material_library_path() -> str | None:
|
||||
"""Read material_library_path from system_settings. Returns None if empty."""
|
||||
"""Return the blend_file_path of the first active AssetLibrary.
|
||||
|
||||
Falls back to the legacy material_library_path system setting.
|
||||
"""
|
||||
engine = _get_engine()
|
||||
with Session(engine) as session:
|
||||
# Prefer active AssetLibrary
|
||||
from app.domains.materials.models import AssetLibrary
|
||||
row = session.execute(
|
||||
select(AssetLibrary).where(AssetLibrary.is_active == True).limit(1) # noqa: E712
|
||||
).scalar_one_or_none()
|
||||
if row and row.blend_file_path:
|
||||
return row.blend_file_path
|
||||
|
||||
# Fallback to legacy system setting
|
||||
row = session.execute(
|
||||
select(SystemSetting).where(SystemSetting.key == "material_library_path")
|
||||
).scalar_one_or_none()
|
||||
|
||||
@@ -3,6 +3,7 @@ from __future__ import annotations
|
||||
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from celery import shared_task
|
||||
|
||||
@@ -31,3 +32,51 @@ def broadcast_queue_status() -> None:
|
||||
logger.debug("Broadcast queue_update: %s", depths)
|
||||
except Exception as exc:
|
||||
logger.warning("broadcast_queue_status failed: %s", exc)
|
||||
|
||||
|
||||
@shared_task(name="app.tasks.beat_tasks.recover_stuck_cad_files", queue="step_processing")
|
||||
def recover_stuck_cad_files() -> None:
|
||||
"""Reset CAD files stuck in 'processing' for more than 10 minutes to 'failed'.
|
||||
|
||||
This recovers from worker crashes (container restarts, OOM kills) that leave
|
||||
the processing_status committed as 'processing' with no task running to complete it.
|
||||
Runs every 5 minutes via Celery Beat.
|
||||
"""
|
||||
try:
|
||||
from sqlalchemy import create_engine, update, and_
|
||||
from sqlalchemy.orm import Session
|
||||
from app.config import settings
|
||||
from app.models.cad_file import CadFile, ProcessingStatus
|
||||
|
||||
cutoff = datetime.utcnow() - timedelta(minutes=10)
|
||||
sync_url = settings.database_url.replace("+asyncpg", "")
|
||||
engine = create_engine(sync_url)
|
||||
with Session(engine) as session:
|
||||
result = session.execute(
|
||||
update(CadFile)
|
||||
.where(
|
||||
and_(
|
||||
CadFile.processing_status == ProcessingStatus.processing,
|
||||
CadFile.updated_at < cutoff,
|
||||
)
|
||||
)
|
||||
.values(
|
||||
processing_status=ProcessingStatus.failed,
|
||||
error_message="Processing timed out — worker may have crashed. Use 'Regenerate Thumbnail' to retry.",
|
||||
)
|
||||
.returning(CadFile.id, CadFile.original_name)
|
||||
)
|
||||
rows = result.fetchall()
|
||||
session.commit()
|
||||
engine.dispose()
|
||||
|
||||
if rows:
|
||||
names = [r[1] for r in rows]
|
||||
logger.warning(
|
||||
"recover_stuck_cad_files: reset %d stuck file(s) to failed: %s",
|
||||
len(rows), names,
|
||||
)
|
||||
else:
|
||||
logger.debug("recover_stuck_cad_files: no stuck files found")
|
||||
except Exception as exc:
|
||||
logger.error("recover_stuck_cad_files failed: %s", exc)
|
||||
|
||||
@@ -34,5 +34,9 @@ celery_app.conf.update(
|
||||
"task": "app.tasks.beat_tasks.broadcast_queue_status",
|
||||
"schedule": 10.0, # every 10 seconds
|
||||
},
|
||||
"recover-stuck-cad-files-every-5m": {
|
||||
"task": "app.tasks.beat_tasks.recover_stuck_cad_files",
|
||||
"schedule": 300.0, # every 5 minutes
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
+118
-44
@@ -363,6 +363,8 @@ def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
from app.config import settings as app_settings
|
||||
from app.models.cad_file import CadFile
|
||||
|
||||
from app.models.system_setting import SystemSetting as _SysSetting
|
||||
|
||||
sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
eng = create_engine(sync_url)
|
||||
with Session(eng) as session:
|
||||
@@ -386,8 +388,14 @@ def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
hex_color = entry.get("hex_color") or entry.get("color", "")
|
||||
if part_name and hex_color:
|
||||
color_map[part_name] = hex_color
|
||||
|
||||
settings_rows = session.execute(_select(_SysSetting)).scalars().all()
|
||||
sys_settings = {s.key: s.value for s in settings_rows}
|
||||
eng.dispose()
|
||||
|
||||
linear_deflection = float(sys_settings.get("gltf_preview_linear_deflection", "0.1"))
|
||||
angular_deflection = float(sys_settings.get("gltf_preview_angular_deflection", "0.5"))
|
||||
|
||||
step = _Path(step_path_str)
|
||||
if not step.exists():
|
||||
log_task_event(self.request.id, f"Failed: STEP file not found: {step}", "error")
|
||||
@@ -411,7 +419,14 @@ def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
"--step_path", str(step),
|
||||
"--output_path", str(output_path),
|
||||
"--color_map", _json.dumps(color_map),
|
||||
"--linear_deflection", str(linear_deflection),
|
||||
"--angular_deflection", str(angular_deflection),
|
||||
]
|
||||
log_task_event(
|
||||
self.request.id,
|
||||
f"OCC tessellation: linear={linear_deflection}mm, angular={angular_deflection}rad",
|
||||
"info",
|
||||
)
|
||||
|
||||
try:
|
||||
result = _subprocess.run(cmd, capture_output=True, text=True, timeout=120)
|
||||
@@ -485,6 +500,7 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
import json as _json
|
||||
import os as _os
|
||||
import subprocess as _subprocess
|
||||
import sys as _sys
|
||||
import uuid as _uuid
|
||||
from pathlib import Path as _Path
|
||||
|
||||
@@ -500,53 +516,97 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
_sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
_eng = _ce(_sync_url)
|
||||
|
||||
# --- 1. Resolve geometry GLB path from existing gltf_geometry MediaAsset ---
|
||||
with _Session(_eng) as _sess:
|
||||
_row = _sess.execute(
|
||||
_sel(MediaAsset).where(
|
||||
MediaAsset.cad_file_id == _uuid.UUID(cad_file_id),
|
||||
MediaAsset.asset_type == MediaAssetType.gltf_geometry,
|
||||
)
|
||||
).scalar_one_or_none()
|
||||
geom_glb_key = _row.storage_key if _row else None
|
||||
|
||||
if not geom_glb_key:
|
||||
# Trigger geometry generation first and retry this task
|
||||
log_task_event(self.request.id, "No gltf_geometry asset found — queuing geometry task first", "info")
|
||||
generate_gltf_geometry_task.delay(cad_file_id, product_id)
|
||||
raise self.retry(exc=RuntimeError("gltf_geometry not yet available"), countdown=30, max_retries=2)
|
||||
|
||||
geom_glb_path = _Path(app_settings.upload_dir) / geom_glb_key
|
||||
if not geom_glb_path.exists():
|
||||
raise RuntimeError(f"Geometry GLB not found on disk: {geom_glb_path}")
|
||||
|
||||
# --- 2. Resolve material map (SCHAEFFLER library names) ---
|
||||
from app.services.material_service import resolve_material_map
|
||||
|
||||
with _Session(_eng) as _sess:
|
||||
from app.models.cad_file import CadFile as _CF
|
||||
_cad = _sess.execute(_sel(_CF).where(_CF.id == _uuid.UUID(cad_file_id))).scalar_one_or_none()
|
||||
raw_mat_map: dict = {}
|
||||
if _cad and _cad.cad_part_materials:
|
||||
raw_mat_map = _cad.cad_part_materials
|
||||
|
||||
mat_map = resolve_material_map(raw_mat_map)
|
||||
|
||||
# --- 3. Resolve asset library .blend path from system settings ---
|
||||
# --- 1. Resolve STEP file path and system settings ---
|
||||
from app.models.cad_file import CadFile as _CF
|
||||
from app.models.system_setting import SystemSetting
|
||||
with _Session(_eng) as _sess:
|
||||
_setting = _sess.execute(
|
||||
_sel(SystemSetting).where(SystemSetting.key == "asset_library_blend")
|
||||
).scalar_one_or_none()
|
||||
asset_library_blend = _setting.value if _setting and _setting.value else ""
|
||||
_eng.dispose()
|
||||
|
||||
# Output path next to geometry GLB
|
||||
output_path = geom_glb_path.parent / (geom_glb_path.stem.replace("_geometry", "") + "_production.glb")
|
||||
with _Session(_eng) as _sess:
|
||||
_cad = _sess.execute(
|
||||
_sel(_CF).where(_CF.id == _uuid.UUID(cad_file_id))
|
||||
).scalar_one_or_none()
|
||||
step_path_str = _cad.stored_path if _cad else None
|
||||
|
||||
settings_rows = _sess.execute(_sel(SystemSetting)).scalars().all()
|
||||
sys_settings = {s.key: s.value for s in settings_rows}
|
||||
|
||||
if not step_path_str:
|
||||
raise RuntimeError(f"CadFile {cad_file_id} not found in DB")
|
||||
step_path = _Path(step_path_str)
|
||||
if not step_path.exists():
|
||||
raise RuntimeError(f"STEP file not found: {step_path}")
|
||||
|
||||
smooth_angle = float(sys_settings.get("blender_smooth_angle", "30"))
|
||||
prod_linear = float(sys_settings.get("gltf_production_linear_deflection", "0.03"))
|
||||
prod_angular = float(sys_settings.get("gltf_production_angular_deflection", "0.2"))
|
||||
|
||||
scripts_dir = _Path(_os.environ.get("RENDER_SCRIPTS_DIR", "/render-scripts"))
|
||||
export_script = scripts_dir / "export_gltf.py"
|
||||
occ_script = scripts_dir / "export_step_to_gltf.py"
|
||||
if not occ_script.exists():
|
||||
raise RuntimeError(f"export_step_to_gltf.py not found at {occ_script}")
|
||||
|
||||
prod_geom_glb = step_path.parent / f"{step_path.stem}_production_geom.glb"
|
||||
python_bin = _sys.executable
|
||||
occ_cmd = [
|
||||
python_bin, str(occ_script),
|
||||
"--step_path", str(step_path),
|
||||
"--output_path", str(prod_geom_glb),
|
||||
"--linear_deflection", str(prod_linear),
|
||||
"--angular_deflection", str(prod_angular),
|
||||
]
|
||||
log_task_event(
|
||||
self.request.id,
|
||||
f"Re-exporting STEP at production quality (linear={prod_linear}mm, angular={prod_angular}rad)",
|
||||
"info",
|
||||
)
|
||||
try:
|
||||
occ_result = _subprocess.run(occ_cmd, capture_output=True, text=True, timeout=180)
|
||||
for line in occ_result.stdout.splitlines():
|
||||
logger.info("[occ-prod] %s", line)
|
||||
if occ_result.returncode != 0 or not prod_geom_glb.exists() or prod_geom_glb.stat().st_size == 0:
|
||||
raise RuntimeError(
|
||||
f"OCC export failed (exit {occ_result.returncode}): {occ_result.stderr[-500:]}"
|
||||
)
|
||||
except Exception as exc:
|
||||
log_task_event(self.request.id, f"OCC re-export failed: {exc}", "error")
|
||||
raise self.retry(exc=exc, countdown=30)
|
||||
|
||||
geom_glb_path = prod_geom_glb
|
||||
|
||||
# --- 2. Resolve material map from Product.cad_part_materials (SCHAEFFLER library names) ---
|
||||
# cad_part_materials lives on Product (list[dict]), NOT on CadFile.
|
||||
# We look up the Product that owns this CadFile (prefer product_id arg if given).
|
||||
from app.services.material_service import resolve_material_map
|
||||
from app.domains.products.models import Product as _Product
|
||||
|
||||
with _Session(_eng) as _sess:
|
||||
_prod_query = _sel(_Product).where(_Product.cad_file_id == _uuid.UUID(cad_file_id))
|
||||
if product_id:
|
||||
_prod_query = _prod_query.where(_Product.id == _uuid.UUID(product_id))
|
||||
_product = _sess.execute(_prod_query).scalars().first()
|
||||
raw_materials: list[dict] = _product.cad_part_materials if _product else []
|
||||
|
||||
# Convert list[{"part_name": X, "material": Y}] → dict[str, str] for resolve_material_map
|
||||
raw_mat_map: dict[str, str] = {
|
||||
m["part_name"]: m["material"]
|
||||
for m in raw_materials
|
||||
if m.get("part_name") and m.get("material")
|
||||
}
|
||||
mat_map = resolve_material_map(raw_mat_map)
|
||||
logger.info(
|
||||
"generate_gltf_production_task: resolved %d material(s) for cad %s (product: %s)",
|
||||
len(mat_map), cad_file_id, _product.id if _product else "none",
|
||||
)
|
||||
|
||||
# --- 3. Run Blender: apply materials + smooth shading + export production GLB ---
|
||||
# Use get_material_library_path() which checks active AssetLibrary first,
|
||||
# then falls back to the legacy material_library_path system setting.
|
||||
from app.services.template_service import get_material_library_path
|
||||
asset_library_blend = get_material_library_path() or ""
|
||||
_eng.dispose()
|
||||
|
||||
output_path = step_path.parent / f"{step_path.stem}_production.glb"
|
||||
|
||||
export_script = scripts_dir / "export_gltf.py"
|
||||
if not is_blender_available():
|
||||
raise RuntimeError("Blender is not available — cannot generate production GLB")
|
||||
if not export_script.exists():
|
||||
@@ -560,13 +620,20 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
"--glb_path", str(geom_glb_path),
|
||||
"--output_path", str(output_path),
|
||||
"--material_map", _json.dumps(mat_map),
|
||||
"--smooth_angle", str(smooth_angle),
|
||||
]
|
||||
if asset_library_blend:
|
||||
cmd += ["--asset_library_blend", asset_library_blend]
|
||||
|
||||
log_task_event(self.request.id, f"Running Blender export_gltf.py for {geom_glb_path.name}", "info")
|
||||
log_task_event(
|
||||
self.request.id,
|
||||
f"Running Blender export_gltf.py — {len(mat_map)} material(s), smooth={smooth_angle}°",
|
||||
"info",
|
||||
)
|
||||
try:
|
||||
result = _subprocess.run(cmd, capture_output=True, text=True, timeout=300)
|
||||
for line in result.stdout.splitlines():
|
||||
logger.info("[export-gltf] %s", line)
|
||||
if result.returncode != 0:
|
||||
raise RuntimeError(
|
||||
f"export_gltf.py exited {result.returncode}:\n{result.stderr[-500:]}"
|
||||
@@ -575,6 +642,12 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
log_task_event(self.request.id, f"Blender production GLB failed: {exc}", "error")
|
||||
logger.error("generate_gltf_production_task Blender failed for cad %s: %s", cad_file_id, exc)
|
||||
raise self.retry(exc=exc, countdown=30)
|
||||
finally:
|
||||
# Clean up the high-quality temp geometry GLB (not needed after Blender export)
|
||||
try:
|
||||
prod_geom_glb.unlink(missing_ok=True)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
log_task_event(self.request.id, f"Production GLB exported: {output_path.name}", "done")
|
||||
|
||||
@@ -888,7 +961,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
logger.error("Turntable render failed for %s: %s", order_line_id, exc)
|
||||
else:
|
||||
# ── Still image path ────────────────────────────────────────
|
||||
emit(order_line_id, f"Calling renderer (STEP → STL → still) {render_width or 'default'}x{render_height or 'default'}{' [transparent]' if transparent_bg else ''}{f' engine={render_engine}' if render_engine else ''}{f' samples={render_samples}' if render_samples else ''}{tmpl_info}")
|
||||
emit(order_line_id, f"Calling renderer (STEP → GLB → Blender) {render_width or 'default'}x{render_height or 'default'}{' [transparent]' if transparent_bg else ''}{f' engine={render_engine}' if render_engine else ''}{f' samples={render_samples}' if render_samples else ''}{tmpl_info}")
|
||||
from app.services.step_processor import render_to_file
|
||||
|
||||
success, render_log = render_to_file(
|
||||
@@ -912,6 +985,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
rotation_y=rotation_y,
|
||||
rotation_z=rotation_z,
|
||||
job_id=order_line_id,
|
||||
order_line_id=order_line_id,
|
||||
noise_threshold=noise_threshold,
|
||||
denoiser=denoiser,
|
||||
denoising_input_passes=denoising_input_passes,
|
||||
|
||||
@@ -1,16 +1,15 @@
|
||||
"""
|
||||
Blender Python script for rendering an STL file to PNG.
|
||||
Blender Python script for rendering a GLB file to PNG.
|
||||
Targets Blender 5.0+ (EEVEE / Cycles).
|
||||
|
||||
Called by Blender:
|
||||
blender --background --python blender_render.py -- \
|
||||
<stl_path> <output_path> <width> <height> [engine] [samples]
|
||||
<glb_path> <output_path> <width> <height> [engine] [samples]
|
||||
|
||||
engine: "cycles" (default) | "eevee"
|
||||
|
||||
Features:
|
||||
- Disconnected mesh islands split into separate objects and painted with
|
||||
palette colours (same 10-colour palette as the Three.js renderer).
|
||||
- OCC-generated GLB: one mesh per STEP part, already in metres.
|
||||
- Bounding-box-aware camera: object fills ~85 % of the frame.
|
||||
- Isometric-style angle (elevation 28°, azimuth 40°).
|
||||
- Dynamic clip planes.
|
||||
@@ -57,12 +56,12 @@ else:
|
||||
|
||||
if len(argv) < 4:
|
||||
print("Usage: blender --background --python blender_render.py -- "
|
||||
"<stl_path> <output_path> <width> <height> [engine] [samples] [smooth_angle] [cycles_device] [transparent_bg]")
|
||||
"<glb_path> <output_path> <width> <height> [engine] [samples] [smooth_angle] [cycles_device] [transparent_bg]")
|
||||
sys.exit(1)
|
||||
|
||||
import json as _json
|
||||
|
||||
stl_path = argv[0]
|
||||
glb_path = argv[0]
|
||||
output_path = argv[1]
|
||||
width = int(argv[2])
|
||||
height = int(argv[3])
|
||||
@@ -163,29 +162,10 @@ def _assign_palette_material(part_obj, index):
|
||||
import re as _re
|
||||
|
||||
|
||||
def _scale_mm_to_m(parts):
|
||||
"""Scale imported STL objects from mm to Blender metres (×0.001).
|
||||
|
||||
STEP/STL coordinates are in mm; Blender's default unit is metres.
|
||||
Without scaling a 50 mm part appears as 50 m inside Blender — way too large
|
||||
relative to any template environment designed in metric units.
|
||||
"""
|
||||
if not parts:
|
||||
return
|
||||
bpy.ops.object.select_all(action='DESELECT')
|
||||
for p in parts:
|
||||
p.scale = (0.001, 0.001, 0.001)
|
||||
p.location *= 0.001
|
||||
p.select_set(True)
|
||||
bpy.context.view_layer.objects.active = parts[0]
|
||||
bpy.ops.object.transform_apply(scale=True, location=False, rotation=False)
|
||||
print(f"[blender_render] scaled {len(parts)} parts mm→m (×0.001)")
|
||||
|
||||
|
||||
def _apply_rotation(parts, rx, ry, rz):
|
||||
"""Apply Euler rotation (degrees, XYZ order) to all parts around world origin.
|
||||
|
||||
After _import_stl + _scale_mm_to_m the combined bbox center is at world origin,
|
||||
After _import_glb the combined bbox center is at world origin,
|
||||
so rotating around origin is equivalent to rotating around the assembly center.
|
||||
"""
|
||||
if not parts or (rx == 0.0 and ry == 0.0 and rz == 0.0):
|
||||
@@ -203,85 +183,35 @@ def _apply_rotation(parts, rx, ry, rz):
|
||||
print(f"[blender_render] applied rotation ({rx}°, {ry}°, {rz}°) to {len(parts)} parts")
|
||||
|
||||
|
||||
def _import_stl(stl_file):
|
||||
"""Import STL into Blender, using per-part STLs if available.
|
||||
def _import_glb(glb_file):
|
||||
"""Import OCC-generated GLB into Blender.
|
||||
|
||||
Checks for {stl_stem}_parts/manifest.json next to the STL file.
|
||||
- Per-part mode: imports each part STL, names Blender object after STEP part name.
|
||||
- Fallback: imports combined STL and splits by loose geometry.
|
||||
|
||||
Returns list of Blender mesh objects, centred at origin.
|
||||
OCC exports one mesh object per STEP part, already in metres.
|
||||
Returns list of Blender mesh objects, centred at world origin.
|
||||
"""
|
||||
stl_dir = os.path.dirname(stl_file)
|
||||
stl_stem = os.path.splitext(os.path.basename(stl_file))[0]
|
||||
parts_dir = os.path.join(stl_dir, stl_stem + "_parts")
|
||||
manifest_path = os.path.join(parts_dir, "manifest.json")
|
||||
bpy.ops.object.select_all(action='DESELECT')
|
||||
bpy.ops.import_scene.gltf(filepath=glb_file)
|
||||
parts = [o for o in bpy.context.selected_objects if o.type == 'MESH']
|
||||
|
||||
parts = []
|
||||
|
||||
if os.path.isfile(manifest_path):
|
||||
# ── Per-part mode ────────────────────────────────────────────────
|
||||
try:
|
||||
with open(manifest_path, "r") as f:
|
||||
manifest = _json.loads(f.read())
|
||||
part_entries = manifest.get("parts", [])
|
||||
except Exception as e:
|
||||
print(f"[blender_render] WARNING: failed to read manifest: {e}")
|
||||
part_entries = []
|
||||
|
||||
if part_entries:
|
||||
for entry in part_entries:
|
||||
part_file = os.path.join(parts_dir, entry["file"])
|
||||
part_name = entry["name"]
|
||||
if not os.path.isfile(part_file):
|
||||
print(f"[blender_render] WARNING: part STL missing: {part_file}")
|
||||
continue
|
||||
|
||||
bpy.ops.object.select_all(action='DESELECT')
|
||||
bpy.ops.wm.stl_import(filepath=part_file)
|
||||
imported = bpy.context.selected_objects
|
||||
if imported:
|
||||
obj = imported[0]
|
||||
obj.name = part_name
|
||||
if obj.data:
|
||||
obj.data.name = part_name
|
||||
parts.append(obj)
|
||||
|
||||
if parts:
|
||||
print(f"[blender_render] imported {len(parts)} named parts from per-part STLs")
|
||||
|
||||
# ── Fallback: combined STL + separate by loose ───────────────────────
|
||||
if not parts:
|
||||
bpy.ops.wm.stl_import(filepath=stl_file)
|
||||
obj = bpy.context.selected_objects[0] if bpy.context.selected_objects else None
|
||||
if obj is None:
|
||||
print(f"ERROR: No objects imported from {stl_file}")
|
||||
sys.exit(1)
|
||||
print(f"ERROR: No mesh objects imported from {glb_file}")
|
||||
sys.exit(1)
|
||||
|
||||
bpy.context.view_layer.objects.active = obj
|
||||
bpy.ops.object.origin_set(type='ORIGIN_GEOMETRY', center='BOUNDS')
|
||||
obj.location = (0.0, 0.0, 0.0)
|
||||
print(f"[blender_render] imported {len(parts)} part(s) from GLB: "
|
||||
f"{[p.name for p in parts[:5]]}")
|
||||
|
||||
bpy.ops.object.mode_set(mode='EDIT')
|
||||
bpy.ops.mesh.separate(type='LOOSE')
|
||||
bpy.ops.object.mode_set(mode='OBJECT')
|
||||
|
||||
parts = list(bpy.context.selected_objects)
|
||||
print(f"[blender_render] fallback: separated into {len(parts)} part(s)")
|
||||
return parts
|
||||
|
||||
# ── Centre per-part imports at origin (combined bbox) ────────────────
|
||||
# Centre combined bbox at world origin
|
||||
all_corners = []
|
||||
for p in parts:
|
||||
all_corners.extend(p.matrix_world @ Vector(c) for c in p.bound_box)
|
||||
|
||||
if all_corners:
|
||||
mins = Vector((min(v.x for v in all_corners),
|
||||
min(v.y for v in all_corners),
|
||||
min(v.z for v in all_corners)))
|
||||
min(v.y for v in all_corners),
|
||||
min(v.z for v in all_corners)))
|
||||
maxs = Vector((max(v.x for v in all_corners),
|
||||
max(v.y for v in all_corners),
|
||||
max(v.z for v in all_corners)))
|
||||
max(v.y for v in all_corners),
|
||||
max(v.z for v in all_corners)))
|
||||
center = (mins + maxs) * 0.5
|
||||
for p in parts:
|
||||
p.location -= center
|
||||
@@ -292,9 +222,9 @@ def _import_stl(stl_file):
|
||||
def _resolve_part_name(index, part_obj):
|
||||
"""Get the STEP part name for a Blender part by index.
|
||||
|
||||
With per-part import, part_obj.name IS the STEP name (possibly with
|
||||
With GLB import, part_obj.name IS the STEP name (possibly with
|
||||
Blender .NNN suffix for duplicates). Strip that suffix for lookup.
|
||||
Falls back to part_names_ordered index mapping for combined-STL mode.
|
||||
Falls back to part_names_ordered index mapping.
|
||||
"""
|
||||
# Strip Blender auto-suffix (.001, .002, etc.)
|
||||
base_name = _re.sub(r'\.\d{3}$', '', part_obj.name)
|
||||
@@ -308,9 +238,9 @@ def _resolve_part_name(index, part_obj):
|
||||
def _apply_material_library(parts, mat_lib_path, mat_map):
|
||||
"""Append materials from library .blend and assign to parts via material_map.
|
||||
|
||||
With per-part STL import, Blender objects are named after STEP parts,
|
||||
so matching is by name (stripping Blender .NNN suffix for duplicates).
|
||||
Falls back to part_names_ordered index-based matching for combined-STL mode.
|
||||
GLB-imported objects are named after STEP parts, so matching is by name
|
||||
(stripping Blender .NNN suffix for duplicates). Falls back to
|
||||
part_names_ordered index-based matching.
|
||||
|
||||
mat_map: {part_name_lower: material_name}
|
||||
Parts without a match keep their current material.
|
||||
@@ -346,8 +276,8 @@ def _apply_material_library(parts, mat_lib_path, mat_map):
|
||||
if not appended:
|
||||
return
|
||||
|
||||
# Assign materials to parts — primary: name-based (per-part STL mode),
|
||||
# secondary: index-based via part_names_ordered (combined STL fallback)
|
||||
# Assign materials to parts — primary: name-based (GLB object names),
|
||||
# secondary: index-based via part_names_ordered
|
||||
assigned_count = 0
|
||||
for i, part in enumerate(parts):
|
||||
# Try name-based matching first (strip Blender .NNN suffix)
|
||||
@@ -380,10 +310,8 @@ if use_template:
|
||||
# Find or create target collection
|
||||
target_col = _ensure_collection(target_collection)
|
||||
|
||||
# Import and split STL
|
||||
parts = _import_stl(stl_path)
|
||||
# Scale mm→m: STEP coords are mm, Blender default unit is metres
|
||||
_scale_mm_to_m(parts)
|
||||
# Import GLB (already in metres from OCC export)
|
||||
parts = _import_glb(glb_path)
|
||||
# Apply render position rotation (before camera/bbox calculations)
|
||||
_apply_rotation(parts, rotation_x, rotation_y, rotation_z)
|
||||
|
||||
@@ -461,9 +389,7 @@ else:
|
||||
# ── MODE A: Factory settings (original behavior) ─────────────────────────
|
||||
needs_auto_camera = True
|
||||
bpy.ops.wm.read_factory_settings(use_empty=True)
|
||||
parts = _import_stl(stl_path)
|
||||
# Scale mm→m: STEP coords are mm, Blender default unit is metres
|
||||
_scale_mm_to_m(parts)
|
||||
parts = _import_glb(glb_path)
|
||||
# Apply render position rotation (before camera/bbox calculations)
|
||||
_apply_rotation(parts, rotation_x, rotation_y, rotation_z)
|
||||
|
||||
@@ -712,7 +638,7 @@ else:
|
||||
draw.rectangle([0, 0, W - 1, bar_h - 1], fill=(0, 137, 61, 255))
|
||||
|
||||
# Model name strip at bottom
|
||||
model_name = os.path.splitext(os.path.basename(stl_path))[0]
|
||||
model_name = os.path.splitext(os.path.basename(glb_path))[0]
|
||||
label_h = max(20, H // 20)
|
||||
img.alpha_composite(
|
||||
Image.new("RGBA", (W, label_h), (30, 30, 30, 180)),
|
||||
|
||||
Symlink
+1
@@ -0,0 +1 @@
|
||||
/home/hartmut/Documents/Copilot/schaefflerautomat/.claude
|
||||
@@ -97,3 +97,9 @@ export async function generateGltfProduction(cadFileId: string): Promise<Generat
|
||||
const res = await api.post<GenerateGltfResponse>(`/cad/${cadFileId}/generate-gltf-production`)
|
||||
return res.data
|
||||
}
|
||||
|
||||
/** Force-reset a CAD file stuck in 'processing' to 'failed'. */
|
||||
export async function resetStuckProcessing(cadFileId: string): Promise<{ status: string; message: string }> {
|
||||
const res = await api.post(`/cad/${cadFileId}/reset-stuck`)
|
||||
return res.data
|
||||
}
|
||||
|
||||
@@ -6,6 +6,8 @@ export interface RenderTemplate {
|
||||
category_key: string | null;
|
||||
output_type_id: string | null;
|
||||
output_type_name: string | null;
|
||||
output_type_ids: string[];
|
||||
output_type_names: string[];
|
||||
blend_file_path: string;
|
||||
original_filename: string;
|
||||
target_collection: string;
|
||||
@@ -39,7 +41,7 @@ export async function createRenderTemplate(formData: FormData): Promise<RenderTe
|
||||
|
||||
export async function updateRenderTemplate(
|
||||
id: string,
|
||||
updates: Partial<Pick<RenderTemplate, 'name' | 'category_key' | 'output_type_id' | 'target_collection' | 'material_replace_enabled' | 'lighting_only' | 'shadow_catcher_enabled' | 'camera_orbit' | 'is_active'>>,
|
||||
updates: Partial<Pick<RenderTemplate, 'name' | 'category_key' | 'output_type_ids' | 'target_collection' | 'material_replace_enabled' | 'lighting_only' | 'shadow_catcher_enabled' | 'camera_orbit' | 'is_active'>>,
|
||||
): Promise<RenderTemplate> {
|
||||
const { data } = await api.patch(`/render-templates/${id}`, updates);
|
||||
return data;
|
||||
|
||||
@@ -10,7 +10,7 @@ export interface WorkflowDefinition {
|
||||
}
|
||||
|
||||
export interface WorkflowConfig {
|
||||
type: 'still' | 'turntable' | 'multi_angle' | 'custom'
|
||||
type: 'still' | 'turntable' | 'multi_angle' | 'still_with_exports' | 'custom'
|
||||
params: WorkflowParams
|
||||
nodes?: WorkflowNode[]
|
||||
}
|
||||
|
||||
@@ -186,7 +186,8 @@ export default function OutputTypeTable() {
|
||||
|
||||
return (
|
||||
<div>
|
||||
<table className="w-full text-sm">
|
||||
<div className="overflow-x-auto">
|
||||
<table className="w-full text-sm min-w-[900px]">
|
||||
<thead>
|
||||
<tr className="border-b border-border-light text-left">
|
||||
<th className="px-4 py-2 font-medium text-content-secondary">Name</th>
|
||||
@@ -1025,6 +1026,7 @@ export default function OutputTypeTable() {
|
||||
)}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
{!showAdd && (
|
||||
<div className="px-4 py-3">
|
||||
|
||||
@@ -115,7 +115,7 @@ export default function RenderTemplateTable() {
|
||||
setEditDraft({
|
||||
name: t.name,
|
||||
category_key: t.category_key,
|
||||
output_type_id: t.output_type_id,
|
||||
output_type_ids: t.output_type_ids ?? [],
|
||||
target_collection: t.target_collection,
|
||||
material_replace_enabled: t.material_replace_enabled,
|
||||
lighting_only: t.lighting_only,
|
||||
@@ -320,18 +320,39 @@ export default function RenderTemplateTable() {
|
||||
</td>
|
||||
<td className="px-3 py-2">
|
||||
{isEditing ? (
|
||||
<select
|
||||
className={inputCls}
|
||||
value={editDraft.output_type_id ?? t.output_type_id ?? ''}
|
||||
onChange={(e) => setEditDraft({ ...editDraft, output_type_id: e.target.value || null })}
|
||||
>
|
||||
<option value="">Any</option>
|
||||
{outputTypes?.map((ot: OutputType) => (
|
||||
<option key={ot.id} value={ot.id}>{ot.name}</option>
|
||||
))}
|
||||
</select>
|
||||
<div className="flex flex-col gap-0.5 max-h-32 overflow-y-auto">
|
||||
{outputTypes?.map((ot: OutputType) => {
|
||||
const checked = (editDraft.output_type_ids ?? []).includes(ot.id)
|
||||
return (
|
||||
<label key={ot.id} className="flex items-center gap-1 text-xs cursor-pointer whitespace-nowrap">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={checked}
|
||||
onChange={() => {
|
||||
const current = editDraft.output_type_ids ?? []
|
||||
const next = checked
|
||||
? current.filter((id: string) => id !== ot.id)
|
||||
: [...current, ot.id]
|
||||
setEditDraft({ ...editDraft, output_type_ids: next })
|
||||
}}
|
||||
/>
|
||||
{ot.name}
|
||||
</label>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
) : (
|
||||
t.output_type_name || <span className="text-content-muted">Any</span>
|
||||
t.output_type_names && t.output_type_names.length > 0 ? (
|
||||
<div className="flex flex-wrap gap-1">
|
||||
{t.output_type_names.map((name, i) => (
|
||||
<span key={i} className="inline-block text-xs px-1.5 py-0.5 bg-blue-100 text-blue-800 rounded">
|
||||
{name}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
) : (
|
||||
<span className="text-content-muted">Any</span>
|
||||
)
|
||||
)}
|
||||
</td>
|
||||
<td className="px-3 py-2">
|
||||
|
||||
@@ -4,13 +4,14 @@ import { Canvas } from '@react-three/fiber'
|
||||
import { OrbitControls, useGLTF, Environment } from '@react-three/drei'
|
||||
import * as THREE from 'three'
|
||||
import { mergeVertices } from 'three/examples/jsm/utils/BufferGeometryUtils.js'
|
||||
import { Loader2, Box, RefreshCw, Grid3X3, Layers, Sun } from 'lucide-react'
|
||||
import { Loader2, Box, RefreshCw, Grid3X3, Layers, Sun, Cpu } from 'lucide-react'
|
||||
import { toast } from 'sonner'
|
||||
import { getMediaAssets } from '../../api/media'
|
||||
import { generateGltfGeometry } from '../../api/cad'
|
||||
import { useAuthStore } from '../../store/auth'
|
||||
|
||||
type ViewMode = 'solid' | 'wireframe'
|
||||
type GlbSource = 'geometry' | 'production'
|
||||
type LightPreset = 'studio' | 'warehouse' | 'sunset' | 'park' | 'city'
|
||||
|
||||
const LIGHT_PRESETS: { id: LightPreset; label: string }[] = [
|
||||
@@ -91,6 +92,7 @@ export default function InlineCadViewer({
|
||||
const [loadingGlb, setLoadingGlb] = useState(false)
|
||||
const [generating, setGenerating] = useState(false)
|
||||
const [viewMode, setViewMode] = useState<ViewMode>('solid')
|
||||
const [glbSource, setGlbSource] = useState<GlbSource>('geometry')
|
||||
const [lightPreset, setLightPreset] = useState<LightPreset>('studio')
|
||||
|
||||
const { data: gltfAssets } = useQuery({
|
||||
@@ -100,20 +102,35 @@ export default function InlineCadViewer({
|
||||
refetchInterval: generating ? 4_000 : false,
|
||||
})
|
||||
|
||||
const { data: productionAssets } = useQuery({
|
||||
queryKey: ['media-assets', cadFileId, 'gltf_production'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: cadFileId, asset_types: ['gltf_production'] }),
|
||||
staleTime: 0,
|
||||
})
|
||||
|
||||
useEffect(() => {
|
||||
if (generating && gltfAssets && gltfAssets.length > 0) setGenerating(false)
|
||||
}, [generating, gltfAssets])
|
||||
|
||||
const latestAsset = gltfAssets?.[0]
|
||||
const downloadUrl = latestAsset?.download_url
|
||||
const hasGeometry = (gltfAssets?.length ?? 0) > 0
|
||||
const hasProduction = (productionAssets?.length ?? 0) > 0
|
||||
|
||||
// Auto-switch to production if it's the only available source
|
||||
useEffect(() => {
|
||||
if (!hasGeometry && hasProduction) setGlbSource('production')
|
||||
}, [hasGeometry, hasProduction])
|
||||
|
||||
const activeDownloadUrl =
|
||||
glbSource === 'production'
|
||||
? productionAssets?.[0]?.download_url
|
||||
: gltfAssets?.[0]?.download_url
|
||||
|
||||
useEffect(() => {
|
||||
if (!downloadUrl || !token) return
|
||||
// Clear stale mesh immediately so the loading spinner shows instead of old geometry
|
||||
if (!activeDownloadUrl || !token) return
|
||||
setGlbBlobUrl(null)
|
||||
setLoadingGlb(true)
|
||||
let blobUrl = ''
|
||||
fetch(downloadUrl, { headers: { Authorization: `Bearer ${token}` } })
|
||||
fetch(activeDownloadUrl, { headers: { Authorization: `Bearer ${token}` } })
|
||||
.then((r) => r.blob())
|
||||
.then((blob) => {
|
||||
blobUrl = URL.createObjectURL(blob)
|
||||
@@ -124,7 +141,7 @@ export default function InlineCadViewer({
|
||||
return () => {
|
||||
if (blobUrl) URL.revokeObjectURL(blobUrl)
|
||||
}
|
||||
}, [downloadUrl, token])
|
||||
}, [activeDownloadUrl, token])
|
||||
|
||||
const generateMut = useMutation({
|
||||
mutationFn: () => generateGltfGeometry(cadFileId),
|
||||
@@ -149,6 +166,19 @@ export default function InlineCadViewer({
|
||||
|
||||
{/* Toolbar */}
|
||||
<div className="absolute top-2 right-2 flex flex-col gap-1 items-end">
|
||||
{/* Geometry / Production toggle — only when both exist */}
|
||||
{hasGeometry && hasProduction && (
|
||||
<div className="flex rounded-md overflow-hidden border border-white/10 bg-black/50 backdrop-blur-sm">
|
||||
<ToolbarBtn active={glbSource === 'geometry'} onClick={() => setGlbSource('geometry')} title="Geometry GLB (OCC, no materials)">
|
||||
<Box size={12} /> Geo
|
||||
</ToolbarBtn>
|
||||
<div className="w-px bg-white/10" />
|
||||
<ToolbarBtn active={glbSource === 'production'} onClick={() => setGlbSource('production')} title="Production GLB (Blender + PBR materials)">
|
||||
<Cpu size={12} /> PBR
|
||||
</ToolbarBtn>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* View mode */}
|
||||
<div className="flex rounded-md overflow-hidden border border-white/10 bg-black/50 backdrop-blur-sm">
|
||||
<ToolbarBtn active={viewMode === 'solid'} onClick={() => setViewMode('solid')} title="Solid">
|
||||
|
||||
@@ -24,13 +24,22 @@ import api from '../../api/client'
|
||||
export interface ThreeDViewerProps {
|
||||
cadFileId: string
|
||||
onClose: () => void
|
||||
/** URL for the geometry-only GLB (from STL export) */
|
||||
/** URL for the geometry-only GLB (from OCC export) */
|
||||
geometryGltfUrl?: string
|
||||
/** URL for the production-quality GLB (from asset library render) */
|
||||
/** URL for the production-quality GLB (Blender + PBR materials) */
|
||||
productionGltfUrl?: string
|
||||
/** Download URLs for GLB and .blend assets */
|
||||
/** Whether a geometry GLB exists (for hint display) */
|
||||
hasGeometryGlb?: boolean
|
||||
/** Whether a production GLB exists (for hint display) */
|
||||
hasProductionGlb?: boolean
|
||||
/** Called when the user clicks "Generate Geometry GLB" from the hint banner */
|
||||
onGenerateGeometry?: () => void
|
||||
/** Whether a geometry GLB generation is in progress */
|
||||
isGeneratingGeometry?: boolean
|
||||
/** Download URLs for assets */
|
||||
downloadUrls?: {
|
||||
glb?: string
|
||||
production?: string
|
||||
blend?: string
|
||||
}
|
||||
}
|
||||
@@ -217,9 +226,15 @@ export default function ThreeDViewer({
|
||||
onClose,
|
||||
geometryGltfUrl,
|
||||
productionGltfUrl,
|
||||
hasGeometryGlb,
|
||||
hasProductionGlb,
|
||||
onGenerateGeometry,
|
||||
isGeneratingGeometry,
|
||||
downloadUrls,
|
||||
}: ThreeDViewerProps) {
|
||||
const [mode, setMode] = useState<ViewMode>('geometry')
|
||||
// Default to production mode if only production GLB is available
|
||||
const initialMode: ViewMode = productionGltfUrl && !geometryGltfUrl ? 'production' : 'geometry'
|
||||
const [mode, setMode] = useState<ViewMode>(initialMode)
|
||||
const [wireframe, setWireframe] = useState(false)
|
||||
const [envPreset, setEnvPreset] = useState<EnvPreset>('city')
|
||||
const [capturing, setCapturing] = useState(false)
|
||||
@@ -232,11 +247,11 @@ export default function ThreeDViewer({
|
||||
staleTime: 60_000,
|
||||
})
|
||||
|
||||
// Resolve the active model URL based on mode
|
||||
// Resolve the active model URL: prefer selected mode, fall back to whichever URL exists
|
||||
const activeUrl =
|
||||
mode === 'production' && productionGltfUrl
|
||||
? productionGltfUrl
|
||||
: geometryGltfUrl
|
||||
: geometryGltfUrl ?? productionGltfUrl
|
||||
|
||||
const handleModelReady = useCallback(() => setModelReady(true), [])
|
||||
const handleError = useCallback((msg: string) => setLoadError(msg), [])
|
||||
@@ -312,11 +327,20 @@ export default function ThreeDViewer({
|
||||
{/* Download buttons */}
|
||||
{downloadUrls?.glb && (
|
||||
<button
|
||||
onClick={() => handleDownload(downloadUrls.glb!, `${cadFileId}.glb`)}
|
||||
onClick={() => handleDownload(downloadUrls.glb!, `${cadFileId}_geometry.glb`)}
|
||||
className="flex items-center gap-1.5 px-3 py-1.5 rounded-md bg-gray-700 hover:bg-gray-600 text-white text-xs font-medium transition-colors"
|
||||
>
|
||||
<Download size={12} />
|
||||
GLB
|
||||
Geometry GLB
|
||||
</button>
|
||||
)}
|
||||
{downloadUrls?.production && (
|
||||
<button
|
||||
onClick={() => handleDownload(downloadUrls.production!, `${cadFileId}_production.glb`)}
|
||||
className="flex items-center gap-1.5 px-3 py-1.5 rounded-md bg-gray-700 hover:bg-gray-600 text-white text-xs font-medium transition-colors"
|
||||
>
|
||||
<Download size={12} />
|
||||
Production GLB
|
||||
</button>
|
||||
)}
|
||||
{downloadUrls?.blend && (
|
||||
@@ -350,6 +374,37 @@ export default function ThreeDViewer({
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Hint banners */}
|
||||
{!hasProductionGlb && (
|
||||
<div className="bg-amber-900/60 border-b border-amber-700/50 px-4 py-2 flex items-center gap-2 text-amber-200 text-xs shrink-0">
|
||||
<Cpu size={13} className="shrink-0" />
|
||||
<span>
|
||||
<strong>No Production GLB yet.</strong> Go to the product page and click "Generate Production GLB" to create a high-quality version with PBR materials and proper mesh smoothing.
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
{!hasGeometryGlb && hasProductionGlb && onGenerateGeometry && (
|
||||
<div className="bg-blue-900/50 border-b border-blue-700/50 px-4 py-2 flex items-center gap-3 text-blue-200 text-xs shrink-0">
|
||||
<Box size={13} className="shrink-0" />
|
||||
<span>
|
||||
<strong>Showing Production GLB.</strong> Generate a Geometry GLB to enable the mode toggle and compare geometry vs. production quality.
|
||||
</span>
|
||||
{isGeneratingGeometry ? (
|
||||
<span className="flex items-center gap-1 text-blue-300 ml-auto shrink-0">
|
||||
<Loader2 size={11} className="animate-spin" />
|
||||
Generating…
|
||||
</span>
|
||||
) : (
|
||||
<button
|
||||
onClick={onGenerateGeometry}
|
||||
className="ml-auto shrink-0 px-3 py-1 rounded bg-blue-700 hover:bg-blue-600 text-white text-xs font-medium transition-colors"
|
||||
>
|
||||
Generate Geometry GLB
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Viewport */}
|
||||
<div className="relative flex-1">
|
||||
{loadError && (
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { useState } from 'react'
|
||||
import { useState, useEffect } from 'react'
|
||||
import { useQuery } from '@tanstack/react-query'
|
||||
import {
|
||||
Settings2, BarChart2, Activity, ImageIcon, DollarSign, Cpu,
|
||||
@@ -123,8 +123,22 @@ function TimeframeSelector({ widgets }: { widgets: WidgetType[] }) {
|
||||
)
|
||||
}
|
||||
|
||||
function useLargeScreen() {
|
||||
const [isLarge, setIsLarge] = useState(() =>
|
||||
typeof window !== 'undefined' ? window.innerWidth >= 1024 : true
|
||||
)
|
||||
useEffect(() => {
|
||||
const mq = window.matchMedia('(min-width: 1024px)')
|
||||
const handler = (e: MediaQueryListEvent) => setIsLarge(e.matches)
|
||||
mq.addEventListener('change', handler)
|
||||
return () => mq.removeEventListener('change', handler)
|
||||
}, [])
|
||||
return isLarge
|
||||
}
|
||||
|
||||
function DashboardGridInner() {
|
||||
const [showCustomize, setShowCustomize] = useState(false)
|
||||
const isLarge = useLargeScreen()
|
||||
|
||||
const { data: widgets, isLoading } = useQuery({
|
||||
queryKey: ['dashboard-config'],
|
||||
@@ -150,7 +164,7 @@ function DashboardGridInner() {
|
||||
|
||||
{/* Grid */}
|
||||
{isLoading ? (
|
||||
<div className="grid grid-cols-3 gap-4">
|
||||
<div className="grid grid-cols-1 lg:grid-cols-3 gap-4">
|
||||
{[0, 1, 2].map((i) => (
|
||||
<div key={i} className="h-40 rounded-xl animate-pulse bg-surface-muted" />
|
||||
))}
|
||||
@@ -162,7 +176,7 @@ function DashboardGridInner() {
|
||||
) : (
|
||||
<div
|
||||
className="grid gap-4"
|
||||
style={{ gridTemplateColumns: 'repeat(3, minmax(0, 1fr))' }}
|
||||
style={isLarge ? { gridTemplateColumns: 'repeat(3, minmax(0, 1fr))' } : { gridTemplateColumns: '1fr' }}
|
||||
>
|
||||
{(widgets ?? []).map((w, i) => {
|
||||
const pos = w.position
|
||||
@@ -173,12 +187,12 @@ function DashboardGridInner() {
|
||||
return (
|
||||
<div
|
||||
key={`${w.widget_type}-${i}`}
|
||||
style={{
|
||||
style={isLarge ? {
|
||||
gridColumnStart: pos.col + 1,
|
||||
gridColumnEnd: `span ${pos.w}`,
|
||||
gridRowStart: pos.row + 1,
|
||||
gridRowEnd: `span ${pos.h}`,
|
||||
}}
|
||||
} : {}}
|
||||
>
|
||||
<WidgetContainer title={meta.title} icon={meta.icon}>
|
||||
<WidgetBody type={w.widget_type as WidgetType} />
|
||||
|
||||
@@ -54,7 +54,7 @@ function ThemeProvider({ children }: { children: React.ReactNode }) {
|
||||
return (
|
||||
<>
|
||||
{children}
|
||||
<Toaster position="top-right" richColors theme={resolvedTheme} />
|
||||
<Toaster position="top-left" richColors theme={resolvedTheme} />
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
+125
-66
@@ -9,8 +9,6 @@ import PricingTierTable from '../components/admin/PricingTierTable'
|
||||
import OutputTypeTable from '../components/admin/OutputTypeTable'
|
||||
import RenderTemplateTable from '../components/admin/RenderTemplateTable'
|
||||
import { useAuthStore } from '../store/auth'
|
||||
import { getMaterialLibraryInfo, uploadMaterialLibrary, deleteMaterialLibrary } from '../api/renderTemplates'
|
||||
import type { MaterialLibraryInfo } from '../api/renderTemplates'
|
||||
import { listPricingTiers } from '../api/pricing'
|
||||
import { listOutputTypes } from '../api/outputTypes'
|
||||
import {
|
||||
@@ -92,6 +90,10 @@ export default function AdminPage() {
|
||||
gltf_material_quality: string
|
||||
gltf_pbr_roughness: number
|
||||
gltf_pbr_metallic: number
|
||||
gltf_preview_linear_deflection: number
|
||||
gltf_preview_angular_deflection: number
|
||||
gltf_production_linear_deflection: number
|
||||
gltf_production_angular_deflection: number
|
||||
}
|
||||
|
||||
const { data: settings } = useQuery({
|
||||
@@ -115,6 +117,9 @@ export default function AdminPage() {
|
||||
const [viewerDraft, setViewerDraft] = useState<Partial<Settings>>({})
|
||||
const viewer3d = { ...settings, ...viewerDraft } as Settings
|
||||
|
||||
const [tessellationDraft, setTessellationDraft] = useState<Partial<Settings>>({})
|
||||
const tess = { ...settings, ...tessellationDraft } as Settings
|
||||
|
||||
const { data: rendererStatus, refetch: refetchStatus } = useQuery({
|
||||
queryKey: ['renderer-status'],
|
||||
queryFn: async () => {
|
||||
@@ -166,6 +171,14 @@ export default function AdminPage() {
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed'),
|
||||
})
|
||||
|
||||
const recoverStuckMut = useMutation({
|
||||
mutationFn: () => api.post('/admin/settings/recover-stuck-processing'),
|
||||
onSuccess: (res) => {
|
||||
toast.success(res.data.message || 'Stuck files recovered')
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed'),
|
||||
})
|
||||
|
||||
const seedWorkflowsMut = useMutation({
|
||||
mutationFn: () => api.post('/admin/settings/seed-workflows'),
|
||||
onSuccess: (res) => {
|
||||
@@ -636,6 +649,18 @@ export default function AdminPage() {
|
||||
<div className="space-y-3">
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide">Maintenance</p>
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3">
|
||||
<div className="flex flex-col gap-1">
|
||||
<button
|
||||
onClick={() => recoverStuckMut.mutate()}
|
||||
disabled={recoverStuckMut.isPending}
|
||||
className="btn-secondary text-sm w-full justify-start border-amber-400/40 text-amber-600 hover:bg-amber-50"
|
||||
title="Reset CAD files stuck in 'processing' for more than 10 minutes to 'failed'. Runs automatically every 5 min."
|
||||
>
|
||||
<RefreshCw size={14} className={recoverStuckMut.isPending ? 'animate-spin' : ''} />
|
||||
{recoverStuckMut.isPending ? 'Recovering…' : 'Recover Stuck Processing'}
|
||||
</button>
|
||||
<p className="text-xs text-content-muted">Resets files stuck in 'processing' to 'failed'. Runs automatically every 5 min.</p>
|
||||
</div>
|
||||
<div className="flex flex-col gap-1">
|
||||
<button
|
||||
onClick={() => processUnprocessedMut.mutate()}
|
||||
@@ -1091,6 +1116,94 @@ export default function AdminPage() {
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
{/* Tessellation Quality */}
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
<div className="card">
|
||||
<div className="p-4 border-b border-border-default">
|
||||
<h2 className="font-semibold text-content">Tessellation Quality</h2>
|
||||
<p className="text-sm text-content-muted mt-0.5">
|
||||
OCC mesh precision for GLB export. Lower values = finer mesh + larger files + slower export.
|
||||
</p>
|
||||
</div>
|
||||
<div className="p-4 space-y-6">
|
||||
<div className="grid grid-cols-2 gap-6">
|
||||
<div className="space-y-4">
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide">Preview (Geometry GLB)</p>
|
||||
<div className="flex items-center gap-3">
|
||||
<label className="text-sm text-content-secondary w-36 shrink-0">Linear deflection</label>
|
||||
<input
|
||||
type="number"
|
||||
step="0.01"
|
||||
min="0.001"
|
||||
max="10"
|
||||
value={tess.gltf_preview_linear_deflection ?? 0.1}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_preview_linear_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">mm</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-3">
|
||||
<label className="text-sm text-content-secondary w-36 shrink-0">Angular deflection</label>
|
||||
<input
|
||||
type="number"
|
||||
step="0.05"
|
||||
min="0.05"
|
||||
max="1.5"
|
||||
value={tess.gltf_preview_angular_deflection ?? 0.5}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_preview_angular_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">rad</span>
|
||||
</div>
|
||||
<p className="text-xs text-content-muted">Used when clicking "Generate Geometry GLB".</p>
|
||||
</div>
|
||||
<div className="space-y-4">
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide">Production (Production GLB)</p>
|
||||
<div className="flex items-center gap-3">
|
||||
<label className="text-sm text-content-secondary w-36 shrink-0">Linear deflection</label>
|
||||
<input
|
||||
type="number"
|
||||
step="0.005"
|
||||
min="0.001"
|
||||
max="10"
|
||||
value={tess.gltf_production_linear_deflection ?? 0.03}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_production_linear_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">mm</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-3">
|
||||
<label className="text-sm text-content-secondary w-36 shrink-0">Angular deflection</label>
|
||||
<input
|
||||
type="number"
|
||||
step="0.05"
|
||||
min="0.05"
|
||||
max="1.5"
|
||||
value={tess.gltf_production_angular_deflection ?? 0.2}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_production_angular_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">rad</span>
|
||||
</div>
|
||||
<p className="text-xs text-content-muted">Used when clicking "Generate Production GLB". Smaller = smoother surfaces.</p>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex gap-2">
|
||||
<button
|
||||
onClick={() => { updateSettingsMut.mutate(tessellationDraft); setTessellationDraft({}) }}
|
||||
disabled={Object.keys(tessellationDraft).length === 0 || updateSettingsMut.isPending}
|
||||
className="btn-primary disabled:opacity-40"
|
||||
>
|
||||
Save Tessellation Settings
|
||||
</button>
|
||||
{Object.keys(tessellationDraft).length > 0 && (
|
||||
<button onClick={() => setTessellationDraft({})} className="btn-secondary">Reset</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
{/* Material Library link */}
|
||||
{/* ------------------------------------------------------------------ */}
|
||||
@@ -1111,73 +1224,19 @@ export default function AdminPage() {
|
||||
|
||||
|
||||
function MaterialLibraryPanel() {
|
||||
const qc = useQueryClient()
|
||||
|
||||
const { data: info } = useQuery({
|
||||
queryKey: ['material-library-info'],
|
||||
queryFn: getMaterialLibraryInfo,
|
||||
})
|
||||
|
||||
const uploadMut = useMutation({
|
||||
mutationFn: (file: File) => uploadMaterialLibrary(file),
|
||||
onSuccess: () => {
|
||||
toast.success('Material library uploaded')
|
||||
qc.invalidateQueries({ queryKey: ['material-library-info'] })
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Upload failed'),
|
||||
})
|
||||
|
||||
const deleteMut = useMutation({
|
||||
mutationFn: deleteMaterialLibrary,
|
||||
onSuccess: () => {
|
||||
toast.success('Material library removed')
|
||||
qc.invalidateQueries({ queryKey: ['material-library-info'] })
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Delete failed'),
|
||||
})
|
||||
|
||||
function handleFileChange(e: React.ChangeEvent<HTMLInputElement>) {
|
||||
const file = e.target.files?.[0]
|
||||
if (file) uploadMut.mutate(file)
|
||||
e.target.value = ''
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-3">
|
||||
<h4 className="text-sm font-semibold text-content-secondary">Material Library (.blend)</h4>
|
||||
<div className="space-y-2">
|
||||
<h4 className="text-sm font-semibold text-content-secondary">Material Library</h4>
|
||||
<p className="text-xs text-content-muted">
|
||||
Materials in this file can be assigned to product parts when "Material Replace" is enabled on a template.
|
||||
Materials for "Material Replace" are now managed via Asset Libraries. The active asset library's materials are used at render time.
|
||||
</p>
|
||||
|
||||
{info?.exists ? (
|
||||
<div className="flex items-center gap-3 p-3 rounded-lg border border-border-default bg-status-success-bg">
|
||||
<CheckCircle2 size={16} className="text-green-500 shrink-0" />
|
||||
<div className="flex-1 min-w-0">
|
||||
<p className="text-sm font-medium text-content">{info.filename}</p>
|
||||
<p className="text-xs text-content-muted">
|
||||
{info.size_bytes ? `${(info.size_bytes / 1024 / 1024).toFixed(1)} MB` : ''}
|
||||
</p>
|
||||
</div>
|
||||
<label className="flex items-center gap-1 px-3 py-1.5 text-sm border border-border-default rounded-md bg-surface text-content-secondary hover:bg-surface-hover cursor-pointer">
|
||||
<Upload size={14} /> Replace
|
||||
<input type="file" accept=".blend" className="hidden" onChange={handleFileChange} />
|
||||
</label>
|
||||
<button
|
||||
onClick={() => { if (confirm('Remove material library?')) deleteMut.mutate() }}
|
||||
disabled={deleteMut.isPending}
|
||||
className="p-1.5 text-red-500 hover:bg-red-50 rounded"
|
||||
title="Remove library"
|
||||
>
|
||||
<Trash2 size={16} />
|
||||
</button>
|
||||
</div>
|
||||
) : (
|
||||
<label className="flex items-center gap-2 px-4 py-3 border-2 border-dashed border-border-default rounded-lg text-sm text-content-muted hover:border-blue-400 hover:text-blue-600 cursor-pointer transition-colors">
|
||||
<Upload size={16} />
|
||||
{uploadMut.isPending ? 'Uploading...' : 'Click to upload material library .blend file'}
|
||||
<input type="file" accept=".blend" className="hidden" onChange={handleFileChange} />
|
||||
</label>
|
||||
)}
|
||||
<Link
|
||||
to="/asset-libraries"
|
||||
className="inline-flex items-center gap-1 text-sm text-accent hover:text-accent-hover"
|
||||
>
|
||||
<Layers size={14} />
|
||||
Manage Asset Libraries
|
||||
</Link>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -79,8 +79,8 @@ export default function CadPreviewPage() {
|
||||
)
|
||||
}
|
||||
|
||||
// No GLB available yet — show generate prompt
|
||||
if (!latestGltf) {
|
||||
// No GLB at all — show generate prompt
|
||||
if (!latestGltf && !latestProduction) {
|
||||
return (
|
||||
<div className="fixed inset-0 z-50 flex flex-col bg-gray-950">
|
||||
<div className="flex items-center justify-between px-5 py-3 bg-gray-900 border-b border-gray-800">
|
||||
@@ -130,8 +130,13 @@ export default function CadPreviewPage() {
|
||||
onClose={() => navigate(-1)}
|
||||
geometryGltfUrl={latestGltf?.download_url ?? undefined}
|
||||
productionGltfUrl={latestProduction?.download_url ?? undefined}
|
||||
hasGeometryGlb={!!latestGltf}
|
||||
hasProductionGlb={!!latestProduction}
|
||||
isGeneratingGeometry={generating}
|
||||
onGenerateGeometry={() => generateMutation.mutate()}
|
||||
downloadUrls={{
|
||||
glb: latestGltf?.download_url ?? undefined,
|
||||
production: latestProduction?.download_url ?? undefined,
|
||||
blend: latestBlend?.download_url ?? undefined,
|
||||
}}
|
||||
/>
|
||||
|
||||
@@ -739,6 +739,10 @@ function OrderLineRow({
|
||||
alt={line.product.name || ''}
|
||||
className="w-10 h-10 object-contain rounded border bg-surface"
|
||||
/>
|
||||
) : (line.render_status === 'processing' || line.render_status === 'pending') ? (
|
||||
<div className="w-10 h-10 rounded border border-dashed border-border-default bg-surface-alt flex items-center justify-center animate-pulse">
|
||||
<Loader2 size={16} className="text-accent animate-spin" />
|
||||
</div>
|
||||
) : (
|
||||
<div className="w-10 h-10 rounded border border-dashed border-border-default bg-surface-alt flex items-center justify-center">
|
||||
<Box size={16} className="text-content-muted" />
|
||||
|
||||
@@ -4,7 +4,7 @@ import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'
|
||||
import { useDropzone } from 'react-dropzone'
|
||||
import {
|
||||
ArrowLeft, Pencil, Save, X, Box, Image,
|
||||
RotateCcw, RefreshCw, Upload, ChevronDown, ChevronRight, Wand2, Download, Plus, Trash2, Filter, Cuboid, Ruler,
|
||||
RotateCcw, RefreshCw, Upload, ChevronDown, ChevronRight, Wand2, Download, Plus, Trash2, Filter, Cuboid, Ruler, Loader2,
|
||||
} from 'lucide-react'
|
||||
import { toast } from 'sonner'
|
||||
import {
|
||||
@@ -18,9 +18,86 @@ import { listMaterials } from '../api/materials'
|
||||
import MaterialInput from '../components/shared/MaterialInput'
|
||||
import MaterialWizard from '../components/MaterialWizard'
|
||||
import { useAuthStore } from '../store/auth'
|
||||
import { generateGltfGeometry, generateGltfProduction } from '../api/cad'
|
||||
import { generateGltfGeometry, generateGltfProduction, resetStuckProcessing } from '../api/cad'
|
||||
import { getMediaAssets } from '../api/media'
|
||||
import InlineCadViewer from '../components/cad/InlineCadViewer'
|
||||
|
||||
function GlbDownloadButton({
|
||||
label, url, filename, onGenerate, isGenerating, title,
|
||||
}: {
|
||||
label: string
|
||||
url: string | null
|
||||
filename: string
|
||||
onGenerate: () => void
|
||||
isGenerating: boolean
|
||||
title: string
|
||||
}) {
|
||||
const token = useAuthStore((s) => s.token)
|
||||
const [isDownloading, setIsDownloading] = useState(false)
|
||||
|
||||
const handleDownload = async () => {
|
||||
if (!url || !token) return
|
||||
setIsDownloading(true)
|
||||
try {
|
||||
const res = await fetch(url, { headers: { Authorization: `Bearer ${token}` } })
|
||||
if (!res.ok) throw new Error(`HTTP ${res.status}`)
|
||||
const blob = await res.blob()
|
||||
const blobUrl = URL.createObjectURL(blob)
|
||||
const a = document.createElement('a')
|
||||
a.href = blobUrl
|
||||
a.download = filename
|
||||
document.body.appendChild(a)
|
||||
a.click()
|
||||
document.body.removeChild(a)
|
||||
URL.revokeObjectURL(blobUrl)
|
||||
} catch {
|
||||
toast.error(`Failed to download ${label}`)
|
||||
} finally {
|
||||
setIsDownloading(false)
|
||||
}
|
||||
}
|
||||
|
||||
if (url) {
|
||||
return (
|
||||
<div className="flex gap-1 w-full">
|
||||
<button
|
||||
className="btn-secondary text-xs flex-1 justify-start"
|
||||
onClick={handleDownload}
|
||||
disabled={isDownloading}
|
||||
title={title}
|
||||
>
|
||||
{isDownloading
|
||||
? <><Loader2 size={12} className="animate-spin" />Downloading…</>
|
||||
: <><Download size={12} />Download {label}</>}
|
||||
</button>
|
||||
<button
|
||||
className="btn-secondary text-xs px-2 shrink-0"
|
||||
onClick={onGenerate}
|
||||
disabled={isGenerating}
|
||||
title={`Re-generate ${label}`}
|
||||
>
|
||||
{isGenerating
|
||||
? <Loader2 size={12} className="animate-spin" />
|
||||
: <RotateCcw size={12} />}
|
||||
</button>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<button
|
||||
className="btn-secondary text-xs w-full justify-start"
|
||||
onClick={onGenerate}
|
||||
disabled={isGenerating}
|
||||
title={title}
|
||||
>
|
||||
{isGenerating
|
||||
? <><Loader2 size={12} className="animate-spin" />Queuing…</>
|
||||
: <><Download size={12} />Generate {label}</>}
|
||||
</button>
|
||||
)
|
||||
}
|
||||
|
||||
function CadStatusBadge({ status }: { status: string | null }) {
|
||||
if (!status) return (
|
||||
<span className="text-xs px-2 py-0.5 rounded-full bg-surface-muted text-content-muted">No STEP</span>
|
||||
@@ -92,6 +169,25 @@ export default function ProductDetailPage() {
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [product?.id, product?.cad_parsed_objects?.length, product?.cad_part_materials.length])
|
||||
|
||||
const cadFileId = product?.cad_file_id ?? null
|
||||
|
||||
const { data: geometryGlbAssets = [] } = useQuery({
|
||||
queryKey: ['media-assets', cadFileId, 'gltf_geometry'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: cadFileId!, asset_types: ['gltf_geometry'] }),
|
||||
enabled: !!cadFileId,
|
||||
staleTime: 0,
|
||||
})
|
||||
|
||||
const { data: productionGlbAssets = [] } = useQuery({
|
||||
queryKey: ['media-assets', cadFileId, 'gltf_production'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: cadFileId!, asset_types: ['gltf_production'] }),
|
||||
enabled: !!cadFileId,
|
||||
staleTime: 0,
|
||||
})
|
||||
|
||||
const geometryGlbUrl = geometryGlbAssets[0]?.download_url ?? null
|
||||
const productionGlbUrl = productionGlbAssets[0]?.download_url ?? null
|
||||
|
||||
const { data: renders = [] } = useQuery<ProductRender[]>({
|
||||
queryKey: ['product-renders', id],
|
||||
queryFn: () => getProductRenders(id!),
|
||||
@@ -234,6 +330,33 @@ export default function ProductDetailPage() {
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed'),
|
||||
})
|
||||
|
||||
const generateGeometryGlbMut = useMutation({
|
||||
mutationFn: () => generateGltfGeometry(product!.cad_file_id!),
|
||||
onSuccess: () => {
|
||||
toast.info('Geometry GLB export queued')
|
||||
qc.invalidateQueries({ queryKey: ['media-assets', cadFileId, 'gltf_geometry'] })
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed to queue GLB export'),
|
||||
})
|
||||
|
||||
const generateProductionGlbMut = useMutation({
|
||||
mutationFn: () => generateGltfProduction(product!.cad_file_id!),
|
||||
onSuccess: () => {
|
||||
toast.info('Production GLB export queued')
|
||||
qc.invalidateQueries({ queryKey: ['media-assets', cadFileId, 'gltf_production'] })
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed to queue production GLB export'),
|
||||
})
|
||||
|
||||
const resetStuckMut = useMutation({
|
||||
mutationFn: () => resetStuckProcessing(product!.cad_file_id!),
|
||||
onSuccess: (res) => {
|
||||
toast.success(res.message)
|
||||
qc.invalidateQueries({ queryKey: ['product', id] })
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Reset failed'),
|
||||
})
|
||||
|
||||
const reprocessMut = useMutation({
|
||||
mutationFn: () => reprocessProduct(id!),
|
||||
onSuccess: () => {
|
||||
@@ -545,6 +668,17 @@ export default function ProductDetailPage() {
|
||||
{isPrivileged && (
|
||||
<>
|
||||
<div className="border-t border-border-light pt-2 mt-1 flex flex-col gap-2">
|
||||
{product.processing_status === 'processing' && (
|
||||
<button
|
||||
className="btn-secondary text-xs w-full justify-start border-amber-400/40 text-amber-700 hover:bg-amber-50"
|
||||
onClick={() => resetStuckMut.mutate()}
|
||||
disabled={resetStuckMut.isPending}
|
||||
title="Force-reset a CAD file stuck in 'processing'. Use if the spinner never goes away."
|
||||
>
|
||||
<Loader2 size={12} className={resetStuckMut.isPending ? 'animate-spin' : ''} />
|
||||
{resetStuckMut.isPending ? 'Resetting…' : 'Reset Stuck Processing'}
|
||||
</button>
|
||||
)}
|
||||
<div {...getRootProps()} className="cursor-pointer">
|
||||
<input {...getInputProps()} />
|
||||
<button className="btn-secondary text-xs w-full justify-start" disabled={cadUploadMut.isPending}>
|
||||
@@ -573,30 +707,22 @@ export default function ProductDetailPage() {
|
||||
</div>
|
||||
|
||||
<div className="border-t border-border-light pt-2 mt-1 flex flex-col gap-2">
|
||||
<button
|
||||
className="btn-secondary text-xs w-full justify-start"
|
||||
onClick={() =>
|
||||
generateGltfGeometry(product.cad_file_id!)
|
||||
.then(() => toast.info('Geometry GLB export queued'))
|
||||
.catch(() => toast.error('Failed to queue GLB export'))
|
||||
}
|
||||
<GlbDownloadButton
|
||||
label="Geometry GLB"
|
||||
url={geometryGlbUrl}
|
||||
filename={`${product.name ?? product.pim_id}_geometry.glb`}
|
||||
onGenerate={() => generateGeometryGlbMut.mutate()}
|
||||
isGenerating={generateGeometryGlbMut.isPending}
|
||||
title="Export geometry GLB directly from STEP via OCC (no Blender)"
|
||||
>
|
||||
<Download size={12} />
|
||||
Generate Geometry GLB
|
||||
</button>
|
||||
<button
|
||||
className="btn-secondary text-xs w-full justify-start"
|
||||
onClick={() =>
|
||||
generateGltfProduction(product.cad_file_id!)
|
||||
.then(() => toast.info('Production GLB export queued'))
|
||||
.catch(() => toast.error('Failed to queue production GLB export'))
|
||||
}
|
||||
/>
|
||||
<GlbDownloadButton
|
||||
label="Production GLB"
|
||||
url={productionGlbUrl}
|
||||
filename={`${product.name ?? product.pim_id}_production.glb`}
|
||||
onGenerate={() => generateProductionGlbMut.mutate()}
|
||||
isGenerating={generateProductionGlbMut.isPending}
|
||||
title="Export production GLB with PBR materials via Blender"
|
||||
>
|
||||
<Download size={12} />
|
||||
Generate Production GLB
|
||||
</button>
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
@@ -389,10 +389,11 @@ function NewWorkflowModal({ onClose, onCreate, isLoading }: NewWorkflowModalProp
|
||||
<label className="block text-sm text-content-secondary mb-1">Typ</label>
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
{([
|
||||
{ value: 'still', label: 'Still', desc: 'Einzelbild PNG' },
|
||||
{ value: 'turntable', label: 'Turntable', desc: 'Animations-MP4' },
|
||||
{ value: 'multi_angle', label: 'Multi-Angle', desc: 'Mehrere Winkel' },
|
||||
{ value: 'custom', label: 'Custom', desc: 'Freier Editor' },
|
||||
{ value: 'still', label: 'Still', desc: 'Single PNG image' },
|
||||
{ value: 'turntable', label: 'Turntable', desc: 'Animation MP4' },
|
||||
{ value: 'multi_angle', label: 'Multi-Angle', desc: 'Multiple angles' },
|
||||
{ value: 'still_with_exports', label: 'Still + GLB', desc: 'PNG + GLB exports' },
|
||||
{ value: 'custom', label: 'Custom', desc: 'Free canvas' },
|
||||
] as { value: WorkflowConfig['type']; label: string; desc: string }[]).map(opt => (
|
||||
<button
|
||||
key={opt.value}
|
||||
@@ -650,6 +651,7 @@ export default function WorkflowEditor() {
|
||||
still: 'Still',
|
||||
turntable: 'Turntable',
|
||||
multi_angle: 'Multi-Angle',
|
||||
still_with_exports: 'Still + GLB',
|
||||
custom: 'Custom',
|
||||
}
|
||||
|
||||
@@ -657,6 +659,7 @@ export default function WorkflowEditor() {
|
||||
still: 'bg-orange-100 text-orange-700 dark:bg-orange-900/40 dark:text-orange-300',
|
||||
turntable: 'bg-purple-100 text-purple-700 dark:bg-purple-900/40 dark:text-purple-300',
|
||||
multi_angle: 'bg-blue-100 text-blue-700 dark:bg-blue-900/40 dark:text-blue-300',
|
||||
still_with_exports: 'bg-green-100 text-green-700 dark:bg-green-900/40 dark:text-green-300',
|
||||
custom: 'bg-surface-hover text-content-muted',
|
||||
}
|
||||
|
||||
|
||||
@@ -1,80 +1,49 @@
|
||||
# Plan: Fix GLB Export Pipeline + Viewer Staleness
|
||||
# Plan: Migrate blender_render.py from STL to GLB Import
|
||||
|
||||
## Root Cause Analysis
|
||||
## Kontext
|
||||
|
||||
### Bug 1 — `export_colors` not valid in Blender 5.0 (CRITICAL)
|
||||
**File**: `render-worker/scripts/export_gltf.py`
|
||||
`bpy.ops.export_scene.gltf(export_colors=False)` → Blender exits code 1:
|
||||
`keyword "export_colors" unrecognized`
|
||||
→ Blender path always fails → always falls back to trimesh → no materials, no sharp edges, faceted mesh.
|
||||
This is confirmed in every single log entry. Blender has never successfully exported a GLB.
|
||||
`render_blender.py` (backend service) correctly converts STEP→GLB via OCC and passes the GLB path as `argv[0]` to `blender_render.py`. However, `blender_render.py` still calls `_import_stl()` which uses `bpy.ops.wm.stl_import()` and `_scale_mm_to_m()`. The GLB from OCC is already in metres (scaled 0.001 internally by `export_step_to_gltf.py`), so no scaling is needed.
|
||||
|
||||
### Bug 2 — `GlbModel` `cloned` ref never resets on URL change (CRITICAL)
|
||||
**File**: `frontend/src/components/cad/InlineCadViewer.tsx`
|
||||
`cloned = useRef<THREE.Group | null>(null)` with guard `if (!cloned.current)` only clones once.
|
||||
When `glbBlobUrl` changes (new GLB generated), React does NOT remount `GlbModel` (same position in tree),
|
||||
so `cloned.current` still holds the old geometry → old mesh shown forever.
|
||||
Fix: add `key={glbBlobUrl}` to `<GlbModel>` → forces remount on each new URL.
|
||||
`still_render.py` already has a correct `_import_glb()` implementation using `bpy.ops.import_scene.gltf()` — this serves as the reference.
|
||||
|
||||
### Bug 3 — `glbBlobUrl` not cleared between fetches (UX)
|
||||
**File**: `frontend/src/components/cad/InlineCadViewer.tsx`
|
||||
When `downloadUrl` changes, cleanup revokes the old blob URL, but `glbBlobUrl` state still holds
|
||||
the (now revoked) old URL → `GlbModel` tries to render a revoked URL for the duration of the new fetch.
|
||||
Fix: `setGlbBlobUrl(null)` at the start of the effect before fetching.
|
||||
This caused render failures for order SA-2026-00099: Blender tried to STL-import a `.glb` file → silent failure → cancelled renders.
|
||||
|
||||
### Bug 4 — `staleTime: 30_000` delays detecting new GLB (UX)
|
||||
**File**: `frontend/src/components/cad/InlineCadViewer.tsx`
|
||||
After "Generate GLB" the task completes and a new MediaAsset is written to DB,
|
||||
but the assets query is cached for 30 seconds → `downloadUrl` stays stale → viewer fetches old GLB.
|
||||
Fix: reduce `staleTime` to `0` so the query always refetches on focus/mount after invalidation.
|
||||
## Betroffene Dateien
|
||||
|
||||
---
|
||||
| Datei | Änderung |
|
||||
|-------|----------|
|
||||
| `blender-renderer/blender_render.py` | Replace `_import_stl` with `_import_glb`, remove `_scale_mm_to_m`, rename `stl_path` → `glb_path` |
|
||||
|
||||
## Affected Files
|
||||
## Tasks (in Reihenfolge)
|
||||
|
||||
| File | Change | Bug |
|
||||
|------|--------|-----|
|
||||
| `render-worker/scripts/export_gltf.py` | Remove invalid `export_colors=False` | 1 |
|
||||
| `frontend/src/components/cad/InlineCadViewer.tsx` | key={glbBlobUrl} on GlbModel + clear state + staleTime=0 | 2, 3, 4 |
|
||||
### [x] Task 1: Replace `_import_stl()` with `_import_glb()` in `blender_render.py`
|
||||
|
||||
---
|
||||
- **Datei**: `blender-renderer/blender_render.py`
|
||||
- **Was**:
|
||||
1. Replace `_import_stl()` function (lines ~206-289) with `_import_glb()` modeled on `still_render.py:196-229`:
|
||||
- Use `bpy.ops.import_scene.gltf(filepath=glb_path)`
|
||||
- Collect imported mesh objects
|
||||
- No scaling needed (GLB already in metres)
|
||||
2. Remove `_scale_mm_to_m()` function (lines ~166-182) — no longer needed
|
||||
3. Remove all calls to `_scale_mm_to_m(parts)` (Mode A ~line 466, Mode B ~line 386)
|
||||
4. Replace all calls to `_import_stl(stl_path)` with `_import_glb(glb_path)` (Mode A ~line 464, Mode B ~line 384)
|
||||
5. Rename variable `stl_path` → `glb_path` throughout (line 65, 715, and all references)
|
||||
6. Update docstring/comments referencing STL
|
||||
- **Akzeptanzkriterium**: `blender_render.py` imports GLB via `bpy.ops.import_scene.gltf()`, no STL references remain, no mm→m scaling
|
||||
- **Abhängigkeiten**: keine
|
||||
|
||||
## Tasks
|
||||
## Migrations-Check
|
||||
|
||||
### Task 1: Fix Blender GLTF export parameters
|
||||
**File**: `render-worker/scripts/export_gltf.py`
|
||||
Remove `export_colors=False` from `bpy.ops.export_scene.gltf()` call.
|
||||
Keep `export_materials="EXPORT"` and `export_image_format="AUTO"` — these are valid in Blender 5.0.
|
||||
**Acceptance**: Blender exits 0, GLB file is created with materials.
|
||||
**Requires rebuild**: yes — scripts are COPY'd into container.
|
||||
Keine Migration nötig — reine Script-Änderung.
|
||||
|
||||
### Task 2: Fix GlbModel stale mesh on regeneration
|
||||
**File**: `frontend/src/components/cad/InlineCadViewer.tsx`
|
||||
Add `key={glbBlobUrl}` on the `<GlbModel>` element inside the Canvas.
|
||||
This forces React to unmount+remount GlbModel whenever the blob URL changes,
|
||||
resetting the `cloned` ref and loading the fresh geometry.
|
||||
**Acceptance**: After generating a new GLB, the viewer shows the new mesh, not the old one.
|
||||
## Reihenfolge-Empfehlung
|
||||
|
||||
### Task 3: Clear stale blob URL before new fetch
|
||||
**File**: `frontend/src/components/cad/InlineCadViewer.tsx`
|
||||
At the top of the `useEffect([downloadUrl, token])` body, add `setGlbBlobUrl(null)` before the fetch.
|
||||
This shows the loading spinner instead of a broken/stale model during re-fetch.
|
||||
**Acceptance**: After regeneration, viewer shows spinner while new GLB loads.
|
||||
1. Task 1 (blender_render.py)
|
||||
2. Rebuild render-worker: `docker compose up -d --build render-worker`
|
||||
3. Test: trigger a thumbnail render or order render and check logs
|
||||
|
||||
### Task 4: Remove staleTime delay on asset query
|
||||
**File**: `frontend/src/components/cad/InlineCadViewer.tsx`
|
||||
Change `staleTime: 30_000` → `staleTime: 0` on the `gltf_geometry` assets query.
|
||||
The `qc.invalidateQueries()` call after generating already forces a refetch,
|
||||
but staleTime=0 also ensures refetch on window focus/tab return.
|
||||
**Acceptance**: New MediaAsset is picked up within seconds of task completion.
|
||||
## Risiken / Offene Fragen
|
||||
|
||||
---
|
||||
|
||||
## Reihenfolge
|
||||
Task 1 (rebuild) + Tasks 2/3/4 (frontend hot-reload) in parallel.
|
||||
|
||||
## Risiken
|
||||
- `export_materials="EXPORT"` and `export_image_format="AUTO"` may also be invalid in Blender 5.0.
|
||||
If so, remove them too and test with bare minimum params (format + apply only).
|
||||
- If the Schaeffler .blend library materials use custom node groups instead of Principled BSDF,
|
||||
the GLTF exporter will still export flat grey — that requires material baking, out of scope here.
|
||||
1. **`_apply_material_library()`** and **`_resolve_part_name()`** work on Blender objects after import — they should work identically regardless of import format (STL vs GLB).
|
||||
2. **Auto-camera computation** uses bounding box of imported objects — works the same with GLB meshes.
|
||||
3. **`turntable_render.py`** and **`turntable_setup.py`** — need to check if they also still use STL import. If so, they need the same fix (but out of scope for this plan unless confirmed).
|
||||
|
||||
@@ -1,16 +1,15 @@
|
||||
"""
|
||||
Blender Python script for rendering an STL file to PNG.
|
||||
Blender Python script for rendering a GLB file to PNG.
|
||||
Targets Blender 5.0+ (EEVEE / Cycles).
|
||||
|
||||
Called by Blender:
|
||||
blender --background --python blender_render.py -- \
|
||||
<stl_path> <output_path> <width> <height> [engine] [samples]
|
||||
<glb_path> <output_path> <width> <height> [engine] [samples]
|
||||
|
||||
engine: "cycles" (default) | "eevee"
|
||||
|
||||
Features:
|
||||
- Disconnected mesh islands split into separate objects and painted with
|
||||
palette colours (same 10-colour palette as the Three.js renderer).
|
||||
- OCC-generated GLB: one mesh per STEP part, already in metres.
|
||||
- Bounding-box-aware camera: object fills ~85 % of the frame.
|
||||
- Isometric-style angle (elevation 28°, azimuth 40°).
|
||||
- Dynamic clip planes.
|
||||
@@ -20,6 +19,12 @@ Features:
|
||||
import sys
|
||||
import os
|
||||
import math
|
||||
|
||||
# Force unbuffered stdout so render log lines appear immediately
|
||||
os.environ["PYTHONUNBUFFERED"] = "1"
|
||||
if hasattr(sys.stdout, "reconfigure"):
|
||||
sys.stdout.reconfigure(line_buffering=True)
|
||||
|
||||
import bpy
|
||||
from mathutils import Vector, Matrix
|
||||
|
||||
@@ -179,7 +184,7 @@ import re as _re
|
||||
def _apply_rotation(parts, rx, ry, rz):
|
||||
"""Apply Euler rotation (degrees, XYZ order) to all parts around world origin.
|
||||
|
||||
After _import_stl + _scale_mm_to_m the combined bbox center is at world origin,
|
||||
After _import_glb the combined bbox center is at world origin,
|
||||
so rotating around origin is equivalent to rotating around the assembly center.
|
||||
"""
|
||||
if not parts or (rx == 0.0 and ry == 0.0 and rz == 0.0):
|
||||
@@ -301,9 +306,9 @@ def _import_glb(glb_file):
|
||||
def _resolve_part_name(index, part_obj):
|
||||
"""Get the STEP part name for a Blender part by index.
|
||||
|
||||
With per-part import, part_obj.name IS the STEP name (possibly with
|
||||
With GLB import, part_obj.name IS the STEP name (possibly with
|
||||
Blender .NNN suffix for duplicates). Strip that suffix for lookup.
|
||||
Falls back to part_names_ordered index mapping for combined-STL mode.
|
||||
Falls back to part_names_ordered index mapping.
|
||||
"""
|
||||
# Strip Blender auto-suffix (.001, .002, etc.)
|
||||
base_name = _re.sub(r'\.\d{3}$', '', part_obj.name)
|
||||
@@ -317,9 +322,9 @@ def _resolve_part_name(index, part_obj):
|
||||
def _apply_material_library(parts, mat_lib_path, mat_map):
|
||||
"""Append materials from library .blend and assign to parts via material_map.
|
||||
|
||||
With per-part STL import, Blender objects are named after STEP parts,
|
||||
so matching is by name (stripping Blender .NNN suffix for duplicates).
|
||||
Falls back to part_names_ordered index-based matching for combined-STL mode.
|
||||
GLB-imported objects are named after STEP parts, so matching is by name
|
||||
(stripping Blender .NNN suffix for duplicates). Falls back to
|
||||
part_names_ordered index-based matching.
|
||||
|
||||
mat_map: {part_name_lower: material_name}
|
||||
Parts without a match keep their current material.
|
||||
@@ -355,30 +360,88 @@ def _apply_material_library(parts, mat_lib_path, mat_map):
|
||||
if not appended:
|
||||
return
|
||||
|
||||
# Assign materials to parts — primary: name-based (per-part STL mode),
|
||||
# secondary: index-based via part_names_ordered (combined STL fallback)
|
||||
# Assign materials to parts — primary: name-based (GLB object names),
|
||||
# secondary: index-based via part_names_ordered
|
||||
assigned_count = 0
|
||||
unmatched_names = []
|
||||
for i, part in enumerate(parts):
|
||||
# Try name-based matching first (strip Blender .NNN suffix)
|
||||
base_name = _re.sub(r'\.\d{3}$', '', part.name)
|
||||
# Strip OCC assembly-instance suffix (_AF0, _AF1, …) — GLB object
|
||||
# names may or may not have them while mat_map keys might.
|
||||
_prev = None
|
||||
while _prev != base_name:
|
||||
_prev = base_name
|
||||
base_name = _re.sub(r'_AF\d+$', '', base_name, flags=_re.IGNORECASE)
|
||||
part_key = base_name.lower().strip()
|
||||
mat_name = mat_map.get(part_key)
|
||||
|
||||
# Prefix fallback: if a mat_map key starts with our base name or
|
||||
# vice-versa, use the longest matching key (most-specific wins).
|
||||
if not mat_name:
|
||||
for key, val in sorted(mat_map.items(), key=lambda x: len(x[0]), reverse=True):
|
||||
if len(key) >= 5 and len(part_key) >= 5 and (
|
||||
part_key.startswith(key) or key.startswith(part_key)
|
||||
):
|
||||
mat_name = val
|
||||
break
|
||||
|
||||
# Fall back to index-based matching via part_names_ordered
|
||||
if not mat_name and part_names_ordered and i < len(part_names_ordered):
|
||||
step_name = part_names_ordered[i]
|
||||
part_key = step_name.lower().strip()
|
||||
mat_name = mat_map.get(part_key)
|
||||
step_key = step_name.lower().strip()
|
||||
mat_name = mat_map.get(step_key)
|
||||
# Also try stripping AF from part_names_ordered entry
|
||||
if not mat_name:
|
||||
_p2 = None
|
||||
while _p2 != step_key:
|
||||
_p2 = step_key
|
||||
step_key = _re.sub(r'_af\d+$', '', step_key)
|
||||
mat_name = mat_map.get(step_key)
|
||||
|
||||
if mat_name and mat_name in appended:
|
||||
part.data.materials.clear()
|
||||
part.data.materials.append(appended[mat_name])
|
||||
assigned_count += 1
|
||||
print(f"[blender_render] assigned '{mat_name}' to part '{part.name}'")
|
||||
print(f"[blender_render] assigned '{mat_name}' to part '{part.name}'", flush=True)
|
||||
else:
|
||||
unmatched_names.append(part.name)
|
||||
|
||||
print(f"[blender_render] material assignment: {assigned_count}/{len(parts)} parts matched")
|
||||
print(f"[blender_render] material assignment: {assigned_count}/{len(parts)} parts matched", flush=True)
|
||||
if unmatched_names:
|
||||
print(f"[blender_render] unmatched parts (palette fallback): {unmatched_names[:10]}", flush=True)
|
||||
|
||||
|
||||
# ── Early GPU activation (must happen BEFORE open_mainfile / Cycles init) ────
|
||||
# Blender compiles Cycles kernels when the engine first initializes. If the
|
||||
# compute_device_type is NONE at that point, Cycles locks to CPU for the rest
|
||||
# of the session. We therefore probe + enable GPU devices NOW, before any
|
||||
# .blend template (which may trigger Cycles init) is loaded.
|
||||
def _activate_gpu():
|
||||
"""Probe for GPU compute devices and activate them. Returns device type or None."""
|
||||
if cycles_device == "cpu":
|
||||
return None
|
||||
try:
|
||||
cprefs = bpy.context.preferences.addons['cycles'].preferences
|
||||
for dt in ('OPTIX', 'CUDA', 'HIP', 'ONEAPI'):
|
||||
try:
|
||||
cprefs.compute_device_type = dt
|
||||
cprefs.get_devices()
|
||||
gpu = [d for d in cprefs.devices if d.type != 'CPU']
|
||||
if gpu:
|
||||
for d in cprefs.devices:
|
||||
d.use = (d.type != 'CPU')
|
||||
print(f"[blender_render] early GPU activation: {dt}, "
|
||||
f"devices={[(d.name, d.type) for d in gpu]}", flush=True)
|
||||
return dt
|
||||
except Exception as e:
|
||||
print(f"[blender_render] {dt} not available: {e}", flush=True)
|
||||
except Exception as e:
|
||||
print(f"[blender_render] early GPU probe failed: {e}", flush=True)
|
||||
return None
|
||||
|
||||
_early_gpu_type = _activate_gpu()
|
||||
|
||||
# ── SCENE SETUP ──────────────────────────────────────────────────────────────
|
||||
|
||||
if use_template:
|
||||
@@ -401,18 +464,32 @@ if use_template:
|
||||
col.objects.unlink(part)
|
||||
target_col.objects.link(part)
|
||||
|
||||
# Apply smooth shading and mark sharp edges / UV seams
|
||||
for part in parts:
|
||||
# Apply smooth shading (Blender 5.0+ shade_smooth_by_angle adds a geometry
|
||||
# node modifier that handles both smooth shading AND sharp edge marking
|
||||
# automatically — no need for the old _mark_sharp_and_seams edit-mode loop)
|
||||
import time as _time
|
||||
_t_smooth = _time.time()
|
||||
for _si, part in enumerate(parts):
|
||||
_apply_smooth(part, smooth_angle)
|
||||
_mark_sharp_and_seams(
|
||||
part, smooth_angle,
|
||||
sharp_edge_midpoints=_mesh_attrs.get('sharp_edge_midpoints'),
|
||||
)
|
||||
print(f"[blender_render] smooth shading: {len(parts)} parts ({_time.time()-_t_smooth:.1f}s)", flush=True)
|
||||
|
||||
# Material assignment: library materials if available, otherwise palette
|
||||
if material_library_path and material_map:
|
||||
# Build lowercased material_map for matching
|
||||
mat_map_lower = {k.lower(): v for k, v in material_map.items()}
|
||||
# Build lowercased material_map for matching.
|
||||
# Include BOTH the original key AND the key with _AF\d+ stripped,
|
||||
# so GLB names (which may lack AF suffixes) can match.
|
||||
mat_map_lower = {}
|
||||
for k, v in material_map.items():
|
||||
kl = k.lower().strip()
|
||||
mat_map_lower[kl] = v
|
||||
# Also add AF-stripped version
|
||||
_stripped = kl
|
||||
_p = None
|
||||
while _p != _stripped:
|
||||
_p = _stripped
|
||||
_stripped = _re.sub(r'_af\d+$', '', _stripped)
|
||||
if _stripped != kl:
|
||||
mat_map_lower.setdefault(_stripped, v)
|
||||
_apply_material_library(parts, material_library_path, mat_map_lower)
|
||||
# Parts not matched by library get palette fallback
|
||||
for i, part in enumerate(parts):
|
||||
@@ -477,19 +554,28 @@ else:
|
||||
# Apply render position rotation (before camera/bbox calculations)
|
||||
_apply_rotation(parts, rotation_x, rotation_y, rotation_z)
|
||||
|
||||
import time as _time
|
||||
_t_smooth_a = _time.time()
|
||||
for i, part in enumerate(parts):
|
||||
_apply_smooth(part, smooth_angle)
|
||||
_mark_sharp_and_seams(
|
||||
part, smooth_angle,
|
||||
sharp_edge_midpoints=_mesh_attrs.get('sharp_edge_midpoints'),
|
||||
)
|
||||
_assign_palette_material(part, i)
|
||||
print(f"[blender_render] smooth+palette: {len(parts)} parts ({_time.time()-_t_smooth_a:.1f}s)", flush=True)
|
||||
|
||||
# Apply material library on top of palette colours (same logic as Mode B).
|
||||
# material_library_path / material_map are parsed from argv even in Mode A
|
||||
# but were previously never used here — that was the bug.
|
||||
if material_library_path and material_map:
|
||||
mat_map_lower = {k.lower(): v for k, v in material_map.items()}
|
||||
mat_map_lower = {}
|
||||
for k, v in material_map.items():
|
||||
kl = k.lower().strip()
|
||||
mat_map_lower[kl] = v
|
||||
_stripped = kl
|
||||
_p = None
|
||||
while _p != _stripped:
|
||||
_p = _stripped
|
||||
_stripped = _re.sub(r'_af\d+$', '', _stripped)
|
||||
if _stripped != kl:
|
||||
mat_map_lower.setdefault(_stripped, v)
|
||||
_apply_material_library(parts, material_library_path, mat_map_lower)
|
||||
# Parts not matched by the library keep their palette material (already set above)
|
||||
|
||||
@@ -633,7 +719,26 @@ if engine == "eevee":
|
||||
continue
|
||||
|
||||
if engine != "eevee": # covers both explicit Cycles and EEVEE-fallback
|
||||
scene.render.engine = 'CYCLES'
|
||||
# ── GPU preferences (before engine activation) ───────────────────────
|
||||
# Set compute_device_type in preferences so Cycles can find GPU kernels.
|
||||
gpu_type_found = _activate_gpu() or _early_gpu_type
|
||||
|
||||
# ── Activate Cycles engine ───────────────────────────────────────────
|
||||
scene.render.engine = 'CYCLES'
|
||||
|
||||
# ── Device selection AFTER engine activation ─────────────────────────
|
||||
# IMPORTANT: scene.cycles.device must be set AFTER scene.render.engine
|
||||
# = 'CYCLES'. Setting it before can be overwritten when Cycles inits
|
||||
# and reads the scene's saved properties (template may have device=CPU).
|
||||
if gpu_type_found:
|
||||
scene.cycles.device = 'GPU'
|
||||
# Re-ensure preferences are set (engine activation may have reset them)
|
||||
_activate_gpu()
|
||||
print(f"[blender_render] Cycles GPU ({gpu_type_found}), samples={samples}", flush=True)
|
||||
else:
|
||||
scene.cycles.device = 'CPU'
|
||||
print(f"[blender_render] WARNING: GPU not found — falling back to CPU, samples={samples}", flush=True)
|
||||
|
||||
scene.cycles.samples = samples
|
||||
scene.cycles.use_denoising = True
|
||||
scene.cycles.denoiser = denoiser_arg if denoiser_arg else 'OPENIMAGEDENOISE'
|
||||
@@ -653,34 +758,6 @@ if engine != "eevee": # covers both explicit Cycles and EEVEE-fallback
|
||||
scene.cycles.use_adaptive_sampling = True
|
||||
scene.cycles.adaptive_threshold = float(noise_threshold_arg)
|
||||
|
||||
# ── Device selection: "cpu" forces CPU, "gpu" forces GPU (fail if unavailable),
|
||||
# "auto" tries GPU first and falls back to CPU.
|
||||
gpu_type_found = None
|
||||
if cycles_device != "cpu":
|
||||
try:
|
||||
cycles_prefs = bpy.context.preferences.addons['cycles'].preferences
|
||||
for device_type in ('OPTIX', 'CUDA', 'HIP', 'ONEAPI'):
|
||||
try:
|
||||
cycles_prefs.compute_device_type = device_type
|
||||
cycles_prefs.get_devices()
|
||||
gpu_devs = [d for d in cycles_prefs.devices if d.type != 'CPU']
|
||||
if gpu_devs:
|
||||
for d in gpu_devs:
|
||||
d.use = True
|
||||
gpu_type_found = device_type
|
||||
break
|
||||
except Exception as e:
|
||||
print(f"[blender_render] {device_type} not available: {e}")
|
||||
except Exception as e:
|
||||
print(f"[blender_render] GPU probe failed: {e}")
|
||||
|
||||
if gpu_type_found:
|
||||
scene.cycles.device = 'GPU'
|
||||
print(f"[blender_render] Cycles GPU ({gpu_type_found}), samples={samples}")
|
||||
else:
|
||||
scene.cycles.device = 'CPU'
|
||||
print(f"[blender_render] WARNING: GPU not found — falling back to CPU, samples={samples}")
|
||||
|
||||
# ── Colour management ─────────────────────────────────────────────────────────
|
||||
# In template mode the .blend file owns its colour management (e.g. Filmic/
|
||||
# AgX for HDR, custom exposure for Alpha-HDR output types). Overwriting it
|
||||
@@ -705,9 +782,18 @@ scene.render.filepath = output_path
|
||||
scene.render.film_transparent = transparent_bg
|
||||
|
||||
# ── Render ────────────────────────────────────────────────────────────────────
|
||||
print(f"[blender_render] Rendering → {output_path} (Blender {bpy.app.version_string})")
|
||||
# Final verification of render device settings
|
||||
if scene.render.engine == 'CYCLES':
|
||||
cprefs = bpy.context.preferences.addons['cycles'].preferences
|
||||
print(f"[blender_render] VERIFY: engine={scene.render.engine}, "
|
||||
f"cycles.device={scene.cycles.device}, "
|
||||
f"compute_device_type={cprefs.compute_device_type}, "
|
||||
f"gpu_devices={[(d.name, d.type, d.use) for d in cprefs.devices if d.type != 'CPU']}",
|
||||
flush=True)
|
||||
print(f"[blender_render] Rendering → {output_path} (Blender {bpy.app.version_string})", flush=True)
|
||||
sys.stdout.flush()
|
||||
bpy.ops.render.render(write_still=True)
|
||||
print("[blender_render] render done.")
|
||||
print("[blender_render] render done.", flush=True)
|
||||
|
||||
# ── Pillow post-processing: green bar + model name label ─────────────────────
|
||||
# Skip overlay for transparent renders to keep clean alpha channel
|
||||
@@ -726,7 +812,7 @@ else:
|
||||
draw.rectangle([0, 0, W - 1, bar_h - 1], fill=(0, 137, 61, 255))
|
||||
|
||||
# Model name strip at bottom
|
||||
model_name = os.path.splitext(os.path.basename(stl_path))[0]
|
||||
model_name = os.path.splitext(os.path.basename(glb_path))[0]
|
||||
label_h = max(20, H // 20)
|
||||
img.alpha_composite(
|
||||
Image.new("RGBA", (W, label_h), (30, 30, 30, 180)),
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
Usage (from Blender):
|
||||
blender --background --python turntable_render.py -- \
|
||||
<stl_path> <frames_dir> <frame_count> <degrees> <width> <height> \
|
||||
<glb_path> <frames_dir> <frame_count> <degrees> <width> <height> \
|
||||
<engine> <samples> <part_colors_json> \
|
||||
[template_path] [target_collection] [material_library_path] [material_map_json]
|
||||
"""
|
||||
@@ -88,7 +88,7 @@ def _apply_rotation(parts, rx, ry, rz):
|
||||
"""Apply Euler XYZ rotation (degrees) to all parts by modifying matrix_world.
|
||||
|
||||
Rotates around world origin, which equals the assembly centre because
|
||||
_import_stl already centres parts there. Applied before material assignment
|
||||
_import_glb already centres parts there. Applied before material assignment
|
||||
and camera/bbox calculations so everything downstream sees the final pose.
|
||||
"""
|
||||
if not parts or (rx == 0.0 and ry == 0.0 and rz == 0.0):
|
||||
|
||||
+36
-32
@@ -1,4 +1,4 @@
|
||||
# Review Report: Phase V2-Cleanup + Phase V3
|
||||
# Review Report: Optimized Material Substitution Algorithm
|
||||
Datum: 2026-03-07
|
||||
|
||||
## Ergebnis: ⚠️ Kleinigkeiten
|
||||
@@ -7,49 +7,53 @@ Datum: 2026-03-07
|
||||
|
||||
## Gefundene Probleme
|
||||
|
||||
### [backend/app/domains/media/router.py] Auth fehlt auf GET /{asset_id} und DELETE-Endpunkten
|
||||
**Schwere**: Mittel
|
||||
**Empfehlung**: `get_asset`, `archive_asset`, `delete_asset_permanent` haben kein `get_current_user` Dependency. War nicht im Plan-Scope, sollte aber in einem Folge-Task ergänzt werden. Aktuell könnte jede Person mit einer Asset-UUID das Asset abrufen oder löschen — allerdings sind UUIDs nicht ratbar (V2-C2 ist damit in der Praxis weitgehend erfüllt, aber formal unvollständig).
|
||||
### [products.py:510] Prefix-Matching ohne Mindestlänge-Guard
|
||||
|
||||
### [backend/app/domains/rendering/tasks.py] asyncio.get_event_loop() in Celery-Kontext
|
||||
**Schwere**: Gering
|
||||
**Empfehlung**: `asyncio.get_event_loop().run_until_complete()` in `_update_workflow_run_status()` ist ein Anti-Pattern in neueren Python-Versionen (3.10+). Deprecation-Warning möglich wenn kein laufender Loop existiert. Besser: `asyncio.run()` oder sync SQLAlchemy-Session wie in `dispatch_service.py`. Kein Blocker.
|
||||
|
||||
### [render-worker/scripts/export_gltf.py] O(N×M) Proximity-Loop
|
||||
**Schwere**: Gering
|
||||
**Empfehlung**: `mark_sharp_edges_by_proximity()` iteriert alle Blender-Mesh-Edges gegen alle OCC-Kantenmittelpunkte. Bei großen STEP-Dateien (10k+ Edges, 500+ OCC-Hinweispunkte) kann das spürbar langsam sein. Nicht kritisch für aktuelle Produktgrößen. Notiz für spätere Optimierung (z.B. KD-Tree mit `scipy.spatial`).
|
||||
Der Prefix-Fallback in `build_materials_from_excel` prüft `cad_norm.startswith(excel_norm)` ohne Mindestlänge für `excel_norm`:
|
||||
|
||||
```python
|
||||
if excel_norm and cad_norm and (
|
||||
cad_norm.startswith(excel_norm) or excel_norm.startswith(cad_norm)
|
||||
):
|
||||
```
|
||||
|
||||
Wenn ein Excel-Eintrag nach Normalisierung sehr kurz wird (z.B. `"f"` aus `f-12345678.prt`), trifft der Präfix-Check auf fast alle CAD-Namen die mit `f_` beginnen. Schaeffler-Teilenamen sind zwar praktisch immer lang genug, aber das Risiko eines Fehlmatches bei atypischen Einträgen existiert.
|
||||
|
||||
**Empfehlung**: Guard hinzufügen: `len(excel_norm) >= 5 and len(cad_norm) >= 5`.
|
||||
|
||||
---
|
||||
|
||||
### [export_gltf.py:122] Prefix-Fallback iteriert über unsortierte dict-Keys
|
||||
|
||||
### [backend/app/services/step_processor.py] bbox-Extraktion ohne Shape-Guard (OCC-Pfad)
|
||||
**Schwere**: Gering
|
||||
**Empfehlung**: `brepbndlib.Add(shape, bbox)` kann bei degenerierten STEP-Geometrien eine leere BBox zurückgeben. Ein Guard `if not bbox.IsVoid():` vor dem `bbox.Get()` wäre robuster. Dieser OCC-Pfad ist in der aktuellen Container-Konfiguration nicht aktiv (kein OCC installiert), aber beim nächsten Container-Upgrade relevant.
|
||||
|
||||
```python
|
||||
for key, val in mat_map_lower.items():
|
||||
if lower_base.startswith(key) or key.startswith(lower_base):
|
||||
mat_name = val
|
||||
break
|
||||
```
|
||||
|
||||
`mat_map_lower` hat keine garantierte Sortierung nach Schlüssellänge. Wenn ein kurzer Key (`"ring"`) und ein langer Key (`"ring_inner_seal"`) beide die Präfix-Bedingung erfüllen, gewinnt der erste in dict-Reihenfolge — nicht zwangsläufig der spezifischste Match.
|
||||
|
||||
**Empfehlung**: Keys nach Länge absteigend sortieren: `sorted(mat_map_lower.items(), key=lambda x: len(x[0]), reverse=True)` — längster Match gewinnt = spezifischster Match.
|
||||
|
||||
---
|
||||
|
||||
## Positiv aufgefallen
|
||||
|
||||
- **V2-C1 (asset_type-Klassifizierung)**: Korrektur in beiden Stellen (`admin.py` + `step_tasks.py`) konsistent auf Extension-Basis umgestellt.
|
||||
- **V2-C2 (Tenant Isolation)**: `get_current_user` Dependency korrekt auf `list_assets`, `download_asset`, `zip_download` ergänzt. Pattern konsistent mit anderen Routers.
|
||||
- **V2-C3 (storage_key Normalisierung)**: `_normalize_key()` Helper in `admin.py` sauber definiert. In `step_tasks.py` inline normalisiert.
|
||||
- **V2-C4 (Cache-Control)**: Header auf beiden Endpoints korrekt gesetzt.
|
||||
- **V3-A1 (OCC Bounding Box in step_processor.py)**: Code korrekt, aber nur wirksam wenn OCC installiert ist — in der Produktionskonfiguration nicht aktiv.
|
||||
- **V3-A2 (Frontend Dimensionen)**: `cad_mesh_attributes` im `ProductOut`-Schema sauber ergänzt. `selectinload(Product.cad_file)` war bereits in allen Queries vorhanden — kein N+1-Problem.
|
||||
- **V3-B (Mark Sharp Edges)**: Proximity-basiertes Marking mit konfigurierbarem Threshold (1mm default) ist ein pragmatischer Ansatz.
|
||||
- **V3-C1/C2/C3 (Workflow-Integration)**: `still_with_exports` korrekt ergänzt, Turntable-Params werden zur Laufzeit aufgelöst, WorkflowRun-Status wird nach Task-Abschluss aktualisiert.
|
||||
- **bbox via STL (nachträglicher Fix)**: `_bbox_from_stl()` mit numpy min/max ist die effizienteste Methode — nutzt bereits gecachte STL-Dateien, kein STEP-Re-Parse nötig. Cadquery-Fallback für Dateien ohne STL-Cache ist korrekt implementiert.
|
||||
- **`render_step_thumbnail` Patch**: Nur ausgeführt wenn `dimensions_mm` noch nicht gesetzt — vermeidet redundante Berechnungen bei Re-Renders.
|
||||
- **TypeScript**: `tsc --noEmit` läuft ohne Fehler. Neue `cad_mesh_attributes`-Interface-Felder korrekt typisiert.
|
||||
- **LEARNINGS.md**: 5 neue Learnings mit korrektem Format eingetragen.
|
||||
- **Task 1 korrekt und robust**: `while prev != base_name` Loop für nested Suffixe terminiert sicher und deckt `_AF0_AF1`-Fälle ab.
|
||||
- **`_re.IGNORECASE` korrekt gesetzt** in `export_gltf.py` — deckt `_AF0` wie auch `_af0`.
|
||||
- **`_normalize_part_token_name` Reihenfolge stimmt**: `_af\d+` wird VOR Hyphen→Underscore-Konvertierung gestrippt, Regex funktioniert zuverlässig.
|
||||
- **Hash-Suffix-Stripping `\d{4,}`**: Mindestlänge 4 verhindert False-Positives bei legitimen kurzen Nummern in Teilenamen.
|
||||
- **`print()` in Blender-Script korrekt**: Blender-Scripts laufen als Subprocess, stdout wird vom Caller geloggt — `logging` wäre hier falsch.
|
||||
- **Kein DB-Schema geändert**: Keine Migration nötig, korrekt erkannt und ausgelassen.
|
||||
- **Tuple-Erweiterung auf 4 Elemente**: `excel_entries` korrekt auf `(tokens, raw, material, excel_norm)` erweitert, keine alten Stellen übersehen.
|
||||
|
||||
---
|
||||
|
||||
## Empfehlung
|
||||
|
||||
Freigabe mit folgenden Nacharbeiten im nächsten Cleanup-Cycle:
|
||||
|
||||
1. Auth auf `get_asset` + `archive_asset` + `delete_asset_permanent` in `media/router.py` ergänzen
|
||||
2. `_update_workflow_run_status()` auf `asyncio.run()` oder sync-SQLAlchemy umstellen
|
||||
3. `if not bbox.IsVoid():` Guard in `step_processor.py` vor `bbox.Get()` einfügen
|
||||
|
||||
Keiner dieser Punkte blockiert den aktuellen Stand — alle Core-Features sind korrekt implementiert.
|
||||
|
||||
Review abgeschlossen. Ergebnis: ⚠️
|
||||
Zwei geringe Probleme (Mindestlänge-Guard + Sortierung nach Key-Länge). Beide sind je eine Zeile Fix und verhindern Fehlmatches bei atypischen Eingaben. Können direkt inline gepatcht werden, kein erneutes Review nötig.
|
||||
|
||||
Reference in New Issue
Block a user