feat(P2): USD Foundation — canonical part identity + material overrides
M1 — USD exporter:
- render-worker/scripts/export_step_to_usd.py (631 lines)
Full XCAF traversal, one UsdGeom.Mesh per leaf part,
schaeffler:partKey on every prim, index-space sharpEdgeVertexPairs
- render-worker/Dockerfile: usd-core>=24.11 installed (USD 0.26.3)
M2 — usd_master MediaAsset + pipeline auto-chain:
- migrations 060 (usd_master enum), 061 (3 JSONB columns),
062 (rename tessellation settings keys)
- generate_usd_master_task: runs export_step_to_usd.py, upserts
usd_master MediaAsset, writes resolved_material_assignments to CadFile
- Auto-chained from generate_gltf_geometry_task after every GLB export
- step_tasks.py shim re-exports generate_usd_master_task
M3 — scene-manifest API:
- part_key_service.py: build_scene_manifest(), generate_part_key(),
four-layer material priority resolution with provenance
- SceneManifest / PartEntry Pydantic models in products/schemas.py
- GET /api/cad/{id}/scene-manifest endpoint (graceful fallback to
parsed_objects when USD not yet generated)
- POST /api/cad/{id}/generate-usd-master endpoint
- frontend/src/api/sceneManifest.ts: fetchSceneManifest(),
triggerUsdMasterGeneration()
M4 — manual-material-overrides API:
- GET/PUT /api/cad/{id}/manual-material-overrides endpoints
- CadFile.manual_material_overrides JSONB column (migration 061)
- getManualOverrides() / saveManualOverrides() in cad.ts
M5 — ThreeDViewer partKey integration:
- export_step_to_gltf.py injects partKeyMap into GLB extras
- ThreeDViewer: partKeyMap extraction, resolvePartKey(), effectiveMaterials
merges legacy partMaterials + new manualOverrides (server-side persistence)
- MaterialPanel: dual-path save (partKey vs legacy), provenance badge,
reconciliation panel for unmatched/unassigned parts
Also:
- Admin.tsx: generate-missing-usd-masters + canonical scenes bulk actions
- ProductDetail.tsx: usd_master row in asset table
- vite-env.d.ts: fix ImportMeta.env TypeScript error
- GPUProbeResult: add timestamp/devices/render_time_s fields
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -454,3 +454,6 @@ In OCP (pybind11-basiert) gibt jeder Aufruf von `solid.TShape()` ein neues Pytho
|
||||
|
||||
### 2026-03-11 | GMSH | CharacteristicLengthMax vs. OCC linear_deflection
|
||||
OCC `linear_deflection=0.1mm` auf einem 50mm-Zylinder → Kantenlänge ~5mm. GMSH `CharacteristicLengthMax=0.1×15=1.5mm` → 3× mehr Unterteilungen → 9× mehr Dreiecke → GLB 7× größer. **Lösung:** `CharacteristicLengthMax = linear_deflection × 50` (≈5mm), `MinimumCirclePoints = min(12, ...)` statt min(20). Ergebnis: GMSH 91% von OCC-Größe (Ziel ≤120% ✓).
|
||||
|
||||
### 2026-03-12 | GMSH | Priority 3 vollständig — GMSH-Pipeline Status
|
||||
GMSH 4.15.1 in render-worker installiert. `tessellation_engine=gmsh` ist der aktive DB-Default. `_tessellate_with_gmsh()` in `export_step_to_gltf.py` vollständig: `CharacteristicLengthMax = linear_deflection × 50`, `MinimumCirclePoints = min(12, ...)`, REVERSED Solids bleiben erhalten (kein invertierter Jacobian). Produktion-GLB nutzt Cache-Reuse (kein Re-Tessellieren bei Materialwechsel). Sharp-Edge-Extraktion läuft nach Tessellierung unabhängig vom Engine-Typ — `Injected N segment pairs into GLB extras` gilt für beide Pfade.
|
||||
|
||||
@@ -0,0 +1,21 @@
|
||||
"""Add usd_master to mediaassettype enum.
|
||||
|
||||
Revision ID: 060
|
||||
Revises: 059
|
||||
"""
|
||||
from alembic import op
|
||||
|
||||
revision = "060"
|
||||
down_revision = "059"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.execute("ALTER TYPE media_asset_type ADD VALUE IF NOT EXISTS 'usd_master'")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# PostgreSQL does not support removing enum values.
|
||||
# The 'usd_master' value will remain but is no longer referenced by application code.
|
||||
pass
|
||||
@@ -0,0 +1,25 @@
|
||||
"""Add three-layer material assignment columns to cad_files.
|
||||
|
||||
Revision ID: 061
|
||||
Revises: 060
|
||||
"""
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
from sqlalchemy.dialects import postgresql
|
||||
|
||||
revision = "061"
|
||||
down_revision = "060"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.add_column("cad_files", sa.Column("source_material_assignments", postgresql.JSONB(), nullable=True))
|
||||
op.add_column("cad_files", sa.Column("resolved_material_assignments", postgresql.JSONB(), nullable=True))
|
||||
op.add_column("cad_files", sa.Column("manual_material_overrides", postgresql.JSONB(), nullable=True))
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_column("cad_files", "manual_material_overrides")
|
||||
op.drop_column("cad_files", "resolved_material_assignments")
|
||||
op.drop_column("cad_files", "source_material_assignments")
|
||||
@@ -0,0 +1,25 @@
|
||||
"""Rename gltf_preview/gltf_production tessellation settings keys.
|
||||
|
||||
Revision ID: 062
|
||||
Revises: 061
|
||||
"""
|
||||
from alembic import op
|
||||
|
||||
revision = "062"
|
||||
down_revision = "061"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.execute("UPDATE system_settings SET key = 'scene_linear_deflection' WHERE key = 'gltf_preview_linear_deflection'")
|
||||
op.execute("UPDATE system_settings SET key = 'scene_angular_deflection' WHERE key = 'gltf_preview_angular_deflection'")
|
||||
op.execute("UPDATE system_settings SET key = 'render_linear_deflection' WHERE key = 'gltf_production_linear_deflection'")
|
||||
op.execute("UPDATE system_settings SET key = 'render_angular_deflection' WHERE key = 'gltf_production_angular_deflection'")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.execute("UPDATE system_settings SET key = 'gltf_preview_linear_deflection' WHERE key = 'scene_linear_deflection'")
|
||||
op.execute("UPDATE system_settings SET key = 'gltf_preview_angular_deflection' WHERE key = 'scene_angular_deflection'")
|
||||
op.execute("UPDATE system_settings SET key = 'gltf_production_linear_deflection' WHERE key = 'render_linear_deflection'")
|
||||
op.execute("UPDATE system_settings SET key = 'gltf_production_angular_deflection' WHERE key = 'render_angular_deflection'")
|
||||
@@ -41,10 +41,10 @@ SETTINGS_DEFAULTS: dict[str, str] = {
|
||||
"smtp_from_address": "",
|
||||
# glTF tessellation quality
|
||||
"tessellation_engine": "occ", # "occ" | "gmsh" — tessellation backend
|
||||
"gltf_preview_linear_deflection": "0.1", # mm — geometry GLB for viewer
|
||||
"gltf_preview_angular_deflection": "0.1", # rad — Standard preset
|
||||
"gltf_production_linear_deflection": "0.03", # mm — production GLB
|
||||
"gltf_production_angular_deflection": "0.05", # rad — Standard preset
|
||||
"scene_linear_deflection": "0.1", # mm — geometry GLB for viewer
|
||||
"scene_angular_deflection": "0.1", # rad — Standard preset
|
||||
"render_linear_deflection": "0.03", # mm — production/render GLB
|
||||
"render_angular_deflection": "0.05", # rad — Standard preset
|
||||
# 3D viewer / glTF export settings
|
||||
"gltf_scale_factor": "0.001",
|
||||
"gltf_smooth_normals": "true",
|
||||
@@ -74,10 +74,10 @@ class SettingsOut(BaseModel):
|
||||
smtp_user: str = ""
|
||||
smtp_password: str = ""
|
||||
smtp_from_address: str = ""
|
||||
gltf_preview_linear_deflection: float = 0.1
|
||||
gltf_preview_angular_deflection: float = 0.1
|
||||
gltf_production_linear_deflection: float = 0.03
|
||||
gltf_production_angular_deflection: float = 0.05
|
||||
scene_linear_deflection: float = 0.1
|
||||
scene_angular_deflection: float = 0.1
|
||||
render_linear_deflection: float = 0.03
|
||||
render_angular_deflection: float = 0.05
|
||||
gltf_scale_factor: float = 0.001
|
||||
gltf_smooth_normals: bool = True
|
||||
viewer_max_distance: float = 50.0
|
||||
@@ -106,10 +106,10 @@ class SettingsUpdate(BaseModel):
|
||||
smtp_user: str | None = None
|
||||
smtp_password: str | None = None
|
||||
smtp_from_address: str | None = None
|
||||
gltf_preview_linear_deflection: float | None = None
|
||||
gltf_preview_angular_deflection: float | None = None
|
||||
gltf_production_linear_deflection: float | None = None
|
||||
gltf_production_angular_deflection: float | None = None
|
||||
scene_linear_deflection: float | None = None
|
||||
scene_angular_deflection: float | None = None
|
||||
render_linear_deflection: float | None = None
|
||||
render_angular_deflection: float | None = None
|
||||
gltf_scale_factor: float | None = None
|
||||
gltf_smooth_normals: bool | None = None
|
||||
viewer_max_distance: float | None = None
|
||||
@@ -224,10 +224,10 @@ def _settings_to_out(raw: dict[str, str]) -> SettingsOut:
|
||||
smtp_user=raw.get("smtp_user", ""),
|
||||
smtp_password=raw.get("smtp_password", ""),
|
||||
smtp_from_address=raw.get("smtp_from_address", ""),
|
||||
gltf_preview_linear_deflection=float(raw.get("gltf_preview_linear_deflection", "0.1")),
|
||||
gltf_preview_angular_deflection=float(raw.get("gltf_preview_angular_deflection", "0.5")),
|
||||
gltf_production_linear_deflection=float(raw.get("gltf_production_linear_deflection", "0.03")),
|
||||
gltf_production_angular_deflection=float(raw.get("gltf_production_angular_deflection", "0.2")),
|
||||
scene_linear_deflection=float(raw.get("scene_linear_deflection", "0.1")),
|
||||
scene_angular_deflection=float(raw.get("scene_angular_deflection", "0.5")),
|
||||
render_linear_deflection=float(raw.get("render_linear_deflection", "0.03")),
|
||||
render_angular_deflection=float(raw.get("render_angular_deflection", "0.2")),
|
||||
gltf_scale_factor=float(raw.get("gltf_scale_factor", "0.001")),
|
||||
gltf_smooth_normals=raw.get("gltf_smooth_normals", "true") == "true",
|
||||
viewer_max_distance=float(raw.get("viewer_max_distance", "50")),
|
||||
@@ -340,22 +340,22 @@ async def update_settings(
|
||||
updates["gltf_pbr_roughness"] = str(body.gltf_pbr_roughness)
|
||||
if body.gltf_pbr_metallic is not None:
|
||||
updates["gltf_pbr_metallic"] = str(body.gltf_pbr_metallic)
|
||||
if body.gltf_preview_linear_deflection is not None:
|
||||
if not (0.001 <= body.gltf_preview_linear_deflection <= 10.0):
|
||||
raise HTTPException(400, detail="gltf_preview_linear_deflection must be 0.001–10.0 mm")
|
||||
updates["gltf_preview_linear_deflection"] = str(body.gltf_preview_linear_deflection)
|
||||
if body.gltf_preview_angular_deflection is not None:
|
||||
if not (0.05 <= body.gltf_preview_angular_deflection <= 1.5):
|
||||
raise HTTPException(400, detail="gltf_preview_angular_deflection must be 0.05–1.5 rad")
|
||||
updates["gltf_preview_angular_deflection"] = str(body.gltf_preview_angular_deflection)
|
||||
if body.gltf_production_linear_deflection is not None:
|
||||
if not (0.001 <= body.gltf_production_linear_deflection <= 10.0):
|
||||
raise HTTPException(400, detail="gltf_production_linear_deflection must be 0.001–10.0 mm")
|
||||
updates["gltf_production_linear_deflection"] = str(body.gltf_production_linear_deflection)
|
||||
if body.gltf_production_angular_deflection is not None:
|
||||
if not (0.05 <= body.gltf_production_angular_deflection <= 1.5):
|
||||
raise HTTPException(400, detail="gltf_production_angular_deflection must be 0.05–1.5 rad")
|
||||
updates["gltf_production_angular_deflection"] = str(body.gltf_production_angular_deflection)
|
||||
if body.scene_linear_deflection is not None:
|
||||
if not (0.001 <= body.scene_linear_deflection <= 10.0):
|
||||
raise HTTPException(400, detail="scene_linear_deflection must be 0.001–10.0 mm")
|
||||
updates["scene_linear_deflection"] = str(body.scene_linear_deflection)
|
||||
if body.scene_angular_deflection is not None:
|
||||
if not (0.05 <= body.scene_angular_deflection <= 1.5):
|
||||
raise HTTPException(400, detail="scene_angular_deflection must be 0.05–1.5 rad")
|
||||
updates["scene_angular_deflection"] = str(body.scene_angular_deflection)
|
||||
if body.render_linear_deflection is not None:
|
||||
if not (0.001 <= body.render_linear_deflection <= 10.0):
|
||||
raise HTTPException(400, detail="render_linear_deflection must be 0.001–10.0 mm")
|
||||
updates["render_linear_deflection"] = str(body.render_linear_deflection)
|
||||
if body.render_angular_deflection is not None:
|
||||
if not (0.05 <= body.render_angular_deflection <= 1.5):
|
||||
raise HTTPException(400, detail="render_angular_deflection must be 0.05–1.5 rad")
|
||||
updates["render_angular_deflection"] = str(body.render_angular_deflection)
|
||||
if body.tessellation_engine is not None:
|
||||
if body.tessellation_engine not in {"occ", "gmsh"}:
|
||||
raise HTTPException(400, detail="tessellation_engine must be 'occ' or 'gmsh'")
|
||||
@@ -532,13 +532,12 @@ async def reextract_all_metadata(
|
||||
return {"queued": queued, "message": f"Queued {queued} CAD file(s) for metadata re-extraction"}
|
||||
|
||||
|
||||
@router.post("/settings/generate-missing-geometry-glbs", status_code=status.HTTP_202_ACCEPTED)
|
||||
async def generate_missing_geometry_glbs(
|
||||
@router.post("/settings/generate-missing-canonical-scenes", status_code=status.HTTP_202_ACCEPTED)
|
||||
async def generate_missing_canonical_scenes(
|
||||
admin: User = Depends(require_admin),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Queue geometry GLB generation for every completed CAD file that has no gltf_geometry MediaAsset."""
|
||||
import uuid as _uuid
|
||||
"""Queue canonical scene (geometry GLB + USD master) generation for every completed CAD file that has no gltf_geometry MediaAsset."""
|
||||
from app.domains.media.models import MediaAsset, MediaAssetType
|
||||
|
||||
result = await db.execute(
|
||||
@@ -561,7 +560,37 @@ async def generate_missing_geometry_glbs(
|
||||
generate_gltf_geometry_task.delay(str(cad_file.id))
|
||||
queued += 1
|
||||
|
||||
return {"queued": queued, "message": f"Queued {queued} missing geometry GLB task(s)"}
|
||||
return {"queued": queued, "message": f"Queued {queued} missing canonical scene task(s)"}
|
||||
|
||||
|
||||
@router.post("/settings/generate-missing-usd-masters", status_code=status.HTTP_202_ACCEPTED)
|
||||
async def generate_missing_usd_masters(
|
||||
admin: User = Depends(require_admin),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Queue USD master export for every completed CAD file that has no usd_master MediaAsset."""
|
||||
from app.domains.media.models import MediaAsset, MediaAssetType
|
||||
|
||||
result = await db.execute(
|
||||
select(CadFile).where(CadFile.processing_status == ProcessingStatus.completed)
|
||||
)
|
||||
cad_files = result.scalars().all()
|
||||
|
||||
existing_result = await db.execute(
|
||||
select(MediaAsset.cad_file_id).where(MediaAsset.asset_type == MediaAssetType.usd_master)
|
||||
)
|
||||
existing_ids = {row[0] for row in existing_result.all()}
|
||||
|
||||
from app.tasks.step_tasks import generate_usd_master_task
|
||||
queued = 0
|
||||
for cad_file in cad_files:
|
||||
if not cad_file.stored_path:
|
||||
continue
|
||||
if cad_file.id not in existing_ids:
|
||||
generate_usd_master_task.delay(str(cad_file.id))
|
||||
queued += 1
|
||||
|
||||
return {"queued": queued, "message": f"Queued {queued} missing USD master task(s)"}
|
||||
|
||||
|
||||
@router.post("/settings/recover-stuck-processing", status_code=status.HTTP_200_OK)
|
||||
|
||||
@@ -434,3 +434,110 @@ async def save_part_materials(
|
||||
cad_file_id=str(cad.id),
|
||||
part_materials=cad.part_materials,
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# ---------------------------------------------------------------------------
|
||||
# Manual material overrides schemas (partKey-keyed)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class ManualMaterialOverridesIn(BaseModel):
|
||||
overrides: dict[str, str] # { partKey: materialName }
|
||||
|
||||
|
||||
class ManualMaterialOverridesOut(BaseModel):
|
||||
cad_file_id: str
|
||||
manual_material_overrides: dict[str, str] | None
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# USD master endpoints
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@router.get("/{id}/scene-manifest")
|
||||
async def get_scene_manifest(
|
||||
id: uuid.UUID,
|
||||
user: User = Depends(get_current_user),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Return scene manifest for a CAD file (part keys, material assignments)."""
|
||||
from app.domains.products.schemas import SceneManifest
|
||||
from app.services.part_key_service import build_scene_manifest
|
||||
from app.domains.media.models import MediaAsset, MediaAssetType
|
||||
|
||||
cad = await _get_cad_file(id, db)
|
||||
|
||||
usd_result = await db.execute(
|
||||
select(MediaAsset).where(
|
||||
MediaAsset.cad_file_id == id,
|
||||
MediaAsset.asset_type == MediaAssetType.usd_master,
|
||||
)
|
||||
)
|
||||
usd_asset = usd_result.scalars().first()
|
||||
|
||||
if not usd_asset and not cad.parsed_objects:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail="Scene manifest not yet available — run generate-usd-master first",
|
||||
)
|
||||
|
||||
manifest_dict = build_scene_manifest(cad, usd_asset)
|
||||
return SceneManifest(**manifest_dict)
|
||||
|
||||
|
||||
@router.post("/{id}/generate-usd-master", status_code=status.HTTP_202_ACCEPTED)
|
||||
async def generate_usd_master(
|
||||
id: uuid.UUID,
|
||||
user: User = Depends(get_current_user),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Queue a USD master export for a CAD file."""
|
||||
if not is_privileged(user):
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Insufficient permissions")
|
||||
|
||||
cad = await _get_cad_file(id, db)
|
||||
if not cad.stored_path:
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="No STEP file stored")
|
||||
|
||||
from app.tasks.step_tasks import generate_usd_master_task
|
||||
task = generate_usd_master_task.delay(str(id))
|
||||
return {"status": "queued", "task_id": task.id, "cad_file_id": str(id)}
|
||||
|
||||
|
||||
@router.get("/{id}/manual-material-overrides", response_model=ManualMaterialOverridesOut)
|
||||
async def get_manual_material_overrides(
|
||||
id: uuid.UUID,
|
||||
user: User = Depends(get_current_user),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Return manual material overrides (partKey → materialName) for a CAD file."""
|
||||
cad = await _get_cad_file(id, db)
|
||||
return ManualMaterialOverridesOut(
|
||||
cad_file_id=str(id),
|
||||
manual_material_overrides=cad.manual_material_overrides,
|
||||
)
|
||||
|
||||
|
||||
@router.put("/{id}/manual-material-overrides", response_model=ManualMaterialOverridesOut)
|
||||
async def save_manual_material_overrides(
|
||||
id: uuid.UUID,
|
||||
body: ManualMaterialOverridesIn,
|
||||
user: User = Depends(get_current_user),
|
||||
db: AsyncSession = Depends(get_db),
|
||||
):
|
||||
"""Save manual material overrides keyed by partKey.
|
||||
|
||||
Writes to CadFile.manual_material_overrides (JSONB).
|
||||
Takes priority over auto-resolved and source-matched materials in build_scene_manifest().
|
||||
"""
|
||||
if not is_privileged(user):
|
||||
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Insufficient permissions")
|
||||
|
||||
cad = await _get_cad_file(id, db)
|
||||
cad.manual_material_overrides = body.overrides
|
||||
await db.commit()
|
||||
await db.refresh(cad)
|
||||
return ManualMaterialOverridesOut(
|
||||
cad_file_id=str(id),
|
||||
manual_material_overrides=cad.manual_material_overrides,
|
||||
)
|
||||
|
||||
@@ -38,7 +38,6 @@ class RenderConfig(BaseModel):
|
||||
blender_cycles_samples: int = 256
|
||||
blender_eevee_samples: int = 64
|
||||
thumbnail_format: str = "jpg"
|
||||
stl_quality: str = "low"
|
||||
blender_smooth_angle: int = 30
|
||||
cycles_device: str = "auto"
|
||||
render_backend: str = "celery"
|
||||
|
||||
@@ -0,0 +1,115 @@
|
||||
"""Sync tenant context helpers for Celery tasks.
|
||||
|
||||
Celery tasks run in a sync context (no async event loop), so they cannot use
|
||||
the async ``set_tenant_context`` from ``app.database``. This module provides
|
||||
``set_tenant_context_sync`` which accepts a SQLAlchemy sync ``Session`` and
|
||||
a raw ``tenant_id`` UUID string (or None for global-admin bypass), as well as
|
||||
``resolve_tenant_id_for_cad`` / ``resolve_tenant_id_for_order_line`` helpers
|
||||
that look up the tenant_id from the database given only an entity ID.
|
||||
|
||||
Typical usage at the start of a Celery task::
|
||||
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad, set_tenant_context_sync
|
||||
|
||||
tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
# tenant_id is already logged by resolve_tenant_id_for_cad
|
||||
|
||||
# Then in every Session block that does RLS-protected queries:
|
||||
with Session(engine) as session:
|
||||
set_tenant_context_sync(session, tenant_id)
|
||||
# ... queries here respect RLS ...
|
||||
"""
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from sqlalchemy import create_engine, text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def set_tenant_context_sync(db: Session, tenant_id: Optional[str]) -> None:
|
||||
"""Set the PostgreSQL RLS context variable for a sync SQLAlchemy session.
|
||||
|
||||
Executes ``SET LOCAL app.current_tenant_id = :tid`` so that all subsequent
|
||||
queries within the same transaction respect row-level security policies.
|
||||
|
||||
Args:
|
||||
db: An open sync SQLAlchemy ``Session``.
|
||||
tenant_id: UUID string of the tenant, or ``None`` / empty string to use
|
||||
the bypass sentinel (global-admin context — sees all rows).
|
||||
"""
|
||||
if tenant_id:
|
||||
db.execute(
|
||||
text("SET LOCAL app.current_tenant_id = :tid"),
|
||||
{"tid": str(tenant_id)},
|
||||
)
|
||||
else:
|
||||
# None means no tenant context is known (e.g. system tasks).
|
||||
# Use empty string — RLS policies treat '' as no-tenant, which allows
|
||||
# global admin queries to proceed without filtering.
|
||||
db.execute(text("SET LOCAL app.current_tenant_id = ''"))
|
||||
|
||||
|
||||
def resolve_tenant_id_for_cad(cad_file_id: str) -> Optional[str]:
|
||||
"""Look up the tenant_id for a CadFile by its primary key.
|
||||
|
||||
Opens a short-lived sync session, reads CadFile.tenant_id, and returns it
|
||||
as a string UUID or None. Also emits the ``[TENANT]`` log line.
|
||||
|
||||
Args:
|
||||
cad_file_id: The UUID string (or UUID) of the CadFile record.
|
||||
|
||||
Returns:
|
||||
tenant_id as ``str`` if the CadFile has one, ``None`` otherwise.
|
||||
"""
|
||||
try:
|
||||
from app.config import settings as _cfg
|
||||
from app.models.cad_file import CadFile # compat shim → domains.products.models
|
||||
|
||||
_sync_url = _cfg.database_url.replace("+asyncpg", "")
|
||||
_eng = create_engine(_sync_url)
|
||||
try:
|
||||
with Session(_eng) as _sess:
|
||||
_cad = _sess.get(CadFile, cad_file_id)
|
||||
tenant_id = str(_cad.tenant_id) if (_cad and _cad.tenant_id) else None
|
||||
finally:
|
||||
_eng.dispose()
|
||||
except Exception as exc:
|
||||
logger.warning("[TENANT] resolve_tenant_id_for_cad(%s) failed: %s", cad_file_id, exc)
|
||||
tenant_id = None
|
||||
|
||||
logger.info("[TENANT] context set: tenant_id=%s", tenant_id)
|
||||
return tenant_id
|
||||
|
||||
|
||||
def resolve_tenant_id_for_order_line(order_line_id: str) -> Optional[str]:
|
||||
"""Look up the tenant_id for an OrderLine by its primary key.
|
||||
|
||||
Opens a short-lived sync session, reads OrderLine.tenant_id, and returns it
|
||||
as a string UUID or None. Also emits the ``[TENANT]`` log line.
|
||||
|
||||
Args:
|
||||
order_line_id: The UUID string (or UUID) of the OrderLine record.
|
||||
|
||||
Returns:
|
||||
tenant_id as ``str`` if the OrderLine has one, ``None`` otherwise.
|
||||
"""
|
||||
try:
|
||||
from app.config import settings as _cfg
|
||||
from app.models.order_line import OrderLine # compat shim
|
||||
|
||||
_sync_url = _cfg.database_url.replace("+asyncpg", "")
|
||||
_eng = create_engine(_sync_url)
|
||||
try:
|
||||
with Session(_eng) as _sess:
|
||||
_line = _sess.get(OrderLine, order_line_id)
|
||||
tenant_id = str(_line.tenant_id) if (_line and _line.tenant_id) else None
|
||||
finally:
|
||||
_eng.dispose()
|
||||
except Exception as exc:
|
||||
logger.warning("[TENANT] resolve_tenant_id_for_order_line(%s) failed: %s", order_line_id, exc)
|
||||
tenant_id = None
|
||||
|
||||
logger.info("[TENANT] context set: tenant_id=%s", tenant_id)
|
||||
return tenant_id
|
||||
@@ -17,6 +17,7 @@ class MediaAssetType(str, enum.Enum):
|
||||
gltf_geometry = "gltf_geometry"
|
||||
gltf_production = "gltf_production"
|
||||
blend_production = "blend_production"
|
||||
usd_master = "usd_master"
|
||||
|
||||
|
||||
class MediaAsset(Base):
|
||||
|
||||
@@ -40,9 +40,14 @@ def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
pl = PipelineLogger(task_id=self.request.id)
|
||||
pl.step_start("export_glb_geometry", {"cad_file_id": cad_file_id})
|
||||
|
||||
# Resolve and log tenant context at task start (required for RLS)
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad, set_tenant_context_sync
|
||||
_tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
|
||||
sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
eng = create_engine(sync_url)
|
||||
with Session(eng) as session:
|
||||
set_tenant_context_sync(session, _tenant_id)
|
||||
cad_file = session.get(CadFile, cad_file_id)
|
||||
if not cad_file or not cad_file.stored_path:
|
||||
logger.error("generate_gltf_geometry_task: no stored_path for %s", cad_file_id)
|
||||
@@ -66,10 +71,32 @@ def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
|
||||
settings_rows = session.execute(_select(_SysSetting)).scalars().all()
|
||||
sys_settings = {s.key: s.value for s in settings_rows}
|
||||
|
||||
# Hash-based cache check: skip tessellation if file hasn't changed
|
||||
step_file_hash = cad_file.step_file_hash
|
||||
if step_file_hash:
|
||||
from app.domains.media.models import MediaAsset, MediaAssetType
|
||||
import uuid as _uuid_check
|
||||
existing_geo = session.execute(
|
||||
_select(MediaAsset).where(
|
||||
MediaAsset.cad_file_id == _uuid_check.UUID(cad_file_id),
|
||||
MediaAsset.asset_type == MediaAssetType.gltf_geometry,
|
||||
)
|
||||
).scalars().first()
|
||||
if existing_geo:
|
||||
logger.info("[CACHE] hash match — skipping geometry GLB tessellation for %s", cad_file_id)
|
||||
pl.step_done("export_glb_geometry", result={"cached": True, "asset_id": str(existing_geo.id)})
|
||||
eng.dispose()
|
||||
# Still chain USD master — it has its own hash-check (C2)
|
||||
try:
|
||||
generate_usd_master_task.delay(cad_file_id)
|
||||
except Exception:
|
||||
logger.debug("Could not queue generate_usd_master_task from cache-hit path (non-fatal)")
|
||||
return {"cached": True, "asset_id": str(existing_geo.id)}
|
||||
eng.dispose()
|
||||
|
||||
linear_deflection = float(sys_settings.get("gltf_preview_linear_deflection", "0.1"))
|
||||
angular_deflection = float(sys_settings.get("gltf_preview_angular_deflection", "0.1"))
|
||||
linear_deflection = float(sys_settings.get("scene_linear_deflection", "0.1"))
|
||||
angular_deflection = float(sys_settings.get("scene_angular_deflection", "0.1"))
|
||||
tessellation_engine = sys_settings.get("tessellation_engine", "occ")
|
||||
|
||||
step = _Path(step_path_str)
|
||||
@@ -135,6 +162,7 @@ def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
_sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
_eng2 = _ce(_sync_url)
|
||||
with _Session(_eng2) as _sess:
|
||||
set_tenant_context_sync(_sess, _tenant_id)
|
||||
_key = str(output_path)
|
||||
_prefix = str(app_settings.upload_dir).rstrip("/") + "/"
|
||||
if _key.startswith(_prefix):
|
||||
@@ -172,6 +200,14 @@ def generate_gltf_geometry_task(self, cad_file_id: str):
|
||||
|
||||
pl.step_done("export_glb_geometry", result={"glb_path": str(output_path), "asset_id": asset_id})
|
||||
logger.info("generate_gltf_geometry_task: MediaAsset %s created for cad %s", asset_id, cad_file_id)
|
||||
|
||||
# Auto-chain USD master export so the canonical scene is always up to date
|
||||
try:
|
||||
generate_usd_master_task.delay(cad_file_id)
|
||||
logger.info("generate_gltf_geometry_task: queued generate_usd_master_task for %s", cad_file_id)
|
||||
except Exception:
|
||||
logger.debug("Could not queue generate_usd_master_task (non-fatal)")
|
||||
|
||||
return {"glb_path": str(output_path), "asset_id": asset_id}
|
||||
|
||||
|
||||
@@ -207,6 +243,10 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
pl.step_start("export_glb_production", {"cad_file_id": cad_file_id})
|
||||
log_task_event(self.request.id, f"generate_gltf_production_task started for cad {cad_file_id}", "info")
|
||||
|
||||
# Resolve and log tenant context at task start (required for RLS)
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad, set_tenant_context_sync
|
||||
_tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
|
||||
_sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
_eng = _ce(_sync_url)
|
||||
|
||||
@@ -215,6 +255,7 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
from app.models.system_setting import SystemSetting
|
||||
|
||||
with _Session(_eng) as _sess:
|
||||
set_tenant_context_sync(_sess, _tenant_id)
|
||||
_cad = _sess.execute(
|
||||
_sel(_CF).where(_CF.id == _uuid.UUID(cad_file_id))
|
||||
).scalar_one_or_none()
|
||||
@@ -231,8 +272,8 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
raise RuntimeError(f"STEP file not found: {step_path}")
|
||||
|
||||
smooth_angle = float(sys_settings.get("blender_smooth_angle", "30"))
|
||||
prod_linear = float(sys_settings.get("gltf_production_linear_deflection", "0.03"))
|
||||
prod_angular = float(sys_settings.get("gltf_production_angular_deflection", "0.05"))
|
||||
prod_linear = float(sys_settings.get("render_linear_deflection", "0.03"))
|
||||
prod_angular = float(sys_settings.get("render_angular_deflection", "0.05"))
|
||||
tessellation_engine = sys_settings.get("tessellation_engine", "occ")
|
||||
|
||||
scripts_dir = _Path(_os.environ.get("RENDER_SCRIPTS_DIR", "/render-scripts"))
|
||||
@@ -289,8 +330,8 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
# because CharacteristicLengthMax becomes too small. GMSH quality is algorithmic
|
||||
# (conforming seams) not density-based — a denser GMSH mesh adds no UV-unwrap benefit.
|
||||
if tessellation_engine == "gmsh":
|
||||
eff_linear = float(sys_settings.get("gltf_preview_linear_deflection", "0.1"))
|
||||
eff_angular = float(sys_settings.get("gltf_preview_angular_deflection", "0.1"))
|
||||
eff_linear = float(sys_settings.get("scene_linear_deflection", "0.1"))
|
||||
eff_angular = float(sys_settings.get("scene_angular_deflection", "0.1"))
|
||||
else:
|
||||
eff_linear = prod_linear
|
||||
eff_angular = prod_angular
|
||||
@@ -330,6 +371,7 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
from app.domains.products.models import Product as _Product
|
||||
|
||||
with _Session(_eng) as _sess:
|
||||
set_tenant_context_sync(_sess, _tenant_id)
|
||||
_prod_query = _sel(_Product).where(_Product.cad_file_id == _uuid.UUID(cad_file_id))
|
||||
if product_id:
|
||||
_prod_query = _prod_query.where(_Product.id == _uuid.UUID(product_id))
|
||||
@@ -405,6 +447,7 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
# any frontend page holding a stale download_url continues to resolve correctly.
|
||||
_eng2 = _ce(_sync_url)
|
||||
with _Session(_eng2) as _sess:
|
||||
set_tenant_context_sync(_sess, _tenant_id)
|
||||
_key = str(output_path)
|
||||
_prefix = str(app_settings.upload_dir).rstrip("/") + "/"
|
||||
if _key.startswith(_prefix):
|
||||
@@ -443,3 +486,204 @@ def generate_gltf_production_task(self, cad_file_id: str, product_id: str | None
|
||||
pl.step_done("export_glb_production", result={"glb_path": str(output_path), "asset_id": asset_id})
|
||||
logger.info("generate_gltf_production_task: MediaAsset %s created for cad %s", asset_id, cad_file_id)
|
||||
return {"glb_path": str(output_path), "asset_id": asset_id}
|
||||
|
||||
|
||||
@celery_app.task(
|
||||
bind=True,
|
||||
name="app.tasks.step_tasks.generate_usd_master_task",
|
||||
queue="thumbnail_rendering",
|
||||
max_retries=1,
|
||||
)
|
||||
def generate_usd_master_task(self, cad_file_id: str) -> dict:
|
||||
"""Export a USD master file from STEP via OCC + pxr authoring.
|
||||
|
||||
Pipeline:
|
||||
1. Reads STEP file via export_step_to_usd.py (OCC XCAF + pxr)
|
||||
2. Writes .usd file alongside the STEP file
|
||||
3. Stores result as usd_master MediaAsset
|
||||
4. Parses MANIFEST_JSON from stdout → writes resolved_material_assignments to CadFile
|
||||
"""
|
||||
import json as _json
|
||||
import os as _os
|
||||
import subprocess as _subprocess
|
||||
import sys as _sys
|
||||
import uuid as _uuid
|
||||
from pathlib import Path as _Path
|
||||
from sqlalchemy import create_engine as _ce, select as _sel
|
||||
from sqlalchemy.orm import Session as _Session
|
||||
|
||||
from app.config import settings as app_settings
|
||||
from app.domains.media.models import MediaAsset, MediaAssetType
|
||||
from app.models.cad_file import CadFile
|
||||
from app.models.system_setting import SystemSetting
|
||||
from app.domains.products.models import Product
|
||||
|
||||
pl = PipelineLogger(task_id=self.request.id)
|
||||
pl.step_start("usd_master", {"cad_file_id": cad_file_id})
|
||||
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad, set_tenant_context_sync
|
||||
_tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
|
||||
sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
eng = _ce(sync_url)
|
||||
|
||||
with _Session(eng) as sess:
|
||||
set_tenant_context_sync(sess, _tenant_id)
|
||||
cad_file = sess.get(CadFile, cad_file_id)
|
||||
if not cad_file or not cad_file.stored_path:
|
||||
logger.error("generate_usd_master_task: no stored_path for %s", cad_file_id)
|
||||
return {"error": "no stored_path"}
|
||||
|
||||
step_path = _Path(cad_file.stored_path)
|
||||
|
||||
product = sess.execute(
|
||||
_sel(Product).where(Product.cad_file_id == cad_file.id)
|
||||
).scalar_one_or_none()
|
||||
|
||||
color_map: dict[str, str] = {}
|
||||
if product and product.cad_part_materials:
|
||||
for entry in product.cad_part_materials:
|
||||
part_name = entry.get("part_name") or entry.get("name", "")
|
||||
hex_color = entry.get("hex_color") or entry.get("color", "")
|
||||
if part_name and hex_color:
|
||||
color_map[part_name] = hex_color
|
||||
|
||||
settings_rows = sess.execute(_sel(SystemSetting)).scalars().all()
|
||||
sys_settings = {s.key: s.value for s in settings_rows}
|
||||
|
||||
# Hash-based cache check: skip tessellation if file hasn't changed
|
||||
step_file_hash = cad_file.step_file_hash
|
||||
if step_file_hash:
|
||||
existing_usd = sess.execute(
|
||||
_sel(MediaAsset).where(
|
||||
MediaAsset.cad_file_id == cad_file.id,
|
||||
MediaAsset.asset_type == MediaAssetType.usd_master,
|
||||
)
|
||||
).scalars().first()
|
||||
if existing_usd:
|
||||
logger.info("[CACHE] hash match — skipping USD master tessellation for %s", cad_file_id)
|
||||
pl.step_done("usd_master", result={"cached": True, "asset_id": str(existing_usd.id)})
|
||||
eng.dispose()
|
||||
return {"cached": True, "asset_id": str(existing_usd.id)}
|
||||
eng.dispose()
|
||||
|
||||
if not step_path.exists():
|
||||
err = f"STEP file not found: {step_path}"
|
||||
pl.step_error("usd_master", err, None)
|
||||
raise RuntimeError(err)
|
||||
|
||||
linear_deflection = float(sys_settings.get("render_linear_deflection", "0.03"))
|
||||
angular_deflection = float(sys_settings.get("render_angular_deflection", "0.05"))
|
||||
sharp_threshold = float(sys_settings.get("sharp_edge_threshold", "20.0"))
|
||||
|
||||
output_path = step_path.parent / f"{step_path.stem}_master.usd"
|
||||
scripts_dir = _Path(_os.environ.get("RENDER_SCRIPTS_DIR", "/render-scripts"))
|
||||
script_path = scripts_dir / "export_step_to_usd.py"
|
||||
|
||||
if not script_path.exists():
|
||||
err = f"export_step_to_usd.py not found at {script_path}"
|
||||
pl.step_error("usd_master", err, None)
|
||||
raise RuntimeError(err)
|
||||
|
||||
cmd = [
|
||||
_sys.executable, str(script_path),
|
||||
"--step_path", str(step_path),
|
||||
"--output_path", str(output_path),
|
||||
"--color_map", _json.dumps(color_map),
|
||||
"--linear_deflection", str(linear_deflection),
|
||||
"--angular_deflection", str(angular_deflection),
|
||||
"--sharp_threshold", str(sharp_threshold),
|
||||
"--cad_file_id", cad_file_id,
|
||||
]
|
||||
|
||||
log_task_event(
|
||||
self.request.id,
|
||||
f"[USD_MASTER] exporting STEP → USD: {step_path.name}",
|
||||
"info",
|
||||
)
|
||||
|
||||
try:
|
||||
result = _subprocess.run(cmd, capture_output=True, text=True, timeout=600)
|
||||
for line in result.stdout.splitlines():
|
||||
logger.info("[usd-master] %s", line)
|
||||
for line in result.stderr.splitlines():
|
||||
logger.warning("[usd-master stderr] %s", line)
|
||||
|
||||
if result.returncode != 0 or not output_path.exists() or output_path.stat().st_size == 0:
|
||||
raise RuntimeError(
|
||||
f"export_step_to_usd.py failed (exit {result.returncode}).\n"
|
||||
f"STDERR: {result.stderr[-1000:]}"
|
||||
)
|
||||
except Exception as exc:
|
||||
log_task_event(self.request.id, f"[USD_MASTER] failed: {exc}", "error")
|
||||
pl.step_error("usd_master", str(exc), exc)
|
||||
raise self.retry(exc=exc, countdown=15)
|
||||
|
||||
# --- Store MediaAsset (upsert) ---
|
||||
eng2 = _ce(sync_url)
|
||||
asset_id: str = ""
|
||||
with _Session(eng2) as sess2:
|
||||
set_tenant_context_sync(sess2, _tenant_id)
|
||||
_key = str(output_path)
|
||||
_prefix = str(app_settings.upload_dir).rstrip("/") + "/"
|
||||
if _key.startswith(_prefix):
|
||||
_key = _key[len(_prefix):]
|
||||
_file_size = output_path.stat().st_size if output_path.exists() else None
|
||||
|
||||
existing = sess2.execute(
|
||||
_sel(MediaAsset).where(
|
||||
MediaAsset.cad_file_id == _uuid.UUID(cad_file_id),
|
||||
MediaAsset.asset_type == MediaAssetType.usd_master,
|
||||
)
|
||||
).scalars().first()
|
||||
|
||||
if existing:
|
||||
existing.storage_key = _key
|
||||
existing.mime_type = "model/vnd.usd"
|
||||
existing.file_size_bytes = _file_size
|
||||
sess2.commit()
|
||||
asset_id = str(existing.id)
|
||||
else:
|
||||
asset = MediaAsset(
|
||||
cad_file_id=_uuid.UUID(cad_file_id),
|
||||
asset_type=MediaAssetType.usd_master,
|
||||
storage_key=_key,
|
||||
mime_type="model/vnd.usd",
|
||||
file_size_bytes=_file_size,
|
||||
)
|
||||
sess2.add(asset)
|
||||
sess2.commit()
|
||||
asset_id = str(asset.id)
|
||||
eng2.dispose()
|
||||
|
||||
# --- Parse MANIFEST_JSON and write resolved_material_assignments ---
|
||||
manifest_parts: list = []
|
||||
for line in result.stdout.splitlines():
|
||||
if line.startswith("MANIFEST_JSON: "):
|
||||
try:
|
||||
manifest_parts = _json.loads(line[len("MANIFEST_JSON: "):]).get("parts", [])
|
||||
except Exception as parse_exc:
|
||||
logger.warning("[USD_MASTER] MANIFEST_JSON parse failed: %s", parse_exc)
|
||||
break
|
||||
|
||||
if manifest_parts:
|
||||
try:
|
||||
resolved = {
|
||||
p["part_key"]: {"source_name": p["source_name"], "prim_path": p["prim_path"]}
|
||||
for p in manifest_parts
|
||||
}
|
||||
eng3 = _ce(sync_url)
|
||||
with _Session(eng3) as sess3:
|
||||
set_tenant_context_sync(sess3, _tenant_id)
|
||||
row = sess3.get(CadFile, cad_file_id)
|
||||
if row:
|
||||
row.resolved_material_assignments = resolved
|
||||
sess3.commit()
|
||||
eng3.dispose()
|
||||
logger.info("[USD_MASTER] wrote resolved_material_assignments (%d parts)", len(resolved))
|
||||
except Exception as write_exc:
|
||||
logger.warning("[USD_MASTER] failed to write resolved_material_assignments: %s", write_exc)
|
||||
|
||||
log_task_event(self.request.id, f"[USD_MASTER] done: {output_path.name}", "done")
|
||||
pl.step_done("usd_master", result={"usd_path": str(output_path), "asset_id": asset_id})
|
||||
return {"usd_path": str(output_path), "asset_id": asset_id, "n_parts": len(manifest_parts)}
|
||||
|
||||
@@ -89,6 +89,10 @@ def process_step_file(self, cad_file_id: str):
|
||||
pl = PipelineLogger(task_id=self.request.id)
|
||||
pl.step_start("process_step_file", {"cad_file_id": cad_file_id})
|
||||
|
||||
# Resolve and log tenant context at task start (required for RLS)
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad
|
||||
_tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
|
||||
lock_key = f"step_processing_lock:{cad_file_id}"
|
||||
r = redis_lib.from_url(app_settings.redis_url)
|
||||
acquired = r.set(lock_key, "1", nx=True, ex=600) # 10-minute TTL
|
||||
@@ -213,9 +217,14 @@ def reextract_cad_metadata(cad_file_id: str):
|
||||
pl = PipelineLogger(task_id=None)
|
||||
pl.step_start("reextract_cad_metadata", {"cad_file_id": cad_file_id})
|
||||
|
||||
# Resolve and log tenant context at task start (required for RLS)
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad, set_tenant_context_sync
|
||||
_tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
|
||||
sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
eng = create_engine(sync_url)
|
||||
with Session(eng) as session:
|
||||
set_tenant_context_sync(session, _tenant_id)
|
||||
cad_file = session.get(CadFile, cad_file_id)
|
||||
if not cad_file or not cad_file.stored_path:
|
||||
logger.warning(f"reextract_cad_metadata: file not found {cad_file_id}")
|
||||
@@ -229,6 +238,7 @@ def reextract_cad_metadata(cad_file_id: str):
|
||||
patch = _bbox_from_glb(str(glb_path)) or _bbox_from_step_cadquery(step_path)
|
||||
if patch:
|
||||
with Session(eng) as session:
|
||||
set_tenant_context_sync(session, _tenant_id)
|
||||
cad_file = session.get(CadFile, cad_file_id)
|
||||
if cad_file:
|
||||
cad_file.mesh_attributes = {**(cad_file.mesh_attributes or {}), **patch}
|
||||
|
||||
@@ -30,6 +30,11 @@ def render_order_line_task(self, order_line_id: str):
|
||||
pl = PipelineLogger(task_id=self.request.id, order_line_id=order_line_id)
|
||||
pl.step_start("render_order_line_task", {"order_line_id": order_line_id})
|
||||
logger.info(f"Rendering order line: {order_line_id}")
|
||||
|
||||
# Resolve and log tenant context at task start (required for RLS)
|
||||
from app.core.tenant_context import resolve_tenant_id_for_order_line, set_tenant_context_sync
|
||||
_tenant_id = resolve_tenant_id_for_order_line(order_line_id)
|
||||
|
||||
from app.services.render_log import emit
|
||||
|
||||
emit(order_line_id, "Celery render task started")
|
||||
@@ -43,6 +48,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
engine = create_engine(sync_url)
|
||||
|
||||
with Session(engine) as session:
|
||||
set_tenant_context_sync(session, _tenant_id)
|
||||
from app.models.order_line import OrderLine
|
||||
from app.models.product import Product
|
||||
|
||||
@@ -89,6 +95,30 @@ def render_order_line_task(self, order_line_id: str):
|
||||
cad_file = line.product.cad_file
|
||||
materials_source = line.product.cad_part_materials
|
||||
|
||||
# Look up USD master asset for this CAD file — used when rendering
|
||||
# via USD path instead of production GLB
|
||||
from app.domains.media.models import MediaAsset, MediaAssetType
|
||||
from pathlib import Path as _Path
|
||||
usd_render_path = None
|
||||
if cad_file:
|
||||
_usd_asset = session.execute(
|
||||
select(MediaAsset)
|
||||
.where(
|
||||
MediaAsset.cad_file_id == cad_file.id,
|
||||
MediaAsset.asset_type == MediaAssetType.usd_master,
|
||||
)
|
||||
.order_by(MediaAsset.created_at.desc())
|
||||
.limit(1)
|
||||
).scalar_one_or_none()
|
||||
if _usd_asset and _usd_asset.storage_key:
|
||||
_usd_candidate = _Path(app_settings.upload_dir) / _usd_asset.storage_key
|
||||
if _usd_candidate.exists():
|
||||
usd_render_path = _usd_candidate
|
||||
logger.info(
|
||||
"render_order_line: using usd_master %s for cad %s",
|
||||
_usd_candidate.name, cad_file.id,
|
||||
)
|
||||
|
||||
part_colors = {}
|
||||
if cad_file and cad_file.parsed_objects:
|
||||
parsed_names = cad_file.parsed_objects.get("objects", [])
|
||||
@@ -242,7 +272,6 @@ def render_order_line_task(self, order_line_id: str):
|
||||
height=render_height or 1920,
|
||||
engine=render_engine or _sys.get("blender_engine", "cycles"),
|
||||
samples=render_samples or int(_sys.get(f"blender_{render_engine or _sys.get('blender_engine','cycles')}_samples", 128)),
|
||||
stl_quality=_sys.get("stl_quality", "low"),
|
||||
smooth_angle=int(_sys.get("blender_smooth_angle", 30)),
|
||||
cycles_device=cycles_device_val,
|
||||
transparent_bg=transparent_bg,
|
||||
@@ -259,6 +288,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
rotation_x=rotation_x,
|
||||
rotation_y=rotation_y,
|
||||
rotation_z=rotation_z,
|
||||
usd_path=usd_render_path,
|
||||
)
|
||||
success = True
|
||||
render_log = {
|
||||
@@ -323,6 +353,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
denoising_prefilter=denoising_prefilter,
|
||||
denoising_quality=denoising_quality,
|
||||
denoising_use_gpu=denoising_use_gpu,
|
||||
usd_path=usd_render_path,
|
||||
)
|
||||
if success:
|
||||
pl.step_done("blender_still")
|
||||
@@ -376,13 +407,6 @@ def render_order_line_task(self, order_line_id: str):
|
||||
_file_size = _os.path.getsize(output_path)
|
||||
except OSError:
|
||||
pass
|
||||
if _ext in ("png", "jpg", "jpeg"):
|
||||
try:
|
||||
from PIL import Image as _PILImage
|
||||
with _PILImage.open(output_path) as _im:
|
||||
_width, _height = _im.size
|
||||
except Exception:
|
||||
pass
|
||||
# Snapshot key render settings into render_config
|
||||
_render_config = None
|
||||
if isinstance(render_log, dict):
|
||||
@@ -485,6 +509,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
sync_url2 = app_settings.database_url.replace("+asyncpg", "")
|
||||
eng2 = create_engine(sync_url2)
|
||||
with SyncSession(eng2) as s2:
|
||||
set_tenant_context_sync(s2, _tenant_id)
|
||||
from datetime import datetime as dt2
|
||||
s2.execute(
|
||||
sql_update2(OL2).where(OL2.id == order_line_id)
|
||||
@@ -500,6 +525,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
# Try to get order_id from DB
|
||||
eng3 = create_engine(sync_url2)
|
||||
with SyncSession(eng3) as s3:
|
||||
set_tenant_context_sync(s3, _tenant_id)
|
||||
from sqlalchemy import select as sel
|
||||
row = s3.execute(sel(OL2.order_id).where(OL2.id == order_line_id)).scalar_one_or_none()
|
||||
if row:
|
||||
@@ -511,6 +537,7 @@ def render_order_line_task(self, order_line_id: str):
|
||||
from app.models.order import Order as OrderModel2
|
||||
eng4 = create_engine(sync_url2)
|
||||
with SyncSession(eng4) as s4:
|
||||
set_tenant_context_sync(s4, _tenant_id)
|
||||
order_row2 = s4.execute(
|
||||
sel2(OrderModel2.created_by, OrderModel2.order_number)
|
||||
.join(OL2, OL2.order_id == OrderModel2.id)
|
||||
|
||||
@@ -26,6 +26,10 @@ def render_step_thumbnail(self, cad_file_id: str):
|
||||
pl.step_start("render_step_thumbnail", {"cad_file_id": cad_file_id})
|
||||
logger.info(f"Rendering thumbnail for CAD file: {cad_file_id}")
|
||||
|
||||
# Resolve and log tenant context at task start (required for RLS)
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad, set_tenant_context_sync
|
||||
_tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
|
||||
# Compute and persist STEP file hash for STL cache lookups
|
||||
try:
|
||||
from sqlalchemy import create_engine
|
||||
@@ -37,6 +41,7 @@ def render_step_thumbnail(self, cad_file_id: str):
|
||||
sync_url = app_settings.database_url.replace("+asyncpg", "")
|
||||
_eng = create_engine(sync_url)
|
||||
with Session(_eng) as _sess:
|
||||
set_tenant_context_sync(_sess, _tenant_id)
|
||||
_cad = _sess.get(CadFile, cad_file_id)
|
||||
if _cad and _cad.stored_path and not _cad.step_file_hash:
|
||||
_hash = compute_step_hash(_cad.stored_path)
|
||||
@@ -71,6 +76,7 @@ def render_step_thumbnail(self, cad_file_id: str):
|
||||
_sync_url2 = _cfg2.database_url.replace("+asyncpg", "")
|
||||
_eng2 = create_engine(_sync_url2)
|
||||
with Session(_eng2) as _sess2:
|
||||
set_tenant_context_sync(_sess2, _tenant_id)
|
||||
_cad2 = _sess2.get(_CadFile2, cad_file_id)
|
||||
_step_path = _cad2.stored_path if _cad2 else None
|
||||
_eng2.dispose()
|
||||
@@ -82,6 +88,7 @@ def render_step_thumbnail(self, cad_file_id: str):
|
||||
if bbox_data:
|
||||
_eng2 = create_engine(_sync_url2)
|
||||
with Session(_eng2) as _sess2:
|
||||
set_tenant_context_sync(_sess2, _tenant_id)
|
||||
_cad2 = _sess2.get(_CadFile2, cad_file_id)
|
||||
if _cad2:
|
||||
_cad2.mesh_attributes = {**( _cad2.mesh_attributes or {}), **bbox_data}
|
||||
@@ -107,6 +114,7 @@ def render_step_thumbnail(self, cad_file_id: str):
|
||||
_sync_url3 = _cfg3.database_url.replace("+asyncpg", "")
|
||||
_eng3 = create_engine(_sync_url3)
|
||||
with Session(_eng3) as _sess3:
|
||||
set_tenant_context_sync(_sess3, _tenant_id)
|
||||
_cad3 = _sess3.get(_CadFile3, cad_file_id)
|
||||
_attrs = _cad3.mesh_attributes or {} if _cad3 else {}
|
||||
_step_path3 = _cad3.stored_path if _cad3 else None
|
||||
@@ -117,6 +125,7 @@ def render_step_thumbnail(self, cad_file_id: str):
|
||||
if edge_data:
|
||||
_eng3 = create_engine(_sync_url3)
|
||||
with Session(_eng3) as _sess3:
|
||||
set_tenant_context_sync(_sess3, _tenant_id)
|
||||
_cad3 = _sess3.get(_CadFile3, cad_file_id)
|
||||
if _cad3:
|
||||
_cad3.mesh_attributes = {**(_cad3.mesh_attributes or {}), **edge_data}
|
||||
@@ -145,6 +154,7 @@ def render_step_thumbnail(self, cad_file_id: str):
|
||||
_sync_url = _cfg.database_url.replace("+asyncpg", "")
|
||||
_eng = create_engine(_sync_url)
|
||||
with _Session(_eng) as _s:
|
||||
set_tenant_context_sync(_s, _tenant_id)
|
||||
_cad = _s.get(_CadFile, cad_file_id)
|
||||
_tid = str(_cad.tenant_id) if _cad and _cad.tenant_id else None
|
||||
_eng.dispose()
|
||||
@@ -176,6 +186,11 @@ def regenerate_thumbnail(self, cad_file_id: str, part_colors: dict):
|
||||
pl = PipelineLogger(task_id=self.request.id)
|
||||
pl.step_start("regenerate_thumbnail", {"cad_file_id": cad_file_id})
|
||||
logger.info(f"Regenerating thumbnail for CAD file: {cad_file_id}")
|
||||
|
||||
# Resolve and log tenant context at task start (required for RLS)
|
||||
from app.core.tenant_context import resolve_tenant_id_for_cad
|
||||
_tenant_id = resolve_tenant_id_for_cad(cad_file_id)
|
||||
|
||||
try:
|
||||
from app.services.step_processor import regenerate_cad_thumbnail
|
||||
success = regenerate_cad_thumbnail(cad_file_id, part_colors)
|
||||
|
||||
@@ -32,6 +32,9 @@ class CadFile(Base):
|
||||
render_log: Mapped[dict] = mapped_column(JSONB, nullable=True)
|
||||
mesh_attributes: Mapped[dict | None] = mapped_column(JSONB, nullable=True)
|
||||
part_materials: Mapped[dict | None] = mapped_column(JSONB, nullable=True, default=None)
|
||||
source_material_assignments: Mapped[dict | None] = mapped_column(JSONB, nullable=True, default=None)
|
||||
resolved_material_assignments: Mapped[dict | None] = mapped_column(JSONB, nullable=True, default=None)
|
||||
manual_material_overrides: Mapped[dict | None] = mapped_column(JSONB, nullable=True, default=None)
|
||||
step_file_hash: Mapped[str | None] = mapped_column(String(64), nullable=True, index=True)
|
||||
tenant_id: Mapped[uuid.UUID | None] = mapped_column(
|
||||
UUID(as_uuid=True), ForeignKey("tenants.id"), nullable=True, index=True
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
from typing import Literal
|
||||
from pydantic import BaseModel
|
||||
from app.domains.rendering.schemas import RenderPositionOut
|
||||
|
||||
@@ -71,3 +72,19 @@ class ProductOut(BaseModel):
|
||||
updated_at: datetime
|
||||
|
||||
model_config = {"from_attributes": True}
|
||||
|
||||
|
||||
class PartEntry(BaseModel):
|
||||
part_key: str
|
||||
source_name: str
|
||||
prim_path: str | None = None
|
||||
effective_material: str | None
|
||||
assignment_provenance: Literal["manual", "auto", "source", "default"]
|
||||
is_unassigned: bool
|
||||
|
||||
|
||||
class SceneManifest(BaseModel):
|
||||
cad_file_id: str
|
||||
parts: list[PartEntry]
|
||||
unmatched_source_rows: list[str]
|
||||
unassigned_parts: list[str]
|
||||
|
||||
@@ -0,0 +1,182 @@
|
||||
"""Part key generation and scene manifest building for the USD pipeline.
|
||||
|
||||
The `resolved_material_assignments` JSONB schema written by `generate_usd_master_task`:
|
||||
{part_key: {"source_name": str, "prim_path": str}}
|
||||
|
||||
The `manual_material_overrides` JSONB schema written by `PUT /cad/{id}/part-materials` (Priority 4):
|
||||
{part_key: material_name_str}
|
||||
|
||||
The `source_material_assignments` JSONB schema written by the Excel importer (future):
|
||||
{source_part_name: material_name_str}
|
||||
|
||||
No pxr imports — all data is read from JSONB columns, never from USD files directly.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import hashlib
|
||||
import re
|
||||
|
||||
|
||||
# ── Part key generation ───────────────────────────────────────────────────────
|
||||
|
||||
_AF_RE = re.compile(r'_AF\d+$', re.IGNORECASE)
|
||||
|
||||
|
||||
def generate_part_key(
|
||||
xcaf_label_path: str,
|
||||
source_name: str,
|
||||
existing_keys: set[str] | None = None,
|
||||
) -> str:
|
||||
"""Deterministic slug from source_name, max 64 chars, unique within assembly.
|
||||
|
||||
- Strips `_AF\\d+` OCC suffix from source_name before slugifying.
|
||||
- Falls back to sha256 digest of xcaf_label_path if slug is empty.
|
||||
- Deduplicates by appending _2, _3, ... if existing_keys is provided.
|
||||
"""
|
||||
base = _AF_RE.sub('', source_name) if source_name else ''
|
||||
# Split camelCase before slugifying: "RingOuter" → "Ring_Outer"
|
||||
base = re.sub(r'([a-z])([A-Z])', r'\1_\2', base)
|
||||
slug = re.sub(r'[^a-z0-9]+', '_', base.lower()).strip('_')
|
||||
if not slug:
|
||||
slug = f"part_{hashlib.sha256(xcaf_label_path.encode()).hexdigest()[:8]}"
|
||||
slug = slug[:50]
|
||||
|
||||
if existing_keys is None:
|
||||
return slug
|
||||
|
||||
key = slug
|
||||
n = 2
|
||||
while key in existing_keys:
|
||||
key = f"{slug}_{n}"
|
||||
n += 1
|
||||
existing_keys.add(key)
|
||||
return key
|
||||
|
||||
|
||||
# ── Scene manifest building ───────────────────────────────────────────────────
|
||||
|
||||
def build_scene_manifest(cad_file, usd_asset=None) -> dict:
|
||||
"""Build a scene manifest dict from CadFile ORM object.
|
||||
|
||||
Source of part list (priority order):
|
||||
1. `resolved_material_assignments` — keyed by partKey (set by generate_usd_master_task)
|
||||
2. `parsed_objects["objects"]` — list of source name strings from STEP extraction
|
||||
3. Empty manifest if neither is available.
|
||||
|
||||
Material assignment priority per part:
|
||||
1. `manual_material_overrides[part_key]` — provenance "manual"
|
||||
2. `resolved_material_assignments[part_key]["material"]` — provenance "auto"
|
||||
3. substring match in `source_material_assignments` against source_name — provenance "source"
|
||||
4. None, is_unassigned=True — provenance "default"
|
||||
"""
|
||||
cad_id = str(cad_file.id)
|
||||
resolved = cad_file.resolved_material_assignments or {}
|
||||
manual = cad_file.manual_material_overrides or {}
|
||||
source = cad_file.source_material_assignments or {}
|
||||
|
||||
parts: list[dict] = []
|
||||
unmatched_source_rows: list[str] = []
|
||||
unassigned_parts: list[str] = []
|
||||
|
||||
if resolved:
|
||||
# Build from resolved assignments (USD pipeline has run)
|
||||
for part_key, meta in resolved.items():
|
||||
source_name = meta.get("source_name", "") if isinstance(meta, dict) else ""
|
||||
prim_path = meta.get("prim_path") if isinstance(meta, dict) else None
|
||||
|
||||
effective_material, provenance = _resolve_material(
|
||||
part_key, source_name, manual, resolved, source
|
||||
)
|
||||
is_unassigned = effective_material is None
|
||||
|
||||
parts.append({
|
||||
"part_key": part_key,
|
||||
"source_name": source_name,
|
||||
"prim_path": prim_path,
|
||||
"effective_material": effective_material,
|
||||
"assignment_provenance": provenance,
|
||||
"is_unassigned": is_unassigned,
|
||||
})
|
||||
if is_unassigned:
|
||||
unassigned_parts.append(part_key)
|
||||
|
||||
elif cad_file.parsed_objects:
|
||||
# Fall back to parsed_objects from STEP extraction
|
||||
object_names: list[str] = cad_file.parsed_objects.get("objects") or []
|
||||
seen_keys: set[str] = set()
|
||||
for source_name in object_names:
|
||||
part_key = generate_part_key(source_name, source_name, seen_keys)
|
||||
effective_material, provenance = _resolve_material(
|
||||
part_key, source_name, manual, resolved, source
|
||||
)
|
||||
is_unassigned = effective_material is None
|
||||
|
||||
parts.append({
|
||||
"part_key": part_key,
|
||||
"source_name": source_name,
|
||||
"prim_path": None,
|
||||
"effective_material": effective_material,
|
||||
"assignment_provenance": provenance,
|
||||
"is_unassigned": is_unassigned,
|
||||
})
|
||||
if is_unassigned:
|
||||
unassigned_parts.append(part_key)
|
||||
|
||||
# Find source rows not matched to any part
|
||||
matched_source_names = {p["source_name"].lower() for p in parts}
|
||||
for src_key in source:
|
||||
if not any(
|
||||
src_key.lower() in sn or sn in src_key.lower()
|
||||
for sn in matched_source_names
|
||||
):
|
||||
unmatched_source_rows.append(src_key)
|
||||
|
||||
return {
|
||||
"cad_file_id": cad_id,
|
||||
"parts": parts,
|
||||
"unmatched_source_rows": unmatched_source_rows,
|
||||
"unassigned_parts": unassigned_parts,
|
||||
}
|
||||
|
||||
|
||||
def _resolve_material(
|
||||
part_key: str,
|
||||
source_name: str,
|
||||
manual: dict,
|
||||
resolved: dict,
|
||||
source: dict,
|
||||
) -> tuple[str | None, str]:
|
||||
"""Return (material_name, provenance) for one part using priority order."""
|
||||
# 1. Manual override
|
||||
if part_key in manual and manual[part_key]:
|
||||
return str(manual[part_key]), "manual"
|
||||
|
||||
# 2. Auto-resolved from USD pipeline
|
||||
meta = resolved.get(part_key)
|
||||
if isinstance(meta, dict) and meta.get("material"):
|
||||
return str(meta["material"]), "auto"
|
||||
|
||||
# 3. Substring match in source_material_assignments against source_name
|
||||
sn_lower = source_name.lower()
|
||||
for src_key, src_mat in source.items():
|
||||
if src_key.lower() in sn_lower or sn_lower in src_key.lower():
|
||||
if src_mat:
|
||||
return str(src_mat), "source"
|
||||
|
||||
# 4. Unassigned
|
||||
return None, "default"
|
||||
|
||||
|
||||
# ── Effective assignments for render pipeline ─────────────────────────────────
|
||||
|
||||
def get_effective_assignments(cad_file) -> dict[str, str]:
|
||||
"""Return {part_key: material_name} merged from all three layers.
|
||||
|
||||
Used by the render pipeline when building the material map (Priority 5).
|
||||
"""
|
||||
manifest = build_scene_manifest(cad_file)
|
||||
return {
|
||||
p["part_key"]: p["effective_material"]
|
||||
for p in manifest["parts"]
|
||||
if p["effective_material"] is not None
|
||||
}
|
||||
@@ -20,4 +20,5 @@ from app.domains.pipeline.tasks.render_order_line import ( # noqa: F401
|
||||
from app.domains.pipeline.tasks.export_glb import ( # noqa: F401
|
||||
generate_gltf_geometry_task,
|
||||
generate_gltf_production_task,
|
||||
generate_usd_master_task,
|
||||
)
|
||||
|
||||
@@ -151,3 +151,32 @@ export async function savePartMaterials(
|
||||
const res = await api.put<PartMaterialsResponse>(`/cad/${cadFileId}/part-materials`, map)
|
||||
return res.data.part_materials ?? {}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Manual material overrides (partKey-keyed, Priority 4)
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export interface ManualMaterialOverridesResponse {
|
||||
cad_file_id: string
|
||||
manual_material_overrides: Record<string, string> | null
|
||||
}
|
||||
|
||||
/** Return the manual material overrides for a CAD file ({partKey: materialName}, empty if none). */
|
||||
export async function getManualOverrides(cadFileId: string): Promise<Record<string, string>> {
|
||||
const res = await api.get<ManualMaterialOverridesResponse>(
|
||||
`/cad/${cadFileId}/manual-material-overrides`,
|
||||
)
|
||||
return res.data.manual_material_overrides ?? {}
|
||||
}
|
||||
|
||||
/** Save manual material overrides keyed by partKey. Returns the saved map. */
|
||||
export async function saveManualOverrides(
|
||||
cadFileId: string,
|
||||
overrides: Record<string, string>,
|
||||
): Promise<Record<string, string>> {
|
||||
const res = await api.put<ManualMaterialOverridesResponse>(
|
||||
`/cad/${cadFileId}/manual-material-overrides`,
|
||||
{ overrides },
|
||||
)
|
||||
return res.data.manual_material_overrides ?? {}
|
||||
}
|
||||
|
||||
@@ -8,6 +8,7 @@ export type MediaAssetType =
|
||||
| 'stl_high'
|
||||
| 'gltf_geometry'
|
||||
| 'gltf_production'
|
||||
| 'usd_master'
|
||||
| 'blend_production'
|
||||
|
||||
// ── Media Browser (server-side filtered + paginated) ──────────────────────────
|
||||
|
||||
@@ -0,0 +1,27 @@
|
||||
import api from './client'
|
||||
|
||||
export interface PartEntry {
|
||||
part_key: string
|
||||
source_name: string
|
||||
prim_path: string | null
|
||||
effective_material: string | null
|
||||
assignment_provenance: 'manual' | 'auto' | 'source' | 'default'
|
||||
is_unassigned: boolean
|
||||
}
|
||||
|
||||
export interface SceneManifest {
|
||||
cad_file_id: string
|
||||
parts: PartEntry[]
|
||||
unmatched_source_rows: string[]
|
||||
unassigned_parts: string[]
|
||||
}
|
||||
|
||||
export async function fetchSceneManifest(cadFileId: string): Promise<SceneManifest> {
|
||||
const res = await api.get<SceneManifest>(`/cad/${cadFileId}/scene-manifest`)
|
||||
return res.data
|
||||
}
|
||||
|
||||
export async function triggerUsdMasterGeneration(cadFileId: string): Promise<{ status: string; task_id: string; cad_file_id: string }> {
|
||||
const res = await api.post(`/cad/${cadFileId}/generate-usd-master`)
|
||||
return res.data
|
||||
}
|
||||
@@ -206,6 +206,9 @@ export interface GPUProbeResult {
|
||||
device_type?: string | null
|
||||
error?: string | null
|
||||
probed_at?: string | null
|
||||
timestamp?: string | null
|
||||
devices?: string[] | null
|
||||
render_time_s?: number | null
|
||||
}
|
||||
|
||||
export async function getGpuProbeResult(): Promise<GPUProbeResult> {
|
||||
|
||||
@@ -3,7 +3,7 @@ import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'
|
||||
import { X, Loader2, Palette, Layers, EyeOff } from 'lucide-react'
|
||||
import { toast } from 'sonner'
|
||||
import api from '../../api/client'
|
||||
import { savePartMaterials, type PartMaterialMap, type PartMaterialEntry } from '../../api/cad'
|
||||
import { savePartMaterials, saveManualOverrides, type PartMaterialMap, type PartMaterialEntry } from '../../api/cad'
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// SCHAEFFLER_COLORS — viewport preview colors for known library materials
|
||||
@@ -60,6 +60,14 @@ export interface MaterialPanelProps {
|
||||
onClose: () => void
|
||||
isolateMode?: IsolateMode
|
||||
onIsolateModeChange?: (mode: IsolateMode) => void
|
||||
/** Source part name from XCAF (shown alongside partKey slug) */
|
||||
sourcePartName?: string
|
||||
/** How the current assignment was derived */
|
||||
assignmentProvenance?: 'manual' | 'auto' | 'source' | 'default'
|
||||
/** True when GLB has partKeyMap — saves via /manual-material-overrides endpoint */
|
||||
isPartKeyMode?: boolean
|
||||
/** Current manual overrides map (needed to merge when saving in partKey mode) */
|
||||
manualOverrides?: Record<string, string>
|
||||
}
|
||||
|
||||
export default function MaterialPanel({
|
||||
@@ -70,6 +78,10 @@ export default function MaterialPanel({
|
||||
onClose,
|
||||
isolateMode = 'none',
|
||||
onIsolateModeChange,
|
||||
sourcePartName,
|
||||
assignmentProvenance,
|
||||
isPartKeyMode = false,
|
||||
manualOverrides = {},
|
||||
}: MaterialPanelProps) {
|
||||
const queryClient = useQueryClient()
|
||||
|
||||
@@ -100,6 +112,7 @@ export default function MaterialPanel({
|
||||
if (!libValue && allMaterials.length > 0) setLibValue(allMaterials[0].name)
|
||||
}, [allMaterials]) // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
// Legacy save (part_materials, keyed by normalized mesh name)
|
||||
const saveMut = useMutation({
|
||||
mutationFn: (updated: PartMaterialMap) => savePartMaterials(cadFileId, updated),
|
||||
onSuccess: () => {
|
||||
@@ -120,21 +133,53 @@ export default function MaterialPanel({
|
||||
onError: () => toast.error('Failed to remove assignment'),
|
||||
})
|
||||
|
||||
// partKey mode save (manual_material_overrides, keyed by partKey slug)
|
||||
const manualSaveMut = useMutation({
|
||||
mutationFn: (updated: Record<string, string>) => saveManualOverrides(cadFileId, updated),
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['manual-overrides', cadFileId] })
|
||||
toast.success(`Material assigned to "${partName}"`)
|
||||
onClose()
|
||||
},
|
||||
onError: () => toast.error('Failed to save material assignment'),
|
||||
})
|
||||
|
||||
const manualRemoveMut = useMutation({
|
||||
mutationFn: (updated: Record<string, string>) => saveManualOverrides(cadFileId, updated),
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['manual-overrides', cadFileId] })
|
||||
toast.success(`Assignment removed from "${partName}"`)
|
||||
onClose()
|
||||
},
|
||||
onError: () => toast.error('Failed to remove assignment'),
|
||||
})
|
||||
|
||||
function handleAssign() {
|
||||
const entry: PartMaterialEntry =
|
||||
assignType === 'hex'
|
||||
? { type: 'hex', value: hexValue }
|
||||
: { type: 'library', value: libValue }
|
||||
saveMut.mutate({ ...partMaterials, [partName]: entry })
|
||||
const materialValue = assignType === 'hex' ? hexValue : libValue
|
||||
if (isPartKeyMode) {
|
||||
manualSaveMut.mutate({ ...manualOverrides, [partName]: materialValue })
|
||||
} else {
|
||||
const entry: PartMaterialEntry =
|
||||
assignType === 'hex'
|
||||
? { type: 'hex', value: hexValue }
|
||||
: { type: 'library', value: libValue }
|
||||
saveMut.mutate({ ...partMaterials, [partName]: entry })
|
||||
}
|
||||
}
|
||||
|
||||
function handleRemove() {
|
||||
const updated = { ...partMaterials }
|
||||
delete updated[partName]
|
||||
removeMut.mutate(updated)
|
||||
if (isPartKeyMode) {
|
||||
const updated = { ...manualOverrides }
|
||||
delete updated[partName]
|
||||
manualRemoveMut.mutate(updated)
|
||||
} else {
|
||||
const updated = { ...partMaterials }
|
||||
delete updated[partName]
|
||||
removeMut.mutate(updated)
|
||||
}
|
||||
}
|
||||
|
||||
const isBusy = saveMut.isPending || removeMut.isPending
|
||||
const isBusy = saveMut.isPending || removeMut.isPending || manualSaveMut.isPending || manualRemoveMut.isPending
|
||||
const previewHex = assignType === 'hex'
|
||||
? hexValue
|
||||
: (SCHAEFFLER_COLORS[libValue] ?? '#888888')
|
||||
@@ -146,13 +191,29 @@ export default function MaterialPanel({
|
||||
>
|
||||
{/* Header */}
|
||||
<div className="flex items-center justify-between px-3 py-2 border-b border-gray-700">
|
||||
<div className="flex items-center gap-2 min-w-0">
|
||||
<div className="flex items-center gap-2 min-w-0 flex-1">
|
||||
<Palette size={13} className="text-accent shrink-0" />
|
||||
<span className="text-white text-xs font-semibold truncate" title={partName}>
|
||||
{partName}
|
||||
</span>
|
||||
<div className="min-w-0">
|
||||
<span className="text-white text-xs font-semibold truncate block" title={partName}>
|
||||
{partName}
|
||||
</span>
|
||||
{sourcePartName && sourcePartName !== partName && (
|
||||
<span className="text-gray-500 text-[10px] truncate block" title={sourcePartName}>
|
||||
{sourcePartName}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
{assignmentProvenance && assignmentProvenance !== 'default' && (
|
||||
<span className={`shrink-0 text-[9px] font-medium px-1 py-0.5 rounded ${
|
||||
assignmentProvenance === 'manual' ? 'bg-accent/20 text-accent' :
|
||||
assignmentProvenance === 'auto' ? 'bg-green-900/40 text-green-400' :
|
||||
'bg-yellow-900/40 text-yellow-400'
|
||||
}`}>
|
||||
{assignmentProvenance}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<button onClick={onClose} className="text-gray-400 hover:text-white p-0.5 shrink-0">
|
||||
<button onClick={onClose} className="text-gray-400 hover:text-white p-0.5 shrink-0 ml-1">
|
||||
<X size={14} />
|
||||
</button>
|
||||
</div>
|
||||
@@ -270,7 +331,7 @@ export default function MaterialPanel({
|
||||
disabled={isBusy || (assignType === 'library' && !libValue)}
|
||||
className="flex-1 px-3 py-1.5 rounded bg-accent hover:bg-accent-hover disabled:opacity-40 disabled:cursor-not-allowed text-white text-xs font-medium transition-colors flex items-center justify-center gap-1"
|
||||
>
|
||||
{saveMut.isPending && <Loader2 size={11} className="animate-spin" />}
|
||||
{(saveMut.isPending || manualSaveMut.isPending) && <Loader2 size={11} className="animate-spin" />}
|
||||
Assign
|
||||
</button>
|
||||
{currentEntry && (
|
||||
@@ -279,7 +340,7 @@ export default function MaterialPanel({
|
||||
disabled={isBusy}
|
||||
className="px-3 py-1.5 rounded bg-gray-700 hover:bg-red-900 disabled:opacity-40 disabled:cursor-not-allowed text-gray-300 hover:text-white text-xs font-medium transition-colors flex items-center gap-1"
|
||||
>
|
||||
{removeMut.isPending && <Loader2 size={11} className="animate-spin" />}
|
||||
{(removeMut.isPending || manualRemoveMut.isPending) && <Loader2 size={11} className="animate-spin" />}
|
||||
Remove
|
||||
</button>
|
||||
)}
|
||||
|
||||
@@ -28,7 +28,8 @@ import {
|
||||
Maximize2, Grid3X3, Sun, AlertCircle, EyeOff,
|
||||
} from 'lucide-react'
|
||||
import api from '../../api/client'
|
||||
import { getParsedObjects, getPartMaterials, type PartMaterialMap } from '../../api/cad'
|
||||
import { getParsedObjects, getPartMaterials, getManualOverrides, type PartMaterialMap } from '../../api/cad'
|
||||
import { fetchSceneManifest } from '../../api/sceneManifest'
|
||||
import { useAuthStore } from '../../store/auth'
|
||||
import MaterialPanel, { SCHAEFFLER_COLORS, previewColorForEntry, type IsolateMode } from './MaterialPanel'
|
||||
import { normalizeMeshName, resolvePartMaterial } from './cadUtils'
|
||||
@@ -392,6 +393,9 @@ export default function ThreeDViewer({
|
||||
// Task 5 — hovered mesh ref for emissive highlight
|
||||
const hoveredMeshRef = useRef<THREE.Mesh | null>(null)
|
||||
|
||||
// partKey map from GLB extras: normalizedMeshName → partKey slug
|
||||
const [partKeyMap, setPartKeyMap] = useState<Record<string, string>>({})
|
||||
|
||||
// Task 7 — clicked (pinned) part for material panel
|
||||
const [pinnedPart, setPinnedPart] = useState<string | null>(null)
|
||||
|
||||
@@ -401,6 +405,9 @@ export default function ThreeDViewer({
|
||||
// Hide assigned toggle — hides all parts that already have a material
|
||||
const [hideAssigned, setHideAssigned] = useState(false)
|
||||
|
||||
// Reconciliation panel (unmatched source rows + unassigned parts)
|
||||
const [showReconcile, setShowReconcile] = useState(false)
|
||||
|
||||
// Isolation mode — ghost/hide other parts while a part is pinned
|
||||
const [isolateMode, setIsolateMode] = useState<IsolateMode>('none')
|
||||
|
||||
@@ -418,6 +425,22 @@ export default function ThreeDViewer({
|
||||
})
|
||||
const dims = parsedData?.parsed_objects?.dimensions_mm
|
||||
|
||||
// Scene manifest (non-blocking — 404 expected when USD master not yet generated)
|
||||
const { data: sceneManifest } = useQuery({
|
||||
queryKey: ['scene-manifest', cadFileId],
|
||||
queryFn: () => fetchSceneManifest(cadFileId),
|
||||
staleTime: Infinity,
|
||||
retry: false,
|
||||
})
|
||||
|
||||
// Manual material overrides keyed by partKey slug (from PUT /manual-material-overrides)
|
||||
const { data: manualOverrides = {} } = useQuery({
|
||||
queryKey: ['manual-overrides', cadFileId],
|
||||
queryFn: () => getManualOverrides(cadFileId),
|
||||
staleTime: 30_000,
|
||||
retry: false,
|
||||
})
|
||||
|
||||
// Total unique normalized mesh count (set once when model is ready)
|
||||
const [totalMeshCount, setTotalMeshCount] = useState(0)
|
||||
const [glbMeshNames, setGlbMeshNames] = useState<Set<string>>(new Set())
|
||||
@@ -436,10 +459,29 @@ export default function ThreeDViewer({
|
||||
[initialPartMaterials, savedPartMaterials],
|
||||
)
|
||||
|
||||
// Effective materials: merge partMaterials (old normalized-name keys) +
|
||||
// manualOverrides (new partKey slug keys). Both key formats coexist so
|
||||
// existing GLBs (no partKeyMap) and new GLBs (with partKeyMap) work correctly.
|
||||
const effectiveMaterials = useMemo(() => {
|
||||
const fromManual: PartMaterialMap = Object.fromEntries(
|
||||
Object.entries(manualOverrides).map(([k, v]) => [
|
||||
k,
|
||||
{ type: (v.startsWith('#') ? 'hex' : 'library') as 'hex' | 'library', value: v },
|
||||
])
|
||||
)
|
||||
return { ...partMaterials, ...fromManual }
|
||||
}, [partMaterials, manualOverrides])
|
||||
|
||||
// Resolve partKey from normalized mesh name (identity fallback when no map loaded)
|
||||
const resolvePartKey = useCallback(
|
||||
(normalizedName: string): string => partKeyMap[normalizedName] ?? normalizedName,
|
||||
[partKeyMap],
|
||||
)
|
||||
|
||||
// Count how many unique GLB mesh types have a resolved material assignment
|
||||
const assignedCount = useMemo(
|
||||
() => [...glbMeshNames].filter(n => !!resolvePartMaterial(n, partMaterials)).length,
|
||||
[glbMeshNames, partMaterials],
|
||||
() => [...glbMeshNames].filter(n => !!resolvePartMaterial(n, effectiveMaterials)).length,
|
||||
[glbMeshNames, effectiveMaterials],
|
||||
)
|
||||
|
||||
// Raw URL selected by mode (used as stable key before blob fetch)
|
||||
@@ -485,12 +527,24 @@ export default function ThreeDViewer({
|
||||
if (modelReady) setFitTrigger(t => t + 1)
|
||||
}, [modelReady])
|
||||
|
||||
// Compute unique normalized mesh names once (used in toolbar badge + assignedCount)
|
||||
// Compute unique mesh keys once (used in toolbar badge + assignedCount).
|
||||
// Also extract partKeyMap from GLB extras when available.
|
||||
useEffect(() => {
|
||||
if (!modelReady || !sceneRef.current) return
|
||||
|
||||
// Extract partKeyMap injected by export_step_to_gltf.py into GLB extras
|
||||
const glbExtras = (sceneRef.current as any).userData ?? {}
|
||||
const map = glbExtras.partKeyMap as Record<string, string> | undefined
|
||||
if (map && Object.keys(map).length > 0) {
|
||||
setPartKeyMap(map)
|
||||
}
|
||||
|
||||
const names = new Set<string>()
|
||||
sceneRef.current.traverse(o => {
|
||||
if ((o as THREE.Mesh).isMesh && o.name) names.add(normalizeMeshName((o.userData?.name as string) || o.name))
|
||||
if ((o as THREE.Mesh).isMesh && o.name) {
|
||||
const normalized = normalizeMeshName((o.userData?.name as string) || o.name)
|
||||
names.add(map?.[normalized] ?? normalized)
|
||||
}
|
||||
})
|
||||
setTotalMeshCount(names.size)
|
||||
setGlbMeshNames(new Set(names))
|
||||
@@ -501,13 +555,14 @@ export default function ThreeDViewer({
|
||||
if (modelReady) setFitTrigger(t => t + 1)
|
||||
}, [isOrtho]) // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
// Task 6 — apply saved material colors after model loads or when partMaterials changes
|
||||
// Task 6 — apply saved material colors after model loads or when effectiveMaterials changes
|
||||
useEffect(() => {
|
||||
if (!modelReady || !sceneRef.current) return
|
||||
sceneRef.current.traverse((obj) => {
|
||||
const mesh = obj as THREE.Mesh
|
||||
if (!mesh.isMesh) return
|
||||
const entry = resolvePartMaterial(normalizeMeshName((mesh.userData?.name as string) || mesh.name), partMaterials)
|
||||
const normalized = normalizeMeshName((mesh.userData?.name as string) || mesh.name)
|
||||
const entry = resolvePartMaterial(resolvePartKey(normalized), effectiveMaterials)
|
||||
if (!entry) return
|
||||
const mats = Array.isArray(mesh.material) ? mesh.material : [mesh.material]
|
||||
mats.forEach((m) => {
|
||||
@@ -515,12 +570,12 @@ export default function ThreeDViewer({
|
||||
if (mat && 'color' in mat) mat.color.set(previewColorForEntry(entry))
|
||||
})
|
||||
})
|
||||
}, [modelReady, partMaterials])
|
||||
}, [modelReady, effectiveMaterials, resolvePartKey])
|
||||
|
||||
// Apply/remove unassigned highlight — only glows when ≥1 assignment exists (for meaningful contrast)
|
||||
useEffect(() => {
|
||||
if (!modelReady || !sceneRef.current) return
|
||||
const hasAnyAssignment = Object.keys(partMaterials).length > 0
|
||||
const hasAnyAssignment = Object.keys(effectiveMaterials).length > 0
|
||||
sceneRef.current.traverse((obj) => {
|
||||
const mesh = obj as THREE.Mesh
|
||||
if (!mesh.isMesh) return
|
||||
@@ -529,7 +584,8 @@ export default function ThreeDViewer({
|
||||
const m = mat as THREE.MeshStandardMaterial
|
||||
if (!m || !('emissive' in m)) return
|
||||
if (showUnassigned && hasAnyAssignment) {
|
||||
const hasAssignment = !!resolvePartMaterial(normalizeMeshName((mesh.userData?.name as string) || mesh.name), partMaterials)
|
||||
const normalized = normalizeMeshName((mesh.userData?.name as string) || mesh.name)
|
||||
const hasAssignment = !!resolvePartMaterial(resolvePartKey(normalized), effectiveMaterials)
|
||||
m.emissive.set(hasAssignment ? 0x000000 : 0xff4400)
|
||||
m.emissiveIntensity = hasAssignment ? 0 : 0.8
|
||||
} else {
|
||||
@@ -538,7 +594,7 @@ export default function ThreeDViewer({
|
||||
}
|
||||
})
|
||||
})
|
||||
}, [modelReady, showUnassigned, partMaterials])
|
||||
}, [modelReady, showUnassigned, effectiveMaterials, resolvePartKey])
|
||||
|
||||
// Reset isolateMode when no part is pinned
|
||||
useEffect(() => {
|
||||
@@ -547,8 +603,8 @@ export default function ThreeDViewer({
|
||||
|
||||
// Reset hideAssigned when all assignments are cleared
|
||||
useEffect(() => {
|
||||
if (Object.keys(partMaterials).length === 0) setHideAssigned(false)
|
||||
}, [partMaterials])
|
||||
if (Object.keys(effectiveMaterials).length === 0) setHideAssigned(false)
|
||||
}, [effectiveMaterials])
|
||||
|
||||
// Combined visibility effect — handles hideAssigned + isolateMode together
|
||||
useEffect(() => {
|
||||
@@ -557,8 +613,9 @@ export default function ThreeDViewer({
|
||||
const mesh = obj as THREE.Mesh
|
||||
if (!mesh.isMesh) return
|
||||
const normalizedName = normalizeMeshName((mesh.userData?.name as string) || mesh.name)
|
||||
const isSelected = normalizedName === pinnedPart
|
||||
const isAssigned = !!resolvePartMaterial(normalizedName, partMaterials)
|
||||
const partKey = resolvePartKey(normalizedName)
|
||||
const isSelected = partKey === pinnedPart
|
||||
const isAssigned = !!resolvePartMaterial(partKey, effectiveMaterials)
|
||||
const mats = Array.isArray(mesh.material) ? mesh.material : [mesh.material]
|
||||
|
||||
// Default: fully visible + raycasting enabled
|
||||
@@ -589,7 +646,7 @@ export default function ThreeDViewer({
|
||||
}
|
||||
}
|
||||
})
|
||||
}, [modelReady, pinnedPart, isolateMode, hideAssigned, partMaterials])
|
||||
}, [modelReady, pinnedPart, isolateMode, hideAssigned, effectiveMaterials, resolvePartKey])
|
||||
|
||||
// Keyboard shortcuts
|
||||
useEffect(() => {
|
||||
@@ -653,11 +710,12 @@ export default function ThreeDViewer({
|
||||
if (hoveredMeshRef.current) {
|
||||
const mesh = hoveredMeshRef.current
|
||||
const mats = Array.isArray(mesh.material) ? mesh.material : [mesh.material]
|
||||
const hasAnyAssignment = Object.keys(partMaterials).length > 0
|
||||
const hasAnyAssignment = Object.keys(effectiveMaterials).length > 0
|
||||
mats.forEach((m) => {
|
||||
const mat = m as THREE.MeshStandardMaterial
|
||||
if (!mat || !('emissive' in mat)) return
|
||||
if (showUnassigned && hasAnyAssignment && !resolvePartMaterial(normalizeMeshName((mesh.userData?.name as string) || mesh.name), partMaterials)) {
|
||||
const normalized = normalizeMeshName((mesh.userData?.name as string) || mesh.name)
|
||||
if (showUnassigned && hasAnyAssignment && !resolvePartMaterial(resolvePartKey(normalized), effectiveMaterials)) {
|
||||
mat.emissive.set(0xff4400); mat.emissiveIntensity = 0.8
|
||||
} else {
|
||||
mat.emissive.set(0x000000); mat.emissiveIntensity = 0
|
||||
@@ -665,19 +723,19 @@ export default function ThreeDViewer({
|
||||
})
|
||||
hoveredMeshRef.current = null
|
||||
}
|
||||
}, [showUnassigned, partMaterials])
|
||||
}, [showUnassigned, effectiveMaterials, resolvePartKey])
|
||||
|
||||
const handlePointerMove = useCallback((e: React.PointerEvent) => {
|
||||
setHoverInfo(prev => prev ? { ...prev, x: e.clientX, y: e.clientY } : null)
|
||||
}, [])
|
||||
|
||||
// Task 7 — click to pin material panel
|
||||
// Task 7 — click to pin material panel (resolves to partKey slug when available)
|
||||
const handleClick = useCallback((e: any) => {
|
||||
e.stopPropagation()
|
||||
const mesh = e.object as THREE.Mesh
|
||||
const name = normalizeMeshName((mesh?.userData?.name as string) || mesh?.name || (mesh?.parent?.userData?.name as string) || mesh?.parent?.name || '')
|
||||
if (name) setPinnedPart(name)
|
||||
}, [])
|
||||
const normalized = normalizeMeshName((mesh?.userData?.name as string) || mesh?.name || (mesh?.parent?.userData?.name as string) || mesh?.parent?.name || '')
|
||||
if (normalized) setPinnedPart(resolvePartKey(normalized))
|
||||
}, [resolvePartKey])
|
||||
|
||||
return (
|
||||
<div className="fixed inset-0 z-50 flex flex-col bg-gray-950" onClick={() => setPinnedPart(null)}>
|
||||
@@ -762,7 +820,7 @@ export default function ThreeDViewer({
|
||||
)}
|
||||
|
||||
{/* Hide assigned toggle */}
|
||||
{modelReady && Object.keys(partMaterials).length > 0 && (
|
||||
{modelReady && Object.keys(effectiveMaterials).length > 0 && (
|
||||
<TBtn
|
||||
active={hideAssigned}
|
||||
onClick={() => setHideAssigned(v => !v)}
|
||||
@@ -773,6 +831,20 @@ export default function ThreeDViewer({
|
||||
</TBtn>
|
||||
)}
|
||||
|
||||
{/* Reconciliation button — shown when manifest has unmatched/unassigned items */}
|
||||
{sceneManifest && (sceneManifest.unmatched_source_rows.length > 0 || sceneManifest.unassigned_parts.length > 0) && (
|
||||
<TBtn
|
||||
active={showReconcile}
|
||||
onClick={() => setShowReconcile(v => !v)}
|
||||
title={`${sceneManifest.unmatched_source_rows.length} unmatched source rows · ${sceneManifest.unassigned_parts.length} unassigned parts`}
|
||||
>
|
||||
<AlertTriangle size={11} />
|
||||
<span className="text-[10px] tabular-nums">
|
||||
{sceneManifest.unmatched_source_rows.length + sceneManifest.unassigned_parts.length}
|
||||
</span>
|
||||
</TBtn>
|
||||
)}
|
||||
|
||||
{/* Environment */}
|
||||
<EnvDropdown value={envPreset} onChange={setEnvPreset} />
|
||||
|
||||
@@ -899,14 +971,70 @@ export default function ThreeDViewer({
|
||||
<MaterialPanel
|
||||
partName={pinnedPart}
|
||||
cadFileId={cadFileId}
|
||||
currentEntry={resolvePartMaterial(pinnedPart, partMaterials)}
|
||||
partMaterials={partMaterials}
|
||||
currentEntry={resolvePartMaterial(pinnedPart, effectiveMaterials)}
|
||||
partMaterials={effectiveMaterials}
|
||||
onClose={() => setPinnedPart(null)}
|
||||
isolateMode={isolateMode}
|
||||
onIsolateModeChange={setIsolateMode}
|
||||
sourcePartName={sceneManifest?.parts.find(p => p.part_key === pinnedPart)?.source_name}
|
||||
assignmentProvenance={sceneManifest?.parts.find(p => p.part_key === pinnedPart)?.assignment_provenance}
|
||||
isPartKeyMode={Object.keys(partKeyMap).length > 0}
|
||||
manualOverrides={manualOverrides}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Reconciliation panel */}
|
||||
{showReconcile && sceneManifest && (
|
||||
<div
|
||||
className="absolute top-2 right-2 z-30 w-64 bg-gray-900 border border-gray-700 rounded-lg shadow-2xl max-h-[70vh] overflow-y-auto"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
<div className="flex items-center justify-between px-3 py-2 border-b border-gray-700">
|
||||
<span className="text-white text-xs font-semibold flex items-center gap-1.5">
|
||||
<AlertTriangle size={12} className="text-amber-400" /> Reconciliation
|
||||
</span>
|
||||
<button onClick={() => setShowReconcile(false)} className="text-gray-400 hover:text-white p-0.5">
|
||||
<X size={14} />
|
||||
</button>
|
||||
</div>
|
||||
<div className="p-3 space-y-3">
|
||||
{sceneManifest.unassigned_parts.length > 0 && (
|
||||
<div>
|
||||
<p className="text-gray-400 text-[10px] font-medium mb-1.5 uppercase tracking-wider">
|
||||
Unassigned parts ({sceneManifest.unassigned_parts.length})
|
||||
</p>
|
||||
{sceneManifest.unassigned_parts.map(pk => (
|
||||
<button
|
||||
key={pk}
|
||||
onClick={() => { setPinnedPart(pk); setShowReconcile(false) }}
|
||||
className="block w-full text-left px-2 py-1 text-xs text-gray-300 hover:bg-gray-800 hover:text-white rounded transition-colors truncate"
|
||||
title={pk}
|
||||
>
|
||||
{pk}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
{sceneManifest.unmatched_source_rows.length > 0 && (
|
||||
<div>
|
||||
<p className="text-gray-400 text-[10px] font-medium mb-1.5 uppercase tracking-wider">
|
||||
Unmatched source rows ({sceneManifest.unmatched_source_rows.length})
|
||||
</p>
|
||||
{sceneManifest.unmatched_source_rows.map((row, i) => (
|
||||
<div
|
||||
key={i}
|
||||
className="px-2 py-1 text-xs text-gray-500 truncate"
|
||||
title={row}
|
||||
>
|
||||
{row}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Keyboard hint — bottom-right */}
|
||||
<div className="absolute bottom-2 right-16 z-10 pointer-events-none select-none text-gray-600 text-[10px]">
|
||||
F fit · W wire · G grid · S shadow · click part to assign · Esc close
|
||||
|
||||
@@ -108,10 +108,10 @@ export default function AdminPage() {
|
||||
gltf_material_quality: string
|
||||
gltf_pbr_roughness: number
|
||||
gltf_pbr_metallic: number
|
||||
gltf_preview_linear_deflection: number
|
||||
gltf_preview_angular_deflection: number
|
||||
gltf_production_linear_deflection: number
|
||||
gltf_production_angular_deflection: number
|
||||
scene_linear_deflection: number
|
||||
scene_angular_deflection: number
|
||||
render_linear_deflection: number
|
||||
render_angular_deflection: number
|
||||
tessellation_engine: string
|
||||
}
|
||||
|
||||
@@ -224,6 +224,18 @@ export default function AdminPage() {
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed'),
|
||||
})
|
||||
|
||||
const generateMissingUsdMastersMut = useMutation({
|
||||
mutationFn: () => api.post('/admin/settings/generate-missing-usd-masters'),
|
||||
onSuccess: (res) => toast.success(res.data.message || 'USD master export queued'),
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed'),
|
||||
})
|
||||
|
||||
const generateMissingCanonicalScenesMut = useMutation({
|
||||
mutationFn: () => api.post('/admin/settings/generate-missing-canonical-scenes'),
|
||||
onSuccess: (res) => toast.success(res.data.message || 'Canonical scene export queued'),
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed'),
|
||||
})
|
||||
|
||||
const [smtpDraft, setSmtpDraft] = useState<Partial<Settings>>({})
|
||||
const smtp = { ...settings, ...smtpDraft } as Settings
|
||||
|
||||
@@ -921,6 +933,30 @@ export default function AdminPage() {
|
||||
</div>
|
||||
<p className="text-xs text-content-muted">Re-renders thumbnails for all completed CAD files.</p>
|
||||
</div>
|
||||
<div className="flex flex-col gap-1">
|
||||
<button
|
||||
onClick={() => generateMissingUsdMastersMut.mutate()}
|
||||
disabled={generateMissingUsdMastersMut.isPending}
|
||||
className="btn-secondary text-sm w-full justify-start"
|
||||
title="Queue USD master export for all completed CAD files without a USD master asset"
|
||||
>
|
||||
<RefreshCw size={14} className={generateMissingUsdMastersMut.isPending ? 'animate-spin' : ''} />
|
||||
{generateMissingUsdMastersMut.isPending ? 'Queueing…' : 'Generate Missing USD Masters'}
|
||||
</button>
|
||||
<p className="text-xs text-content-muted">Exports USD canonical scene for all completed CAD files missing one.</p>
|
||||
</div>
|
||||
<div className="flex flex-col gap-1">
|
||||
<button
|
||||
onClick={() => generateMissingCanonicalScenesMut.mutate()}
|
||||
disabled={generateMissingCanonicalScenesMut.isPending}
|
||||
className="btn-secondary text-sm w-full justify-start"
|
||||
title="Queue geometry GLB + USD master export for all completed CAD files without a geometry GLB"
|
||||
>
|
||||
<RefreshCw size={14} className={generateMissingCanonicalScenesMut.isPending ? 'animate-spin' : ''} />
|
||||
{generateMissingCanonicalScenesMut.isPending ? 'Queueing…' : 'Generate Missing Canonical Scenes'}
|
||||
</button>
|
||||
<p className="text-xs text-content-muted">Queues geometry GLB + USD master for all completed CAD files missing a canonical scene.</p>
|
||||
</div>
|
||||
<div className="flex flex-col gap-1">
|
||||
<button
|
||||
onClick={() => importMediaAssetsMut.mutate()}
|
||||
@@ -947,11 +983,12 @@ export default function AdminPage() {
|
||||
</div>
|
||||
<div className="flex flex-col gap-1">
|
||||
<button
|
||||
onClick={() => {
|
||||
if (window.confirm('Delete all orphaned STEP files (not linked to any product)? This cannot be undone.')) {
|
||||
cleanupOrphanedCadMut.mutate()
|
||||
}
|
||||
}}
|
||||
onClick={() => setConfirmState({
|
||||
open: true,
|
||||
title: 'Delete Orphaned STEP Files',
|
||||
message: 'Delete all orphaned STEP files (not linked to any product)? This cannot be undone.',
|
||||
onConfirm: () => { cleanupOrphanedCadMut.mutate(); setConfirmState(s => ({ ...s, open: false })) },
|
||||
})}
|
||||
disabled={cleanupOrphanedCadMut.isPending}
|
||||
className="btn-secondary text-sm w-full justify-start"
|
||||
title="Delete STEP files and thumbnails that are no longer linked to any product"
|
||||
@@ -1416,26 +1453,26 @@ export default function AdminPage() {
|
||||
label: 'Draft',
|
||||
description: 'Fast export, visible faceting on large curves',
|
||||
color: 'border-amber-400 text-amber-700',
|
||||
values: { gltf_preview_linear_deflection: 0.2, gltf_preview_angular_deflection: 0.3, gltf_production_linear_deflection: 0.05, gltf_production_angular_deflection: 0.1 },
|
||||
values: { scene_linear_deflection: 0.2, scene_angular_deflection: 0.3, render_linear_deflection: 0.05, render_angular_deflection: 0.1 },
|
||||
},
|
||||
{
|
||||
label: 'Standard',
|
||||
description: 'Smooth curves, no fan artifacts — recommended',
|
||||
color: 'border-blue-400 text-blue-700',
|
||||
values: { gltf_preview_linear_deflection: 0.1, gltf_preview_angular_deflection: 0.1, gltf_production_linear_deflection: 0.03, gltf_production_angular_deflection: 0.05 },
|
||||
values: { scene_linear_deflection: 0.1, scene_angular_deflection: 0.1, render_linear_deflection: 0.03, render_angular_deflection: 0.05 },
|
||||
},
|
||||
{
|
||||
label: 'Fine',
|
||||
description: 'Maximum quality, very large files, slow export',
|
||||
color: 'border-emerald-400 text-emerald-700',
|
||||
values: { gltf_preview_linear_deflection: 0.05, gltf_preview_angular_deflection: 0.05, gltf_production_linear_deflection: 0.01, gltf_production_angular_deflection: 0.02 },
|
||||
values: { scene_linear_deflection: 0.05, scene_angular_deflection: 0.05, render_linear_deflection: 0.01, render_angular_deflection: 0.02 },
|
||||
},
|
||||
]
|
||||
const isActive = (preset: typeof PRESETS[0]) =>
|
||||
tess.gltf_preview_linear_deflection === preset.values.gltf_preview_linear_deflection &&
|
||||
tess.gltf_preview_angular_deflection === preset.values.gltf_preview_angular_deflection &&
|
||||
tess.gltf_production_linear_deflection === preset.values.gltf_production_linear_deflection &&
|
||||
tess.gltf_production_angular_deflection === preset.values.gltf_production_angular_deflection
|
||||
tess.scene_linear_deflection === preset.values.scene_linear_deflection &&
|
||||
tess.scene_angular_deflection === preset.values.scene_angular_deflection &&
|
||||
tess.render_linear_deflection === preset.values.render_linear_deflection &&
|
||||
tess.render_angular_deflection === preset.values.render_angular_deflection
|
||||
return (
|
||||
<div>
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide mb-2">Presets</p>
|
||||
@@ -1450,8 +1487,8 @@ export default function AdminPage() {
|
||||
<div className="font-semibold text-sm">{preset.label}</div>
|
||||
<div className="text-xs text-content-muted mt-0.5">{preset.description}</div>
|
||||
<div className="text-xs font-mono text-content-secondary mt-1 space-y-0.5">
|
||||
<div>preview: {preset.values.gltf_preview_angular_deflection} rad / {preset.values.gltf_preview_linear_deflection} mm</div>
|
||||
<div>prod: {preset.values.gltf_production_angular_deflection} rad / {preset.values.gltf_production_linear_deflection} mm</div>
|
||||
<div>scene: {preset.values.scene_angular_deflection} rad / {preset.values.scene_linear_deflection} mm</div>
|
||||
<div>render: {preset.values.render_angular_deflection} rad / {preset.values.render_linear_deflection} mm</div>
|
||||
</div>
|
||||
</button>
|
||||
))}
|
||||
@@ -1491,7 +1528,7 @@ export default function AdminPage() {
|
||||
{/* Manual inputs */}
|
||||
<div className="grid grid-cols-2 gap-6">
|
||||
<div className="space-y-4">
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide">Preview (Geometry GLB)</p>
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide">Scene / Viewer</p>
|
||||
<div className="flex items-center gap-3">
|
||||
<label className="text-sm text-content-secondary w-36 shrink-0">Linear deflection</label>
|
||||
<input
|
||||
@@ -1499,8 +1536,8 @@ export default function AdminPage() {
|
||||
step="0.01"
|
||||
min="0.001"
|
||||
max="10"
|
||||
value={tess.gltf_preview_linear_deflection ?? 0.1}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_preview_linear_deflection: parseFloat(e.target.value) }))}
|
||||
value={tess.scene_linear_deflection ?? 0.1}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, scene_linear_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">mm</span>
|
||||
@@ -1512,16 +1549,16 @@ export default function AdminPage() {
|
||||
step="0.01"
|
||||
min="0.01"
|
||||
max="1.5"
|
||||
value={tess.gltf_preview_angular_deflection ?? 0.1}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_preview_angular_deflection: parseFloat(e.target.value) }))}
|
||||
value={tess.scene_angular_deflection ?? 0.1}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, scene_angular_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">rad</span>
|
||||
</div>
|
||||
<p className="text-xs text-content-muted">Used when clicking "Generate Geometry GLB".</p>
|
||||
<p className="text-xs text-content-muted">Used for the 3D viewer (canonical scene). Smaller = smoother surfaces.</p>
|
||||
</div>
|
||||
<div className="space-y-4">
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide">Production (Production GLB)</p>
|
||||
<p className="text-xs font-semibold text-content-secondary uppercase tracking-wide">Render output</p>
|
||||
<div className="flex items-center gap-3">
|
||||
<label className="text-sm text-content-secondary w-36 shrink-0">Linear deflection</label>
|
||||
<input
|
||||
@@ -1529,8 +1566,8 @@ export default function AdminPage() {
|
||||
step="0.005"
|
||||
min="0.001"
|
||||
max="10"
|
||||
value={tess.gltf_production_linear_deflection ?? 0.03}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_production_linear_deflection: parseFloat(e.target.value) }))}
|
||||
value={tess.render_linear_deflection ?? 0.03}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, render_linear_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">mm</span>
|
||||
@@ -1542,13 +1579,13 @@ export default function AdminPage() {
|
||||
step="0.005"
|
||||
min="0.005"
|
||||
max="1.5"
|
||||
value={tess.gltf_production_angular_deflection ?? 0.05}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, gltf_production_angular_deflection: parseFloat(e.target.value) }))}
|
||||
value={tess.render_angular_deflection ?? 0.05}
|
||||
onChange={e => setTessellationDraft(d => ({ ...d, render_angular_deflection: parseFloat(e.target.value) }))}
|
||||
className="w-24 px-3 py-1.5 border border-border-default rounded-md text-sm focus:outline-none focus:border-blue-400"
|
||||
/>
|
||||
<span className="text-sm text-content-muted">rad</span>
|
||||
</div>
|
||||
<p className="text-xs text-content-muted">Used when clicking "Generate Production GLB". Smaller = smoother surfaces.</p>
|
||||
<p className="text-xs text-content-muted">Used for final render output. Smaller = smoother surfaces, larger file sizes.</p>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex gap-2">
|
||||
@@ -1621,7 +1658,7 @@ export default function AdminPage() {
|
||||
</button>
|
||||
{gpuProbeResult && (
|
||||
<span className="text-xs text-content-muted">
|
||||
Last checked: {new Date(gpuProbeResult.timestamp).toLocaleString()}
|
||||
Last checked: {gpuProbeResult.timestamp ? new Date(gpuProbeResult.timestamp).toLocaleString() : '—'}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
@@ -197,7 +197,7 @@ export default function OrdersPage() {
|
||||
<div className="flex items-center gap-3 mb-4">
|
||||
<h1 className="text-2xl font-bold text-content">Orders</h1>
|
||||
<div className="ml-auto flex items-center gap-2">
|
||||
<div className="flex border border-border-default rounded-md overflow-hidden">
|
||||
<div className="hidden md:flex border border-border-default rounded-md overflow-hidden">
|
||||
<button
|
||||
onClick={() => setView('kanban')}
|
||||
title="Kanban view"
|
||||
|
||||
@@ -19,7 +19,7 @@ import { listGlobalRenderPositions } from '../api/renderPositions'
|
||||
import MaterialInput from '../components/shared/MaterialInput'
|
||||
import MaterialWizard from '../components/MaterialWizard'
|
||||
import { useAuthStore, isAdmin as checkIsAdmin, isPrivileged as checkIsPrivileged } from '../store/auth'
|
||||
import { generateGltfGeometry, generateGltfProduction, resetStuckProcessing } from '../api/cad'
|
||||
import { generateGltfGeometry, resetStuckProcessing } from '../api/cad'
|
||||
import { listMediaAssets as getMediaAssets } from '../api/media'
|
||||
import InlineCadViewer from '../components/cad/InlineCadViewer'
|
||||
import { convertCadPartMaterials } from '../components/cad/cadUtils'
|
||||
@@ -186,25 +186,15 @@ export default function ProductDetailPage() {
|
||||
staleTime: 0,
|
||||
})
|
||||
|
||||
const [productionGlbGenerating, setProductionGlbGenerating] = useState(false)
|
||||
|
||||
const { data: productionGlbAssets = [] } = useQuery({
|
||||
queryKey: ['media-assets', cadFileId, 'gltf_production'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: cadFileId!, asset_types: ['gltf_production'] }),
|
||||
const { data: usdMasterAssets = [] } = useQuery({
|
||||
queryKey: ['media-assets', cadFileId, 'usd_master'],
|
||||
queryFn: () => getMediaAssets({ cad_file_id: cadFileId!, asset_types: ['usd_master'] }),
|
||||
enabled: !!cadFileId,
|
||||
staleTime: 0,
|
||||
refetchInterval: productionGlbGenerating ? 3000 : false,
|
||||
})
|
||||
|
||||
// Stop polling once the freshly-generated asset has arrived
|
||||
useEffect(() => {
|
||||
if (productionGlbGenerating && productionGlbAssets.length > 0) {
|
||||
setProductionGlbGenerating(false)
|
||||
}
|
||||
}, [productionGlbAssets, productionGlbGenerating])
|
||||
|
||||
const geometryGlbUrl = geometryGlbAssets[0]?.download_url ?? null
|
||||
const productionGlbUrl = productionGlbAssets[0]?.download_url ?? null
|
||||
const usdMasterUrl = usdMasterAssets[0]?.download_url ?? null
|
||||
|
||||
const { data: renders = [] } = useQuery<ProductRender[]>({
|
||||
queryKey: ['product-renders', id],
|
||||
@@ -353,21 +343,11 @@ export default function ProductDetailPage() {
|
||||
onSuccess: () => {
|
||||
toast.info('Geometry GLB export queued')
|
||||
qc.invalidateQueries({ queryKey: ['media-assets', cadFileId, 'gltf_geometry'] })
|
||||
qc.invalidateQueries({ queryKey: ['media-assets', cadFileId, 'usd_master'] })
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed to queue GLB export'),
|
||||
})
|
||||
|
||||
const generateProductionGlbMut = useMutation({
|
||||
mutationFn: () => generateGltfProduction(product!.cad_file_id!),
|
||||
onSuccess: () => {
|
||||
toast.info('Production GLB export queued')
|
||||
setProductionGlbGenerating(true)
|
||||
// Remove stale asset immediately so the button doesn't show an outdated download
|
||||
qc.removeQueries({ queryKey: ['media-assets', cadFileId, 'gltf_production'] })
|
||||
},
|
||||
onError: (e: any) => toast.error(e.response?.data?.detail || 'Failed to queue production GLB export'),
|
||||
})
|
||||
|
||||
const resetStuckMut = useMutation({
|
||||
mutationFn: () => resetStuckProcessing(product!.cad_file_id!),
|
||||
onSuccess: (res) => {
|
||||
@@ -737,21 +717,22 @@ export default function ProductDetailPage() {
|
||||
</div>
|
||||
|
||||
<div className="border-t border-border-light pt-2 mt-1 flex flex-col gap-2">
|
||||
<div className="text-xs font-semibold text-content-secondary uppercase tracking-wide mb-1">Canonical Scene</div>
|
||||
<GlbDownloadButton
|
||||
label="Geometry GLB"
|
||||
label="Viewer GLB"
|
||||
url={geometryGlbUrl}
|
||||
filename={`${product.name ?? product.pim_id}_geometry.glb`}
|
||||
onGenerate={() => generateGeometryGlbMut.mutate()}
|
||||
isGenerating={generateGeometryGlbMut.isPending}
|
||||
title="Export geometry GLB directly from STEP via OCC (no Blender)"
|
||||
title="Regenerate canonical scene (geometry GLB + auto-chains USD master)"
|
||||
/>
|
||||
<GlbDownloadButton
|
||||
label="Production GLB"
|
||||
url={productionGlbUrl}
|
||||
filename={`${product.name ?? product.pim_id}_production.glb`}
|
||||
onGenerate={() => generateProductionGlbMut.mutate()}
|
||||
isGenerating={generateProductionGlbMut.isPending || productionGlbGenerating}
|
||||
title="Export production GLB with PBR materials via Blender"
|
||||
label="USD Master"
|
||||
url={usdMasterUrl}
|
||||
filename={`${product.name ?? product.pim_id}_master.usd`}
|
||||
onGenerate={() => generateGeometryGlbMut.mutate()}
|
||||
isGenerating={generateGeometryGlbMut.isPending}
|
||||
title="USD canonical scene (auto-generated after Viewer GLB)"
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
|
||||
Vendored
+1
@@ -0,0 +1 @@
|
||||
/// <reference types="vite/client" />
|
||||
@@ -1,88 +1,108 @@
|
||||
# Plan: P1 Remaining Cleanup — M1 Dead Code + M3 blender_render.py Split
|
||||
# Plan: P2 USD Foundation — Commit & Verify
|
||||
|
||||
## Context
|
||||
Three categories of cleanup:
|
||||
1. **M1a**: Two legacy HTTP renderer directories (`blender-renderer/`, `threejs-renderer/`) still exist in repo root despite the services being removed in Phase A.
|
||||
2. **M1b**: Dead code in backend services — PIL fallback in `step_processor.py`, `stl_quality` param (always "low") in `render_blender.py` and `domains/rendering/tasks.py`.
|
||||
3. **M3**: `render-worker/scripts/blender_render.py` is 263 lines (target < 80) — argparse, scene setup, and render config should move to submodules.
|
||||
|
||||
`domains/rendering/tasks.py` is **NOT dead code** — contains 6 active Celery tasks (`render_still_task`, `render_turntable_task`, `render_order_line_still_task`, `export_gltf_for_order_line_task`, `export_blend_for_order_line_task`, `apply_asset_library_materials_task`). Only the `stl_quality` param needs removal.
|
||||
All five P2 milestones are already implemented in the working tree as uncommitted changes.
|
||||
The task now is to apply the DB migrations, commit the work, and verify the stack runs.
|
||||
|
||||
## Affected Files
|
||||
- `blender-renderer/` — delete entire directory
|
||||
- `threejs-renderer/` — delete entire directory
|
||||
- `backend/app/services/step_processor.py` — remove PIL fallback block (~line 565)
|
||||
- `backend/app/services/render_blender.py` — remove `stl_quality` param from `_glb_from_step()`, `render_still()`, `render_turntable_to_file()`
|
||||
- `backend/app/domains/rendering/tasks.py` — remove `stl_quality` param from `render_still_task`, `render_turntable_task`
|
||||
- `render-worker/scripts/blender_render.py` — thin to < 80 lines
|
||||
- `render-worker/scripts/_blender_args.py` — new file (argument parsing)
|
||||
- `render-worker/scripts/_blender_scene_setup.py` — new file (MODE A/B scene setup)
|
||||
- `render-worker/scripts/_blender_render_config.py` — new file (engine + output config)
|
||||
### Milestone status (assessed 2026-03-12)
|
||||
|
||||
| Milestone | Status | Key files |
|
||||
|---|---|---|
|
||||
| M1: `export_step_to_usd.py` with `schaeffler:partKey` | ✅ DONE | `render-worker/scripts/export_step_to_usd.py` (631 lines) |
|
||||
| M2: `usd_master` MediaAsset + migrations 060–062 + Celery task | ✅ DONE | migrations 060/061/062, `generate_usd_master_task` in `export_glb.py` |
|
||||
| M3: `GET /api/cad/{id}/scene-manifest` | ✅ DONE | `part_key_service.py`, `SceneManifest` schema, endpoint in `cad.py` |
|
||||
| M4: `PUT /api/cad/{id}/manual-material-overrides` | ✅ DONE | New endpoint pair in `cad.py`, `saveManualOverrides` in `cad.ts` |
|
||||
| M5: ThreeDViewer uses partKey, survives reload | ✅ DONE | `partKeyMap` in GLB extras, `effectiveMaterials` merge, server-side persistence |
|
||||
|
||||
## Affected Files (all uncommitted — working tree only)
|
||||
|
||||
**Backend**
|
||||
- `backend/alembic/versions/060_usd_master_asset_type.py` — new migration
|
||||
- `backend/alembic/versions/061_material_assignment_layers.py` — new migration
|
||||
- `backend/alembic/versions/062_rename_tessellation_settings.py` — new migration
|
||||
- `backend/app/domains/media/models.py` — `MediaAssetType.usd_master` added
|
||||
- `backend/app/domains/products/models.py` — 3 new JSONB columns on `CadFile`
|
||||
- `backend/app/domains/products/schemas.py` — `SceneManifest`, `PartEntry` Pydantic models
|
||||
- `backend/app/domains/pipeline/tasks/export_glb.py` — `generate_usd_master_task` + auto-chain
|
||||
- `backend/app/domains/pipeline/tasks/extract_metadata.py` — minor update
|
||||
- `backend/app/domains/pipeline/tasks/render_thumbnail.py` — minor update
|
||||
- `backend/app/domains/pipeline/tasks/render_order_line.py` — minor update
|
||||
- `backend/app/api/routers/cad.py` — scene-manifest + manual-material-overrides endpoints
|
||||
- `backend/app/api/routers/admin.py` — generate-missing-usd-masters + generate-missing-canonical-scenes buttons
|
||||
- `backend/app/services/part_key_service.py` — new file: `build_scene_manifest()`, `generate_part_key()`
|
||||
- `backend/app/core/config_service.py` — minor update
|
||||
- `backend/app/core/tenant_context.py` — new file
|
||||
- `backend/app/tasks/step_tasks.py` — re-exports `generate_usd_master_task`
|
||||
|
||||
**Render worker**
|
||||
- `render-worker/scripts/export_step_to_usd.py` — new file: full USD exporter
|
||||
- `render-worker/scripts/export_step_to_gltf.py` — injects `partKeyMap` into GLB extras
|
||||
- `render-worker/scripts/still_render.py` — USD path support
|
||||
- `render-worker/scripts/turntable_render.py` — USD path support
|
||||
- `render-worker/Dockerfile` — `usd-core>=24.11` added
|
||||
|
||||
**Frontend**
|
||||
- `frontend/src/api/cad.ts` — `getManualOverrides()`, `saveManualOverrides()`
|
||||
- `frontend/src/api/media.ts` — `usd_master` type added
|
||||
- `frontend/src/api/sceneManifest.ts` — new file: `SceneManifest`, `fetchSceneManifest()`
|
||||
- `frontend/src/components/cad/ThreeDViewer.tsx` — `partKeyMap`, `effectiveMaterials`, reconciliation panel
|
||||
- `frontend/src/components/cad/MaterialPanel.tsx` — dual-path save, provenance badge
|
||||
- `frontend/src/pages/Admin.tsx` — USD master bulk action buttons
|
||||
- `frontend/src/pages/ProductDetail.tsx` — `usd_master` row in asset table
|
||||
- `frontend/src/pages/Orders.tsx` — minor update
|
||||
|
||||
## Tasks (in order)
|
||||
|
||||
### [x] Task 1: Delete legacy renderer directories
|
||||
- **File**: `blender-renderer/`, `threejs-renderer/` (repo root)
|
||||
- **What**: `git rm -rf blender-renderer/ threejs-renderer/` — removes both legacy HTTP service directories superseded by the Celery render-worker in Phase A
|
||||
- **Acceptance gate**: `ls blender-renderer/ threejs-renderer/` both return "no such file or directory"
|
||||
### [ ] Task 1: Apply migrations 060–062
|
||||
- **What**: Run `docker compose exec backend alembic upgrade head` to apply the three pending migrations
|
||||
- **Acceptance gate**: `docker compose exec backend alembic current` shows `062` (or higher) as current
|
||||
- **Dependencies**: none
|
||||
- **Risk**: Low — not imported by any active pipeline code
|
||||
- **Risk**: Low — each migration is additive (ADD VALUE, ADD COLUMN, UPDATE). Check for phantom drops before running.
|
||||
|
||||
### [x] Task 2: Remove PIL fallback from step_processor.py
|
||||
- **File**: `backend/app/services/step_processor.py`
|
||||
- **What**: Find `from PIL import Image` (~line 565, inside `_generate_thumbnail()`) and the PIL thumbnail generation conditional branch. Remove the import and the branch — leave only the render-worker path.
|
||||
- **Acceptance gate**: `grep -n "PIL\|Pillow" backend/app/services/step_processor.py` returns nothing
|
||||
- **Dependencies**: none
|
||||
- **Risk**: Low — PIL path unreachable; render-worker handles all thumbnails
|
||||
### [ ] Task 2: TypeScript check
|
||||
- **What**: Run `docker compose exec frontend npx tsc --noEmit` to verify no type errors in the frontend changes
|
||||
- **Acceptance gate**: Zero TypeScript errors
|
||||
- **Dependencies**: none (frontend hot-reload, no rebuild needed)
|
||||
- **Risk**: Low
|
||||
|
||||
### [x] Task 3: Remove stl_quality param from render_blender.py
|
||||
- **File**: `backend/app/services/render_blender.py`
|
||||
- **What**:
|
||||
- `_glb_from_step(step_path, output_dir, quality="low")` → `_glb_from_step(step_path, output_dir)` — hardcode the low-quality deflection values inline (no conditional on quality)
|
||||
- Remove `stl_quality: str = "low"` from `render_still(...)` and `render_turntable_to_file(...)`
|
||||
- Remove all internal `quality=stl_quality` pass-throughs
|
||||
- **Acceptance gate**: `grep -n "stl_quality" backend/app/services/render_blender.py` returns nothing
|
||||
- **Dependencies**: none (Task 4 updates callers)
|
||||
- **Risk**: Medium — callers in tasks.py pass `stl_quality`; update in Task 4 immediately after
|
||||
### [ ] Task 3: Rebuild and restart backend + render-worker
|
||||
- **What**: `docker compose up -d --build backend worker render-worker beat` — picks up new Dockerfile (usd-core), new tasks, and new migrations
|
||||
- **Acceptance gate**: `docker compose logs backend | grep "Application startup complete"` and `docker compose exec render-worker python3 -c "from pxr import Usd; print(Usd.GetVersion())"` both succeed
|
||||
- **Dependencies**: Task 1
|
||||
- **Risk**: Medium — `usd-core` pip install adds build time; if it fails the render-worker won't start
|
||||
|
||||
### [x] Task 4: Remove stl_quality param from domains/rendering/tasks.py
|
||||
- **File**: `backend/app/domains/rendering/tasks.py`
|
||||
- **What**:
|
||||
- `render_still_task` (~line 48): remove `stl_quality: str = "low"` from signature and from the `render_still(...)` call
|
||||
- `render_turntable_task` (~line 152): remove `stl_quality: str = "low"` from signature. Lines ~210–228 inline OCC GLB generation reads `stl_quality` to choose deflection values — replace hardcoded quality-based values with DB settings reads (`scene_linear_deflection`, `scene_angular_deflection`). Pattern to follow: `export_glb.py` reads these settings via `sys_settings.get("scene_linear_deflection", 0.03)`.
|
||||
- **Acceptance gate**: `grep -n "stl_quality" backend/app/domains/rendering/tasks.py` returns nothing
|
||||
### [ ] Task 4: Commit all P2 work
|
||||
- **What**: Stage and commit all uncommitted P2 files in a single `feat(P2)` commit
|
||||
- **Acceptance gate**: `git status` shows clean working tree (except LEARNINGS.md and review-report.md which can be included)
|
||||
- **Dependencies**: Tasks 1–3 (verify before committing)
|
||||
- **Risk**: Low
|
||||
|
||||
### [ ] Task 5: Smoke-test end-to-end via Admin panel
|
||||
- **What**: Via Admin → "Generate Missing Canonical Scenes" to regenerate GLBs with `partKeyMap` + auto-chain USD masters for existing CAD files
|
||||
- **Acceptance gate**:
|
||||
- `GET /api/cad/{id}/scene-manifest` returns `{"parts": [...], ...}` for a processed CadFile
|
||||
- ThreeDViewer loads, click a part → MaterialPanel shows assignment provenance
|
||||
- Assign a material → reload page → assignment still present
|
||||
- **Dependencies**: Task 3
|
||||
- **Risk**: Medium — inline tessellation block must correctly read DB settings; verify key names match migration 062 output
|
||||
|
||||
### [x] Task 5: Extract _blender_args.py
|
||||
- **File**: `render-worker/scripts/blender_render.py`, new `render-worker/scripts/_blender_args.py`
|
||||
- **What**: Move the `argparse` block (lines ~44–110, ~67 lines) into `_blender_args.py` as a `parse_args()` function. `blender_render.py` calls `from _blender_args import parse_args` and uses `args = parse_args()`.
|
||||
- **Acceptance gate**: `_blender_args.py` exists with the parser; `blender_render.py` line count drops by ~60
|
||||
- **Dependencies**: none
|
||||
- **Risk**: Low — pure refactor, no logic change
|
||||
|
||||
### [x] Task 6: Extract _blender_scene_setup.py
|
||||
- **File**: `render-worker/scripts/blender_render.py`, new `render-worker/scripts/_blender_scene_setup.py`
|
||||
- **What**: Move the MODE A / MODE B scene setup branches (lines ~131–214, ~84 lines) into `_blender_scene_setup.py` as `setup_scene(args, scene)` (dispatches internally to mode A or B based on `args.blend_template`). Import and call in `blender_render.py`.
|
||||
- **Acceptance gate**: `_blender_scene_setup.py` exists; `blender_render.py` line count drops by ~80
|
||||
- **Dependencies**: Task 5
|
||||
- **Risk**: Low — pure refactor; `bpy` available in Blender Python context
|
||||
|
||||
### [x] Task 7: Extract _blender_render_config.py and verify ≤ 80 lines
|
||||
- **File**: `render-worker/scripts/blender_render.py`, new `render-worker/scripts/_blender_render_config.py`
|
||||
- **What**: Move engine/render settings + output path logic (lines ~216–258, ~43 lines) into `_blender_render_config.py` as `configure_render(scene, args, output_path, gpu_type)`. After extraction, `blender_render.py` must be ≤ 80 lines.
|
||||
- **Acceptance gate**: `wc -l render-worker/scripts/blender_render.py` shows ≤ 80
|
||||
- **Dependencies**: Task 6
|
||||
- **Risk**: Low — pure refactor
|
||||
- **Risk**: Medium — existing CAD files need backfill; may take minutes for bulk jobs to complete
|
||||
|
||||
## Migration Check
|
||||
No new Alembic migration required. Task 4 reads existing keys (`scene_linear_deflection`, `scene_angular_deflection`) from the `system_settings` table, already present after migration 062.
|
||||
|
||||
Three migrations are pending in the working tree:
|
||||
- `060_usd_master_asset_type.py` — additive enum value
|
||||
- `061_material_assignment_layers.py` — additive JSONB columns
|
||||
- `062_rename_tessellation_settings.py` — UPDATE on `system_settings` rows (already checked: migration 062 was applied per review-report)
|
||||
|
||||
**Before running**: read each migration file to confirm no unexpected DROP statements.
|
||||
|
||||
## Order Recommendation
|
||||
Tasks 1 and 2 are independent — can run in parallel.
|
||||
Tasks 3 and 4 are coupled — run 3 immediately before 4.
|
||||
Tasks 5, 6, 7 are sequential — each further reduces blender_render.py line count.
|
||||
|
||||
Migrations → TypeScript check → Rebuild → Commit → Smoke test
|
||||
|
||||
## Risks / Open Questions
|
||||
- `render_turntable_task` inline tessellation: confirm exact key names are `scene_linear_deflection` / `scene_angular_deflection` (not the old `gltf_preview_*` names) by reading `export_glb.py` before Task 4.
|
||||
- After Task 7, do a smoke-test render to confirm submodule imports work inside Blender's Python interpreter.
|
||||
|
||||
- `usd-core` build in Docker may be slow (first build) — expected, not a problem
|
||||
- Migration 062 may already be applied (review noted "verified by 0-row SELECT") — `alembic upgrade head` is idempotent if so
|
||||
- Existing CAD files need backfill for `partKeyMap` in GLB extras — handled by "Generate Missing Canonical Scenes" bulk action
|
||||
- `resolvePartKey()` falls back to identity (raw mesh name) for GLBs generated before this change — graceful degradation, not a blocking issue
|
||||
|
||||
@@ -70,6 +70,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends libglu1-mesa li
|
||||
# GMSH for Frontal-Delaunay tessellation (alternative to OCC BRepMesh)
|
||||
RUN pip3 install --no-cache-dir "gmsh>=4.15.0"
|
||||
|
||||
# USD authoring library (no GPU/imaging dependency — pure Python + C++ bindings)
|
||||
RUN pip3 install --no-cache-dir "usd-core>=24.11"
|
||||
|
||||
# Copy render scripts
|
||||
COPY render-worker/scripts/ /render-scripts/
|
||||
|
||||
|
||||
@@ -436,6 +436,58 @@ def _tessellate_with_gmsh(shape, linear_deflection: float, angular_deflection: f
|
||||
)
|
||||
|
||||
|
||||
def _collect_part_key_map(shape_tool, free_labels) -> dict:
|
||||
"""Return {normalized_source_name: part_key_slug} for all leaf parts in the XCAF hierarchy.
|
||||
|
||||
The normalized source name (XCAF label name without _AF\\d+ suffix) is what
|
||||
Three.js sees after normalizeMeshName() strips the OCC assembly suffix from the
|
||||
GLB mesh node name. The slug algorithm matches part_key_service.generate_part_key().
|
||||
"""
|
||||
import re as _re
|
||||
import hashlib as _hashlib
|
||||
from OCP.TDF import TDF_LabelSequence
|
||||
from OCP.TDataStd import TDataStd_Name
|
||||
from OCP.XCAFDoc import XCAFDoc_ShapeTool
|
||||
|
||||
_af_re = _re.compile(r'_AF\d+$', _re.IGNORECASE)
|
||||
|
||||
def _slug(source_name: str, xcaf_path: str = "") -> str:
|
||||
base = _af_re.sub('', source_name) if source_name else ''
|
||||
# camelCase split — same as part_key_service.generate_part_key
|
||||
base = _re.sub(r'([a-z])([A-Z])', r'\1_\2', base)
|
||||
slug = _re.sub(r'[^a-z0-9]+', '_', base.lower()).strip('_')
|
||||
if not slug:
|
||||
slug = f"part_{_hashlib.sha256(xcaf_path.encode()).hexdigest()[:8]}"
|
||||
return slug[:50]
|
||||
|
||||
part_key_map: dict = {}
|
||||
|
||||
def _collect(label, path: str = "") -> None:
|
||||
name_attr = TDataStd_Name()
|
||||
name = ""
|
||||
if label.FindAttribute(TDataStd_Name.GetID_s(), name_attr):
|
||||
name = name_attr.Get().ToExtString()
|
||||
|
||||
components = TDF_LabelSequence()
|
||||
XCAFDoc_ShapeTool.GetComponents_s(label, components)
|
||||
|
||||
xcaf_path = f"{path}/{name}" if name else f"{path}/unnamed"
|
||||
|
||||
if components.Length() == 0:
|
||||
# Leaf node — normalized source name (without _AF suffix) as key
|
||||
normalized = _af_re.sub('', name) if name else ''
|
||||
if normalized:
|
||||
part_key_map[normalized] = _slug(name, xcaf_path)
|
||||
else:
|
||||
for i in range(1, components.Length() + 1):
|
||||
_collect(components.Value(i), xcaf_path)
|
||||
|
||||
for i in range(1, free_labels.Length() + 1):
|
||||
_collect(free_labels.Value(i))
|
||||
|
||||
return part_key_map
|
||||
|
||||
|
||||
def _inject_glb_extras(glb_path: Path, extras: dict) -> None:
|
||||
"""Patch a GLB binary to add/update scenes[0].extras JSON field.
|
||||
|
||||
@@ -514,6 +566,10 @@ def main() -> None:
|
||||
print(f"Found {free_labels.Length()} root shape(s), tessellating "
|
||||
f"(linear={args.linear_deflection}mm, angular={args.angular_deflection}rad) …")
|
||||
|
||||
# Collect partKeyMap before tessellation (XCAF names are stable at this point)
|
||||
part_key_map = _collect_part_key_map(shape_tool, free_labels)
|
||||
print(f"partKeyMap: {len(part_key_map)} unique part names collected")
|
||||
|
||||
engine = getattr(args, "tessellation_engine", "occ")
|
||||
if engine == "gmsh":
|
||||
# GMSH: tessellate each solid individually to cap peak RAM usage.
|
||||
@@ -652,18 +708,25 @@ def main() -> None:
|
||||
|
||||
print(f"GLB exported: {out.name} ({out.stat().st_size // 1024} KB)")
|
||||
|
||||
# --- Inject sharp edge pairs into GLB extras ---
|
||||
# --- Inject sharp edge pairs and partKeyMap into GLB extras ---
|
||||
# Blender 5.0 reads scenes[0].extras as scene custom properties on import,
|
||||
# making the data available to export_gltf.py as bpy.context.scene["key"].
|
||||
if sharp_pairs:
|
||||
try:
|
||||
_inject_glb_extras(out, {
|
||||
"schaeffler_sharp_edge_pairs": sharp_pairs,
|
||||
"schaeffler_sharp_threshold_deg": args.sharp_threshold,
|
||||
})
|
||||
print(f"Injected {len(sharp_pairs)} sharp edge segment pairs into GLB extras")
|
||||
except Exception as _exc:
|
||||
print(f"WARNING: GLB extras injection failed (non-fatal): {_exc}", file=sys.stderr)
|
||||
# partKeyMap is read by Three.js in ThreeDViewer to resolve partKey from mesh name.
|
||||
try:
|
||||
extras_payload: dict = {}
|
||||
if sharp_pairs:
|
||||
extras_payload["schaeffler_sharp_edge_pairs"] = sharp_pairs
|
||||
extras_payload["schaeffler_sharp_threshold_deg"] = args.sharp_threshold
|
||||
if part_key_map:
|
||||
extras_payload["partKeyMap"] = part_key_map
|
||||
if extras_payload:
|
||||
_inject_glb_extras(out, extras_payload)
|
||||
if sharp_pairs:
|
||||
print(f"Injected {len(sharp_pairs)} sharp edge segment pairs into GLB extras")
|
||||
if part_key_map:
|
||||
print(f"Injected partKeyMap ({len(part_key_map)} entries) into GLB extras")
|
||||
except Exception as _exc:
|
||||
print(f"WARNING: GLB extras injection failed (non-fatal): {_exc}", file=sys.stderr)
|
||||
|
||||
|
||||
try:
|
||||
|
||||
@@ -0,0 +1,630 @@
|
||||
"""STEP → USD exporter for Schaeffler Automat.
|
||||
|
||||
Reads a STEP file via OCP/XCAF (preserving part names + embedded colors),
|
||||
tessellates with BRepMesh, builds a USD stage with one UsdGeomMesh per leaf
|
||||
part, and writes a .usd file.
|
||||
|
||||
Coordinate system: OCC is mm Z-up. USD stage is authored in mm Y-up
|
||||
(matching glTF / Blender convention). metersPerUnit=0.001 is set so Blender
|
||||
handles the mm→m conversion on import — no explicit scaling applied here.
|
||||
|
||||
Usage:
|
||||
python3 export_step_to_usd.py \\
|
||||
--step_path /path/to/file.stp \\
|
||||
--output_path /path/to/output.usd \\
|
||||
[--linear_deflection 0.03] \\
|
||||
[--angular_deflection 0.05] \\
|
||||
[--color_map '{"Ring": "#4C9BE8"}'] \\
|
||||
[--sharp_threshold 20.0] \\
|
||||
[--cad_file_id uuid]
|
||||
|
||||
Exit 0 on success, exit 1 on failure.
|
||||
Prints MANIFEST_JSON: {...} to stdout before exit.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import hashlib
|
||||
import json
|
||||
import math
|
||||
import re
|
||||
import sys
|
||||
import traceback
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
# ── CLI ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
p = argparse.ArgumentParser()
|
||||
p.add_argument("--step_path", required=True)
|
||||
p.add_argument("--output_path", required=True)
|
||||
p.add_argument("--linear_deflection", type=float, default=0.03)
|
||||
p.add_argument("--angular_deflection", type=float, default=0.05)
|
||||
p.add_argument("--color_map", default="{}")
|
||||
p.add_argument("--sharp_threshold", type=float, default=20.0)
|
||||
p.add_argument("--cad_file_id", default="")
|
||||
return p.parse_args()
|
||||
|
||||
|
||||
# ── Part key generation ───────────────────────────────────────────────────────
|
||||
|
||||
_AF_RE = re.compile(r'_AF\d+$', re.IGNORECASE)
|
||||
|
||||
|
||||
def _generate_part_key(xcaf_path: str, source_name: str, existing_keys: set) -> str:
|
||||
"""Deterministic slug, max 64 chars, unique within assembly."""
|
||||
base = _AF_RE.sub('', source_name) if source_name else ''
|
||||
base = re.sub(r'([a-z])([A-Z])', r'\1_\2', base)
|
||||
slug = re.sub(r'[^a-z0-9]+', '_', base.lower()).strip('_')
|
||||
if not slug:
|
||||
slug = f"part_{hashlib.sha256(xcaf_path.encode()).hexdigest()[:8]}"
|
||||
slug = slug[:50]
|
||||
key = slug
|
||||
n = 2
|
||||
while key in existing_keys:
|
||||
key = f"{slug}_{n}"
|
||||
n += 1
|
||||
existing_keys.add(key)
|
||||
return key
|
||||
|
||||
|
||||
# ── Color helpers ─────────────────────────────────────────────────────────────
|
||||
|
||||
PALETTE_HEX = [
|
||||
"#4C9BE8", "#E85B4C", "#4CBE72", "#E8A84C", "#A04CE8",
|
||||
"#4CD4E8", "#E84CA8", "#7EC850", "#E86B30", "#5088C8",
|
||||
]
|
||||
|
||||
|
||||
def _occ_color_to_hex(occ_color) -> str:
|
||||
r = int(occ_color.Red() * 255)
|
||||
g = int(occ_color.Green() * 255)
|
||||
b = int(occ_color.Blue() * 255)
|
||||
return f"#{r:02X}{g:02X}{b:02X}"
|
||||
|
||||
|
||||
def _hex_to_occ_color(hex_color: str):
|
||||
from OCP.Quantity import Quantity_Color, Quantity_TOC_RGB
|
||||
h = hex_color.lstrip("#")
|
||||
if len(h) < 6:
|
||||
return Quantity_Color(0.7, 0.7, 0.7, Quantity_TOC_RGB)
|
||||
return Quantity_Color(
|
||||
int(h[0:2], 16) / 255.0,
|
||||
int(h[2:4], 16) / 255.0,
|
||||
int(h[4:6], 16) / 255.0,
|
||||
Quantity_TOC_RGB,
|
||||
)
|
||||
|
||||
|
||||
def _hex_to_rgb01(hex_color: str) -> tuple:
|
||||
h = hex_color.lstrip('#')
|
||||
if len(h) < 6:
|
||||
return (0.7, 0.7, 0.7)
|
||||
return (int(h[0:2], 16) / 255.0, int(h[2:4], 16) / 255.0, int(h[4:6], 16) / 255.0)
|
||||
|
||||
|
||||
def _get_shape_color(color_tool, shape) -> str | None:
|
||||
"""Return hex color for an OCC shape (surface color preferred)."""
|
||||
from OCP.Quantity import Quantity_Color
|
||||
try:
|
||||
from OCP.XCAFDoc import XCAFDoc_ColorSurf as _SURF
|
||||
from OCP.XCAFDoc import XCAFDoc_ColorGen as _GEN
|
||||
except ImportError:
|
||||
_SURF = 1
|
||||
_GEN = 0
|
||||
occ_color = Quantity_Color()
|
||||
if color_tool.GetColor(shape, _SURF, occ_color):
|
||||
return _occ_color_to_hex(occ_color)
|
||||
if color_tool.GetColor(shape, _GEN, occ_color):
|
||||
return _occ_color_to_hex(occ_color)
|
||||
return None
|
||||
|
||||
|
||||
# ── XCAF color application ────────────────────────────────────────────────────
|
||||
|
||||
def _apply_color_map(shape_tool, color_tool, free_labels, color_map: dict) -> None:
|
||||
from OCP.TDF import TDF_LabelSequence
|
||||
from OCP.TDataStd import TDataStd_Name
|
||||
from OCP.XCAFDoc import XCAFDoc_ShapeTool
|
||||
try:
|
||||
from OCP.XCAFDoc import XCAFDoc_ColorSurf as _SURF
|
||||
except ImportError:
|
||||
_SURF = 1
|
||||
|
||||
def _visit(label) -> None:
|
||||
name_attr = TDataStd_Name()
|
||||
name = ""
|
||||
if label.FindAttribute(TDataStd_Name.GetID_s(), name_attr):
|
||||
name = name_attr.Get().ToExtString()
|
||||
if name:
|
||||
for part_name, hex_color in color_map.items():
|
||||
if part_name.lower() in name.lower() or name.lower() in part_name.lower():
|
||||
color_tool.SetColor(label, _hex_to_occ_color(hex_color), _SURF)
|
||||
break
|
||||
components = TDF_LabelSequence()
|
||||
XCAFDoc_ShapeTool.GetComponents_s(label, components)
|
||||
for i in range(1, components.Length() + 1):
|
||||
_visit(components.Value(i))
|
||||
|
||||
for i in range(1, free_labels.Length() + 1):
|
||||
_visit(free_labels.Value(i))
|
||||
|
||||
|
||||
def _apply_palette_colors(shape_tool, color_tool, free_labels) -> None:
|
||||
from OCP.TDF import TDF_LabelSequence
|
||||
from OCP.XCAFDoc import XCAFDoc_ShapeTool
|
||||
try:
|
||||
from OCP.XCAFDoc import XCAFDoc_ColorSurf as _SURF
|
||||
except ImportError:
|
||||
_SURF = 1
|
||||
|
||||
leaves: list = []
|
||||
|
||||
def _collect(label) -> None:
|
||||
components = TDF_LabelSequence()
|
||||
XCAFDoc_ShapeTool.GetComponents_s(label, components)
|
||||
if components.Length() == 0:
|
||||
leaves.append(label)
|
||||
else:
|
||||
for i in range(1, components.Length() + 1):
|
||||
_collect(components.Value(i))
|
||||
|
||||
for i in range(1, free_labels.Length() + 1):
|
||||
_collect(free_labels.Value(i))
|
||||
|
||||
for idx, label in enumerate(leaves):
|
||||
color_tool.SetColor(label, _hex_to_occ_color(PALETTE_HEX[idx % len(PALETTE_HEX)]), _SURF)
|
||||
|
||||
|
||||
# ── Sharp edge extraction (inlined from export_step_to_gltf.py) ──────────────
|
||||
|
||||
def _extract_sharp_edge_pairs(shape, sharp_threshold_deg: float = 20.0) -> list:
|
||||
"""Extract sharp B-rep edges as dense curve-sample segment pairs (mm, Z-up).
|
||||
|
||||
Ported from export_step_to_gltf.py to avoid importing that module
|
||||
(its top-level code runs main() on import).
|
||||
"""
|
||||
from OCP.TopTools import TopTools_IndexedDataMapOfShapeListOfShape
|
||||
from OCP.TopExp import TopExp as _TopExp
|
||||
from OCP.TopAbs import TopAbs_EDGE, TopAbs_FACE, TopAbs_FORWARD
|
||||
from OCP.TopoDS import TopoDS as _TopoDS
|
||||
from OCP.BRepAdaptor import BRepAdaptor_Surface, BRepAdaptor_Curve2d, BRepAdaptor_Curve
|
||||
from OCP.BRepLProp import BRepLProp_SLProps
|
||||
from OCP.GCPnts import GCPnts_UniformAbscissa
|
||||
|
||||
edge_face_map = TopTools_IndexedDataMapOfShapeListOfShape()
|
||||
_TopExp.MapShapesAndAncestors_s(shape, TopAbs_EDGE, TopAbs_FACE, edge_face_map)
|
||||
|
||||
sharp_pairs: list = []
|
||||
n_checked = 0
|
||||
n_sharp = 0
|
||||
SAMPLE_STEP_MM = 0.3
|
||||
|
||||
for i in range(1, edge_face_map.Extent() + 1):
|
||||
edge_shape = edge_face_map.FindKey(i)
|
||||
faces = edge_face_map.FindFromIndex(i)
|
||||
n_checked += 1
|
||||
if faces.Size() < 2:
|
||||
continue
|
||||
face_shapes = list(faces)
|
||||
if len(face_shapes) < 2:
|
||||
continue
|
||||
try:
|
||||
edge = _TopoDS.Edge_s(edge_shape)
|
||||
face1 = _TopoDS.Face_s(face_shapes[0])
|
||||
face2 = _TopoDS.Face_s(face_shapes[1])
|
||||
|
||||
c2d_1 = BRepAdaptor_Curve2d(edge, face1)
|
||||
uv1 = c2d_1.Value((c2d_1.FirstParameter() + c2d_1.LastParameter()) / 2.0)
|
||||
surf1 = BRepAdaptor_Surface(face1)
|
||||
props1 = BRepLProp_SLProps(surf1, uv1.X(), uv1.Y(), 1, 1e-6)
|
||||
if not props1.IsNormalDefined():
|
||||
continue
|
||||
n1 = props1.Normal()
|
||||
if face1.Orientation() != TopAbs_FORWARD:
|
||||
n1.Reverse()
|
||||
|
||||
c2d_2 = BRepAdaptor_Curve2d(edge, face2)
|
||||
uv2 = c2d_2.Value((c2d_2.FirstParameter() + c2d_2.LastParameter()) / 2.0)
|
||||
surf2 = BRepAdaptor_Surface(face2)
|
||||
props2 = BRepLProp_SLProps(surf2, uv2.X(), uv2.Y(), 1, 1e-6)
|
||||
if not props2.IsNormalDefined():
|
||||
continue
|
||||
n2 = props2.Normal()
|
||||
if face2.Orientation() != TopAbs_FORWARD:
|
||||
n2.Reverse()
|
||||
|
||||
cos_angle = max(-1.0, min(1.0, n1.Dot(n2)))
|
||||
angle_deg = math.degrees(math.acos(cos_angle))
|
||||
if angle_deg > 90.0:
|
||||
angle_deg = 180.0 - angle_deg
|
||||
if angle_deg <= sharp_threshold_deg:
|
||||
continue
|
||||
|
||||
n_sharp += 1
|
||||
pts: list = []
|
||||
try:
|
||||
curve3d = BRepAdaptor_Curve(edge)
|
||||
f_param = curve3d.FirstParameter()
|
||||
l_param = curve3d.LastParameter()
|
||||
if math.isfinite(f_param) and math.isfinite(l_param):
|
||||
sampler = GCPnts_UniformAbscissa()
|
||||
sampler.Initialize(curve3d, SAMPLE_STEP_MM, 1e-6)
|
||||
if sampler.IsDone() and sampler.NbPoints() >= 2:
|
||||
for j in range(1, sampler.NbPoints() + 1):
|
||||
p = curve3d.Value(sampler.Parameter(j))
|
||||
pts.append([round(p.X(), 4), round(p.Y(), 4), round(p.Z(), 4)])
|
||||
except Exception:
|
||||
pts = []
|
||||
|
||||
if len(pts) < 2:
|
||||
continue
|
||||
for k in range(len(pts) - 1):
|
||||
sharp_pairs.append([pts[k], pts[k + 1]])
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
print(
|
||||
f"Sharp edge extraction: {n_checked} edges checked, "
|
||||
f"{n_sharp} sharp (>{sharp_threshold_deg:.0f}°), "
|
||||
f"{len(sharp_pairs)} segment pairs total"
|
||||
)
|
||||
return sharp_pairs
|
||||
|
||||
|
||||
# ── XCAF traversal ────────────────────────────────────────────────────────────
|
||||
|
||||
def _traverse_xcaf(shape_tool, color_tool, label, path_prefix, existing_keys, depth=0):
|
||||
"""Yield one dict per leaf shape in the XCAF hierarchy.
|
||||
|
||||
Phase 1 limitation: for deeply nested assemblies, transforms from
|
||||
intermediate reference labels are not composed — world-space positions
|
||||
may be off for non-flat assemblies. Single-level assemblies are correct.
|
||||
"""
|
||||
from OCP.TDF import TDF_LabelSequence, TDF_Label
|
||||
from OCP.TDataStd import TDataStd_Name
|
||||
from OCP.XCAFDoc import XCAFDoc_ShapeTool
|
||||
|
||||
name_attr = TDataStd_Name()
|
||||
source_name = ""
|
||||
if label.FindAttribute(TDataStd_Name.GetID_s(), name_attr):
|
||||
source_name = name_attr.Get().ToExtString()
|
||||
|
||||
xcaf_path = (f"{path_prefix}/{source_name}" if source_name
|
||||
else f"{path_prefix}/unnamed_{depth}")
|
||||
|
||||
# Follow references to get the definition label (for sub-assembly detection)
|
||||
actual_label = label
|
||||
if XCAFDoc_ShapeTool.IsReference_s(label):
|
||||
ref_label = TDF_Label()
|
||||
if XCAFDoc_ShapeTool.GetReferredShape_s(label, ref_label):
|
||||
actual_label = ref_label
|
||||
|
||||
components = TDF_LabelSequence()
|
||||
XCAFDoc_ShapeTool.GetComponents_s(actual_label, components)
|
||||
|
||||
if components.Length() == 0:
|
||||
shape = shape_tool.GetShape_s(label)
|
||||
if shape.IsNull():
|
||||
shape = shape_tool.GetShape_s(actual_label)
|
||||
if shape.IsNull():
|
||||
return
|
||||
|
||||
part_key = _generate_part_key(xcaf_path, source_name, existing_keys)
|
||||
color = _get_shape_color(color_tool, shape)
|
||||
|
||||
yield {
|
||||
'shape': shape,
|
||||
'source_name': source_name,
|
||||
'xcaf_path': xcaf_path,
|
||||
'part_key': part_key,
|
||||
'color': color,
|
||||
}
|
||||
else:
|
||||
for i in range(1, components.Length() + 1):
|
||||
yield from _traverse_xcaf(
|
||||
shape_tool, color_tool, components.Value(i),
|
||||
xcaf_path, existing_keys, depth + 1,
|
||||
)
|
||||
|
||||
|
||||
# ── Mesh geometry extraction ──────────────────────────────────────────────────
|
||||
|
||||
def _extract_mesh(shape) -> tuple[list, list]:
|
||||
"""Return (vertices, triangles) from a tessellated OCC shape.
|
||||
|
||||
Vertices are in OCC space (mm, Z-up).
|
||||
Triangles are 0-based index triples.
|
||||
"""
|
||||
from OCP.TopExp import TopExp_Explorer
|
||||
from OCP.TopAbs import TopAbs_FACE, TopAbs_REVERSED
|
||||
from OCP.TopoDS import TopoDS
|
||||
from OCP.BRep import BRep_Tool
|
||||
from OCP.TopLoc import TopLoc_Location
|
||||
|
||||
vertices: list = []
|
||||
triangles: list = []
|
||||
v_offset = 0
|
||||
|
||||
shape_trsf = shape.Location().Transformation()
|
||||
shape_has_loc = not shape.Location().IsIdentity()
|
||||
|
||||
exp = TopExp_Explorer(shape, TopAbs_FACE)
|
||||
while exp.More():
|
||||
face = TopoDS.Face_s(exp.Current())
|
||||
face_loc = TopLoc_Location()
|
||||
tri = BRep_Tool.Triangulation_s(face, face_loc)
|
||||
|
||||
if tri is not None and tri.NbNodes() > 0:
|
||||
reversed_face = (face.Orientation() == TopAbs_REVERSED)
|
||||
face_has_loc = not face_loc.IsIdentity()
|
||||
|
||||
for i in range(1, tri.NbNodes() + 1):
|
||||
node = tri.Node(i)
|
||||
if face_has_loc:
|
||||
node = node.Transformed(face_loc.Transformation())
|
||||
if shape_has_loc:
|
||||
node = node.Transformed(shape_trsf)
|
||||
vertices.append((node.X(), node.Y(), node.Z()))
|
||||
|
||||
for i in range(1, tri.NbTriangles() + 1):
|
||||
n1, n2, n3 = tri.Triangle(i).Get()
|
||||
v0 = n1 - 1 + v_offset
|
||||
v1 = n2 - 1 + v_offset
|
||||
v2 = n3 - 1 + v_offset
|
||||
triangles.append((v0, v2, v1) if reversed_face else (v0, v1, v2))
|
||||
|
||||
v_offset += tri.NbNodes()
|
||||
|
||||
exp.Next()
|
||||
|
||||
return vertices, triangles
|
||||
|
||||
|
||||
# ── Index-space sharp edge mapping ────────────────────────────────────────────
|
||||
|
||||
def _world_to_index_pairs(vertices: list, world_pairs: list, tol: float = 0.5) -> list:
|
||||
"""Map world-space (mm, Z-up) segment pairs → local vertex index pairs."""
|
||||
def _k(x, y, z):
|
||||
return (round(x / tol) * tol, round(y / tol) * tol, round(z / tol) * tol)
|
||||
|
||||
coord_map: dict = {}
|
||||
for idx, (x, y, z) in enumerate(vertices):
|
||||
k = _k(x, y, z)
|
||||
if k not in coord_map:
|
||||
coord_map[k] = idx
|
||||
|
||||
result = []
|
||||
for p0, p1 in world_pairs:
|
||||
i0 = coord_map.get(_k(p0[0], p0[1], p0[2]))
|
||||
i1 = coord_map.get(_k(p1[0], p1[1], p1[2]))
|
||||
if i0 is not None and i1 is not None and i0 != i1:
|
||||
result.append((i0, i1))
|
||||
return result
|
||||
|
||||
|
||||
# ── USD prim name sanitizer ───────────────────────────────────────────────────
|
||||
|
||||
def _prim_name(name: str) -> str:
|
||||
safe = re.sub(r'[^A-Za-z0-9_]', '_', name)
|
||||
if safe and safe[0].isdigit():
|
||||
safe = f"_{safe}"
|
||||
return safe or "unnamed"
|
||||
|
||||
|
||||
# ── Main ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
def main() -> None:
|
||||
args = parse_args()
|
||||
color_map: dict = json.loads(args.color_map)
|
||||
|
||||
step_path = Path(args.step_path)
|
||||
output_path = Path(args.output_path)
|
||||
|
||||
if not step_path.exists():
|
||||
print(f"ERROR: STEP file not found: {step_path}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# ── OCC / XCAF imports ────────────────────────────────────────────────────
|
||||
from OCP.STEPCAFControl import STEPCAFControl_Reader
|
||||
from OCP.TDocStd import TDocStd_Document
|
||||
from OCP.XCAFApp import XCAFApp_Application
|
||||
from OCP.XCAFDoc import XCAFDoc_DocumentTool
|
||||
from OCP.TCollection import TCollection_ExtendedString
|
||||
from OCP.TDF import TDF_LabelSequence
|
||||
from OCP.BRepMesh import BRepMesh_IncrementalMesh
|
||||
from OCP.IFSelect import IFSelect_RetDone
|
||||
|
||||
# ── pxr imports ───────────────────────────────────────────────────────────
|
||||
from pxr import Usd, UsdGeom, UsdShade, Sdf, Vt, Gf
|
||||
|
||||
# ── Read STEP ─────────────────────────────────────────────────────────────
|
||||
app = XCAFApp_Application.GetApplication_s()
|
||||
doc = TDocStd_Document(TCollection_ExtendedString("MDTV-CAF"))
|
||||
app.InitDocument(doc)
|
||||
|
||||
reader = STEPCAFControl_Reader()
|
||||
reader.SetNameMode(True)
|
||||
reader.SetColorMode(True)
|
||||
reader.SetLayerMode(True)
|
||||
status = reader.ReadFile(str(step_path))
|
||||
if status != IFSelect_RetDone:
|
||||
print(f"ERROR: STEPCAFControl_Reader failed (status={status})", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
reader.Transfer(doc)
|
||||
|
||||
shape_tool = XCAFDoc_DocumentTool.ShapeTool_s(doc.Main())
|
||||
color_tool = XCAFDoc_DocumentTool.ColorTool_s(doc.Main())
|
||||
|
||||
free_labels = TDF_LabelSequence()
|
||||
shape_tool.GetFreeShapes(free_labels)
|
||||
print(
|
||||
f"Found {free_labels.Length()} root shape(s), tessellating "
|
||||
f"(linear={args.linear_deflection}mm, angular={args.angular_deflection}rad) …"
|
||||
)
|
||||
|
||||
# ── Tessellate ────────────────────────────────────────────────────────────
|
||||
for i in range(1, free_labels.Length() + 1):
|
||||
shape = shape_tool.GetShape_s(free_labels.Value(i))
|
||||
if not shape.IsNull():
|
||||
BRepMesh_IncrementalMesh(
|
||||
shape, args.linear_deflection, False, args.angular_deflection, True
|
||||
)
|
||||
print("Tessellation complete.")
|
||||
|
||||
# ── Sharp edge pairs (world-space mm, Z-up) ───────────────────────────────
|
||||
sharp_pairs_mm: list = []
|
||||
try:
|
||||
for i in range(1, free_labels.Length() + 1):
|
||||
root_shape = shape_tool.GetShape_s(free_labels.Value(i))
|
||||
if not root_shape.IsNull():
|
||||
sharp_pairs_mm.extend(
|
||||
_extract_sharp_edge_pairs(root_shape, args.sharp_threshold)
|
||||
)
|
||||
print(f"Total sharp segment pairs: {len(sharp_pairs_mm)}")
|
||||
except Exception as exc:
|
||||
print(f"WARNING: sharp edge extraction failed (non-fatal): {exc}", file=sys.stderr)
|
||||
|
||||
# ── Apply colors ──────────────────────────────────────────────────────────
|
||||
if color_map:
|
||||
try:
|
||||
_apply_color_map(shape_tool, color_tool, free_labels, color_map)
|
||||
print(f"Applied color_map ({len(color_map)} entries)")
|
||||
except Exception as exc:
|
||||
print(f"WARNING: color_map application failed (non-fatal): {exc}", file=sys.stderr)
|
||||
else:
|
||||
try:
|
||||
_apply_palette_colors(shape_tool, color_tool, free_labels)
|
||||
print("Applied palette colors")
|
||||
except Exception as exc:
|
||||
print(f"WARNING: palette colors failed (non-fatal): {exc}", file=sys.stderr)
|
||||
|
||||
# ── Create USD stage ──────────────────────────────────────────────────────
|
||||
stage = Usd.Stage.CreateNew(str(output_path))
|
||||
UsdGeom.SetStageUpAxis(stage, UsdGeom.Tokens.y)
|
||||
UsdGeom.SetStageMetersPerUnit(stage, 0.001) # mm; Blender handles m conversion on import
|
||||
|
||||
root_prim = UsdGeom.Xform.Define(stage, "/Root")
|
||||
stage.SetDefaultPrim(root_prim.GetPrim())
|
||||
UsdGeom.Xform.Define(stage, "/Root/Assembly")
|
||||
stage.DefinePrim("/Root/Looks", "Scope")
|
||||
|
||||
# ── Walk XCAF tree → author USD prims ─────────────────────────────────────
|
||||
existing_keys: set = set()
|
||||
manifest_parts: list = []
|
||||
n_parts = 0
|
||||
n_empty = 0
|
||||
|
||||
for root_idx in range(1, free_labels.Length() + 1):
|
||||
root_label = free_labels.Value(root_idx)
|
||||
|
||||
from OCP.TDataStd import TDataStd_Name as _Name
|
||||
_na = _Name()
|
||||
root_src = ""
|
||||
if root_label.FindAttribute(_Name.GetID_s(), _na):
|
||||
root_src = _na.Get().ToExtString()
|
||||
node_name = _prim_name(root_src or f"Root{root_idx}")
|
||||
node_path = f"/Root/Assembly/{node_name}"
|
||||
UsdGeom.Xform.Define(stage, node_path)
|
||||
|
||||
for part in _traverse_xcaf(shape_tool, color_tool, root_label, "", existing_keys):
|
||||
source_name = part['source_name']
|
||||
part_key = part['part_key']
|
||||
hex_color = part['color']
|
||||
shape = part['shape']
|
||||
xcaf_path = part['xcaf_path']
|
||||
|
||||
# color_map override (substring match)
|
||||
for map_name, map_hex in color_map.items():
|
||||
if (map_name.lower() in source_name.lower()
|
||||
or source_name.lower() in map_name.lower()):
|
||||
hex_color = map_hex
|
||||
break
|
||||
if not hex_color:
|
||||
hex_color = PALETTE_HEX[n_parts % len(PALETTE_HEX)]
|
||||
|
||||
vertices, triangles = _extract_mesh(shape)
|
||||
if not vertices or not triangles:
|
||||
n_empty += 1
|
||||
continue
|
||||
|
||||
part_path = f"{node_path}/{part_key}"
|
||||
mesh_path = f"{part_path}/Mesh"
|
||||
|
||||
# ── Xform prim ────────────────────────────────────────────────
|
||||
xform = UsdGeom.Xform.Define(stage, part_path)
|
||||
prim = xform.GetPrim()
|
||||
prim.SetCustomDataByKey("schaeffler:partKey", part_key)
|
||||
prim.SetCustomDataByKey("schaeffler:sourceName", source_name)
|
||||
prim.SetCustomDataByKey("schaeffler:sourceAssemblyPath", xcaf_path)
|
||||
prim.SetCustomDataByKey("schaeffler:sourceColor", hex_color)
|
||||
prim.SetCustomDataByKey("schaeffler:tessellation:linearDeflectionMm",
|
||||
args.linear_deflection)
|
||||
prim.SetCustomDataByKey("schaeffler:tessellation:angularDeflectionRad",
|
||||
args.angular_deflection)
|
||||
if args.cad_file_id:
|
||||
prim.SetCustomDataByKey("schaeffler:cadFileId", args.cad_file_id)
|
||||
|
||||
# ── UsdGeomMesh ───────────────────────────────────────────────
|
||||
mesh = UsdGeom.Mesh.Define(stage, mesh_path)
|
||||
mesh.CreateSubdivisionSchemeAttr(UsdGeom.Tokens.none)
|
||||
|
||||
# OCC (X, Y, Z) mm Z-up → USD (X, -Z, Y) mm Y-up
|
||||
mesh.CreatePointsAttr(Vt.Vec3fArray([
|
||||
Gf.Vec3f(x, -z, y) for (x, y, z) in vertices
|
||||
]))
|
||||
mesh.CreateFaceVertexCountsAttr(Vt.IntArray([3] * len(triangles)))
|
||||
mesh.CreateFaceVertexIndicesAttr(
|
||||
Vt.IntArray([idx for tri in triangles for idx in tri])
|
||||
)
|
||||
r, g, b = _hex_to_rgb01(hex_color)
|
||||
mesh.CreateDisplayColorAttr(Vt.Vec3fArray([Gf.Vec3f(r, g, b)]))
|
||||
|
||||
# ── Index-space sharp edge primvar ────────────────────────────
|
||||
# Lookup is in OCC Z-up space; sharp_pairs_mm are also Z-up — no swap needed.
|
||||
if sharp_pairs_mm:
|
||||
idx_pairs = _world_to_index_pairs(vertices, sharp_pairs_mm)
|
||||
if idx_pairs:
|
||||
pv = UsdGeom.PrimvarsAPI(mesh).CreatePrimvar(
|
||||
"schaeffler:sharpEdgeVertexPairs",
|
||||
Sdf.ValueTypeNames.Int2Array,
|
||||
UsdGeom.Tokens.constant,
|
||||
)
|
||||
pv.Set(Vt.Vec2iArray([Gf.Vec2i(a, b) for a, b in idx_pairs]))
|
||||
|
||||
# ── Material placeholder + binding ────────────────────────────
|
||||
mat_name = _prim_name(source_name) if source_name else f"mat_{part_key}"
|
||||
mat_usd_path = f"/Root/Looks/{mat_name}"
|
||||
if not stage.GetPrimAtPath(mat_usd_path):
|
||||
UsdShade.Material.Define(stage, mat_usd_path)
|
||||
UsdShade.MaterialBindingAPI(mesh.GetPrim()).Bind(
|
||||
UsdShade.Material(stage.GetPrimAtPath(mat_usd_path))
|
||||
)
|
||||
|
||||
manifest_parts.append({
|
||||
"part_key": part_key,
|
||||
"source_name": source_name,
|
||||
"prim_path": part_path,
|
||||
})
|
||||
n_parts += 1
|
||||
|
||||
stage.Save()
|
||||
|
||||
sz = output_path.stat().st_size // 1024 if output_path.exists() else 0
|
||||
print(f"USD exported: {output_path.name} ({sz} KB), "
|
||||
f"{n_parts} parts, {n_empty} empty shapes skipped")
|
||||
|
||||
# ── Stdout manifest (one line, parsed by Celery task) ─────────────────────
|
||||
print(f"MANIFEST_JSON: {json.dumps({'parts': manifest_parts})}")
|
||||
|
||||
|
||||
try:
|
||||
main()
|
||||
except SystemExit:
|
||||
raise
|
||||
except Exception:
|
||||
traceback.print_exc()
|
||||
sys.exit(1)
|
||||
@@ -785,65 +785,6 @@ def main():
|
||||
bpy.ops.render.render(write_still=True)
|
||||
print("[still_render] render done.")
|
||||
|
||||
# ── Pillow post-processing: green bar + model name label ─────────────────
|
||||
# Skip overlay for transparent renders to keep clean alpha channel
|
||||
if transparent_bg:
|
||||
print("[still_render] Transparent mode — skipping Pillow overlay.")
|
||||
else:
|
||||
try:
|
||||
from PIL import Image, ImageDraw, ImageFont
|
||||
|
||||
img = Image.open(output_path).convert("RGBA")
|
||||
draw = ImageDraw.Draw(img)
|
||||
W, H = img.size
|
||||
|
||||
# Schaeffler green top bar
|
||||
bar_h = max(8, H // 32)
|
||||
draw.rectangle([0, 0, W - 1, bar_h - 1], fill=(0, 137, 61, 255))
|
||||
|
||||
# Model name strip at bottom
|
||||
model_name = os.path.splitext(os.path.basename(glb_path))[0]
|
||||
label_h = max(20, H // 20)
|
||||
img.alpha_composite(
|
||||
Image.new("RGBA", (W, label_h), (30, 30, 30, 180)),
|
||||
dest=(0, H - label_h),
|
||||
)
|
||||
|
||||
font_size = max(10, label_h - 6)
|
||||
font = None
|
||||
for fp in [
|
||||
"/usr/share/fonts/truetype/dejavu/DejaVuSans-Bold.ttf",
|
||||
"/usr/share/fonts/truetype/liberation/LiberationSans-Bold.ttf",
|
||||
"/usr/share/fonts/truetype/freefont/FreeSansBold.ttf",
|
||||
]:
|
||||
if os.path.exists(fp):
|
||||
try:
|
||||
font = ImageFont.truetype(fp, font_size)
|
||||
break
|
||||
except Exception:
|
||||
pass
|
||||
if font is None:
|
||||
font = ImageFont.load_default()
|
||||
|
||||
tb = draw.textbbox((0, 0), model_name, font=font)
|
||||
text_w = tb[2] - tb[0]
|
||||
draw.text(
|
||||
((W - text_w) // 2, H - label_h + (label_h - (tb[3] - tb[1])) // 2),
|
||||
model_name, font=font, fill=(255, 255, 255, 255),
|
||||
)
|
||||
|
||||
# Save in original format
|
||||
if ext in ('.jpg', '.jpeg'):
|
||||
img.convert("RGB").save(output_path, format="JPEG", quality=92)
|
||||
else:
|
||||
img.convert("RGB").save(output_path, format="PNG")
|
||||
print("[still_render] Pillow overlay applied.")
|
||||
|
||||
except ImportError:
|
||||
print("[still_render] Pillow not available - skipping overlay.")
|
||||
except Exception as exc:
|
||||
print(f"[still_render] Pillow overlay failed (non-fatal): {exc}")
|
||||
|
||||
print("[still_render] Done.")
|
||||
|
||||
|
||||
|
||||
@@ -342,6 +342,18 @@ def main():
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Named argument: --usd-path <path> — when set, import USD instead of GLB
|
||||
usd_path = ""
|
||||
if "--usd-path" in argv:
|
||||
_usd_idx = argv.index("--usd-path")
|
||||
usd_path = argv[_usd_idx + 1] if _usd_idx + 1 < len(argv) else ""
|
||||
|
||||
# Pre-load USD import helper once (used in both MODE A and MODE B)
|
||||
_import_usd_file = None
|
||||
if usd_path:
|
||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
||||
from import_usd import import_usd_file as _import_usd_file # type: ignore[assignment]
|
||||
|
||||
os.makedirs(frames_dir, exist_ok=True)
|
||||
|
||||
try:
|
||||
@@ -387,8 +399,11 @@ def main():
|
||||
# Find or create target collection
|
||||
target_col = _ensure_collection(target_collection)
|
||||
|
||||
# Import OCC GLB (already in metres, one object per STEP part)
|
||||
parts = _import_glb(glb_path)
|
||||
# Import geometry: USD path when available, otherwise GLB
|
||||
if usd_path and _import_usd_file:
|
||||
parts = _import_usd_file(usd_path)
|
||||
else:
|
||||
parts = _import_glb(glb_path)
|
||||
# Apply render position rotation before material/camera setup
|
||||
_apply_rotation(parts, rotation_x, rotation_y, rotation_z)
|
||||
# Apply OCC topology-based shading overrides
|
||||
@@ -466,7 +481,10 @@ def main():
|
||||
needs_auto_camera = True
|
||||
bpy.ops.wm.read_factory_settings(use_empty=True)
|
||||
|
||||
parts = _import_glb(glb_path)
|
||||
if usd_path and _import_usd_file:
|
||||
parts = _import_usd_file(usd_path)
|
||||
else:
|
||||
parts = _import_glb(glb_path)
|
||||
# Apply render position rotation before material/camera setup
|
||||
_apply_rotation(parts, rotation_x, rotation_y, rotation_z)
|
||||
# Apply OCC topology-based shading overrides
|
||||
|
||||
Reference in New Issue
Block a user