feat: sharp edge pipeline V02, tessellation presets, media cache-bust, GMSH plan

Sharp Edge Pipeline V02:
- export_step_to_gltf.py: replace BRep_Tool.Polygon3D_s (returns None in XCAF) with
  GCPnts_UniformAbscissa curve sampling at 0.3mm step — extracts 17,129 segment pairs
- Inject sharp_edge_pairs + sharp_threshold_deg into GLB extras (scenes[0].extras)
  via binary GLB JSON-chunk patching (no extra dependency)
- export_gltf.py: read schaeffler_sharp_edge_pairs from Blender scene custom props,
  apply via KD-tree to mark edges sharp=True + seam=True (OCC mm Z-up → Blender transform)
- tools/restore_sharp_marks.py: dual-pass (dihedral angle + OCC pairs), updated coordinate
  transform (X, -Z, Y) * 0.001

Tessellation:
- Admin UI: Draft / Standard / Fine preset buttons with active-state highlighting
- Default angular deflection: preview 0.5→0.1 rad, production 0.2→0.05 rad
- export_glb.py: read updated defaults from system_settings

Media / Cache:
- media/service.py: get_download_url appends ?v={file_size_bytes} cache-buster
- media/router.py: Cache-Control: no-cache for all download/thumbnail endpoints

Render pipeline:
- still_render.py / turntable_render.py: shared GPU activation + camera improvements
- render_order_line.py: global render position support
- render_thumbnail.py: updated defaults

Frontend:
- InlineCadViewer: file_size_bytes-aware URL update triggers re-fetch on regeneration
- ThreeDViewer: material panel, part selection, PBR mode improvements
- Admin.tsx: tessellation preset cards, GMSH setting dropdown
- MediaBrowser, ProductDetail, OrderDetail, Orders: various UI improvements
- New: MaterialPanel, GlobalRenderPositionsPanel, StepIndicator components
- New: renderPositions.ts API client

Plans / Docs:
- plan.md: GMSH Frontal-Delaunay tessellation plan (6 tasks)
- LEARNINGS.md: OCC Polygon3D_s None issue + GCPnts fix
- .gitignore: add backend/core (core dump from root process)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-03-11 14:40:36 +01:00
parent 202b06a026
commit ca62319688
70 changed files with 6551 additions and 1130 deletions
+21 -5
View File
@@ -88,8 +88,12 @@ def _apply_sharp_edges_from_occ(mesh_objects: list, sharp_edge_pairs: list) -> N
continue # degenerate — both endpoints map to same vertex
bv0, bv1 = bm.verts[idx0], bm.verts[idx1]
edge = bm.edges.get((bv0, bv1)) or bm.edges.get((bv1, bv0))
if edge is not None and edge.smooth:
if edge is not None:
# Mark sharp (for normal splitting) AND seam (for UV unwrap).
# Both are needed: sharp controls glTF vertex splits / shading;
# seam defines UV island boundaries for correct UV unwrapping.
edge.smooth = False
edge.seam = True
marked += 1
bm.to_mesh(obj.data)
@@ -116,6 +120,14 @@ def main() -> None:
mesh_objects = [o for o in bpy.data.objects if o.type == "MESH"]
print(f"Imported geometry GLB: {args.glb_path} ({len(mesh_objects)} mesh objects)")
# Read OCC sharp edge pairs embedded by export_step_to_gltf.py into GLB extras.
# Blender 5.0 maps glTF scenes[0].extras as scene custom properties on import.
# These take priority over the mesh_attributes CLI argument (which only has 2
# endpoints per edge — see V02 refactor for why this matters).
glb_sharp_pairs = bpy.context.scene.get("schaeffler_sharp_edge_pairs") or []
if glb_sharp_pairs:
print(f"Loaded {len(glb_sharp_pairs)} OCC sharp edge pairs from GLB extras")
# Remove OCC-baked custom normals from the geometry GLB.
# RWGltf_CafWriter embeds per-corner normals from OCC tessellation as a
# 'custom_normal' attribute (CORNER, INT16_2D). If left in place, Blender's
@@ -168,10 +180,14 @@ def main() -> None:
print(f"Marked {total_sharp} sharp/seam edges across {len(mesh_objects)} objects")
# Apply OCC sharp edges if available (additional explicit sharp edges from CAD data)
sharp_pairs = mesh_attributes.get("sharp_edge_pairs") or []
if sharp_pairs:
_apply_sharp_edges_from_occ(mesh_objects, sharp_pairs)
# Apply OCC sharp edges from GLB extras (V02: dense tessellation segment pairs).
# Prefer GLB-embedded pairs over mesh_attributes CLI argument — the GLB extras
# contain the full tessellated polyline for each sharp B-rep edge (all intermediate
# points), while mesh_attributes only has 2 endpoints per edge (too sparse for
# reliable KD-tree matching). Fall back to mesh_attributes if GLB extras absent.
occ_pairs = list(glb_sharp_pairs) or (mesh_attributes.get("sharp_edge_pairs") or [])
if occ_pairs:
_apply_sharp_edges_from_occ(mesh_objects, occ_pairs)
# Apply asset library materials if provided.
# link=False (append) is required: the GLTF exporter can only traverse
@@ -46,6 +46,10 @@ def parse_args() -> argparse.Namespace:
"--color_map", default="{}",
help='JSON dict mapping part name → hex color, e.g. \'{"Ring": "#4C9BE8"}\'',
)
parser.add_argument(
"--sharp_threshold", type=float, default=20.0,
help="Dihedral angle threshold (degrees) for sharp B-rep edge detection. Default 20.0",
)
return parser.parse_args()
@@ -123,6 +127,177 @@ def _apply_palette_colors(shape_tool, color_tool, free_labels) -> None:
color_tool.SetColor(label, occ_color, COLOR_SURF)
def _extract_sharp_edge_pairs(shape, sharp_threshold_deg: float = 20.0) -> list:
"""Extract geometrically sharp B-rep edges as dense curve sample segment pairs.
For each edge shared by exactly 2 faces, evaluates the dihedral angle using
PCurve-based surface normal evaluation. When the angle exceeds the threshold,
samples the analytical 3D curve uniformly at 0.3mm intervals via
GCPnts_UniformAbscissa. Consecutive sample pairs give the KD-tree in
export_gltf.py enough density to find and mark the correct Blender mesh edges.
Note: BRep_Tool.Polygon3D_s() and PolygonOnTriangulation_s() return None in
XCAF compound context — the tessellation data is stored on component instances,
not on the compound edges. Curve sampling bypasses this entirely.
Args:
shape: OCC TopoDS_Shape (tessellated with BRepMesh_IncrementalMesh)
sharp_threshold_deg: dihedral angle threshold in degrees (default 20°)
Returns:
List of [[x0,y0,z0],[x1,y1,z1]] consecutive segment pairs in mm (OCC
coordinate space, Z-up). Must be called BEFORE mm→m scaling.
"""
import math as _math
from OCP.TopTools import TopTools_IndexedDataMapOfShapeListOfShape
from OCP.TopExp import TopExp as _TopExp
from OCP.TopAbs import TopAbs_EDGE, TopAbs_FACE, TopAbs_FORWARD
from OCP.TopoDS import TopoDS as _TopoDS
from OCP.BRepAdaptor import BRepAdaptor_Surface, BRepAdaptor_Curve2d, BRepAdaptor_Curve
from OCP.BRepLProp import BRepLProp_SLProps
from OCP.GCPnts import GCPnts_UniformAbscissa
edge_face_map = TopTools_IndexedDataMapOfShapeListOfShape()
_TopExp.MapShapesAndAncestors_s(shape, TopAbs_EDGE, TopAbs_FACE, edge_face_map)
sharp_pairs: list = []
n_checked = 0
n_sharp = 0
# Sample step 0.3mm — well below the KD-tree TOL=0.5mm in export_gltf.py.
# Tessellation vertex spacing for default deflection is ~0.78-1.55mm, so at
# least one consecutive sample pair will straddle each tessellation edge.
SAMPLE_STEP_MM = 0.3
for i in range(1, edge_face_map.Extent() + 1):
edge_shape = edge_face_map.FindKey(i)
faces = edge_face_map.FindFromIndex(i)
if faces.Size() < 2:
n_checked += 1
continue
face_shapes = list(faces)
if len(face_shapes) < 2:
n_checked += 1
continue
n_checked += 1
try:
edge = _TopoDS.Edge_s(edge_shape)
face1 = _TopoDS.Face_s(face_shapes[0])
face2 = _TopoDS.Face_s(face_shapes[1])
# PCurve-based normal evaluation at edge midpoint
c2d_1 = BRepAdaptor_Curve2d(edge, face1)
uv1 = c2d_1.Value((c2d_1.FirstParameter() + c2d_1.LastParameter()) / 2.0)
surf1 = BRepAdaptor_Surface(face1)
props1 = BRepLProp_SLProps(surf1, uv1.X(), uv1.Y(), 1, 1e-6)
if not props1.IsNormalDefined():
continue
n1 = props1.Normal()
if face1.Orientation() != TopAbs_FORWARD:
n1.Reverse()
c2d_2 = BRepAdaptor_Curve2d(edge, face2)
uv2 = c2d_2.Value((c2d_2.FirstParameter() + c2d_2.LastParameter()) / 2.0)
surf2 = BRepAdaptor_Surface(face2)
props2 = BRepLProp_SLProps(surf2, uv2.X(), uv2.Y(), 1, 1e-6)
if not props2.IsNormalDefined():
continue
n2 = props2.Normal()
if face2.Orientation() != TopAbs_FORWARD:
n2.Reverse()
cos_angle = max(-1.0, min(1.0, n1.Dot(n2)))
angle_deg = _math.degrees(_math.acos(cos_angle))
# Use exterior (supplement) angle so convex and concave edges both work
if angle_deg > 90.0:
angle_deg = 180.0 - angle_deg
if angle_deg <= sharp_threshold_deg:
continue # smooth transition — skip
n_sharp += 1
# Sample the analytical 3D curve at fixed arc-length intervals.
# GCPnts_UniformAbscissa works on the exact B-rep curve regardless of
# whether tessellation polygon data is stored on the edge or not.
pts: list = []
try:
curve3d = BRepAdaptor_Curve(edge)
f_param = curve3d.FirstParameter()
l_param = curve3d.LastParameter()
if _math.isfinite(f_param) and _math.isfinite(l_param):
sampler = GCPnts_UniformAbscissa()
sampler.Initialize(curve3d, SAMPLE_STEP_MM, 1e-6)
if sampler.IsDone() and sampler.NbPoints() >= 2:
for j in range(1, sampler.NbPoints() + 1):
t = sampler.Parameter(j)
p = curve3d.Value(t)
pts.append([round(p.X(), 4), round(p.Y(), 4), round(p.Z(), 4)])
except Exception:
pts = []
if len(pts) < 2:
continue
# Consecutive segment pairs — KD-tree in export_gltf.py maps each
# endpoint to its nearest Blender vertex; if they differ and share a
# mesh edge, that edge is marked sharp+seam.
for k in range(len(pts) - 1):
sharp_pairs.append([pts[k], pts[k + 1]])
except Exception:
continue
print(
f"Sharp edge extraction: {n_checked} edges checked, "
f"{n_sharp} sharp (>{sharp_threshold_deg:.0f}°), "
f"{len(sharp_pairs)} segment pairs total"
)
return sharp_pairs
def _inject_glb_extras(glb_path: Path, extras: dict) -> None:
"""Patch a GLB binary to add/update scenes[0].extras JSON field.
The GLB format stores a JSON chunk immediately after the 12-byte header.
We re-serialize it with the new extras and update chunk + total lengths.
No external dependencies — pure stdlib struct/json.
"""
import struct as _struct
data = glb_path.read_bytes()
# GLB header: magic(4) + version(4) + total_length(4) = 12 bytes
# JSON chunk: chunk_length(4) + chunk_type(4) + chunk_data(chunk_length bytes)
json_len = _struct.unpack_from("<I", data, 12)[0]
json_type = _struct.unpack_from("<I", data, 16)[0]
if json_type != 0x4E4F534A: # "JSON"
print("WARNING: _inject_glb_extras: unexpected chunk type, skipping extras injection",
file=sys.stderr)
return
j = json.loads(data[20: 20 + json_len])
if "scenes" in j and j["scenes"]:
existing = j["scenes"][0].get("extras") or {}
existing.update(extras)
j["scenes"][0]["extras"] = existing
else:
j.setdefault("extras", {}).update(extras)
new_json = json.dumps(j, separators=(",", ":"))
# Pad to 4-byte boundary with spaces (required by GLB spec)
pad = (4 - len(new_json) % 4) % 4
new_json_bytes = new_json.encode() + b" " * pad
rest = data[20 + json_len:] # BIN chunk and anything after
new_chunk = _struct.pack("<II", len(new_json_bytes), 0x4E4F534A) + new_json_bytes
new_total = 12 + len(new_chunk) + len(rest)
new_header = _struct.pack("<III", 0x46546C67, 2, new_total)
glb_path.write_bytes(new_header + new_chunk + rest)
def main() -> None:
args = parse_args()
color_map: dict = json.loads(args.color_map)
@@ -173,6 +348,21 @@ def main() -> None:
True, # isInParallel
)
# --- Extract sharp B-rep edge pairs (before mm→m scaling so coords are in mm) ---
# Collect all free shapes into one list for the extraction function.
# The extraction uses the freshly tessellated XCAF shapes.
sharp_pairs: list = []
try:
for i in range(1, free_labels.Length() + 1):
root_shape = shape_tool.GetShape_s(free_labels.Value(i))
if not root_shape.IsNull():
pairs = _extract_sharp_edge_pairs(root_shape, args.sharp_threshold)
sharp_pairs.extend(pairs)
print(f"Total OCC sharp segment pairs: {len(sharp_pairs)}")
except Exception as _exc:
print(f"WARNING: sharp edge extraction failed (non-fatal): {_exc}", file=sys.stderr)
sharp_pairs = []
# --- Apply colors ---
if color_map:
_apply_color_map(shape_tool, color_tool, free_labels, color_map)
@@ -222,6 +412,19 @@ def main() -> None:
print(f"GLB exported: {out.name} ({out.stat().st_size // 1024} KB)")
# --- Inject sharp edge pairs into GLB extras ---
# Blender 5.0 reads scenes[0].extras as scene custom properties on import,
# making the data available to export_gltf.py as bpy.context.scene["key"].
if sharp_pairs:
try:
_inject_glb_extras(out, {
"schaeffler_sharp_edge_pairs": sharp_pairs,
"schaeffler_sharp_threshold_deg": args.sharp_threshold,
})
print(f"Injected {len(sharp_pairs)} sharp edge segment pairs into GLB extras")
except Exception as _exc:
print(f"WARNING: GLB extras injection failed (non-fatal): {_exc}", file=sys.stderr)
try:
main()
+35 -9
View File
@@ -223,8 +223,14 @@ def _import_glb(glb_file):
max(v.y for v in all_corners),
max(v.z for v in all_corners)))
center = (mins + maxs) * 0.5
for p in parts:
p.location -= center
# Move root objects (parentless) to centre. Adjusting a child's local
# .location by a world-space vector gives wrong results when the GLB has
# Empty parent nodes (OCC assembly hierarchy). Shifting the root moves
# the entire hierarchy correctly.
all_imported = list(bpy.context.selected_objects)
root_objects = [o for o in all_imported if o.parent is None]
for obj in root_objects:
obj.location -= center
return parts
@@ -244,9 +250,10 @@ def _resolve_part_name(index, part_obj, part_names_ordered):
def _apply_material_library(parts, mat_lib_path, mat_map, part_names_ordered=None):
"""Append materials from library .blend and assign to parts via material_map.
With per-part STL import, Blender objects are named after STEP parts,
so matching is by name (stripping Blender .NNN suffix for duplicates).
Falls back to part_names_ordered index-based matching for combined-STL mode.
Matching priority per part:
1. GLB object name (strip Blender .NNN suffix + OCC _AF0/_AF1 suffix)
2. Prefix fallback (longest mat_map key that is a prefix of / contains part name)
3. Index-based via part_names_ordered (also strips _AF suffix)
mat_map: {part_name_lower: material_name}
Parts without a match keep their current material.
@@ -286,16 +293,35 @@ def _apply_material_library(parts, mat_lib_path, mat_map, part_names_ordered=Non
# secondary: index-based via part_names_ordered (combined STL fallback)
assigned_count = 0
for i, part in enumerate(parts):
# Try name-based matching first (strip Blender .NNN suffix)
# 1. Name-based: strip Blender .NNN suffix, then OCC _AF0/_AF1 suffix
base_name = _re.sub(r'\.\d{3}$', '', part.name)
_prev = None
while _prev != base_name:
_prev = base_name
base_name = _re.sub(r'_AF\d+$', '', base_name, flags=_re.IGNORECASE)
part_key = base_name.lower().strip()
mat_name = mat_map.get(part_key)
# Fall back to index-based matching via part_names_ordered
# 2. Prefix fallback: longest mat_map key that is a prefix/suffix match
if not mat_name:
for key, val in sorted(mat_map.items(), key=lambda x: len(x[0]), reverse=True):
if len(key) >= 5 and len(part_key) >= 5 and (
part_key.startswith(key) or key.startswith(part_key)
):
mat_name = val
break
# 3. Index-based fallback via part_names_ordered (also strips _AF suffix)
if not mat_name and part_names_ordered and i < len(part_names_ordered):
step_name = part_names_ordered[i]
part_key = step_name.lower().strip()
mat_name = mat_map.get(part_key)
step_key = step_name.lower().strip()
mat_name = mat_map.get(step_key)
if not mat_name:
_p2 = None
while _p2 != step_key:
_p2 = step_key
step_key = _re.sub(r'_af\d+$', '', step_key)
mat_name = mat_map.get(step_key)
if mat_name and mat_name in appended:
part.data.materials.clear()
+35 -9
View File
@@ -193,8 +193,14 @@ def _import_glb(glb_file):
max(v.y for v in all_corners),
max(v.z for v in all_corners)))
center = (mins + maxs) * 0.5
for p in parts:
p.location -= center
# Move root objects (parentless) to centre. Adjusting a child's local
# .location by a world-space vector gives wrong results when the GLB has
# Empty parent nodes (OCC assembly hierarchy). Shifting the root moves
# the entire hierarchy correctly.
all_imported = list(bpy.context.selected_objects)
root_objects = [o for o in all_imported if o.parent is None]
for obj in root_objects:
obj.location -= center
return parts
@@ -214,9 +220,10 @@ def _resolve_part_name(index, part_obj, part_names_ordered):
def _apply_material_library(parts, mat_lib_path, mat_map, part_names_ordered=None):
"""Append materials from library .blend and assign to parts via material_map.
With per-part STL import, Blender objects are named after STEP parts,
so matching is by name (stripping Blender .NNN suffix for duplicates).
Falls back to part_names_ordered index-based matching for combined-STL mode.
Matching priority per part:
1. GLB object name (strip Blender .NNN suffix + OCC _AF0/_AF1 suffix)
2. Prefix fallback (longest mat_map key that is a prefix of / contains part name)
3. Index-based via part_names_ordered (also strips _AF suffix)
mat_map: {part_name_lower: material_name}
Parts without a match keep their current material.
@@ -256,16 +263,35 @@ def _apply_material_library(parts, mat_lib_path, mat_map, part_names_ordered=Non
# secondary: index-based via part_names_ordered (combined STL fallback)
assigned_count = 0
for i, part in enumerate(parts):
# Try name-based matching first (strip Blender .NNN suffix)
# 1. Name-based: strip Blender .NNN suffix, then OCC _AF0/_AF1 suffix
base_name = _re.sub(r'\.\d{3}$', '', part.name)
_prev = None
while _prev != base_name:
_prev = base_name
base_name = _re.sub(r'_AF\d+$', '', base_name, flags=_re.IGNORECASE)
part_key = base_name.lower().strip()
mat_name = mat_map.get(part_key)
# Fall back to index-based matching via part_names_ordered
# 2. Prefix fallback: longest mat_map key that is a prefix/suffix match
if not mat_name:
for key, val in sorted(mat_map.items(), key=lambda x: len(x[0]), reverse=True):
if len(key) >= 5 and len(part_key) >= 5 and (
part_key.startswith(key) or key.startswith(part_key)
):
mat_name = val
break
# 3. Index-based fallback via part_names_ordered (also strips _AF suffix)
if not mat_name and part_names_ordered and i < len(part_names_ordered):
step_name = part_names_ordered[i]
part_key = step_name.lower().strip()
mat_name = mat_map.get(part_key)
step_key = step_name.lower().strip()
mat_name = mat_map.get(step_key)
if not mat_name:
_p2 = None
while _p2 != step_key:
_p2 = step_key
step_key = _re.sub(r'_af\d+$', '', step_key)
mat_name = mat_map.get(step_key)
if mat_name and mat_name in appended:
part.data.materials.clear()