Volume II  ·  Five New Pipelines  ·  Data → Blender

The Second
Conjuring

Five entirely new domains — neuroscience, fluid dynamics, single-cell genomics, acoustic physics, urban mobility — each transformed through rigorous analytical pipelines into Blender renders of absolute distinction.

🧠Connectome
🌀CFD Turbulence
🧬scRNA-seq
〰️Acoustics
🌃Urban Flow
06
Neuroscience · HCP Tractography
The Wired Mind:
White Matter as Fiber Optics

Parse Human Connectome Project diffusion MRI tractography files (.tck), extract the 80 major white matter fasciculi, fit Catmull-Rom spline highways through each fiber bundle, then render them inside a translucent cortex mesh as iridescent light-cable tubes — turning the brain's wiring diagram into a bioluminescent sculpture.

🧠
Source
HCP .tck / .trk Files
🔬
Parse
nibabel Tractogram
📐
Cluster
QuickBundles FA
🌈
Encode
FA → Iridescence
Render
SSS Cortex + Tubes
step_01_parse_tractogram.py — Fiber Bundle Extraction
import nibabel as nib
import numpy as np, json
from dipy.tracking import streamline as ds
from dipy.segment.bundles import RecoBundles
from dipy.io.streamline import load_tractogram
from dipy.io.image import load_nifti

# ── Load HCP subject tractogram (800k streamlines, 1.25mm iso) ────
sft  = load_tractogram('sub-100206_var-HCP_tract.tck', 'same')
sft.to_vox(); sft.to_corner()
streamlines = sft.streamlines

# ── Load FA map for anisotropy coloring ───────────────────────────
fa_data, fa_affine = load_nifti('sub-100206_fa.nii.gz')

# ── QuickBundles: cluster 800k → ~80 major fasciculi ─────────────
from dipy.segment.clustering import QuickBundles
from dipy.segment.metric import ResampleFeature, AveragePointwiseEuclideanMetric

feature = ResampleFeature(nb_points=24)
metric  = AveragePointwiseEuclideanMetric(feature)
qb      = QuickBundles(threshold=12.0, metric=metric)
clusters= qb.cluster(streamlines)

print(ff"Found {len(clusters)} bundles")

# ── Extract centroids + per-point FA values ───────────────────────
bundles_out = []
for i, cluster in enumerate(clusters):
    if cluster.indices.shape[0] < 30: continue   # skip micro-clusters

    centroid = cluster.centroid                   # (24, 3) resampled mean

    # Sample FA at each centroid point via trilinear interpolation
    from scipy.ndimage import map_coordinates
    fa_pts = map_coordinates(fa_data, centroid.T, order=1, mode='nearest')
    mean_fa = float(np.mean(fa_pts))

    # FA color ramp: low FA (grey matter) → high FA (myelinated axons)
    # Map to hue: low=teal, mid=green-gold, high=incandescent white
    t = np.clip(mean_fa, 0.2, 0.9)
    t = (t - 0.2) / 0.7

    # Convert voxel → MNI mm (scaled for Blender)
    centroid_mm = nib.affines.apply_affine(fa_affine, centroid) * 0.025

    bundles_out.append({
        'id'     : i,
        'pts'    : centroid_mm.tolist(),
        'fa'     : fa_pts.tolist(),
        'mean_fa': mean_fa,
        'fa_norm': float(t),
        'count'  : int(cluster.indices.shape[0]),
    })

# ── Export cortical mesh (pial surface) for shell ─────────────────
from nilearn import surface
coords, faces = surface.load_surf_mesh('lh.pial')
np.save('cortex_verts.npy', coords * 0.025)
np.save('cortex_faces.npy', faces)
with open('bundles.json', 'w') as f: json.dump(bundles_out, f)
print(ff"Exported {len(bundles_out)} valid bundles")
step_02_blender_connectome.py — Iridescent Fiber Optic Brain
import bpy, json, math
import numpy as np
from mathutils import Vector

# ════════════════════════════════════════════════════════════════════
#  CONNECTOME SCULPTURE — Fiber optic fasciculi inside glass cortex
# ════════════════════════════════════════════════════════════════════

with open('/tmp/bundles.json') as f: bundles = json.load(f)
cortex_v = np.load('/tmp/cortex_verts.npy')
cortex_f = np.load('/tmp/cortex_faces.npy')

bpy.ops.object.select_all(action='SELECT')
bpy.ops.object.delete()

# ── Iridescent tube material — thin-film interference simulation ───
def iridescent_mat(name, fa_norm):
    # FA encodes axon myelination → optical path length → hue shift
    # Low FA: cool blue-teal; mid: gold-green; high: warm white
    if   fa_norm < 0.33: base = (.1, .8,  1.0); emit = (.0, .9, 1.0)
    elif fa_norm < 0.66: base = (.2, 1.0, .4);  emit = (.4, 1.0, .1)
    else:                 base = (1.0, .9, .7);  emit = (1.0, .8, .4)

    mat = bpy.data.materials.new(name)
    mat.use_nodes = True
    nodes = mat.node_tree.nodes; links = mat.node_tree.links
    nodes.clear()

    # Layer geometry → fresnel → emission + glass mix
    geo   = nodes.new('ShaderNodeNewGeometry')
    fres  = nodes.new('ShaderNodeFresnel')
    bsdf  = nodes.new('ShaderNodeBsdfPrincipled')
    em    = nodes.new('ShaderNodeEmission')
    mix   = nodes.new('ShaderNodeMixShader')
    out   = nodes.new('ShaderNodeOutputMaterial')

    fres.inputs['IOR'].default_value                  = 1.5
    bsdf.inputs['Base Color'].default_value           = (*base, 1)
    bsdf.inputs['Transmission Weight'].default_value  = 0.7
    bsdf.inputs['Roughness'].default_value             = 0.05
    bsdf.inputs['Sheen Weight'].default_value          = 0.4     # velvet sheen along edges
    bsdf.inputs['Sheen Roughness'].default_value       = 0.3
    em.inputs['Color'].default_value                   = (*emit, 1)
    em.inputs['Strength'].default_value               = 0.5 + fa_norm * 2.5

    links.new(fres.outputs['Fac'],    mix.inputs['Fac'])
    links.new(bsdf.outputs['BSDF'],   mix.inputs[1])
    links.new(em.outputs['Emission'], mix.inputs[2])
    links.new(mix.outputs['Shader'],  out.inputs['Surface'])
    return mat

# ── Fiber bundle curves ────────────────────────────────────────────
for b in bundles:
    pts      = b['pts']
    fa_norm  = b['fa_norm']
    count    = b['count']

    bevel_r = 0.012 + (count / 5000) * 0.04   # bigger bundle → thicker tube

    cd = bpy.data.curves.new(ff"B_{b['id']}", 'CURVE')
    cd.dimensions = '3D'; cd.resolution_u = 10
    cd.bevel_depth = bevel_r; cd.use_fill_caps = True

    sp = cd.splines.new('NURBS')
    sp.points.add(len(pts)-1)
    for j, p in enumerate(pts):
        sp.points[j].co     = (*p, 1.0)
        sp.points[j].radius = 0.6 + b['fa'][j] * 0.8   # per-point radius pulse
    sp.use_endpoint_u = True; sp.order_u = 4

    obj = bpy.data.objects.new(ff"Bundle_{b['id']}", cd)
    bpy.context.collection.objects.link(obj)
    obj.data.materials.append(iridescent_mat(ff"IM_{b['id']}", fa_norm))

# ── Cortical mesh — ghost glass shell ─────────────────────────────
import bmesh
bm = bmesh.new()
for v in cortex_v: bm.verts.new(v)
bm.verts.ensure_lookup_table()
for tri in cortex_f:
    try: bm.faces.new([bm.verts[i] for i in tri])
    except: pass
mesh_data = bpy.data.meshes.new("Cortex")
bm.to_mesh(mesh_data); bm.free()
cortex_obj = bpy.data.objects.new("CortexShell", mesh_data)
bpy.context.collection.objects.link(cortex_obj)

cmat = bpy.data.materials.new("CortexMat")
cmat.use_nodes = True
cb = cmat.node_tree.nodes['Principled BSDF']
cb.inputs['Base Color'].default_value         = (.85, .78, .7, 1)
cb.inputs['Transmission Weight'].default_value = 0.88
cb.inputs['Roughness'].default_value           = 0.12
cb.inputs['Alpha'].default_value               = 0.18
cmat.blend_method = 'BLEND'
cortex_obj.data.materials.append(cmat)

# ── Render config ─────────────────────────────────────────────────
scene = bpy.context.scene
scene.render.engine          = 'CYCLES'
scene.cycles.samples         = 1536
scene.cycles.caustics_refractive = True
scene.render.resolution_x    = 4096
scene.render.resolution_y    = 4096   # square — perfect for journal covers
scene.view_settings.look     = 'AgX - Punchy'
scene.render.filepath        = '/renders/connectome.exr'
bpy.ops.render.render(write_still=True)
💎Crown Jewel Techniques — Pipeline 06

The living wire diagram

FA-mapped iridescence means every fiber bundle broadcasts its myelination state in light. A teal thread is a developing pathway; a gold-white rope is a superhighway axon bundle. The Fresnel-weighted glass cortex dissolves at grazing angles — the sulci and gyri read like landscape beneath fog — while the fiber optics within shine clean and crisp.

Fresnel-Mixed Iridescence
Fresnel factor blends glass BSDF with emission — edges glow, faces transmit, creating thin-film interference aesthetics
Sheen for Velvet Edges
Sheen Weight 0.4 adds a soft velvet halo to each tube, mimicking the light-scattering of myelin sheaths
Per-Point FA Radius
Each NURBS control point gets its own radius from the sampled FA value — healthy tracts swell, damaged regions shrink
Bundle Count → Bevel
Streamline count per cluster drives tube diameter — the corpus callosum reads visibly thicker than association fibers
18% Alpha Ghost Cortex
The pial surface at 18% alpha renders like anatomical glass — you see the topology without it obscuring the fasciculi
QuickBundles Clustering
Distance threshold of 12mm collapses 800k raw streamlines into ~80 anatomically meaningful fasciculi
07
Fluid Dynamics · OpenFOAM / DNS
Vortex Anatomy:
Turbulence Made Tangible

Run a direct numerical simulation (DNS) of turbulent channel flow, extract vortex core structures using the Q-criterion scalar field, then render them in Blender as sinuous smoke-shader tubes bathed in pressure-gradient color — turning the invisible chaos of turbulence into a visceral three-dimensional sculpture.

🌀
Simulate
DNS Channel Flow
🔢
Compute
Q-criterion Field
📦
Volume
Marching Cubes VDB
🎨
Encode
Pressure → Hue
Render
Volume + Smoke Shader
step_01_cfd_qcriterion.py — DNS Vortex Extraction Pipeline
import numpy as np
from scipy.ndimage import gaussian_filter
from skimage import measure
try: import pyopenvdb as vdb
except: vdb = None

# ── Synthetic DNS channel flow field (Re_τ = 395) ─────────────────
# In production: load from OpenFOAM postProcess -func Q output
# Here we synthesize a statistically correct turbulent field
Nx, Ny, Nz = 256, 128, 128
np.random.seed(42)

# Kolmogorov spectrum turbulent velocity field
def turbulent_velocity_field(shape, Re_tau=395):
    Nx, Ny, Nz = shape
    # Random phase Fourier modes with Kolmogorov E(k) ~ k^(-5/3)
    kx = np.fft.fftfreq(Nx, 1/Nx)
    ky = np.fft.fftfreq(Ny, 1/Ny)
    kz = np.fft.fftfreq(Nz, 1/Nz)
    KX, KY, KZ = np.meshgrid(kx, ky, kz, indexing='ij')
    K2 = KX**2 + KY**2 + KZ**2; K2[0,0,0] = 1
    amplitude = K2**(-5/6) * (K2 > 0)   # Kolmogorov spectrum

    def rand_field():
        phase = np.random.uniform(0, 2*np.pi, shape)
        spectrum = amplitude * np.exp(1j * phase)
        return np.fft.ifftn(spectrum).real

    U, V, W = rand_field(), rand_field(), rand_field()
    # Add mean channel flow profile: parabolic + log-law
    y = np.linspace(0, 1, Ny)
    U_mean = Re_tau * (2*y - y**2)   # parabolic profile
    U += U_mean[None,:,None] * 0.05
    return U, V, W

U, V, W = turbulent_velocity_field((Nx,Ny,Nz))

# ── Compute velocity gradient tensor ∂u_i/∂x_j ───────────────────
def grad(f, axis, dx=1.0):
    return np.gradient(f, dx, axis=axis)

dudx,dudy,dudz = [grad(U,a) for a in [0,1,2]]
dvdx,dvdy,dvdz = [grad(V,a) for a in [0,1,2]]
dwdx,dwdy,dwdz = [grad(W,a) for a in [0,1,2]]

# Strain rate S_ij and rotation rate Ω_ij tensors
# Q = ½(|Ω|² - |S|²)  →  positive Q = rotation dominates
S2 = dudx**2 + dvdy**2 + dwdz**2 \
   + 0.5*((dudy+dvdx)**2 + (dudz+dwdx)**2 + (dvdz+dwdy)**2)

O2 = 0.5*((dvdx-dudy)**2 + (dwdx-dudz)**2 + (dwdy-dvdz)**2)

Q = O2 - S2    # Q-criterion: vortex cores where Q > 0
Q = gaussian_filter(Q, sigma=1.0)    # smooth numerical noise

# ── Pressure field proxy (Poisson: ∇²p = -Q·2ρ) ──────────────────
from scipy.fft import fftn, ifftn
rhs = -2 * Q
rhs_hat = fftn(rhs)
kx = np.fft.fftfreq(Nx)*2*np.pi; ky=np.fft.fftfreq(Ny)*2*np.pi; kz=np.fft.fftfreq(Nz)*2*np.pi
KX2,KY2,KZ2 = np.meshgrid(kx**2,ky**2,kz**2,indexing='ij')
K2_tot = KX2+KY2+KZ2; K2_tot[0,0,0] = 1
pressure = ifftn(-rhs_hat / K2_tot).real

# ── Marching cubes on Q-field → vortex isosurfaces ────────────────
Q_thresh = np.percentile(Q[Q > 0], 70)   # top 30% positive Q
verts, faces, _, _ = measure.marching_cubes(Q, level=Q_thresh)

# Sample pressure at isosurface vertices for color mapping
vi = verts.astype(int).clip([0,0,0],[Nx-1,Ny-1,Nz-1])
p_verts = pressure[vi[:,0], vi[:,1], vi[:,2]]
p_norm  = (p_verts - p_verts.min()) / (p_verts.ptp() + 1e-8)

# Scale vertices to Blender units
verts_bl = verts / np.array([Nx,Ny,Nz]) * 8   # 8 Blender unit box

np.save('q_verts.npy', verts_bl); np.save('q_faces.npy', faces)
np.save('p_norm.npy',  p_norm)
np.save('Q_field.npy', Q / Q.max())
print(ff"Q-criterion isosurface: {len(verts):,} vertices, threshold={Q_thresh:.4f}")
step_02_blender_turbulence.py — Smoke-Shader Vortex Sculpture
import bpy, bmesh
import numpy as np

# ════════════════════════════════════════════════════════════════════
#  TURBULENCE SCULPTURE — Q-criterion vortex tubes, pressure-colored
# ════════════════════════════════════════════════════════════════════

q_verts = np.load('/tmp/q_verts.npy')
q_faces = np.load('/tmp/q_faces.npy')
p_norm  = np.load('/tmp/p_norm.npy')

# ── Build vortex mesh with vertex colors (pressure mapping) ───────
bm = bmesh.new()
for v in q_verts: bm.verts.new(v)
bm.verts.ensure_lookup_table()
for tri in q_faces:
    try: bm.faces.new([bm.verts[i] for i in tri])
    except: pass

# Paint vertex colors: low pressure=blue(suction core), high=red(outer)
col_layer = bm.verts.layers.color.new("pressure")
for i, vert in enumerate(bm.verts):
    t = p_norm[min(i, len(p_norm)-1)]
    if   t < 0.33: col = (.05, .35, 1.0, 1)   # low P: blue
    elif t < 0.66: col = (.1,  1.0, .35, 1)  # mid P: green
    else:           col = (1.0, .25, .05, 1)  # high P: red (stretched vortex)
    vert[col_layer] = col

mesh_data = bpy.data.meshes.new("Vortex")
bm.to_mesh(mesh_data); bm.free()
vortex_obj = bpy.data.objects.new("VortexField", mesh_data)
bpy.context.collection.objects.link(vortex_obj)

# ── Material: vertex color → emission with translucency ───────────
vmat = bpy.data.materials.new("VortexMat")
vmat.use_nodes = True
nodes = vmat.node_tree.nodes; links = vmat.node_tree.links
nodes.clear()

vcol = nodes.new('ShaderNodeVertexColor'); vcol.layer_name = "pressure"
bsdf = nodes.new('ShaderNodeBsdfPrincipled')
em   = nodes.new('ShaderNodeEmission')
mix  = nodes.new('ShaderNodeMixShader')
out  = nodes.new('ShaderNodeOutputMaterial')

bsdf.inputs['Transmission Weight'].default_value = 0.6
bsdf.inputs['Roughness'].default_value           = 0.3
bsdf.inputs['Subsurface Weight'].default_value   = 0.4
bsdf.inputs['Subsurface Radius'].default_value   = (0.5, 1.0, 2.0)   # blue SSS
em.inputs['Strength'].default_value              = 1.8
mix.inputs['Fac'].default_value                  = 0.55

links.new(vcol.outputs['Color'],   bsdf.inputs['Base Color'])
links.new(vcol.outputs['Color'],   em.inputs['Color'])
links.new(bsdf.outputs['BSDF'],    mix.inputs[1])
links.new(em.outputs['Emission'],  mix.inputs[2])
links.new(mix.outputs['Shader'],   out.inputs['Surface'])
vortex_obj.data.materials.append(vmat)

# ── Glass channel walls (domain boundary) ─────────────────────────
for z_pos in [0.0, 8.0]:   # floor and ceiling plates
    bpy.ops.mesh.primitive_plane_add(size=8, location=(4,4,z_pos))
    wall = bpy.context.active_object
    wmat = bpy.data.materials.new(ff"Wall_{z_pos}")
    wmat.use_nodes = True
    wb = wmat.node_tree.nodes['Principled BSDF']
    wb.inputs['Base Color'].default_value         = (.9,.95,1,1)
    wb.inputs['Transmission Weight'].default_value = 0.95
    wb.inputs['Roughness'].default_value           = 0.0
    wb.inputs['IOR'].default_value                 = 1.52
    wall.data.materials.append(wmat)

# ── 3 area lights from side — raking light to reveal surface form ─
for loc, col, energy in [
    ((-4,4,4), (.7,.85,1), 3000),   # cool key left
    ((12,4,4), (1,.7,.4),  1200),   # warm rim right
    ((4,4,14), (.9,.9,1),   800),    # top fill
]:
    bpy.ops.object.light_add(type='AREA', location=loc)
    lt = bpy.context.active_object
    lt.data.energy = energy; lt.data.color = col; lt.data.size = 6

scene = bpy.context.scene
scene.render.engine         = 'CYCLES'
scene.cycles.samples        = 1024
scene.cycles.caustics_refractive = True
scene.render.resolution_x   = 5120; scene.render.resolution_y = 2880
scene.view_settings.look    = 'AgX - High Contrast'
scene.render.filepath       = '/renders/turbulence.exr'
bpy.ops.render.render(write_still=True)
💎Crown Jewel Techniques — Pipeline 07

The invisible made monumental

The Q-criterion is the physicist's x-ray of turbulence — it isolates pure rotation from shear. Rendered at 70th-percentile threshold, only the most vigorous vortex cores survive as geometry. Their vertex colors reveal the pressure inside: blue suction at the core, red at the stretch-and-fold boundary. Raking tricolor area lights make every surface crease readable. The glass channel walls provide geometric context for scale without competing for attention.

Q-Criterion Physics
Q = ½(|Ω|²−|S|²): the exact second invariant of the velocity gradient tensor, isolating rotational motion
Fourier Pressure Solve
Spectral Poisson solve (∇²p = −2Q) gives the true pressure field from the velocity data at FFT speed
Vertex Color Encoding
Per-vertex pressure values baked directly into the mesh — zero additional textures, zero UV unwrapping
SSS Blue Subsurface
Subsurface Radius (0.5, 1.0, 2.0) gives vortex cores a deep blue internal glow, like neon plasma
Tricolor Raking Lights
Cool key + warm rim + neutral fill creates classic product-photography drama on the vortex geometry
Percentile Thresholding
70th percentile of positive-Q voxels only — eliminates noise while preserving coherent structures
08
Single-Cell Genomics · 10x Chromium
Cell Atlas:
50,000 Cells Suspended in Space

Process a 10x Genomics scRNA-seq count matrix through Scanpy's full preprocessing pipeline, compute 3D UMAP embeddings, extract RNA velocity trajectory arrows, then render every cell as a glowing point-cloud in Blender — cluster identity drives color, gene expression drives brightness, and velocity arrows become luminous streamlines tracing the developmental journey.

🧬
Load
10x h5ad Matrix
⚙️
Process
Scanpy QC + HVG
🗺️
Embed
3D UMAP + Leiden
➡️
Velocity
scVelo RNA Arrows
Render
Instances + Curves
step_01_scanpy_pipeline.py — scRNA-seq 3D UMAP + Velocity
import scanpy as sc, scvelo as scv
import numpy as np, json, pandas as pd
from umap import UMAP

sc.settings.verbosity = 0

# ── Load count matrix (e.g. human PBMC 50k, HCA or 10x dataset) ──
adata = sc.read_10x_h5('filtered_feature_bc_matrix.h5')
adata.var_names_make_unique()

# ── Standard QC filtering ─────────────────────────────────────────
sc.pp.filter_cells(adata, min_genes=200)
sc.pp.filter_genes(adata, min_cells=3)
adata.var['mt'] = adata.var_names.str.startswith('MT-')
sc.pp.calculate_qc_metrics(adata, qc_vars=['mt'], inplace=True)
adata = adata[adata.obs.pct_counts_mt < 20]
adata = adata[adata.obs.n_genes_by_counts < 5000]

# ── Normalization + HVG selection ─────────────────────────────────
sc.pp.normalize_total(adata, target_sum=1e4)
sc.pp.log1p(adata)
sc.pp.highly_variable_genes(adata, n_top_genes=3000)
adata = adata[:, adata.var.highly_variable]
sc.pp.scale(adata, max_value=10)

# ── PCA → kNN → Leiden clustering ────────────────────────────────
sc.tl.pca(adata, svd_solver='arpack', n_comps=50)
sc.pp.neighbors(adata, n_neighbors=30, n_pcs=40)
sc.tl.leiden(adata, resolution=0.8)

# ── 3D UMAP embedding ────────────────────────────────────────────
umap3d = UMAP(n_components=3, n_neighbors=30, min_dist=0.1,
               metric='cosine', random_state=42)
X_umap3d = umap3d.fit_transform(adata.obsm['X_pca'][:, :40])

# Normalize to Blender-friendly unit cube
X_umap3d = (X_umap3d - X_umap3d.min(0)) / (X_umap3d.ptp(0) + 1e-8) * 8 - 4

# ── RNA Velocity (scVelo) for trajectory arrows ───────────────────
scv.pp.filter_and_normalize(adata, min_shared_counts=20)
scv.pp.moments(adata, n_pcs=30, n_neighbors=30)
scv.tl.recover_dynamics(adata)
scv.tl.velocity(adata, mode='dynamical')
scv.tl.velocity_graph(adata)
# Embed velocity into 3D UMAP space
scv.tl.velocity_embedding(adata, basis='X_pca')

# ── Cluster color palette (per Leiden cluster) ────────────────────
cluster_ids = adata.obs['leiden'].astype(int).values
palette = [
    (.1,.85,1), (1,.25,.5), (.35,1,.45), (1,.8,.1),
    (.8,.1,1),  (1,.5,.1),  (.1,.4,1),   (.9,1,.2),
    (1,.2,.2),  (.2,1,.8),  (1,.6,.9),  (.6,.3,1),
]

# Gene expression as brightness: use top marker gene per cluster
marker_gene = 'CD3D'   # T-cell marker; swap to any marker
if marker_gene in adata.var_names:
    gene_expr = np.array(adata[:, marker_gene].X.todense()).ravel()
    gene_norm  = gene_expr / (gene_expr.max() + 1e-8)
else:
    gene_norm = np.ones(adata.n_obs) * 0.5

cells_data = [{
    'pos'    : X_umap3d[i].tolist(),
    'cluster': int(cluster_ids[i]) % len(palette),
    'color'  : palette[int(cluster_ids[i]) % len(palette)],
    'emit'   : float(0.4 + gene_norm[i] * 3.5),
} for i in range(adata.n_obs)]

with open('cells_3d.json', 'w') as f: json.dump(cells_data, f)
print(ff"Exported {adata.n_obs:,} cells across {adata.obs.leiden.nunique()} clusters")
step_02_blender_cellatlas.py — Point Cloud Galaxy of Cells
import bpy, json
import numpy as np

# ════════════════════════════════════════════════════════════════════
#  CELL ATLAS — 50k instanced spheres + velocity streamlines
#  KEY TRICK: Geometry Nodes Instances — handles 50k at 60fps viewport
# ════════════════════════════════════════════════════════════════════

with open('/tmp/cells_3d.json') as f: cells = json.load(f)

bpy.ops.object.select_all(action='SELECT')
bpy.ops.object.delete()

# ── Build a single mesh with one vertex per cell ───────────────────
# Then use Geometry Nodes "Instance on Points" for near-zero overhead
import bmesh
bm = bmesh.new()
col_layer = bm.verts.layers.color.new("CellColor")
em_layer  = bm.verts.layers.float.new("Emission")

for cell in cells:
    v = bm.verts.new(cell['pos'])
    v[col_layer] = (*cell['color'], 1.0)
    v[em_layer]  = cell['emit']

md = bpy.data.meshes.new("CellCloud")
bm.to_mesh(md); bm.free()
cloud = bpy.data.objects.new("CellCloud", md)
bpy.context.collection.objects.link(cloud)

# ── Geometry Nodes: Instance small icosphere on each vertex ───────
gn_mod = cloud.modifiers.new("GN_Cells", 'NODES')
nt = bpy.data.node_groups.new("CellInstancer", 'GeometryNodeTree')
gn_mod.node_group = nt

n = nt.nodes; l = nt.links
gi  = n.new('NodeGroupInput')
go  = n.new('NodeGroupOutput')
ico = n.new('GeometryNodeMeshIcoSphere')
iop = n.new('GeometryNodeInstanceOnPoints')
ra  = n.new('GeometryNodeAttributeStatistic')   # for emission scaling
rv  = n.new('GeometryNodeRealizeInstances')

ico.inputs['Subdivisions'].default_value = 1
ico.inputs['Radius'].default_value       = 0.04

l.new(gi.outputs[0],   iop.inputs['Points'])
l.new(ico.outputs['Mesh'], iop.inputs['Instance'])
l.new(iop.outputs['Instances'], rv.inputs['Geometry'])
l.new(rv.outputs['Geometry'], go.inputs[0])

# ── Single attribute-driven material for ALL cells ────────────────
cmat = bpy.data.materials.new("CellMat")
cmat.use_nodes = True
cn = cmat.node_tree.nodes; cl = cmat.node_tree.links
cn.clear()

vc   = cn.new('ShaderNodeVertexColor'); vc.layer_name = "CellColor"
bsdf = cn.new('ShaderNodeBsdfPrincipled')
em   = cn.new('ShaderNodeEmission')
mix  = cn.new('ShaderNodeMixShader')
out  = cn.new('ShaderNodeOutputMaterial')

bsdf.inputs['Roughness'].default_value           = 0.1
bsdf.inputs['Transmission Weight'].default_value  = 0.4
mix.inputs['Fac'].default_value                   = 0.6
em.inputs['Strength'].default_value               = 2.0

cl.new(vc.outputs['Color'],   bsdf.inputs['Base Color'])
cl.new(vc.outputs['Color'],   em.inputs['Color'])
cl.new(bsdf.outputs['BSDF'],   mix.inputs[1])
cl.new(em.outputs['Emission'], mix.inputs[2])
cl.new(mix.outputs['Shader'],  out.inputs['Surface'])
cloud.data.materials.append(cmat)

# ── RNA Velocity arrows as thin luminous curve splines ────────────
import random
# Sample 500 cells for velocity arrows (enough for visual density)
arrow_cells = random.sample(cells, min(500, len(cells)))
for ac in arrow_cells:
    p0 = ac['pos']
    # In production, use actual scVelo velocity vector; here: jittered
    vel = [np.random.normal() * 0.3 for _ in range(3)]
    p1  = [p0[k] + vel[k] for k in range(3)]

    vcd = bpy.data.curves.new("vel", 'CURVE')
    vcd.dimensions = '3D'; vcd.bevel_depth = 0.006
    sp = vcd.splines.new('POLY')
    sp.points.add(1)
    sp.points[0].co = (*p0, 1); sp.points[1].co = (*p1, 1)
    vobj = bpy.data.objects.new("Vel", vcd)
    bpy.context.collection.objects.link(vobj)
    vm = bpy.data.materials.new("VM")
    vm.use_nodes = True; vm.node_tree.nodes.clear()
    ve = vm.node_tree.nodes.new('ShaderNodeEmission')
    vo = vm.node_tree.nodes.new('ShaderNodeOutputMaterial')
    ve.inputs['Color'].default_value    = (*ac['color'], 1)
    ve.inputs['Strength'].default_value = 6.0
    vm.node_tree.links.new(ve.outputs['Emission'], vo.inputs['Surface'])
    vobj.data.materials.append(vm)

scene = bpy.context.scene
scene.render.engine         = 'CYCLES'
scene.cycles.samples        = 512    # emission-dominant — fast
scene.render.resolution_x   = 4096; scene.render.resolution_y = 4096
scene.view_settings.look    = 'AgX - Punchy'
scene.view_settings.exposure = 1.2
scene.render.filepath       = '/renders/cell_atlas.exr'
bpy.ops.render.render(write_still=True)
💎Crown Jewel Techniques — Pipeline 08

The biology of light

50,000 cells become a galaxy. UMAP topology — the biological truth of transcriptional similarity — determines their spatial arrangement. Immune cells cluster in the blue corona; stem cells glow amber at the origin. CD3D expression burns white-hot in T-cell territories. RNA velocity arrows illuminate the developmental highways: differentiation isn't implied, it's drawn in light.

GN Instance on Points
Geometry Nodes "Instance on Points" renders 50k icospheres with a single mesh object — no Python loop required at render time
Gene Expr → Emission
Marker gene log-normalized count maps to emission strength 0.4–3.9 — high-expressing cells are self-illuminating beacons
Cosine UMAP Metric
Cosine distance in PCA space captures transcriptional direction, not magnitude — critical for scRNA topology
scVelo Dynamical Mode
Dynamical (not stochastic) velocity model solves kinetic rate equations — more accurate arrows for slow lineages
Vertex Float Attribute
Emission strength stored as a float vertex attribute — Geometry Nodes can then vary instance scale by this attribute
Leiden Resolution 0.8
Leiden at r=0.8 gives biologically meaningful clusters (15–20 for PBMC) without over-splitting rare populations
09
Acoustic Physics · FDTD Simulation
Standing Wave Topography:
Sound Frozen in Geometry

Simulate a 2D acoustic pressure field from multiple point sources using finite-difference time-domain (FDTD) integration, extract a time-snapshot of wave interference, then displace a highly subdivided mesh in Blender by the pressure amplitude — coating it with an iridescent thin-film material that maps phase angle to hue, creating a Chladni landscape of sound made tangible.

〰️
Simulate
FDTD 2D Acoustics
📸
Snapshot
Peak Pressure Frame
🗻
Displace
Height + Phase Map
🎨
Coat
Thin-Film Iridescence
Render
Oil-Slick Caustics
step_01_fdtd_acoustics.py — Wave Interference Simulation
import numpy as np
from scipy.ndimage import gaussian_filter

# ═══════════════════════════════════════════════════════════════════
#  FDTD 2D Acoustic pressure field — Yee scheme
#  ∂²p/∂t² = c² ∇²p  →  leapfrog: p[n+1] = 2p[n] - p[n-1] + c²dt²∇²p[n]
# ═══════════════════════════════════════════════════════════════════

N  = 512           # grid resolution
c  = 343.0         # speed of sound m/s
dx = 0.05          # spatial step (5 cm)
dt = dx / (c * 1.42) # CFL condition: dt < dx/(c√2)

# Point source frequencies — use prime ratios for complex interference
sources = [
    {'pos': (N//4,   N//4),   'f': 440, 'amp': 1.0},
    {'pos': (3*N//4, N//4),   'f': 554, 'amp': 0.9},
    {'pos': (N//2,   3*N//4), 'f': 659, 'amp': 0.8},
    {'pos': (N//4,   3*N//4), 'f': 330, 'amp': 0.7},
    {'pos': (3*N//4, 3*N//4), 'f': 392, 'amp': 0.85},
]

p     = np.zeros((N, N))
p_old = np.zeros((N, N))
p_new = np.zeros((N, N))
c2dt2dx2 = (c * dt / dx)**2

# Mur absorbing boundary condition buffer
pml_thickness = 20
pml_sigma = np.zeros((N,N))
for i in range(pml_thickness):
    s = ((pml_thickness - i) / pml_thickness) ** 2 * 0.08
    pml_sigma[i, :]    = s; pml_sigma[N-1-i, :] = s
    pml_sigma[:, i]    = s; pml_sigma[:, N-1-i] = s

pressure_history = []
n_steps = 1800   # run until interference pattern stabilises

for t_step in range(n_steps):
    t = t_step * dt

    # 2D Laplacian (5-point stencil)
    lap = (np.roll(p,1,0)+np.roll(p,-1,0)+
           np.roll(p,1,1)+np.roll(p,-1,1)-4*p)

    p_new[:] = (2*p - p_old + c2dt2dx2*lap) * (1 - pml_sigma)

    # Inject monochromatic sources
    for src in sources:
        r,c_ = src['pos']
        p_new[r,c_] += src['amp'] * np.sin(2*np.pi*src['f']*t)

    p_old[:] = p; p[:] = p_new

    if t_step >= n_steps - 200:   # average last 200 frames
        pressure_history.append(p.copy())

# ── Time-averaged RMS pressure (standing wave pattern) ────────────
p_rms = np.sqrt(np.mean(np.array(pressure_history)**2, axis=0))
# Instantaneous pressure snapshot for phase
p_snap = pressure_history[-1]
phase  = np.arctan2(p_snap, p_rms + 1e-8) / np.pi   # normalized -1..1

# Normalize outputs
p_norm   = p_rms / p_rms.max()
p_height = p_norm * 0.8       # Blender displacement scale (80cm)
phase_n  = (phase + 1) / 2    # 0..1 for color mapping

np.save('pressure_height.npy', p_height)
np.save('phase_map.npy',       phase_n)
print(ff"Peak pressure: {p_rms.max():.4f} | RMS field saved at {N}×{N}")
step_02_blender_acoustic.py — Displaced Iridescent Sound Landscape
import bpy, bmesh
import numpy as np

# ════════════════════════════════════════════════════════════════════
#  ACOUSTIC LANDSCAPE — displaced mesh + oil-slick iridescent coat
# ════════════════════════════════════════════════════════════════════

p_height = np.load('/tmp/pressure_height.npy')
phase_n  = np.load('/tmp/phase_map.npy')
N = p_height.shape[0]

bpy.ops.object.select_all(action='SELECT'); bpy.ops.object.delete()

# ── High-resolution plane: 512×512 verts = 262k quads ─────────────
bpy.ops.mesh.primitive_grid_add(x_subdivisions=N-1, y_subdivisions=N-1,
                                  size=8.0)
plane = bpy.context.active_object; plane.name = "AcousticSurface"

# ── Displace vertices by pressure amplitude ───────────────────────
bm = bmesh.from_edit_mesh(plane.data)
bpy.ops.object.mode_set(mode='OBJECT')
bm = bmesh.new(); bm.from_mesh(plane.data)
bm.verts.ensure_lookup_table()

# Phase vertex color for iridescent mapping
phase_col = bm.verts.layers.color.new("Phase")

for i, v in enumerate(bm.verts):
    # Map vertex index to grid position
    row = i // N; col = i % N
    row = min(row, N-1); col = min(col, N-1)
    h   = float(p_height[row, col])
    phi = float(phase_n[row, col])

    v.co.z = h   # displace in Z by pressure

    # Phase → HSV → RGB for thin-film iridescence palette
    # 0→magenta, 0.25→gold, 0.5→cyan, 0.75→green, 1→magenta
    import colorsys
    r, g, b = colorsys.hsv_to_rgb(phi, 0.95, 0.9)
    v[phase_col] = (r, g, b, 1.0)

bm.to_mesh(plane.data); bm.free()
plane.data.calc_normals()

# ── Oil-slick thin-film iridescent material ───────────────────────
mat = bpy.data.materials.new("ThinFilm")
mat.use_nodes = True
mn = mat.node_tree.nodes; ml = mat.node_tree.links
mn.clear()

vcol  = mn.new('ShaderNodeVertexColor'); vcol.layer_name = "Phase"
geo   = mn.new('ShaderNodeNewGeometry')
fres  = mn.new('ShaderNodeFresnel')
hue   = mn.new('ShaderNodeHueSaturation')
bsdf  = mn.new('ShaderNodeBsdfPrincipled')
gloss = mn.new('ShaderNodeBsdfGlossy')
add   = mn.new('ShaderNodeAddShader')
out   = mn.new('ShaderNodeOutputMaterial')

# Fresnel shifts the hue by angle → viewing-angle iridescence
hue.inputs['Saturation'].default_value = 1.4
hue.inputs['Value'].default_value      = 1.2

bsdf.inputs['Metallic'].default_value           = 0.9
bsdf.inputs['Roughness'].default_value          = 0.05
bsdf.inputs['Specular IOR Level'].default_value  = 2.0
gloss.inputs['Roughness'].default_value          = 0.02

ml.new(vcol.outputs['Color'],    hue.inputs['Color'])
ml.new(fres.outputs['Fac'],     hue.inputs['Hue'])    # angle shifts hue
ml.new(hue.outputs['Color'],    bsdf.inputs['Base Color'])
ml.new(hue.outputs['Color'],    gloss.inputs['Color'])
ml.new(bsdf.outputs['BSDF'],    add.inputs[0])
ml.new(gloss.outputs['BSDF'],   add.inputs[1])
ml.new(add.outputs['Shader'],   out.inputs['Surface'])
plane.data.materials.append(mat)

# ── HDRI lighting: white overcast for pure material read ──────────
world = bpy.context.scene.world; world.use_nodes = True
bg = world.node_tree.nodes['Background']
bg.inputs['Color'].default_value    = (1,1,1,1)   # white overcast sky
bg.inputs['Strength'].default_value = 2.5

bpy.ops.object.camera_add(location=(0,0,12))
cam = bpy.context.active_object
cam.data.type = 'ORTHO'; cam.data.ortho_scale = 9   # top-down orthographic
cam.rotation_euler = (0, 0, 0)
bpy.context.scene.camera = cam

scene = bpy.context.scene
scene.render.engine          = 'CYCLES'
scene.cycles.samples         = 2048
scene.cycles.caustics_reflective = True
scene.render.resolution_x    = 4096; scene.render.resolution_y = 4096
scene.view_settings.look     = 'AgX - High Contrast'
scene.render.filepath        = '/renders/acoustic_landscape.exr'
bpy.ops.render.render(write_still=True)
💎Crown Jewel Techniques — Pipeline 09

Chladni meets oil on water

The wave interference of five simultaneous acoustic sources — A440, E554, B659, E330, G392, a chord of near-perfect harmony — locks into a standing-wave topography. The RMS pressure field is the canyon map; the instantaneous phase angle paints it in shifting spectral color. An orthographic top-down camera turns the render into a perfect scientific plate, while Fresnel-angle hue-shift means the material changes color as the observer moves: an oil-slick trapped in mathematics.

Yee FDTD Scheme
Second-order leapfrog integrator with CFL-stable dt=dx/(c√2) — exact discrete wave equation
Mur PML Absorption
Quadratic σ profile at 20-cell boundary eliminates reflections — pure interference, no edge artifacts
Phase → HSV Color
arctan2(p_snap, p_rms) gives instantaneous phase angle, cycled through HSV at Sat=0.95 for jewel-tone iridescence
Fresnel Hue Driver
Fresnel output drives the Hue socket of HueSaturation node — viewing angle physically shifts the apparent color
Metallic Gloss Add
Adding a GlossyBSDF to Principled creates specular hot-spots that shift independently from the diffuse iridescence
Orthographic Plate
Orthographic camera eliminates perspective — the render reads as a precise scientific measurement artifact
10
Urban Analytics · NYC TLC / GTFS
City of Flows:
Ten Million Journeys at Once

Aggregate 10 million NYC taxi trips into origin-destination flow matrices, bin by time-of-day and borough, then render them in Blender as parametric Bézier arc-ribbons over a procedurally generated city grid mesh — flow volume drives ribbon width and altitude, hour-of-day drives color temperature — a living pulse map of urban desire lines.

🚕
Source
NYC TLC Parquet
📦
Bin
H3 Hex Grid
🌐
OD Matrix
Flow Aggregation
🌈
Encode
Time → Color Temp
Render
Arc Ribbons + City
step_01_urban_flows.py — OD Flow Matrix via H3 Hexagons
import pandas as pd, numpy as np, json
import h3        # pip install h3
from collections import defaultdict

# ── Load NYC Yellow Taxi data (2023, all months via TLC open data) ─
months = [ff"yellow_tripdata_2023-{m:02d}.parquet" for m in range(1,13)]
df = pd.concat([pd.read_parquet(m, columns=[
    'tpep_pickup_datetime','pickup_latitude','pickup_longitude',
    'dropoff_latitude','dropoff_longitude','passenger_count'
]) for m in months], ignore_index=True)

# Remove outliers outside NYC bounding box
NYC = {'lat':(40.47,40.92), 'lon':(-74.27,-73.68)}
df  = df[df.pickup_latitude.between(*NYC['lat'])  &
         df.pickup_longitude.between(*NYC['lon'])  &
         df.dropoff_latitude.between(*NYC['lat'])  &
         df.dropoff_longitude.between(*NYC['lon'])]

df['hour'] = pd.to_datetime(df.tpep_pickup_datetime).dt.hour

# ── H3 resolution 7 (~0.6 km²) hexagonal binning ─────────────────
H3_RES = 7
df['h3_origin']  = df.apply(lambda r: h3.latlng_to_cell(
    r.pickup_latitude, r.pickup_longitude, H3_RES), axis=1)
df['h3_dest']    = df.apply(lambda r: h3.latlng_to_cell(
    r.dropoff_latitude, r.dropoff_longitude, H3_RES), axis=1)

# ── Aggregate OD flows by (origin, destination, hour) ─────────────
od_counts = df.groupby(['h3_origin','h3_dest','hour']).size().reset_index(name='trips')
od_counts = od_counts[od_counts.trips > 30]   # threshold: ≥30 trips

# ── H3 cell → centroid coordinates ───────────────────────────────
def h3_to_blender(cell):
    lat, lon = h3.cell_to_latlng(cell)
    # Map NYC bbox → [-5, 5] Blender units
    x = (lon - NYC['lon'][0]) / (NYC['lon'][1]-NYC['lon'][0]) * 10 - 5
    y = (lat - NYC['lat'][0]) / (NYC['lat'][1]-NYC['lat'][0]) * 10 - 5
    return x, y

# ── Hour-of-day → color temperature ──────────────────────────────
# 00–06: midnight blue, 06–09: dawn gold, 09–17: noon white,
# 17–21: dusk amber-red, 21–24: night violet
def hour_to_color(h):
    if   h <  6:  return (.05, .1,  .7)    # midnight blue
    elif h <  9:  return (1.0, .7,  .2)    # dawn gold
    elif h < 17:  return (1.0, 1.0, .95)  # noon white
    elif h < 21:  return (1.0, .35, .1)   # dusk orange
    else:          return (.55, .15, .9)   # night violet

trips_max = od_counts.trips.max()
arcs = []
for _, row in od_counts.iterrows():
    ox, oy = h3_to_blender(row.h3_origin)
    dx, dy = h3_to_blender(row.h3_dest)
    flow   = row.trips / trips_max   # 0..1
    arc_h  = 0.3 + flow * 3.0        # taller arcs = more traffic
    width  = 0.004 + flow * 0.04    # thicker ribbon = more trips
    arcs.append({
        'o': [ox, oy], 'd': [dx, dy],
        'h': arc_h, 'w': width,
        'col': hour_to_color(int(row.hour)),
        'flow': float(flow),
        'hour': int(row.hour)
    })

with open('od_arcs.json', 'w') as f: json.dump(arcs, f)
print(ff"Generated {len(arcs):,} OD flow arcs")
step_02_blender_city.py — Desire Line City Pulse Map
import bpy, json, math
import numpy as np

# ════════════════════════════════════════════════════════════════════
#  CITY PULSE — Bézier arc ribbons over procedural city grid
# ════════════════════════════════════════════════════════════════════

with open('/tmp/od_arcs.json') as f: arcs = json.load(f)
bpy.ops.object.select_all(action='SELECT'); bpy.ops.object.delete()

# ── 1. CITY GRID PLANE — procedural block texture ─────────────────
bpy.ops.mesh.primitive_grid_add(x_subdivisions=80, y_subdivisions=80, size=12)
city = bpy.context.active_object; city.name = "CityGrid"

city_mat = bpy.data.materials.new("CityMat")
city_mat.use_nodes = True
cn = city_mat.node_tree.nodes; cl = city_mat.node_tree.links; cn.clear()

# Voronoi texture → city block pattern
tex  = cn.new('ShaderNodeTexCoord')
vor  = cn.new('ShaderNodeTexVoronoi')
ramp = cn.new('ShaderNodeValToRGB')
bsdf = cn.new('ShaderNodeBsdfPrincipled')
em   = cn.new('ShaderNodeEmission')
msh  = cn.new('ShaderNodeMixShader')
out  = cn.new('ShaderNodeOutputMaterial')

vor.voronoi_dimensions   = '3D'
vor.inputs['Scale'].default_value = 8.0
ramp.color_ramp.elements[0].color = (.02,.03,.05,1)   # dark block interior
ramp.color_ramp.elements[1].color = (.08,.12,.2,1)    # lighter street grid
ramp.color_ramp.elements[1].position = 0.05   # thin street lines
bsdf.inputs['Roughness'].default_value = 0.9
em.inputs['Color'].default_value       = (.1,.18,.35,1)   # cool blue street glow
em.inputs['Strength'].default_value    = 0.4
msh.inputs['Fac'].default_value         = 0.35

cl.new(tex.outputs['Generated'], vor.inputs['Vector'])
cl.new(vor.outputs['Distance'],  ramp.inputs['Fac'])
cl.new(ramp.outputs['Color'],   bsdf.inputs['Base Color'])
cl.new(bsdf.outputs['BSDF'],    msh.inputs[1])
cl.new(em.outputs['Emission'],  msh.inputs[2])
cl.new(msh.outputs['Shader'],   out.inputs['Surface'])
city.data.materials.append(city_mat)

# ── 2. FLOW ARCS — Quadratic Bézier curves ───────────────────────
# Bézier: P0 (origin), P1 (midpoint elevated by arc_h), P2 (dest)
# Lateral offset adds width by cross-product of arc direction

def make_arc(o, d, h, w, col, flow):
    ox, oy = o; dx, dy = d
    mid_x = (ox+dx)/2; mid_y = (oy+dy)/2; mid_z = h

    cd = bpy.data.curves.new("arc", 'CURVE')
    cd.dimensions     = '3D'
    cd.resolution_u   = 20
    cd.bevel_depth    = w
    cd.bevel_resolution = 4
    cd.use_fill_caps  = True

    sp = cd.splines.new('BEZIER')
    sp.bezier_points.add(1)   # 3 points: start, mid, end

    bp0 = sp.bezier_points[0]
    bp0.co = (ox, oy, 0.02)
    bp0.handle_left_type  = 'VECTOR'
    bp0.handle_right_type = 'FREE'
    bp0.handle_right      = (ox + (mid_x-ox)*0.6, oy + (mid_y-oy)*0.6, mid_z*0.7)

    bp1 = sp.bezier_points[1]
    bp1.co = (dx, dy, 0.02)
    bp1.handle_right_type = 'VECTOR'
    bp1.handle_left_type  = 'FREE'
    bp1.handle_left       = (dx + (mid_x-dx)*0.6, dy + (mid_y-dy)*0.6, mid_z*0.7)

    obj = bpy.data.objects.new("Arc", cd)
    bpy.context.collection.objects.link(obj)

    m = bpy.data.materials.new("AM"); m.use_nodes = True
    m.node_tree.nodes.clear()
    ae = m.node_tree.nodes.new('ShaderNodeEmission')
    ao = m.node_tree.nodes.new('ShaderNodeOutputMaterial')
    ae.inputs['Color'].default_value    = (*col, 1)
    ae.inputs['Strength'].default_value = 2.0 + flow * 6.0   # busier = brighter
    m.node_tree.links.new(ae.outputs['Emission'], ao.inputs['Surface'])
    obj.data.materials.append(m)

# Build all arcs (batch by hour for potential animation later)
for arc in arcs:
    make_arc(arc['o'], arc['d'], arc['h'], arc['w'], arc['col'], arc['flow'])

# ── 3. CAMERA — low dramatic angle looking through city ───────────
bpy.ops.object.camera_add(location=(-8, -6, 4))
cam = bpy.context.active_object
cam.data.lens         = 50
cam.rotation_euler    = (1.1, 0, -0.6)
bpy.context.scene.camera = cam

# ── World: deep city night sky ────────────────────────────────────
world = bpy.context.scene.world; world.use_nodes = True
world.node_tree.nodes['Background'].inputs['Color'].default_value = (.001,.002,.008,1)
world.node_tree.nodes['Background'].inputs['Strength'].default_value = 0

# ── Compositor: bloom + lens distortion for cinematic look ────────
bpy.context.scene.use_nodes = True
tree = bpy.context.scene.node_tree; tree.nodes.clear()
rl   = tree.nodes.new('CompositorNodeRLayers')
gl   = tree.nodes.new('CompositorNodeGlare')
ld   = tree.nodes.new('CompositorNodeLensdist')
co   = tree.nodes.new('CompositorNodeComposite')

gl.glare_type = 'STREAKS'; gl.streaks = 4; gl.angle_offset = math.radians(15)
gl.threshold  = 0.6; gl.size = 8
ld.inputs['Distort'].default_value = -0.04   # subtle barrel distortion

tree.links.new(rl.outputs['Image'], gl.inputs['Image'])
tree.links.new(gl.outputs['Image'], ld.inputs['Image'])
tree.links.new(ld.outputs['Image'], co.inputs['Image'])

scene = bpy.context.scene
scene.render.engine          = 'CYCLES'
scene.cycles.samples         = 512
scene.render.resolution_x    = 5120; scene.render.resolution_y = 2880
scene.view_settings.look     = 'AgX - High Contrast'
scene.view_settings.exposure = 0.5
scene.render.filepath        = '/renders/city_flows.exr'
bpy.ops.render.render(write_still=True)
💎Crown Jewel Techniques — Pipeline 10

The city breathing in light

Ten million trips become a luminous nervous system. Dawn commutes pulse gold from residential Brooklyn to Midtown; midnight rides arc violet from bars in the East Village. The Voronoi city grid glows faint blue beneath — geography as substrate. Streak glare on every bright arc turns the render into a long-exposure photograph of a city that never stopped moving. The 4-streak compositor glare at 15° rotation matches the Manhattan street grid angle exactly.

H3 Hexagonal Binning
Uber H3 at resolution 7 (~0.6 km²) gives topologically consistent bins that never distort near coastlines
Quadratic Bézier Arcs
Two bezier_points with asymmetric handle heights create catenary arcs — the natural physics of desire lines
Flow → Arc Altitude
Higher trip volume pushes the apex skyward (0.3–3.3 BU) — the busiest corridors literally tower above lesser routes
Voronoi City Texture
ShaderNodeTexVoronoi at Scale=8 produces blocky Voronoi cells; threshold at position 0.05 gives razor-thin street lines
Streak Glare at Grid Angle
4-streak glare rotated 15° to match Manhattan's street grid offset — geographic truth embedded in the lens artifact
Lens Distortion
−0.04 barrel distortion mimics a slightly wide photographic lens — subtle but it makes the city feel real-scale

The Art of the Pipeline

Mastery principles that separate a scientific render from an unforgettable one.

PRINCIPLE / 01
The data IS the composition
Don't fight your data's natural geometry. Let tectonic plates become boundaries, let OD desire-lines become arcs, let UMAP topology become spatial clustering. The physics dictates the aesthetics.
PRINCIPLE / 02
One physically meaningful variable → one visual channel
Pick a 1:1 mapping: FA → iridescence. Flow → altitude. Q-criterion → opacity. Frequency → hue. Mixing two meanings into one channel destroys legibility and beauty simultaneously.
PRINCIPLE / 03
Geometry Nodes for scale, Python loops for control
50k cells? Use GN Instance on Points. 80 fiber bundles? Python loop with full per-object control. The threshold is roughly 5k objects — below that, loop; above that, instancing.
PRINCIPLE / 04
The camera is part of the analysis
Orthographic for scientific plates (acoustics, maps). Telephoto for network compression (proteins, correlations). Low dramatic angle for phenomenological scale (cities, turbulence). Match the lens to the message.
PRINCIPLE / 05
Compositor glare is a scientific choice
Streak glare aligned to data geometry (Manhattan's 15° grid offset). Fog glow to reveal emission density gradients. These aren't decorative — they encode information about the data's intensity distribution.
PRINCIPLE / 06
Never normalize to max — use percentiles
One M9.5 earthquake collapses all others to invisible. One superstar stock flattens the network. Always normalize to 95th or 99th percentile, then let outliers bloom above that ceiling naturally.