Chapter 11 HDR and Exposure Fusion


One-Sentence Goal
On a traceable linear radiometric baseline, robustly merge multi-exposure/multi-gain observations—under alignment and de-ghosting constraints—into a high-dynamic-range, scene-referred representation, then map it to the target transfer system (PQ/HLG/legacy OETF) for publication.


I. Scope & Targets

  1. Inputs
    • Multi-frame raw observations: { I_k_raw }, k = 1..K (possibly Bayer/multispectral, with differing t_k, ISO_k, ND_k, a_k).
    • Calibration & response: camera response f or inverse LUT LUT_inv, black level D, flat-field and PRNU/DSNU, saturation threshold S_max.
    • Temporal & geometric: ts | tau_mono, rolling/global shutter mode, reference frame index k_ref, alignment model W_k.
    • Noise & gain: read noise sigma_r, electronic gain G_k, net exposure factor K_k = t_k * a_k * G_k * ND_k.
    • Rendering target: dst_cs and OETF ∈ { sRGB, PQ, HLG, GammaX }, display peak or target luminance L_peak.
  2. Outputs
    • Scene-referred HDR: E_hat or L_hat (linear irradiance/radiance), optionally OpenEXR/float16.
    • Display-referred image: RGB_out (after tone mapping and OETF), plus optional HDR10/HLG metadata.
    • Artifacts & manifest: hdr_profile.v1, ghost_mask, sat_mask_k, manifest.imaging.hdr, hash_sha256(profile), signature.
  3. Applicability
    • Supports exposure bracketing, dual-gain readout, alternating ISO, multi-camera fusion; cross-modality fusion requires prior Chapter 10 color binding.
    • For fast motion and flicker sources, enable de-ghosting and exposure equalization by default; on failure, fall back to the best single frame.

II. Terms & Variables

  1. Imaging & physics
    • E(x,y): irradiance; L: radiance; f: camera response; D: black level; S_max: saturation level.
    • K_k = t_k * a_k * G_k * ND_k: net exposure factor; y_k: pixel value; n_k: noise.
  2. Estimation & fusion
    • E_hat_k = ( f^{-1}( y_k ) - D ) / K_k: per-frame irradiance estimate.
    • w_k(x,y): weight; sat_mask_k = 1{ y_k ≥ S_thr }; ghost_mask: motion/occlusion mask.
    • W_k: warp from frame k to k_ref; Ω_valid: valid domain after warping.
  3. Metrics
    • DR_scene = log2( max(E_hat) / min_pos(E_hat) ); DR_gain = DR_out - DR_ref.
    • ghost_rate = |ghost_mask| / |Ω_valid|; halo_score: edge-halo metric; banding_rate.
    • SNR_k ≈ E_hat_k / sqrt( sigma_r^2 / K_k^2 + sigma_s * E_hat_k / K_k ) (with sigma_s the photon-noise factor).

III. Axioms P211- (HDR Fusion Baseline)*


IV. Minimal Equations S211-*


V. Pipeline & Operational Flow M110-*


VI. Contracts & Assertions


VII. Implementation Bindings I110-*


VIII. Cross-References


IX. Quality Metrics & Risk Control

  1. Indicators & thresholds
    • DR_scene, DR_gain, ghost_rate, halo_score, banding_rate, edge_acutance_change, artifact_rate.
    • Runtime monitoring: within window Delta_t, track drift_exposure_meta, drift in w_k distributions, spikes in ghost_rate.
  2. Key risks & playbooks
    • Flicker/LED PWM: exposure inconsistency across frames → exposure-equalization regression or short-exposure-only fallback.
    • Fast motion/occlusion: ghosting and mismatches → strengthen ghost_mask, locally replace with single frames.
    • Highlight spill & reverse interpolation: bright-edge artifacts → sub-pixel alignment + guided-filter tone mapping.
    • Response mismatch: f^{-1} bias → online CRF refinement or gray-card self-calibration.
    • Resources & latency: in streaming, constrain thr/chan flow (see Threads); if needed, down-order weights and process in tiles.

Summary
This chapter delivers an executable HDR fusion loop: inv_response → calibrate → align/ghost → per-frame irradiance → SNR-aware weights → robust merge → color binding → tone/OETF. With DR_scene / ghost_rate / halo_score as core contracts and explicit fallback paths, the pipeline—published with hdr_profile.v1 and manifest.imaging.hdr—ensures consistent, auditable HDR across devices and scenes.