With its second major release, Atlux λ 2 from Vinzi upgrades Unreal Engine from a lighting and rendering sandbox to something more unexpected: a fully simulated 3D scanning studio. The plugin, previously known as MetaShoot, already recreated physical photography stages within Unreal. Now, its new Capture tab extends this idea to the world of 3D data acquisition. Artists can distribute virtual DSLR cameras around a subject using the Camera Volume, tweak focal length, aperture, and shutter speed, and generate synthetic scan data without leaving the engine.
That data can be exported as standard PLY point clouds or COLMAP datasets, complete with camera positions, ready for reconstruction or use in Gaussian Splatting workflows. In effect, Atlux λ 2 allows Unreal scenes to feed directly into neural or point-based renderers—turning virtual scenes into training data or splat clouds.
Lighting: modernised for Lumen
The original Atlux lights were tuned for Unreal Engine 4’s raytracer. Version 2 retunes everything for Lumen, Unreal’s realtime global illumination system. Point and Directional light types have been added, Spotlight behaviour refined, and new presets included for standard studio setups.
A new procedural gobo system projects colour and shadow patterns across surfaces or acts as a softbox diffuser. The system remains parametric, letting artists adjust projection density, sharpness, and tint directly in the viewport.
Props, particles, and calibration
Atlux λ 2 also expands the library of assets available inside its Prop tab. New podiums, curtains, and cyclorama-style backgrounds can be dropped in with a click. Environmental effects such as light trails, particle bursts, fog, and god rays simulate studio atmosphere or help block out cinematic moods.
For technical accuracy, Vinzi adds reference materials and calibration props: chrome and grey balls, Macbeth-style colour charts, and surface decals such as fingerprints or scratches for realism checks.

Cameras and animation
Camera motion receives a complete overhaul via the new Action tab. Beyond static turntables and rail rigs, the update introduces floating, breathing, and handheld motions—useful for emulating product demo or cinematic coverage. Object animation presets now include oscillation and platform rotations, making it easier to preview assets dynamically without separate sequencer setups.
Stylisation and colour management
The Post-Process tab offers a surprising detour into non-photorealistic rendering. New materials simulate hatching, blueprints, oil painting, or even LEGO-style mosaic looks. While arguably niche, these presets demonstrate Atlux’s procedural layering system. Colour workflow has also matured. Artists can now load custom LUTs in .cube format, and use OCIO configuration directly from the plugin. This aligns Atlux with modern VFX colour management pipelines, especially for teams delivering look-dev stills or promotional imagery.
Randomisation and stability
Version 2 replaces the former AI Randomizer with Afflatus, a fully controllable randomisation engine. It can vary lights, props, or cameras individually, making batch renders or creative explorations more predictable. Vinzi has also addressed long-standing stability issues, notably a crash that occurred when cancelling renders mid-process. The UI has been reorganised and cleaned up, reflecting Atlux’s shift from experimental plugin to mature production tool.

Pricing and compatibility
Atlux λ 2 supports Unreal Engine 5.4 and later, running exclusively on Windows. It’s a free update for existing licence holders. A new perpetual licence costs $495 for individuals, while subscription users now pay $59.50 per month. Studios above $5 million annual revenue pay $990.50 or $109.50 per month.
Why a virtual scanner makes sense
Atlux’s decision to simulate photogrammetry inside Unreal feels both logical and overdue. Real-world 3D scanning is constrained by lighting, calibration, and physical capture limits. Unreal’s virtual environment removes all of these.
Because Atlux generates PLY or COLMAP data from synthetic cameras, users can prototype real scanning setups virtually—testing camera count, exposure, or spacing before investing in physical rigs. This also means Unreal can now output the same kind of data that Gaussian Splatting software ingests.
For visual effects, this unites two previously separate domains: engine-driven rendering and point-based reconstruction. Artists working on virtual production stages or motion-capture shoots could simulate scans of props or performers within their live Unreal environments, immediately feeding that synthetic data into splatting workflows or neural rendering experiments.
The result is a hybrid form of asset creation – half DCC, half dataset generation – without needing to step outside the engine.
The Gaussian bridge
Gaussian Splatting relies on point-based representations with density and colour information. Until now, these were usually reconstructed from photogrammetry or radiance field capture. Atlux’s virtual scan export removes the dependency on real footage: it produces consistent, noise-free data that can serve as either training material or direct renderable geometry.
It’s a small technical leap with large implications. For game cinematics, motion capture pre-visualisation, or virtual set planning, Gaussian splats generated in-engine would integrate seamlessly with existing Unreal lighting and compositing systems. Scene layout, lighting, and reconstruction could all occur within a single runtime context.
It raises a simple question: why did no one integrate a virtual scanner sooner? The pieces like Unreal, COLMAP, splat renderers, have existed for years. Atlux is simply the first to package them coherently inside the engine.

Practical caveats
Synthetic scanning does have limits. The generated data mirrors only what exists in the scene: no micro-surface irregularities, no photographic noise. For production pipelines requiring physical accuracy, real scans will still dominate.
Domain bias also matters. Gaussian reconstruction tools trained on real images may respond differently to perfect, noise-free synthetic data. Colour, specularity, and occlusion gaps might require deliberate simulation of imperfections to ensure realism.
Performance remains a concern. Simulating tens or hundreds of high-resolution cameras for each capture can tax GPU memory and I/O bandwidth. Unreal users will need to balance fidelity against throughput, especially when batch-generating scans.
A maturing plugin ecosystem
With Atlux λ 2, Vinzi seems to position Unreal less as a render target and more as a production laboratory—a controllable environment for both image generation and 3D data experimentation. It mirrors a broader shift in the industry: tools that blur the boundaries between DCC, realtime, and machine-learning workflows. As engines mature, the distinction between rendering, scanning, and simulation narrows.
Artists and technical directors should, as always, test the plugin in context before committing to production use. But Atlux λ 2’s ability to create synthetic scans directly inside Unreal marks a thoughtful and technically literate step toward unified virtual production pipelines.