|
Back to List
Production 2026-02-19

75mm Studio Applies LiDAR & Matchmove to Plan.B's 'SNAP TIME' MV

Perfectly Realizing a Space Where Virtual Members and Real People Coexist

Written by 75mm Studio

75mm Studio's LiDAR scan and matchmove technology were actively utilized and received attention in the music video for the group song 'SNAP TIME', recently released by the virtual hip-hop label Plan.B Music.

In this project, where virtual members and real people had to coexist naturally in one space, a precise 3D data-based common coordinate system and camera tracking technology played a key role.

Plan.B SNAP TIME Scan Image

Precise 3D Data and Camera Tracking

75mm Studio secured high-resolution 3D data of the actual space through on-site LiDAR scanning and precisely matched camera tracking based on this. This significantly reduced scale mismatch and position error problems during the compositing process, allowing for both rapid work progress and high quality.

Perfect Integration Through Common Coordinate System

The integrated work, centered on the LiDAR scan-based common coordinate system, processed motion capture (FBX) and camera (FBX) data in a single scene, creating a result where live-action footage (Plate) and 3D characters blended perfectly.

In this music video, where Plan.B's virtual members freely move around and share daily life with actual employees in one space, 75mm Studio's technical capabilities contributed to maximizing immersion. Through this project, we were able to gain experience in utilizing LiDAR scan and matchmove technology more efficiently.

Technical Details & FAQ

What is LiDAR scanning and how is it used in VFX?
LiDAR (Light Detection and Ranging) is an ultra-precise 3D scanning technology that uses laser pulses to measure spatial distances. In modern VFX pipelines, precise geometry data acquired through LiDAR ensures that computer-generated CGI environments match physical movie sets with zero margin of error. This maximizes the accuracy of camera matchmoving and virtual set extensions, drastically reducing post-production turnaround times.
How does 75mm Studio process massive Point Cloud data?
Raw point cloud data acquired on-site consists of billions of points, making it too heavy for standard DCC software like Maya or Unreal Engine. We run the raw data through an advanced proprietary processing pipeline that involves point registration, aggressive noise reduction, and topological lightweighting. Ultimately, the dense point cloud is converted via meshing and retopology into an optimized, highly manageable polygonal asset.
What is the primary difference between Photogrammetry and LiDAR?
While LiDAR structural scanning focuses on calculating exact physical dimensions with millimeter accuracy (the 'skeleton'), Photogrammetry focuses strictly on generating photorealistic visual textures (the 'skin'). At 75mm Studio, we employ a hybrid workflow combining the undeniable millimeter-accuracy of laser scanning with the hyper-realistic optical fidelity of photogrammetry.
Why is 3D Gaussian Splatting revolutionary for spatial archiving?
3D Gaussian Splatting is a breakthrough AI-based neural rendering technique that bypasses the limitations of traditional polygonal rendering. It easily captures incredibly complex micro-details such as translucent objects, fine foliage, and specularity. Moreover, it achieves this extreme photorealism while maintaining real-time 60fps render performance, even directly within a web browser, making it the ultimate game-changer for digital twin distribution.

Interested in this technology? Contact Us