A Flexible and Versatile Studio for Synchronized Multi-View Video Recording
Loading...
Date
2003
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
In recent years, the convergence of computer vision and computer graphics has put forth new research areas that work on scene reconstruction from and analysis of multi-view video footage. In free-viewpoint video, for example, new views of a scene are generated from an arbitrary viewpoint in real-time using a set of multi-view video streams as inputs. The analysis of real-world scenes from multi-view video to extract motion information or reflection models is another field of research that greatly benefits from high-quality input data. Building a recording setup for multi-view video involves a great effort on the hardware as well as the software side. The amount of image data to be processed is huge, a decent lighting and camera setup is essential for a naturalistic scene appearance and robust background subtraction, and the computing infrastructure has to enable real-time processing of the recorded material. This paper describes our recording setup for multi-view video acquisition that enables the synchronized recording of dynamic scenes from multiple camera positions under controlled conditions. The requirements to the room and their implementation in the separate components of the studio are described in detail. The efficiency and flexibility of the room is demonstrated on the basis of the results that we obtain with a real-time 3D scene reconstruction system, a system for non-intrusive optical motion capture and a model-based free-viewpoint video system for human actors.
Description
@inproceedings{10.2312:vvg.20031002,
booktitle = {Vision, Video, and Graphics (VVG) 2003},
editor = {Peter Hall and Philip Willis},
title = {{A Flexible and Versatile Studio for Synchronized Multi-View Video Recording}},
author = {Theobalt, C. and Li, M. and Magnor, M.A. and Seidel, H.-P.},
year = {2003},
publisher = {The Eurographics Association},
ISBN = {3-905673-54-1},
DOI = {10.2312/vvg.20031002}
}