Object-based 2D-to-3D video conversion for effective stereoscopic content generation in 3D-TV applications

Feng, Yue and Ren, Jinchang and Jiang, Jianmin (2011) Object-based 2D-to-3D video conversion for effective stereoscopic content generation in 3D-TV applications. IEEE Transactions on Broadcasting, 57 (2). pp. 500-509. ISSN 0018-9316 (https://doi.org/10.1109/TBC.2011.2131030)

[thumbnail of 3D-TV paper]
Preview
PDF. Filename: 3dtv_draft.pdf
Accepted Author Manuscript
License: Creative Commons Attribution-NonCommercial 4.0 logo

Download (701kB)| Preview

Abstract

Three-dimensional television (3D-TV) has gained increasing popularity in the broadcasting domain, as it enables enhanced viewing experiences in comparison to conventional two-dimensional (2D) TV. However, its application has been constrained due to the lack of essential contents, i.e., stereoscopic videos. To alleviate such content shortage, an economical and practical solution is to reuse the huge media resources that are available in monoscopic 2D and convert them to stereoscopic 3D. Although stereoscopic video can be generated from monoscopic sequences using depth measurements extracted from cues like focus blur, motion and size, the quality of the resulting video may be poor as such measurements are usually arbitrarily defined and appear inconsistent with the real scenes. To help solve this problem, a novel method for object-based stereoscopic video generation is proposed which features i) optical-flow based occlusion reasoning in determining depth ordinal, ii) object segmentation using improved region-growing from masks of determined depth layers, and iii) a hybrid depth estimation scheme using content-based matching (inside a small library of true stereo image pairs) and depth-ordinal based regularization. Comprehensive experiments have validated the effectiveness of our proposed 2D-to-3D conversion method in generating stereoscopic videos of consistent depth measurements for 3D-TV applications.

ORCID iDs

Feng, Yue, Ren, Jinchang ORCID logoORCID: https://orcid.org/0000-0001-6116-3194 and Jiang, Jianmin;