Reference : Depth enhancement by fusion for passive and active sensing
Scientific congresses, symposiums and conference proceedings : Paper published in a journal
Engineering, computing & technology : Electrical & electronics engineering
http://hdl.handle.net/10993/25026
Depth enhancement by fusion for passive and active sensing
English
Garcia, F. [Interdisciplinary Centre for Security, Reliability and Trust, Universtity of Luxembourg, Luxembourg]
Aouada, D. [Interdisciplinary Centre for Security, Reliability and Trust, Universtity of Luxembourg, Luxembourg]
Abdella, H. K. [Interdisciplinary Centre for Security, Reliability and Trust, Universtity of Luxembourg, Luxembourg, Université de Bourgogne, France]
Solignac, T. [Advanced Engineering - IEE S.A., Luxembourg]
Mirbach, B. [Advanced Engineering - IEE S.A., Luxembourg]
Ottersten, Björn mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) >]
2012
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
7585 LNCS
PART 3
506-515
Yes
International
03029743
12th European Conference on Computer Vision, ECCV 2012
7 October 2012 through 13 October 2012
Florence
[en] Active Sensing ; Depth enhancement ; Depth Map ; Depth measurements ; Depth sensing ; Depth value ; Homogeneous regions ; Passive and active sensing ; Passive sensing ; Sensing systems ; Special treatments ; Stereo cameras ; Time-of-flight cameras ; Triangulation method ; Cameras ; Computer vision ; Data fusion ; Sensors ; Passive filters
[en] This paper presents a general refinement procedure that enhances any given depth map obtained by passive or active sensing. Given a depth map, either estimated by triangulation methods or directly provided by the sensing system, and its corresponding 2-D image, we correct the depth values by separately treating regions with undesired effects such as empty holes, texture copying or edge blurring due to homogeneous regions, occlusions, and shadowing. In this work, we use recent depth enhancement filters intended for Time-of-Flight cameras, and adapt them to alternative depth sensing modalities, both active using an RGB-D camera and passive using a dense stereo camera. To that end, we propose specific masks to tackle areas in the scene that require a special treatment. Our experimental results show that such areas are satisfactorily handled by replacing erroneous depth measurements with accurate ones. © 2012 Springer-Verlag.
Google;National Robotics Engineering Center (NREC);Adobe;Microsoft Research;Mitsubishi Electric
http://hdl.handle.net/10993/25026
10.1007/978-3-642-33885-4_51
93332
9783642338847

File(s) associated to this reference

Fulltext file(s):

FileCommentaryVersionSizeAccess
Open access
Depth Enhancement by Fusion for Passive.pdfPublisher postprint1.36 MBView/Open

Bookmark and Share SFX Query

All documents in ORBilu are protected by a user license.