References of "Mirbach, B"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailMulti-frame super-resolution by enhanced shift & add
Al Ismaeil, Kassem UL; Aouada, D.; Ottersten, Björn UL et al

in International Symposium on Image and Signal Processing and Analysis, ISPA (2013)

A critical step in multi-frame super-resolution is the registration of frames based on their motion. We improve the performance of current state-of-the-art super-resolution techniques by proposing a more ... [more ▼]

A critical step in multi-frame super-resolution is the registration of frames based on their motion. We improve the performance of current state-of-the-art super-resolution techniques by proposing a more robust and accurate registration as early as in the initialization stage of the high resolution estimate. Indeed, we solve the limitations on scale and motion inherent to the classical Shift & Add approach by upsampling the low resolution frames up to the super-resolution factor prior to estimating motion or to median filtering. This is followed by an appropriate selective optimization, leading to an enhanced Shift & Add. Quantitative and qualitative evaluations have been conducted at two levels; the initial estimation and the final optimized superresolution. Results show that the proposed algorithm outperforms existing state-of-art methods. © 2013 University of Trieste and University of Zagreb. [less ▲]

Detailed reference viewed: 63 (1 UL)
Full Text
Peer Reviewed
See detailSpatio-temporal ToF data enhancement by fusion
Garcia, F.; Aouada, D.; Mirbach, B. et al

in Proceedings - International Conference on Image Processing, ICIP (2012)

We propose an extension of our previous work on spatial domain Time-of-Flight (ToF) data enhancement to the temporal domain. Our goal is to generate enhanced depth maps at the same frame rate of the 2-D ... [more ▼]

We propose an extension of our previous work on spatial domain Time-of-Flight (ToF) data enhancement to the temporal domain. Our goal is to generate enhanced depth maps at the same frame rate of the 2-D camera that, coupled with a ToF camera, constitutes a hybrid ToF multi-camera rig. To that end, we first estimate the motion between consecutive 2-D frames, and then use it to predict their corresponding depth maps. The enhanced depth maps result from the fusion between the recorded 2-D frames and the predicted depth maps by using our previous contribution on ToF data enhancement. The experimental results show that the proposed approach overcomes the ToF camera drawbacks; namely, low resolution in space and time and high level of noise within depth measurements, providing enhanced depth maps at video frame rate. © 2012 IEEE. [less ▲]

Detailed reference viewed: 72 (0 UL)
Full Text
Peer Reviewed
See detailDepth enhancement by fusion for passive and active sensing
Garcia, F.; Aouada, D.; Abdella, H. K. et al

in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (2012), 7585 LNCS(PART 3), 506-515

This paper presents a general refinement procedure that enhances any given depth map obtained by passive or active sensing. Given a depth map, either estimated by triangulation methods or directly ... [more ▼]

This paper presents a general refinement procedure that enhances any given depth map obtained by passive or active sensing. Given a depth map, either estimated by triangulation methods or directly provided by the sensing system, and its corresponding 2-D image, we correct the depth values by separately treating regions with undesired effects such as empty holes, texture copying or edge blurring due to homogeneous regions, occlusions, and shadowing. In this work, we use recent depth enhancement filters intended for Time-of-Flight cameras, and adapt them to alternative depth sensing modalities, both active using an RGB-D camera and passive using a dense stereo camera. To that end, we propose specific masks to tackle areas in the scene that require a special treatment. Our experimental results show that such areas are satisfactorily handled by replacing erroneous depth measurements with accurate ones. © 2012 Springer-Verlag. [less ▲]

Detailed reference viewed: 78 (0 UL)