Pixel-Wise Warping for Deep Image Stitching
DOI:
https://doi.org/10.1609/aaai.v37i1.25202Keywords:
CV: ApplicationsAbstract
Existing image stitching approaches based on global or local homography estimation are not free from the parallax problem and suffer from undesired artifacts. In this paper, instead of relying on the homography-based warp, we propose a novel deep image stitching framework exploiting the pixel-wise warp field to handle the large-parallax problem. The proposed deep image stitching framework consists of a Pixel-wise Warping Module (PWM) and a Stitched Image Generating Module (SIGMo). For PWM, we obtain pixel-wise warp in a similar manner as estimating an optical flow (OF). In the stitching scenario, the input images usually include non-overlap (NOV) regions of which warp cannot be directly estimated, unlike the overlap (OV) regions. To help the PWM predict a reasonable warp on the NOV region, we impose two geometrical constraints: an epipolar loss and a line-preservation loss. With the obtained warp field, we relocate the pixels of the target image using forward warping. Finally, the SIGMo is trained by the proposed multi-branch training framework to generate a stitched image from a reference image and a warped target image. For training and evaluating the proposed framework, we build and publish a novel dataset including image pairs with corresponding pixel-wise ground truth warp and stitched result images. We show that the results of the proposed framework are quantitatively and qualitatively superior to those of the conventional methods.Downloads
Published
2023-06-26
How to Cite
Kweon, H., Kim, H., Kang, Y., Yoon, Y., Jeong, W., & Yoon, K.-J. (2023). Pixel-Wise Warping for Deep Image Stitching. Proceedings of the AAAI Conference on Artificial Intelligence, 37(1), 1196-1204. https://doi.org/10.1609/aaai.v37i1.25202
Issue
Section
AAAI Technical Track on Computer Vision I