Immersive maps such as Google Street View and Bing Streetside provide true-to-life views with a massive collection of panoramas. However, these panoramas are only available at sparse intervals along the path they are taken, resulting in visual discontinuities during navigation. Prior art in view synthesis is usually built upon a set of perspective images, a pair of stereoscopic images, or a monocular image, but barely examines wide-baseline panoramas, which are widely adopted in commercial platforms to optimize bandwidth and storage usage. In this paper, we leverage the unique characteristics of wide-baseline panoramas and present OmniSyn, a novel pipeline for 360° view synthesis between wide-baseline panoramas. OmniSyn predicts omnidirectional depth maps using a spherical cost volume and a monocular skip, renders meshes in 360° images, and synthesizes intermediate views with a fusion network. We demonstrate the effectiveness of OmniSyn via comprehensive experimental results including comparison with the state-of-the-art methods on CARLA and Matterport datasets, ablation studies, and generalization studies on street views. We envision our work may inspire future research for this unheeded real-world task and eventually produce a smoother experience for navigating immersive maps.
@misc{Li2022OmnisynArxiv, doi = {10.48550/ARXIV.2202.08752}, url = {https://arxiv.org/abs/2202.08752}, author = {Li, David and Zhang, Yinda and Häne, Christian and Tang, Danhang and Varshney, Amitabh and Du, Ruofei}, title = {OmniSyn: Synthesizing 360 Videos with Wide-baseline Panoramas}, publisher = {arXiv}, year = {2022} } @INPROCEEDINGS{Li2022OmnisynVRW, author={Li, David and Zhang, Yinda and H\"{a}ne, Christian and Tang, Danhang and Varshney, Amitabh and Du, Ruofei}, booktitle={2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)}, title={{OmniSyn}: Synthesizing 360 Videos with Wide-baseline Panoramas}, year={2022}, volume={}, number={}, pages={670-671}, doi={10.1109/VRW55335.2022.00186} }