The 3D digitalization of buildings, urban scenes, and the like is now a mature technology. Highly complex, densely sampled, reasonably accurate 3D models can be obtained by range-scanners and even image-based reconstruction methods from dense image collections. Acquisition of naked geometry is not enough in Cultural Heritage applications, because the surface colors (e.g. pictorial data) are clearly of central importance. Moreover, the 3D geometry cannot be expected to be complete, lacking context, parts made of materials like glass and metal, difficult to reach surfaces, etc. Easily captured photographs are the natural source of the appearance data missing in the 3D geometry. In spite of the recent availability of reliable technologies to align 2D images on 3D data, the two sides of the dataset are not easy to combine satisfactorily in a visualization. Texture mapping techniques, perhaps the most obvious candidate for the task, assume strict content consistency (3D to 2D, and 2D to 2D) which these datasets do not and should not exhibit (the advantage of pictures consisting in their ability to feature details, lighting conditions, non-persistent items, etc. which are absent in the 3D models or in the other pictures). In this work, we present a simple but effective technique to jointly and interactively visualize 2D and 3D data of this kind. This technique is used within PhotoCloud [IV12], a flexible opensource tool which is being designed to browse, navigate, and visualize large, remotely stored 3D-2D datasets, and which emphasizes scalability, usability, and ability to cope with heterogeneous data from various sources.

Joint Interactive Visualization of 3D Models and Pictures in Walkable Scenes

BRIVIO, PAOLO;TARINI, MARCO;
2012-01-01

Abstract

The 3D digitalization of buildings, urban scenes, and the like is now a mature technology. Highly complex, densely sampled, reasonably accurate 3D models can be obtained by range-scanners and even image-based reconstruction methods from dense image collections. Acquisition of naked geometry is not enough in Cultural Heritage applications, because the surface colors (e.g. pictorial data) are clearly of central importance. Moreover, the 3D geometry cannot be expected to be complete, lacking context, parts made of materials like glass and metal, difficult to reach surfaces, etc. Easily captured photographs are the natural source of the appearance data missing in the 3D geometry. In spite of the recent availability of reliable technologies to align 2D images on 3D data, the two sides of the dataset are not easy to combine satisfactorily in a visualization. Texture mapping techniques, perhaps the most obvious candidate for the task, assume strict content consistency (3D to 2D, and 2D to 2D) which these datasets do not and should not exhibit (the advantage of pictures consisting in their ability to feature details, lighting conditions, non-persistent items, etc. which are absent in the 3D models or in the other pictures). In this work, we present a simple but effective technique to jointly and interactively visualize 2D and 3D data of this kind. This technique is used within PhotoCloud [IV12], a flexible opensource tool which is being designed to browse, navigate, and visualize large, remotely stored 3D-2D datasets, and which emphasizes scalability, usability, and ability to cope with heterogeneous data from various sources.
2012
Brivio, Paolo; Tarini, Marco; Cignoni, P.; Scopigno, R.
File in questo prodotto:
File Dimensione Formato  
EG_short_photocloud_projection.pdf

accesso aperto

Tipologia: Documento in Pre-print
Licenza: DRM non definito
Dimensione 3.29 MB
Formato Adobe PDF
3.29 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11383/1790791
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact