For geospatial applications, huge amounts of heterogeneous data sets of different topology are collected nowadays with different data acquisition techniques. Especially airborne and mobile platform LIDAR data are becoming ubiquitous, but SAR and stereophotogrammetry also contribute to the rapid growth of geotopographical data sets to sizes of tens to hundreds of TBs. Due to the problems of handling such large data volumes and the difficulty of fusing point clouds of heterogeneous provenance, rasters, volumetric data and 2D vector data, many of those new data sets are not used appropriately or not at all.
Therefore, IQmulus is targeting to enable optimized use of large, heterogeneous geo-spatial data sets for better decision making through a high-volume fusion and analysis information management platform. This platform will transpose approaches and IT standards from distributed computing to enable distributed, service-oriented geospatial processing. The project will determine optimal execution and distribution parameters for different geospatial processing tasks and to ensure that the IQmulus system can transparently execute processing on different architectures like GPGPU clusters or clouds. Methods will be developed to connect processing and visualization into a tight loop, ensuring high interactivity in the process to enable users to better understand correlations between heterogeneous data sets.
The competence center Spatial Information Management contributes to the project with its expertise in Cloud Computing and processing of large geospatial data. The staff members are working on the development of the JobManager, a component to control the processing in the Cloud, as well as the Workflow Editor that allows end users to create workflows for the processing of geospatial data using a Domain-Specific Language. In addition the competence center is responsible for the project's technical and scientific coordination (Scientific Manager) and for the infrastructure work package.