Geospatial Data has always been Big Data. Now Big Data Analytics for geospatial data is available to allow users to analyze massive volumes of geospatial data. Petabyte archives for remotely sensed geodata were being planned in the 1980s, and growth has met expectations. Add to this the ever increasing volume and reliability of real time sensor observations, the need for high performance, big data analytics for modeling and simulation of geospatially enabled content is greater than ever. In the past, limited access to the processing power that makes high volume or high velocity collection of geospatial data useful for many applications has been a bottleneck. Workstations capable of fast geometric processing of vector geodata brought a revolution in GIS. Now big processing through cloud computing and analytics can make greater sense of data and deliver the promised value of imagery and all other types of geospatial information.
Cloud initiatives have accelerated lightweight client access to powerful processing services hosted at remote locations. The recent ESA/ESRIN “Big Data from Space” event addressed challenges posed by policies for dissemination, data search, sharing, transfer, mining, analysis, fusion and visualization. A wide range of topics, scenarios and technical resources were discussed. In addition to the projects discussed at that event, several other big data initiatives have been launched to increase capabilities to processing geospatial data: the European Commission’s Big Data Public Private Forum, the US National Science Foundation’s Big Data Science & Engineering, and the US Office of Science and Technology Policy’s (OSTP) Big Earth Data Initiative (BEDI).
Read more at http://www.opengeospatial.org/blog/1866
Source: http://www.opengeospatial.org/blog/1866
0 Responses to “Big Processing of Geospatial Data”