56 results on '"Kainz, Wolfgang"'
Search Results
2. Development Density-Based Optimization Modeling of Sustainable Land Use Patterns.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Ligmann-Zielinska, Arika, Church, Richard, and Jankowski, Piotr
- Abstract
Current land use patterns with low-density, single-use, and leapfrogging urban growth on city outskirts call for more efficient land use development strategies balancing economy, environmental protection, and social equity. In this paper, we present a new spatial multiobjective optimization model with a constraint based on the level of neighborhood development density. The constraint encourages infill development and land use compatibility by requiring compact and contiguous land use allocation. The multiobjective optimization model presented in this paper minimizes the conflicting objectives of open space development, infill and redevelopment, land use neighborhood compatibility, and cost distance to already urbanized areas. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
3. The' stroke' Concept in Geographic Network Generalization and Analysis.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Thomson, Robert C.
- Abstract
Strokes are relatively simple linear elements readily perceived in a network. Apart from their role as graphical elements, strokes reflect lines of flow or movement within the network itself and so constitute natural functional units. Since the functional importance of a stroke is reflected in its perceived salience this makes strokes a suitable basis for network generalization, through the preferential preservation of salient strokes during data reduction. In this paper an exploration of the dual functional-graphical nature of strokes is approached via a look at perceptual grouping in generalization. The identification and use of strokes are then described. The strengths and limitations of stroke-based generalization are discussed; how the technique may be developed is also considered. Finally, the functional role of strokes in networks is highlighted by a look at recent developments in space syntax and related studies. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
4. Analysis of Cross Country Trafficability.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Sivertun, Åke, and Gumos, Aleksander
- Abstract
Many decisions — not only in the field of Emergency Management or Military oriented actions — require nowadays in addition to reaching verdicts a large amount of spatial and geographical information data. If these data are handled in Geographical Information Systems — GIS, we are introducing new possibilities to handle and analyze this type of information in a way that divert substantially from traditional handling of the paper maps. A Geographical Information System is an IS with the capabilities not only to handle currently being produced digital maps in raster and vector formats but in addition analyze those for instance together with Remote Sensing techniques like GPS positioning and combining it with a real time intelligence reports. The development of the societies parallel to globalization and global dependencies trends, some symptoms of climate changes, ageing population, more complex societies and more complex systems lead also to a grater demand for more sophisticated information and information systems (Trnka 2003; Trnka et al. 2005a and 2005b; Quarantelli 1999; Rubin 1998; Rubin 2000; Kiranoudis et al. 2002; Mendonça et al. 2001; Beroggi 2001; Johnson 2002). The research teams at IDA/LiU have extensive experience with testing various forms of data capture, real time analyzes and diffusion of geographically registered data through, for example, mobile GIS technology. However, we have experienced a necessity for development of both entirely GIS-based models and supportive to them data to be analyzed, in order to improve all crucial phases of the Emergency Management scenario reasoning during a preventive and as well an information provisions stages. In this way information could be regarded as a strategic infrastructure that is now being investigated on the Swedish national level as well as by the European Union, for example through the European Network of Excellence — the GMOSS. One goal for GMOSS is to investigate good procedures for Emergency Management and Crisis Response, and as a consequence to build standardized and harmonized geographical databases that can be used in decision support systems. What is still to be added to the agenda is an implementation of several GIS-based models that are making use of all those databases for prediction of potential hazards, for preventive works and action plans. Objectives in this article are to contribute to the development of such models and to investigate necessary data for use in rescue-, relief- and preventive works, and to stress obligations for data uptodateness for structuralizing better preparedness plans etc. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
5. Building an Integrated Cadastral Fabric for Higher Resolution Socioeconomic Spatial Data Analysis.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Schuurman, Nadine, Leszczynski, Agnieszka, Fiedler, Rob, Grund, Darrin, and Bell, Nathaniel
- Published
- 2006
- Full Text
- View/download PDF
6. Scale-Dependent Definitions of Gradient and Aspect and their Computation.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Reinbacher, Iris, Kreveld, Marc, and Benkert, Marc
- Abstract
In order to compute lines of constant gradient and areas of constant aspect on a terrain, we introduce the notion of scale dependent local gradient and aspect for a neighborhood around each point of a terrain. We present three definitions for local gradient and aspect, and give efficient algorithms to compute them. We have implemented our algorithms for grid data and we compare the results for all methods. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
7. Tradeoffs when Multiple Observer Siting on Large Terrain Cells.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Franklin, W. Randolph, and Vogt, Christian
- Abstract
This paper demonstrates a toolkit for multiple observer siting to maximize their joint viewshed, on high-resolution gridded terrains, up to 2402 × 2402, with the viewsheds' radii of up to 1000. It shows that approximate (rather than exact) visibility indexes of observers are sufficient for siting multiple observers. It also shows that, when selecting potential observers, geographic dispersion is more important than maximum estimated visibility, and it quantifies this. Applications of optimal multiple observer siting include radio towers, terrain observation, and mitigation of environmental visual nuisances. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
8. I/O-Efficient Hierarchical Watershed Decomposition of Grid Terrain Models.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Arge, Lars, Danner, Andrew, Haverkort, Herman, and Zeh, Norbert
- Abstract
Recent progress in remote sensing has made massive amounts of high resolution terrain data readily available. Often the data is distributed as regular grid terrain models where each grid cell is associated with a height. When terrain analysis applications process such massive terrain models, data movement between main memory and slow disk (I/O), rather than CPU time, often becomes the performance bottleneck. Thus it is important to consider I/O-efficient algorithms for fundamental terrain problems. One such problem is the hierarchical decomposition of a grid terrain model into watersheds—regions where all water flows towards a single common outlet. Several different hierarchical watershed decompositions schemes have been described in the hydrology literature. One important such scheme is the Pfafstetter label method where each watershed is assigned a unique label and each grid cell is assigned a sequence of labels corresponding to the (nested) watersheds to which it belongs. In this paper we present an I/O-efficient algorithm for computing the Pfafstetter label of each cell of a grid terrain model. The algorithm uses O(s rt(T)) I/Os, the number of I/Os needed to sort T elements, where T is the total length of the cell labels. To our knowledge, our algorithm is the first efficient algorithm for the problem. We also present the results of a experimental study using massive real life terrain data that shows our algorithm is practically as well as theoretically efficient. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
9. An Evaluation of Spatial Interpolation Accuracy of Elevation Data.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Qihao Weng
- Abstract
This paper makes a general evaluation of the spatial interpolation accuracy of elevation data. Six common interpolators were examined, including Kriging, inverse distance to a power, minimum curvature, modified Shepard's method, radial basis functions, and triangulation with linear interpolation. The main properties and mathematical procedures of the interpolation algorithms were reviewed. In order to obtain full evaluation of the interpolations, both statistical (including root-mean-square-error, standard deviation, and mean) and spatial accuracy measures (including accuracy surface, and spatial autocorrelation) were employed. It is found that the accuracy of spatial interpolation of elevations was primarily subject to input data point density and distribution, grid size (resolution), terrain complexity, and interpolation algorithm used. The variations in interpolation parameters may significantly improve or worsen the accuracy. Further researches are needed to examine the impacts of terrain complexity in details and various data sampling strategies. The combined use of variogram models, accuracy surfaces, and spatial autocorrelation represents a promising direction in mapping spatial data accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
10. Use of Plan Curvature Variations for the Identification of Ridges and Channels on DEM.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Rana, Sanjay
- Abstract
This paper proposes novel improvements in the traditional algorithms for the identification of ridge and channel (also called ravines) topographic features on raster digital elevation models (DEMs). The overall methodology consists of two main steps: (1) smoothing the DEM by applying a mean filter, and (2) detection of ridge and channel features as cells with positive and negative plan curvature respectively, along with a decline and incline in plan curvature away from the cell in direction orthogonal to the feature axis respectively. The paper demonstrates a simple approach to visualize the multi-scale structure of terrains and utilize it for semiautomated topographic feature identification. Despite its simplicity, the revised algorithm produced markedly superior outputs than a comparatively sophisticated feature extraction algorithm based on conic-section analysis of terrain. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
11. From Point Cloud to Grid DEM: A Scalable Approach.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Agarwal, Pankaj K., Arge, Lars, and Danner, Andrew
- Abstract
Given a set S of points in ℝ3 sampled from an elevation function H : ℝ2 → ℝ, we present a scalable algorithm for constructing a grid digital elevation model (DEM). Our algorithm consists of three stages: First, we construct a quad tree on S to partition the point set into a set of non-overlapping segments. Next, for each segment q, we compute the set of points in q and all segments neighboring q. Finally, we interpolate each segment independently using points within the segment and its neighboring segments. Data sets acquired by LIDAR and other modern mapping technologies consist of hundreds of millions of points and are too large to fit in main memory. When processing such massive data sets, the transfer of data between disk and main memory (also called I/O), rather than the CPU time, becomes the performance bottleneck. We therefore present an I/O-efficient algorithm for constructing a grid DEM. Our experiments show that the algorithm scales to data sets much larger than the size of main memory, while existing algorithms do not scale. For example, using a machine with 1GB RAM, we were able to construct a grid DEM containing 1.3 billion cells (occupying 1.2GB) from a LIDAR data set of over 390 million points (occupying 20GB) in about 53 hours. Neither ArcGIS nor GRASS, two popular GIS products, were able to process this data set. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
12. Capturing and Representing Conceptualization Uncertainty Interactively using Object-Fields.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Voudouris, Vlasios, Fisher, Peter F., and Wood, Jo
- Abstract
We present a method for representing, recording and managing conceptualization uncertainty. We review components of uncertainty associated with semantics and metadata. We present a way of recording and visualizing uncertainty using sketching and suggest a framework for recording and managing uncertainty and associated semantics using Object-Fields. A case study is also used to demonstrate a software prototype that shows proof-of concept. We conclude by identifying future research challenges in terms of supporting dynamic exploration of uncertainty, semantics and field objects. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
13. Modeling Uncertainty in Knowledge Discovery for Classifying Geographic Entities with Fuzzy Boundaries.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Feng Qi, and Zhu, A.-Xing
- Abstract
Boosting is a machine learning strategy originally designed to increase classification accuracies of classifiers through inductive learning. This paper argues that this strategy of learning and inference actually corresponds to a cognitive model that explains the uncertainty associated with class assignments for classifying geographic entities with fuzzy boundaries. This paper presents a study that adopts the boosting strategy in knowledge discovery, which allows for the modeling and mapping of such uncertainty when the discovered knowledge is used for classification. A case study of knowledge discovery for soil classification proves the effectiveness of this approach. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
14. The Influence of Uncertainty Visualization on Decision Making: An Empirical Evaluation.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Deitrick, Stephanie, and Edsall, Robert
- Abstract
Uncertainty visualization is a research area that integrates visualization with the study of uncertainty. Many techniques have been developed for representing uncertainty, and there have been many participant-based empirical studies evaluating the effectiveness of specific techniques. However, there is little empirical evidence to suggest that uncertainty visualization influences, or results in, different decisions. Through a human-subjects experiment, this research evaluates whether specific uncertainty visualization methods, including texture and value, influence decisions and a users confidence in their decisions. The results of this study indicate that uncertainty visualization may effect decisions, but the degree of influence is affected by how the uncertainty is expressed. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
15. An Integrated Cloud Model for Measurement Errors and Fuzziness.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Cheng, Tao, Zhilin Li, Li, Deren, and Li, Deyi
- Abstract
Two kinds of uncertainties — measurement errors and concept (or classification) fuzziness, can be differentiated in GIS data. There are many tools to handle them separately. However, an integrated model is needed to assess their combined effect in GIS analysis (such as classification and overlay) and to assess the plausible effects on subsequent decision-making. The cloud model sheds lights on integrated modeling of uncertainties of fuzziness and randomness. But how to adopt the cloud model to GIS uncertainties needs to be investigated. Indeed, this paper proposes an integrated formal model for measurement errors and fuzziness based upon the cloud model. It addresses physical meaning of the parameters for the cloud model and provides the guideline of setting these values. Using this new model, via multi-criteria reasoning, the combined effect of uncertainty in data and classification on subsequent decision-making can be assessed through statistical indicators, which can be used for quality assurance. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
16. Skeleton Based Contour Line Generalization.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Matuk, Krzysztof, Gold, Christopher, and Zhilin Li
- Abstract
Contour lines are a widely utilized representation of terrain models in both cartography and Geographical Information Systems (GIS). Since they are often presented at different scales there is a need for generalization techniques. In this paper an algorithm for the generalization of contour lines based on skeleton pruning is presented. The algorithm is based on the boundary residual function and retraction of the skeleton of contour lines. The novelty of this method relies on pruning not only the internal skeleton branches, but also those skeleton branches placed outside the closed contour polygon. This approach, in contrast to original method which was designed for closed shapes is capable of handling also open polygonal chains. A simplified version of the skeleton is extracted in the first step of the algorithm and in the next a simpler boundary is computed. The simpler boundary as shown in this paper, can be found using three different ways: detection of stable vertices, computation of an average vertex and approximation of the boundary by Bezier splines. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
17. Grid Typification.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Anders, Karl-Heinrich
- Abstract
In this paper the detection and typification of grid structures in building groups is described. Typification is a generalization operation that replaces a large number of similar objects by a smaller number of objects, while preserving the global structure of the object distribution. The typification approach is based on three processes. First the grid structures are detected based on the so-called relative neighborhood graph. Second the detected grid structures are regularized by a least square adjustment of an affine or Helmert transformation. The third process is the reduction or simplification of the grid structure, which can be done using the same affine or Helmert transformation approach. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
18. The Hierarchical Watershed Partitioning and Data Simplification of River Network.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Ai, Tinghua, Liu, Yaolin, and Chen, Jun
- Abstract
For the generalization of river network, the importance decision of river channels in a catchment has to consider three aspects at different levels: the spatial distribution pattern at macro level, the distribution density at meso level and the individual geometric properties at micro level. To extract such structured information, this study builds the model of watershed hierarchical partitioning based on Delaunay triangulation. The watershed area is determined by the spatial competition process applying the partitioning similar to Voronoi diagram to obtain the basin polygon of each river channel. The hierarchical relation is constructed to represent the inclusion between different level watersheds. This model supports to compute the parameters such as distribution density, distance between neighbor channels and the hierarchical watershed area. The study presents a method to select the river network by the watershed area threshold. The experiment on real river data shows this method has good generalization effect. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
19. 3D Analysis with High-Level Primitives: A Crystallographic Approach.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Poupeau, Benoit, and Bonin, Olivier
- Abstract
This paper introduces a new approach to 3D handling of geographical information in the context of risk analysis. We propose to combine several geometrical and topological models for 3D data to take advantage from their respective capabilities. Besides, we adapt from crystallography a high-level description of geographical features that enables to compute several metric and cardinal relations, such as the "lay on" relation, which plays a key-role for geographical information. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
20. A Tetrahedronized Irregular Network Based DBMS Approach for 3D Topographic Data Modeling.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Penninga, Friso, Oosterom, Peter, and Kazar, Baris M.
- Abstract
Topographic features such as physical objects become more complex due to increasing multiple land use. Increasing awareness of the importance of sustainable (urban) development leads to the need for 3D planning and analysis. As a result, topographic products need to be extended into the third dimension. In this paper, we developed a new topological 3D data model that relies on Poincaré algebra. The internal structure is based on a network of simplexes, which are well defined, and very suitable for keeping the 3D data set consistent. More complex 3D features are based on this simple structure and computed when needed. We describe an implementation of this 3D model on a commercial DBMS. We also show how a 2D visualizer can be extended to visualize these 3D objects. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
21. A Flexible, Extensible Object Oriented Real-time Near Photorealistic Visualization System: The System Framework Design.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Jones, Anthony, and Cornford, Dan
- Abstract
In this paper we describe a novel, extensible visualization system currently under development at Aston University. We introduce modern programming methods, such as the use of data driven programming, design patterns, and the careful definition of interfaces to allow easy extension using plug-ins, to 3D landscape visualization software. We combine this with modern developments in computer graphics, such as vertex and fragment shaders, to create an extremely flexible, extensible real-time near photorealistic visualization system. In this paper we show the design of the system and the main sub-components. We stress the role of modern programming practices and illustrate the benefits these bring to 3D visualization. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
22. Automated Construction of Urban Terrain Models.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Buchholz, Henrik, Döllner, Jürgen, Ross, Lutz, and Kleinschmit, Birgit
- Abstract
Elements of urban terrain models such as streets, pavements, lawns, walls, and fences are fundamental for effective recognition and convincing appearance of virtual 3D cities and virtual 3D landscapes. These elements complement important other components such as 3D building models and 3D vegetation models. This paper introduces an object-oriented, rule-based and heuristic-based approach for modeling detailed virtual 3D terrains in an automated way. Terrain models are derived from 2D vector-based plans based on generation rules, which can be controlled by attributes assigned to 2D vector elements. The individual parts of the resulting urban terrain models are represented as "first-class" objects. These objects remain linked to the underlying 2D vector-based plan elements and, therefore, preserve data semantics and associated thematic information. With urban terrain models, we can achieve high-quality photorealistic 3D geovirtual environments and support interactive creation and manipulation. The automated construction represents a systematic solution for the bi-directional linkage of 2D plans and 3D geovirtual environments and overcomes cost-intensive CAD-based construction processes. The approach both simplifies the geometric construction of detailed urban terrain models and provides a seamless integration into traditional GIS-based workflows. The resulting 3D geovirtual environments are well suited for a variety of applications including urban and open-space planning, information systems for tourism and marketing, and navigation systems. As a case study, we demonstrate our approach applied to an urban development area of downtown Potsdam, Germany. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
23. A Linear Programming Approach to Rectangular Cartograms.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Speckmann, Bettina, Kreveld, Marc, and Florisson, Sander
- Abstract
In [26], the first two authors of this paper presented the first algorithms to construct rectangular cartograms. The first step is to determine a representation of all regions by rectangles and the second — most important — step is to get the areas of all rectangles correct. This paper presents a new approach to the second step. It is based on alternatingly solving linear programs on the x-coordinates and the y-coordinates of the sides of the rectangles. Our algorithm gives cartograms with considerably lower error and better visual qualities than previous approaches. It also handles countries that cannot be present in any purely rectangular cartogram and it introduces a new way of controlling incorrect adjacencies of countries. Our implementation computes aesthetically pleasing rectangular and nearly rectangular cartograms, for instance depicting the 152 countries of the World that have population over one million. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
24. A Tangible Augmented Reality Interface to Tiled Street Maps and its Usability Testing.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Moore, Antoni
- Abstract
The Tangible Augmented Street Map (TASM) is a novel interface to geographic objects, such as tiled maps of labeled city streets. TASM uses tangible Augmented Reality, the superimposition of digital graphics on top of real world objects in order to enhance the user's experience. The tangible object (i.e. a cube) replicates the role of an input device. Hence the cube can be rotated to display maps that are adjacent to the current tile in geographic space. The cube is capable of theoretically infinite movement, embedded in a coordinate system with topology enabled. TASM has been tested for usability using heuristic evaluation, where selected experts use the cube, establishing non-correspondence with recognized usability principles. While general and vague, the heuristics helped prioritize immediate geographic and system-based tasks needed to improve the usability of TASM, also pointing the way towards a group of geographically oriented heuristics. This addresses a key geovisualization challenge — the creation of domain-specific and technology-related theory. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
25. Advanced Operations for Maps in Spatial Databases.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., McKenney, Mark, and Schneider, Markus
- Abstract
Maps are a fundamental spatial concept capable of representing and storing large amounts of information in a visual form. Map operations have been studied and rigorously defined in the literature; however, we identify a new class of map join operations which cannot be completed using existing operations. We then consider existing operations involving connectivity concepts, and extend this class of operations by defining new, more complex operations that take advantage of the connectivity properties of maps. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
26. Structuring Kinetic Maps.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Dakowicz, Maciej, and Gold, Chris
- Abstract
We attempt to show that a tessellated spatial model has definite advantages for cartographic applications, and facilitates a kinetic structure for map updating and simulation. We develop the moving-point Delaunay/Voronoi model that manages collision detection snapping and intersection at the data input stage by maintaining a topology based on a complete tessellation. We show that the Constrained Delaunay triangulation allows the simulation of edges, and not just points, with only minor changes to the moving-point model. We then develop an improved kinetic Line-segment Voronoi diagram, which is a better-specified model of the spatial relationships for compound map objects than is the Constrained Triangulation. However, until now it has been more difficult to implement. We believe that this method is now viable for 2D cartography, and in many cases it should replace the Constrained approach. Whichever method is used, the concept of using the moving point as a pen, with the ability to delete and add line segments as desired in the construction and updating process, appears to be a valuable development. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
27. A Semantic-based Approach to the Representation of Network-Constrained Trajectory Data.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Xiang Li, Claramunt, Christophe, Ray, Cyril, and Hui Lin
- Abstract
Recent technological advances in urban traffic systems engender the availability of large trajectory data sets. However, the potential of these large urban databases are often neglected. This is due to a twofold problem. First, the volumes generated represent gigabytes of information per day, thus making data processing and analysis a computationally costly operation. Secondly, there is a lack of analysis of the semantics revealed by urban trajectories, at both the representation and data manipulation levels. The research presented in this paper addresses these two issues. We introduce an optimized representation approach that can efficiently reduce trajectory data volumes and facilitate data access and query languages. Our approach is a semantic-based representation model that characterizes significant trajectory points within a network. Key points are selected according to a combination of network, velocity, and direction criteria. This semantic approach facilitates trajectory data queries, the implicit modeling of trajectory processes. The proposed model is illustrated by a prototype implemented in a district of Hong Kong. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
28. A Quantitative Similarity Measure for Maps.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Frank, Richard, and Ester, Martin
- Abstract
In on-demand map generation, a base-map is modified to meet user requirements on scale, resolution, and other parameters. Since there are many ways of satisfying the requirement, we need a method of measuring the quality of the alternative maps. In this paper, we introduce a uniform framework for measuring the quality of generalized maps. The proposed Map Quality measure takes into account changes in all local objects (Shape Similarity), their neighborhoods (Location Similarity) and lastly across the entire map (Semantic Content Similarity). These three quality aspects measure the major generalization operators of simplification, relocation and selection, exaggeration and aggregation, collapse and typification. The three different aspects are combined using user-specified weights. Thus, the proposed framework supports the automatic choice of best alternative map according to preferences of the user or application. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
29. Semantic Similarity Measures within the Semantic Framework of the Universal Ontology of Geographical Space.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Čeh, Marjan, Podobnikar, Tomaž, and Smole, Domen
- Abstract
The objective of this paper is to discuss our methodology for comparing, searching and integrating geographic concepts. Searching for spatially oriented datasets could be illustrated by the complexity of the communication between the producer and user. The common vocabulary consists of a set of concepts describing the geographic space called universal ontology of geographical space (UOGS). We have defined the semantic parameters for measuring semantic similarities within the UOGS semantic framework and described our applicative approach to the similarity analyses of spatial databases. In order to test our results we have implemented the entire vocabulary as a set prolog fact. Following this we also implemented functionality such as the querying mechanism and the simple semantic similarity model, again as a set of prolog clauses. In addition to this, we applied prolog rules for the purpose of extracting semantic information describing geographic concepts and extracting it from natural language texts. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
30. Characterizing Land Cover Structure with Semantic Variograms.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Ahlqvist, Ola, and Shortridge, Ashton
- Abstract
This paper introduces the semantic variogram, which is a measure of spatial variation based upon semantic similarity metrics calculated for nominal land cover class definitions. Traditional approaches for measuring spatial autocorrelation for nominal geographical data compare classes between pairs of observations to determine a simple binary measure of similarity (identical/different). These binary values are summarized over many sample pairs separated by various distances to characterize some spatial metric of correlation, or variation. The use of binary similarity measures ignores potentially substantial ranges in similarity between different classes. Through the development of category representations capable of producing quantifiable measures of pair wise class similarity, descriptive spatial statistics that operate upon ratio data may be employed. These measures, including the semantic variogram proposed in this work, may characterize spatial variability of categorical maps more sensitively than traditional measures. We apply the semantic variogram to National Land Cover Data (NLCD) for three different study sites, and compare results to those from a multiple class indicator semivariogram. We demonstrate that substantial differences exist in observed short-range variability for the two metrics in all sites. The semantic variograms detect much lower short-range variability due to the tendency of semantically similar classes to be closer together. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
31. Coastline Matching Process Based on the Discrete Fréchet Distance.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Mascret, Ariane, Devogele, Thomas, Berre, Iwan, and Hénaff, Alain
- Abstract
Spatial distances are the main tools used for data matching and control quality. This paper describes new measures adapted to sinuous lines to compute the maximal and average discrepancy: Discrete Fréchet distance and Discrete Average Fréchet distance. Afterwards, a global process is defined to automatically handle two sets of lines. The usefulness of these distances is tested, with a comparison of coastlines. The validation is done with the computation of three sets of coastlines, obtained respectively from SPOT 5 orthophotographs and GPS points. Finally, an extension to Digital Elevation Model is presented. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
32. A Hierarchical Approach to the Line-Line Topological Relations.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Zhilin Li, and Min Deng
- Abstract
Topological relations have been recognized to be very useful for spatial query, analysis and reasoning. This paper concentrates on the topological relations between two lines in IR2. The line of thought employed in this study is that the topological relation between two lines can be described by a combination of finite number of basic (or elementary) relations. Based on this idea, a hierarchical approach is proposed for the description and determination of basic relations between two lines. Seventeen (17) basic relations are identified and eleven (11) of them form the basis for combinational description of a complex relation, which can be determined by a compound relation model. A practical example of bus routes is provided for illustration of the approach proposed in this paper, which is an application of the line-line topological relations in traffic planning. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
33. Integrating 2D Topographic Vector Data with a Digital Terrain Model — a Consistent and Semantically Correct Approach.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Koch, Andreas, and Heipke, Christian
- Abstract
The most commonly used topographic vector data are currently two-dimensional. The topography is modeled by different objects; in contrast, a digital terrain model (DTM) is a continuous representation of the Earth surface. The integration of the two data sets leads to an augmentation of the dimension of the topographic objects, which is useful in many applications. However, the integration process may lead to inconsistent and semantically incorrect results. In this paper we describe recent work on consistent and semantically correct integration of 2D GIS vector data and a DTM. In contrast to our prior work in this area, the presented algorithm takes into account geometric inaccuracies of both, planimetric and height data, and thus achieves more realistic results. Height information, implicitly contained in our understanding of certain topographic objects, is explicitly formulated and introduced into an optimization procedure together with the height data from the DTM. Results using real data demonstrate the applicability of the approach. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
34. Changes in Topological Relations when Splitting and Merging Regions.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Egenhofer, Max J., and Wilmsen, Dominik
- Abstract
This paper addresses changes in topological relations as they occur when splitting a region into two. It derives systematically what qualitative inferences can be made about binary topological relations when one region is cut into two pieces. The new insights about the possible topological relations obtained after splitting regions form a foundation for high-level spatio- temporal reasoning without explicit geometric information about each object's shapes, as well as for transactions in spatio-temporal databases that want to enforce consistency constraints. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
35. Implementation of a Prototype Toolbox for Communicating Spatial Data Quality and Uncertainty Using a Wildfire Risk Example.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Reinke, K. J., Jones, S., and Hunter, G. J.
- Abstract
Current GIS are often described as rich in functionality but poor in knowledge content and transfer. This paper presents a prototype for communicating data quality in spatial databases using a hybrid design between data-driven and user-driven factors based upon traditional communication and cartographic concepts. The prototype aims to give data users a better understanding of the uncertainty that affects their information by utilizing a knowledge-based method where they can choose from multiple visualizations to represent the uncertainty in their data, as well as access information about why a particular visualization has been proposed. In doing so, decisions become more transparent to data users, which increases the capability of the prototype to act as a training aid. The example case study examines the data quality in a source dataset and illustrates how the concepts apply in an operational environment at different levels of communication. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
36. Efficient Evaluation Techniques for Topological Predicates on Complex Regions.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Praing, Reasey, and Schneider, Markus
- Abstract
Topological predicates between spatial objects have for a long time been a focus of intensive research in a number of diverse disciplines. In the context of spatial databases and geographical information systems, they support the construction of suitable query languages for spatial data retrieval and analysis. Whereas to a large extent conceptual aspects of topological predicates have been emphasized, the development of efficient evaluation techniques for them has been rather neglected. Recently, the design of topological predicates for different combinations of complex spatial data types has led to a large increase of their numbers and accentuated the need for their efficient implementation. The goal of this paper is to develop efficient implementation techniques for them within the framework of the spatial algebra SPAL2D. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
37. An Evaluation Method for Determining Map-Quality.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Jobst, Markus, and Twaroch, Florian A.
- Abstract
The quality of maps, geo-visualization and usage of multimedia presentation techniques for spatial communication is an important issue for map creation, distribution and acceptance of these information systems (IS) by a public community. The purpose of this paper is to present an evaluation method based on stochastic reasoning for supporting map designers. We investigate the applicability of Bayesian Belief networks and present a prototypical implementation. We will give an outlook to future research questions. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
38. Using Metadata to Link Uncertainty and Data Quality Assessments.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Comber, A. J., Fisher, P. F., Harvey, F., Gahegan, M., and Wadsworth, R.
- Published
- 2006
- Full Text
- View/download PDF
39. Filling the Gaps in Keyword-Based Query Expansion for Geodata Retrieval.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Hochmair, Hartwig H.
- Abstract
Query expansion describes the automated process of supplementing a user's search with additional terms or geographic locations to make it more appropriate for the user's needs. Such process relies on the system's knowledge about the relation between geographic terms and places. Geodata repositories host spatial data, which can be queried over their metadata, such as keywords. One way to organize the system's knowledge structure for keyword-based query expansion is to use a similarity network. In a complete similarity network the total number of similarity values between keyterms increases with the square of included keywords. Thus, the task of determining all these values becomes time consuming very quickly. One efficient method is to start with a sparse similarity network, and automatically estimate missing similarity values from other values with an algorithm. Hence, this paper introduces and evaluates four such algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
40. Reduced Data Model for Storing and Retrieving Geographic Data.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Frank, Andrew U.
- Abstract
The ‘industry-strength' data models are complex to use and tend to obscure the fundamental issues. Going back to the original proposal of Chen for Entities and Relationships, I describe here a reduced data model with Objects and Relations. It is mathematically well founded in the category of relations and has been implemented to demonstrate that it is viable. An example how this is used to structure data and load data is shown. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
41. Spatiotemporal Event Detection and Analysis over Multiple Granularities.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Croitoru, Arie, Eickhorst, Kristin, Stefandis, Anthony, and Agouris, Peggy
- Abstract
Granularity in time and space has a fundamental role in our perception and understanding of various phenomena. Currently applied analysis methods are based on a single level of granularity that is user-driven, leaving the user with the difficult task of determining the level of spatiotemporal abstraction at which processing will take place. Without a priori knowledge about the nature of the phenomenon at hand this is often a difficult task that may have a substantial impact on the processing results. In light of this, this paper introduces a spatiotemporal data analysis and knowledge discovery framework, which is based on two primary components: the spatiotemporal helix and scale-space analysis. While the spatiotemporal helix offers the ability to model and summarize spatiotemporal data, the scale space analysis offers the ability to simultaneously process the data at multiple scales, thus allowing processing without a priori knowledge. In particular, this paper discusses how scale space representation and the derived deep structure can be used for the detection of events (and processes) in spatiotemporal data, and demonstrates the robustness of our framework in the presence of noise. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
42. Preference Based Retrieval of Information Elements.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Achatschitz, Claudia
- Abstract
Spatial information systems like GIS assist a user in making a spatial decision by presenting information that supports the decision process. Planning a holiday via the Internet can be a daunting process. Decision-making is based on finding the relevant data sets in short time among an overwhelming amount of data sources. In many cases the user has to figure out how the presented information can fit the decision he has to make on his own. To address this problem a tourist would need tools to retrieve the spatial information according to his current preferences. The relevant data is determined as data elements describing facts corresponding to the tourist's preferences. The present work suggests a way how a user dealing with a tourist information system can indicate his preferences through the user interface. According to the suggested alternatives the user can change his preferences and generate a new set of alternatives. A feedback loop provides a tool that allows the user to explore the available alternatives. The work was motivated by classical dialog based booking processes between human operators, carried out on the telephone. We introduce the conceptual model of a user interface, which considers the agents preferences and describes the overall interaction process. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
43. Expert Knowledge and Embedded Knowledge: Or Why Long Rambling Class Descriptions are Useful.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Wadsworth, R. A., Comber, A. J., and Fisher, P. F.
- Abstract
In many natural resource inventories class descriptions have atrophied to little more than simple labels or ciphers; the data producer expects the data user to share a common understanding of the way the world works and how it should be characterized (that is the producer implicitly assumes that their epistemology, ontology and semantics are universal). Activities like the UK e-science programme and the EU INSPIRE initiative mean that it is increasingly difficult for the producer to anticipate who the users of the data are going to be. It is increasingly less likely that producer and user share a common understanding and the interaction between them necessary to clarify any inconsistencies has been reduced. There are still some cases where the data producer provides more than a class label making it possible for a user unfamiliar with the semantics and ontology of the producer to process the text and assess the relationship between classes and between classifications. In this paper we apply computer characterization to the textual descriptions of two land cover maps, LCMGB (land cover map of Great Britain produced in 1990) and LCM2000 (land cover map 2000). Statistical analysis of the text is used to parameterize a look-up table and to evaluate the consistency of the two classification schemes. The results show that automatic processing of the text generates similar relations between classes as that produced by human experts. It also showed that the automatically generated relationships were as useful as the expert derived relationships in identifying change. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
44. Measuring Linear Complexity with Wavelets.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., and Lawford, Geoff J.
- Abstract
This paper explores wavelets as a measure of linear complexity. An introduction to wavelets is given before applying them to spatial data and testing them as a complexity measure on a vector representation of the Australian coastline. Wavelets are shown to be successful at measuring linear complexity. The technique used breaks a single line into different classes of complexity, and has the advantages that it is objective, automated and fast. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
45. Continuous Wavelet Transformations for Hyperspectral Feature Detection.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Ferwerda, Jelle G., and Jones, Simon D.
- Abstract
A novel method for the analysis of spectra and detection of absorption features in hyperspectral signatures is proposed, based on the ability of wavelet transformations to enhance absorption features. Field spectra of wheat grown on different levels of available nitrogen were collected, and compared to the foliar nitrogen content. The spectra were assessed both as absolute reflectances and recalculated into derivative spectra, and their respective wavelet transformed signals. Wavelet transformed signals, transformed using the Daubechies 5 motherwavelet at scaling level 32, performed consistently better than reflectance or derivative spectra when tested in a bootstrapped phased regression against nitrogen. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
46. Exploring Geographical Data with Spatio-Visual Data Mining.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Demšar, Urška, Krisp, Jukka M., and Křemenová, Olga
- Abstract
Efficiently exploring a large spatial dataset with the aim of forming a hypothesis is one of the main challenges for information science. This study presents a method for exploring spatial data with a combination of spatial and visual data mining. Spatial relationships are modeled during a data pre-processing step, consisting of the density analysis and vertical view approach, after which an exploration with visual data mining follows. The method has been tried on emergency response data about fire and rescue incidents in Helsinki. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
47. Modeling Geometric Rules in Object Based Models: An XML / GML Approach.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Reeves, Trevor, Cornford, Dan, Konecny, Michal, and Ellis, Jeremy
- Abstract
Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called ‘business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
48. A Voronoi-Based Map Algebra.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Ledoux, Hugo, and Gold, Christopher
- Abstract
Although the map algebra framework is very popular within the GIS community for modelling fields, the fact that it is solely based on raster structures has been severely criticised. Instead of representing fields with a regular tessellation, we propose in this paper using the Voronoi diagram (VD), and argue that it has many advantages over other tessellations. We also present a variant of map algebra where all the operations are performed directly on VDs. Our solution is valid in two and three dimensions, and permits us to circumvent the gridding and resampling processes that must be performed with map algebra. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
49. QACHE: Query Caching in Location-Based Services.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Ding, Hui, Yalamanchi, Aravind, Kothuri, Ravi, Ravada, Siva, and Scheuermann, Peter
- Abstract
Many emerging applications of location-based services continuously monitor a set of moving objects and answer queries pertaining to their locations. Query processing in such services is critical to ensure high performance of the system. Observing that one predominant cost in query processing is the frequent accesses to the database, in this paper we describe how to reduce the number of moving object to database server round-trips by caching query information on the application server tier. We propose a novel-caching framework, named QACHE, which stores and organizes spatially-relevant queries for selected moving objects. QACHE leverages the spatial indices and other algorithms in the database server for organizing and refreshing relevant cache entries within a configurable area of interest, referred to as the cache-footprint, around a moving object. QACHE contains appropriate refresh policies and prefetching algorithms for efficient cache-based evaluation of queries on moving objects. In experiments comparing QACHE to other proposed mechanisms, QACHE achieves a significant reduction (from 63% to $99%) in database roundtrips thereby improving the throughput of an LBS system. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
50. Database Model and Algebra for Complex and Heterogeneous Spatial Entities.
- Author
-
Riedl, Andreas, Kainz, Wolfgang, Elmes, Gregory A., Bordogna, Gloria, Pagani, Marco, and Psaila, Giuseppe
- Abstract
Current Geographic Information Systems (GISs) adopt spatial database models that do not allow an easy interaction with users engaged in spatial analysis operations. In fact, users must be well aware of the representation of the spatial entities and specifically of the way in which the spatial reference is structured in order to query the database. The main reason of this inadequacy is that the current spatial database models violate the independence principle of spatial data. The consequence is that potentially simple queries are difficult to specify and strongly depends on the actual data in the spatial database. In this contribution we tackle the problem of defining a database model to manage in a unified way spatial entities (classes of spatial elements with common properties) with different levels of complexity. Complex spatial entities are defined by aggregation of primitive spatial entities; instances of spatial entities are called spatial grains. The database model is provided with an algebra to perform spatial queries over complex spatial entities; the algebra is defined in such a way it guarantees the independence principle and meets the closure property. By means of the operators provided by the algebra, it is possible to easily perform spatial queries working at the logical level only. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.