Browse

1 - 10 of 258 items :

  • Cartography and Photogrammetry x
Clear All

Abstract

Stereo photogrammetry has been used in this study to analyse and detect movements within the Lecture theater of School of Environmental Technology of Federal University of Technology Minna via the use of Kalman filter algorithm. The essential steps for implementation of this method are herein highlighted and results obtained indicate Ins. Mov.s (velocity) ranging from ±0.0000001 m/epoch to ±0.000007 m/epoch with greater movements noticed in the horizontal direction than in the vertical direction of the building. Because the observed movements were insignificant, the building has been classified as stable. However, a longer period of observation with a bi-monthly observational interval has been recommended to enable decision on the rate of rise/sink and deformation of the building.

Abstract

The article presents an original optical system to measure displacements across joints or cracks in building structures. It describes the concept of system operation, algorithms to be followed and results of practical tests that have been performed.

The proposed solution is based on digital photos taken with a non-metric digital camera, modified by defining its internal orientation elements and correction of lens distortion during calibration, constituting the measurement instrument registering the pictures of markers. QR (Quick Response) codes are proposed to be the markers. Being digitally processed, a set of registered images allow visualising the measured size of occurred displacements.

Owing to this solution, it is possible to obtain data on a mutual position of two or more QR codes in the form of translation elements in 3D space and appropriate three orientation angles. Appointed elements are unequivocal in spatial interpretation and not limited by dimension.

As the tests performed by the authors show, the results are more than satisfactory. The proposed measurement technology is an objective system of data acquisition, suitable for automating the whole monitoring process of displacements of building structure elements concerning joints and cracks.

Archiwum Fotogrametrii, Kartografii i Teledetekcji

Abstract

Exchange of and access to spatial data is the principal goal of any Spatial Data Infrastructure, therefore, one of the key concepts of SDI is interoperability, especially semantic and syntactic. Whereas application schemas and quality issues are one of the aspects that have to be considered to ensure a successful data interchange in SDI.

Two types of application schema are widely used in the European SDI as well as in the Polish SDI. They cover both semantic and syntactic interoperability and are an integral parts of spatial data specifications and relevant regulations in the form of data models. However, working out accurate and correct application schemas may be a challenge.

Additionally, faulty or too complex application schemas can influence the ability for valid data interchange, and consequently, prevent achieving interoperability within SDI. Therefore, the capability to examine and estimate the UML and GML application schemas quality seems to be a worthwhile and important issue in the context of semantic and syntactic interoperability in SDI.

The main subject of this article it to set out the context of performed studies, among others, the role of application schema in the interoperable data exchange, issues related to the concept of quality and its measures.

Abstract

A geoid or quasigeoid model allows the integration of satellite measurements with ground levelling measurements in valid height systems. A precise quasigeoid model has been developed for the city of Krakow. One of the goals of the model construction was to provide a more detailed quasigeoid course than the one offered by the national model PL-geoid2011. Only four measurement points in the area of Kraków were used to build a national quasigeoid model. It can be assumed that due to the small number of points and their uneven distribution over the city area, the quasigeoid can be determined less accurately. It became the reason for developing a local quasigeoid model based on a larger number of evenly distributed points. The quasigeoid model was based on 66 evenly distributed points (from 2.5 km to 5.0 km apart) in the study area. The process of modelling the quasigeoid used height anomalies determined at these points on the basis of normal heights derived through levelling and ellipsoidal heights derived through GNSS surveys. Height anomalies coming from the global geopotential model EGM2008 served as a long-wavelength trend in those derived from surveys. Analyses showed that the developed height anomaly model fits the empirical data at the level of single millimetres – mean absolute difference 0.005 m. The developed local model QuasigeoidKR2019, similar to the national model PL-geoid2011, are models closely related to the reference and height systems in Poland. Such models are used to integrate GNSS and levelling observations. A comparison of the local QuasigeoidKR2019 and national PL-geoid2011 model was made for the reference frame PL-ETRF2000 and height datum PL-KRON86-NH. The comparison of the two models with respect to GNSS/levelling height anomalies shows a triple reduction in the values of individual quartiles and a mean absolute difference for the developed local model. These summary statistics clearly indicate that the accuracy of the local model for the city of Krakow is significantly higher than that of the national one.

Abstract

The subject of the considerations presented in the article is the question of the criteria according to which the comparables, in the market value estimating process, should be selected. As the most important in the selection of comparative properties, the factor of similarity in relation to the subject property (measured by the Euclidean distance) was considered. As the key issue, the similarity assessment criterion and the influence of the adopted critical value of this criterion on the accuracy of estimates were chosen. The analysis of the above was carried out with taking into account the role of the significance of independent variables (measured by their correlation with the vector of the dependent variable). The results of the simulation tests carried out in the variants set by the criteria adopted were presented. On this basis, it has been shown that there is a potential most relevant solution in the collection of obtainable estimation results. This solution corresponds with the smallest sum of the differences of model prices (accepted as known) and corresponding estimates. The found minimum occurs for a specific layout of the above criterion values only.

Abstract

The author’s aim is to reflect on the cartographic modelling of historical borders based on the example of the series “Historical Atlas of Poland. Detailed maps of the 16th century” (HAP). HAP presents secular (state, palatinate, district) and religious borders (dioceses, archdeaconry, deanery, parish). The belonging of historic settlements to administrative units is determined on the basis of written sources. During work on the current volumes of HAP, the borders were reconstructed through their manual interpolation (the so-called linear model). Digital tools enable the automatic generation of administrative units based on settlements in point geometry (Thiessen polygons) or the use of modern divisions (precincts [obręby ewidencyjne]) as a reference to them (semi-automatic method). The article compares and assesses the three mentioned methods of determining historical borders and the possibilities of harmonizing them in relation to contemporary administrative divisions. The source material consisted of 18,357 settlements from the volumes of HAP published so far and 235 parishes for detailed analyses. Precincts were adopted as reference areas due to the possibilities of data harmonization.

Abstract

The authors analyses the issues inherent in implementation of a multi-sensory mobile application which uses a map as an interface for an edutainment-style city guide for tourists. Two models – the triad of tourist experiences (3E), i.e. education, entertainment, and excitement, and the hierarchy of needs of Abraham Maslow – were used as the basis for identifying what conditions should be met by such an application to encourage its use by both local residents and tourists, in equal measure. It was decided that only open source software would be used to achieve the goal of the application.

Abstract

The author of this paper sets out to identify and date the individual editions of Atlas selectus that were published in Leipzig by Johann George Schreiber and his heirs. The paper is based on the analysis of maps and printed registers from 27 unique editions of the atlas. After a brief summary of Schreiber’s non-cartographic activity, the atlas is presented and dating of its eight editions is proposed, from 1741 to the end of the 18th century. Some special editions of the atlas are also identified and briefly described. The estimated production rate of maps in Schreiber’s workshop is compared with earlier research. Finally, a few misconceptions concerning Schreiber and his cartography are clarified.

Abstract

Selection is a key element of the cartographic generalisation process, often being its first stage. On the other hand it is a component of other generalisation operators, such as simplification. One of the approaches used in generalization is the condition-action approach. The author uses a condition-action approach based on three types of rough logics (Rough Set Theory (RST), Dominance-Based Rough Set Theory (DRST) and Fuzzy-Rough Set Theory (FRST)), checking the possibility of their use in the process of selecting topographic objects (buildings, roads, rivers) and comparing the obtained results. The complexity of the decision system (the number of rules and their conditions) and its effectiveness are assessed, both in terms of quantity and quality – through visual assessment. The conducted research indicates the advantage of the DRST and RST approaches (with the CN2 algorithm) due to the quality of the obtained selection, the greater simplicity of the decision system, and better refined IT tools enabling the use of these systems. At this stage, the FRST approach, which is characterised by the highest complexity of created rules and the worst selection results, is not recommended. Particular approaches have limitations resulting from the need to select appropriate measurement scales for the attributes used in them. Special attention should be paid to the selection of network objects, in which the use of only a condition-action approach, without maintaining consistency of the network, may not produce the desired results. Unlike approaches based on classical logic, rough approaches allow the use of incomplete or contradictory information. The proposed tools can (in their current form) find an auxiliary use in the selection of topographic objects, and potentially also in other generalisation operators.