Browse

1 - 10 of 135 items :

  • History of Science x
Clear All

Abstract

Selection is a key element of the cartographic generalisation process, often being its first stage. On the other hand it is a component of other generalisation operators, such as simplification. One of the approaches used in generalization is the condition-action approach. The author uses a condition-action approach based on three types of rough logics (Rough Set Theory (RST), Dominance-Based Rough Set Theory (DRST) and Fuzzy-Rough Set Theory (FRST)), checking the possibility of their use in the process of selecting topographic objects (buildings, roads, rivers) and comparing the obtained results. The complexity of the decision system (the number of rules and their conditions) and its effectiveness are assessed, both in terms of quantity and quality – through visual assessment. The conducted research indicates the advantage of the DRST and RST approaches (with the CN2 algorithm) due to the quality of the obtained selection, the greater simplicity of the decision system, and better refined IT tools enabling the use of these systems. At this stage, the FRST approach, which is characterised by the highest complexity of created rules and the worst selection results, is not recommended. Particular approaches have limitations resulting from the need to select appropriate measurement scales for the attributes used in them. Special attention should be paid to the selection of network objects, in which the use of only a condition-action approach, without maintaining consistency of the network, may not produce the desired results. Unlike approaches based on classical logic, rough approaches allow the use of incomplete or contradictory information. The proposed tools can (in their current form) find an auxiliary use in the selection of topographic objects, and potentially also in other generalisation operators.

Abstract

Databases are a basic component of every GIS system and many geoinformation applications. They also hold a prominent place in the tool kit of any cartographer. Solutions based on the relational model have been the standard for a long time, but there is a new increasingly popular technological trend – solutions based on the NoSQL database which have many advantages in the context of processing of large data sets. This paper compares the performance of selected spatial relational and NoSQL databases executing queries with selected spatial operators. It has been hypothesised that a non-relational solution will prove to be more effective, which was confirmed by the results of the study. The same spatial data set was loaded into PostGIS and MongoDB databases, which ensured standardisation of data for comparison purposes. Then, SQL queries and JavaScript commands were used to perform specific spatial analyses. The parameters necessary to compare the performance were measured at the same time. The study’s results have revealed which approach is faster and utilises less computer resources. However, it is difficult to clearly identify which technology is better because of a number of other factors which have to be considered when choosing the right tool.

Abstract

The use of cartographic sources and methods are the basic tools of historical geography. One of the main research trends in this field is the analysis of the spatial layout and number of old settlement units. The confrontation of maps with historical data allows the drawing of a town’s area at a certain time to be studied. The retrogression (R) and progression (P) methods that are currently used are imperfect and the model created (map) is usually incomplete and its reliability is limited. In the author’s opinion, the joining of retrogression and progression (a new method; combined – K)1 increases the quality of cartographic reconstruction of natural and cultural landscapes. The use of basic mathematical methods from the scope of set operations means the component reliability of the researched cartographic model can be varied because the common part of the retrogression and progression cartographic model represents mutual verification of source data. Quantitative effectiveness assessments of retrogression (R), progression (P) and the combined method (K) can be made for countable elements (e.g. buildings). As part of the conducted study, the effectiveness of separate methods was calculated: R = 76% for retrogression, P = 59% for progression and K = R ∪ P = 85% for the combined method. The mutual verification of the methods (R ∩ P) included 45% of residential buildings. The author describes the proposition of a new method and the course of verification research.

Abstract

This article critically explores Foucauldian approaches to the human-animal-technology nexus central to modern industrialised agriculture, in particular those which draw upon Foucault’s conception of power as productive to posit the reconstitution of animal subjectivities in relation to changing agricultural technologies. This is situated in the context of key recent literature addressing animals and biopolitics, and worked through a historical case study of an emergent dairy technology. On this basis it is argued that such approaches contain important insights but also involve risks for the analyses of human-animal-technology relations, especially the risk of subsuming what is irreducible in animal subjectivity and agency under the shaping power of technologies conceived as disciplinary or biopolitical apparatuses. It is argued that this can be avoided by bringing biopolitical analysis into dialogue with currents from actor-network theory in order to trace the formation of biopolitical collectives as heterogeneous assemblages. Drawing upon documentary archive sources, the article explores this by working these different framings of biopolitics through a historical case study of the development of the first mechanical milking machines for use on dairy farms.

Abstract

The authors’ main goal is to highlight the additional research potential of the method of analysing changes in the routes and names of streets introduced by Paweł E. Weszpiński in 2012. The proposed method was based on the old city maps of Warsaw and, according to Weszpiński, described “wandering streets and their names”. Taking the changing routes and names of streets on Lublin city maps from the last century as the research subject, the authors demonstrate that the method can be used to analyse how urban spaces are perceived and how they function in the minds of local residents. The authors propose to modify the method by adding one more important factor – the function of the place or street affected by the “wandering”. They claim that the study of changes in streets’ topography, territorial scope and names should be supplemented each time with an analysis of the administrative, economic or social significance of the place.

Abstract

This paper examines the development of socio-technical strategies and practices for the study and display of marine mammals. It considers how techniques initially developed for terrestrial use were deployed under marine conditions, not least through the adaptation of strategies and habits originating in industry, commerce and the military, in order to facilitate researchers’ access to their subjects. In particular, it examines how these methodological developments intersected with the terrestrial display of marine mammals. Throughout, it shows how the agency of the animals under observation had a key role to play in the emergence of cetology as a profession and as a form of knowledge.

Abstract

The author presents a geospatial analysis of the Peru-Chile Trench located in the South Pacific Ocean by the Generic Mapping Tool (GMT) scripting toolset used to process and model data sets. The study goal is to perform geomorphological modelling by the comparison of two segments of the trench located in northern (Peruvian) and southern (Chilean) parts. The aim of the study is to perform automatic digitizing profiles using GMT and several scripting modules. Orthogonal cross-section profiles transecting the trench in a perpendicular direction were automatically digitized, and the profiles visualized and compared. The profiles show variations in the geomorphology of the trench in the northern and southern segments. To visualize geological and geophysical settings, a set of the thematic maps was visualized by GMT modules: free-air gravity anomaly, geoid, geology and bathymetry. The results of the descriptive statistical analysis of the bathymetry in both segments show that the most frequent depths for the Peruvian segment of the Peru-Chile Trench range from -4,000 to -4,200 (827 recorded samples) versus the range of -4,500 to -4,700 m for the Peruvian segment (1,410 samples). The Peruvian segment of the trench is deeper and its geomorphology steeper with abrupt slopes compared to the Chilean segment. A comparison of the data distribution for both segments gives the following results. The Peruvian segment has the majority of data (23%) reaching 1,410 (-4,500 m to -4,700 m). This peak shows a steep pattern in data distribution, while other data in the neighbouring diapason are significantly lower: 559 (-4,700 m to -5,000 m) and 807 (-4,200 m to -4,400 m). The Chilean segment has more unified data distribution for depths of -6,000 m to -7,000 m. This paper presents GMT workflow for the cartographic automatic modelling and mapping deep-sea trench geomorphology.