Search Results

1 - 2 of 2 items

  • Author: Marc Halbrügge x
Clear All Modify Search

Keep it simple - A case study of model development in the context of the Dynamic Stocks and Flows (DSF) task

This paper describes the creation of a cognitive model submitted to the ‘Dynamic Stocks and Flows’ (DSF) modeling challenge. This challenge aims at comparing computational cognitive models for human behavior during an open ended control task. Participants in the modeling competition were provided with a simulation environment and training data for benchmarking their models while the actual specification of the competition task was withheld. To meet this challenge, the cognitive model described here was designed and optimized for generalizability. Only two simple assumptions about human problem solving were used to explain the empirical findings of the training data. In-depth analysis of the data set prior to the development of the model led to the dismissal of correlations or other parametric statistics as goodness-of-fit indicators. A new statistical measurement based on rank orders and sequence matching techniques is being proposed instead. This measurement, when being applied to the human sample, also identifies clusters of subjects that use different strategies for the task. The acceptability of the fits achieved by the model is verified using permutation tests.

Exploration for Understanding in Cognitive Modeling

The cognitive modeling and artificial general intelligence research communities may reap greater scientific return on research investments - may achieve an improved understanding of architectures and models - if there is more emphasis on systematic sensitivity and necessity analyses during model development, evaluation, and comparison. We demonstrate this methodological prescription with two of the models submitted for the Dynamic Stocks and Flows (DSF) Model Comparison Challenge, exploring the complex interactions among architectural mechanisms, knowledge-level strategy variants, and task conditions. To cope with the computational demands of these analyses we use a predictive analytics approach similar to regression trees, combined with parallelization on high performance computing clusters, to enable large scale, simultaneous search and exploration.