Melanie Ludwig, Alexander Asteroth, Christian Rasche and Mark Pfeiffer
Banister, E., Calvert, T., Savage, M., & Bach, T. (1975). A systems model of training for athletic performance. Aust J Sports Med , 7(3), 57–61.
Busso, T. (2003). Variable dose-response relationship between exercise training and performance. Medicine and science in sports and exercise , 35(7), 1188–1195.
Busso, T., Candau, R., & Lacour, J.-R. (1994). Fatigue and fitness modelled from the effects of training on performance. European journal of applied physiology and occupational physiology , 69(1), 50–54.
Busso, T., Carasso, C
D. Kolossa, M.A. Bin Azhar, C. Rasche, S. Endler, F. Hanakam, A. Ferrauti and M. Pfeiffer
Busso, T. (2003). Variable dose-response relationship between exercise training and performance. Medicine and science in sports and exercise, 35(7), 1188-1195.
Calvert, T. W., Banister, E. W., Savage, M. V., & Bach, T. (1976). A systems model of the effects of training on physical performance. IEEE Transactions on Systems, Man, and Cybernetics, (2), 94-102.
Chalencon, S., Pichot, V., Roche, F., Lacour, J. R., Garet, M., Connes, P., Barthemelemy, J. C., & Busso, T. (2015). Modeling of performance and
Performance-based research funding systems (PRFSs) have been installed in several countries around the globe, many of which are European ( Debackere et al., 2018 ; Zacharewicz et al., 2018 ). Hicks (2012) has characterized PRFSs as “national systems of research output evaluation used to distribute research funding to universities”. The Flemish Government introduced the BOF-key (BOF stands for “Bijzonder Onderzoeksfonds” or “University Research Fund” in English), a mechanism to distribute research funding between the Flemish universities, in
.), Science and Racket Sports II (pp. 211-220). London: E & FN Spon.
Hughes, M., & Bartlett, R. (2002). The use of performance indicators in performance analysis. Journal of Sports Sciences , 20 , 739-754.
Juan, Y.L., Zhang, H., & Hu, J.J. (2008). Computer diagnostics for the analysis of table tennis matches. International Journal of Sport Science and Engineering, 2 , 144-153.
Kaplan, W. (1991). Advanced Calculus (4th ed). Reading, MA: Addison-Wesley.
Lames, M. (1991). Leistungsdiagnostik durch Computersimulation . [Performance Diagnosis by
The performance-based research funding system (PRFS) in Poland was created in 1990 ( Kulczycki, Korzeń, & Korytkowski, 2017 ). Since then, the Polish research evaluation model has evolved several times from being peer-review based to a mostly metric model. In July 2018, a new law for the science and higher education sectors was adopted. The goal of the new law was to construct a new, coherent and clear regulation regarding the functioning of the system of science and higher education. These new regulations redesign the metric based evaluation
/ACM Transactions on Networking, 20 (6), 1788-1799.
Cakir, O., & Aras, M. E. (2012). A Recommendation Engine by Using Association Rules. Procedia – Social and Behavioral Sciences , 62, 452-456. World Conference on Business, Economics and Management (BEM-2012), May 4-6 2012, Antalya, Turkey.
Cintia, P. U. d. P., Rinzivillo, S. I. N. R. C., & Pappalardo, L. U. d. P. (2015). A network-based approach to evaluate the performance of football teams. In Machine
1 Background and motivation
Funding constitutes one of the main channels through which authority is exercised over research. Changes in the design of funding systems can accordingly be expected to have significant effects on the production of scientific knowledge ( Whitley, Gläser & Engwall, 2010 ) and a detailed understanding of the design and effects of national research funding mechanisms is therefore vital ( Aagaard, 2017 ). This is not least the case in relation to performance based research funding system (PBRFS) which during the latest decades have
devoted to benchmark research performance at the international, domestic, and intramural levels ( Abramo & D’Angelo, 2011 ). Yet, assessment of research activities is complex and contentious, and researchers, funders and decision makers try to adapt with ever improving and expanding methods and indicators ( James et al., 2015 ). However, measuring research performance remains a challenge all over the world ( Huang et al., 2017 ). Traditional assessments are usually composite index based, taking many dimensions together, such as research investment (in labor force and in
schemes ( Borgdorff, 2012 ; Jewesbury, 2009 ) or included in university performance evaluations (e.g. Lewandowska & Stano, 2018)—there is still little clarity on what exactly constitutes artistic research, how it is distinct from professional art practice in general, and which parameters and indicators could guide the quality assessment of its output. Authors like Wissler (1997) , Biggs and Karlsson (2011) or Hellström (2010) emphasize that the field would benefit from disciplinary meta-reflection to generate a “new paradigm” by which to conceive of artistic
Quantitative evaluation studies monitoring the scientific performance of actors (countries, institutes, individuals and also journals) regard primarily scholarly output and impact. Output is usually quantified by counting publications. The impact measurement usually regards scientific impact. Such impact, framed as a proxy for quality, is then measured by citations.
Assessing other impacts, for instance societal impact, is much more complicated and certainly less developed. Traceable effects of research in the societal context varies from one