Impact Evaluation of an Emerging European Health Project – the MIDAS Model

  • 1 Dublin City University, Ireland


Background: This paper describes the impact evaluation of a large big data platform initiative that is being undertaken in order to increase the probability of its success. The initiative, MIDAS (Meaningful Integration of Data Analytics and Services), is a European health-based Horizon 2020 project comprising a consortium of members from various universities, research institutions, and government agencies.

Objectives: The purpose of the paper is to present a pioneering platform that will support healthcare policymakers in their decision-making by enabling greater and more efficient use of their data. The goal is to present and evaluate the results of the MIDAS project across four countries.

Methods/Approach: The literature is replete with examples of worthwhile technology projects that have failed due to user resistance. In order to avoid such failure, and ensure the success of the final MIDAS platform, a detailed impact evaluation is being undertaken at timed periods of development.

Results: This paper describes the impact evaluation process, outlining the use of Q-methodology and the development of a 36-item concourse using the HTMLQ system for that purpose.

Conclusions: This research contributes to the overall understanding of how impact evaluation can be undertaken at timed periods during the development of an innovative technology for organisational purposes.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • 1. Abouzahra, M. (2011), “Causes of failure in Healthcare IT projects”, Causes of failure in Healthcare IT projects. In 3rd International Conference on Advanced Management Science (Vol. 19, pp. 46-50). IACSIT Press, Singapore.

  • 2. Diem, K.G. (1997), Measuring impact of education programs. Rutgers Cooperative Extension fact sheet, Rutgers University, New Brunswick.

  • 3. Hicks, N. (2017) “Delivering an Outcomes-based NHS: Creating the Right Conditions. The Office of Health Economics, Seminar Briefing 20”, available at: (25 March 2018)

  • 4. Kellogg Foundation (2004), Using logic models to bring together planning, evaluation and action: Logic model development guide, Kellog Foundation, Michigan.

  • 5. Kruse, C. S., Goswamy, R., Raval, Y. J., Marawi, S. (2016). “Challenges and opportunities of big data in health care: a systematic review”, JMIR medical informatics, Vol. 4 No.4, e38.

  • 6. Lu, X., Liu, H., Ye, W. (2010), “Analysis failure factors for small & medium software projects based on PLS method. In Proceedings of the 2nd IEEE International Conference on Information Management and Engineering (ICIME)”, Chengdu, China, pp. 676-680.

  • 7. McCawley, P.F. (2001). The Logic Model for Program Planning and Evaluation. Adapted from Taylor-Powell, E. 1999. Providing leadership for program evaluation. University of Wisconsin Extension, Madison.

  • 8. McKeown, B., Thomas, D.B. (1988). Q Methodology, Sage, Newbury Park.

  • 9. O’Neill, B. (1998), “Money talks: Documenting the economic impact of Extension personal finance programs”, Journal of Extension, 36(5) Article 5FEA2, available at: (22 March 2018).

  • 10. Pinto, J.K. Mandel, S.J. (1990). “The Causes of Project Failure”, IEEE Transactions on Engineering Management, pp. 67-72.

  • 11. Ritchie J, Lewis J. (2003). “Qualitative research practice: a guide for social science students and researchers”, Sage, London.

  • 12. Ritchie, J., Spencer, L., O’Connor, W. (2003). “Carrying out qualitative analysis. In J. Ritchie & J. Lewis (Eds.), Qualitative research practice: a guide for social science students and researchers”, pp. 219-262.

  • 13. Smith, J., Firth, J, (2011) “Qualitative data analysis: the framework approach”, Nurse Researcher, Vol. 18 No. 2, pp. 52-62.


Journal + Issues