The Relationship Between Quality of Student Contribution in Learning Activities and their Overall Performances in an Online Course

Rajabalee Yousra Banoor 1 , Frank Rennie 2  and Mohammad Issack Santally 3
  • 1 University of Mauritius, , Mauritius
  • 2 University of Highlands and Islands,
  • 3 University of Mauritius, , Mauritius

Abstract

In this research we studied the correlation between the level of students’ online participation and their overall performances. We examined in this study, the participation level in different learning activities assigned to two large cohorts of learners, and compared them with their final grades at the end of the year. We defined the quality of their participation in the online course as being classified into the level of learning activities in which they participated. Learning activities were grouped into four levels which were identified namely at the knowledge, understanding, critical thinking skills and practical competencies. The findings revealed that participation in higher-order online learning activities, that is the higher ability to show critical skills and practical competencies, resulted in better grades of the learners in the module. However, the results also highlighted that overall students had a tendency to score more marks in the knowledge category as the activities required lower order cognitive skills. It was further observed that low performers demonstrated a tendency to obtain lower marks in all the four grouping levels and vice-versa for high performers. Two key elements can be concluded from the findings. The first aspect is about instructional design of such online courses where there is a need for the inclusion of learning activities targeted at the development of different types of skills, and second is the distribution and weighting given to these categories. The recommendation is that for first year students, a greater weighting of marks toward knowledge level activities will generally encourage good performances, and this could be gradually reviewed when they move on to level 2 onwards in their studies.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • 1. Abrami, P. C., Bernard, R. M., Bures, E. M., Borokhovski, E., & Tamim, R. M. (2011). Interaction in distance education and online learning: Using evidence and theory to improve practice. Journal of Computing in Higher Education, 23(2-3), 82-103.

  • 2. Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31, 542-550.

  • 3. Alkhateeb, M., Hayashi, Y., & Hirashima, T. (2015). Comparison between Kit-Build and Scratch-Build Concept Mapping Methods in Supporting EFL Reading Comprehension. The Journal of Information and Systems in Education, 14(1), 13-27.

  • 4. Arbaugh, J. B. (2002). Managing the online classroom: a study of technological and behavioral characteristics of web-based MBA courses. Journal of High Technology Management Research, 13, 203-223.

  • 5. Arbaugh, J. B., & Duray, R. (2002). Technological and structural characteristics, student learning and satisfaction with web-based courses – An exploratory study of two on-line MBA programs. Management Learning, 33(3), 331-347.

  • 6. Beaudoin, M. F. (2002). Learning or lurking? Tracking the “invisible” online student. The internet and higher education, 5(2), 147-155.

  • 7. Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational research, 79(3), 1243-1289.

  • 8. Cacciamani, S., Cesareni, D., Martini, F., Ferrini, T., & Fujita, N. (2012). Influence of participation, facilitator styles, and metacognitive reflection on knowledge building in online university courses. Computers & Education, 58(3), 874-884.

  • 9. Case, S. M., & Swanson, D. B. (2001). Constructing written test questions for the basic and clinical sciences (2nd ed.). Philadelphia: National Board of Medical Examiners

  • 10. Chan, C. K., & Chan, Y. Y. (2011). Students’ views of collaboration and online participation in Knowledge Forum. Computers & Education, 57(1), 1445-1457.

  • 11. Cheng, G., & Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An empirical study of a blended learning course. British Journal of Educational Technology, 47(2), 257-278.

  • 12. Cohen, D., & Prusak, L. (2001). In good company: how social capital makes organizations work. MA, Boston: Harvard Business Press.

  • 13. Cunningham, J. M. (2015). Mechanizing People and Pedagogy: Establishing Social Presence in the Online Classroom. Online Learning, 19(3), 34-47.

  • 14. Dabbagh, N., & Kitsantas, A. (2012). Personal Learning Environments, social media, and self-regulated learning: A natural formula for connecting formal and informal learning. The Internet and higher education, 15(1), 3-8.

  • 15. Dennen, V. P., & Paulus, T. M. (2006). Researching “collaborative knowledge building” in formal distance learning environments. In T. Koschman, T. W. Chan & D. D. Suthers (Eds.), Computer supported collaborative learning 2005: The next 10 Years! International Society of the Learning Sciences, Taipei.

  • 16. Dimkov, T., Pieters, W., & Hartel, P. (2011). Training students to steal: a practical assignment in computer security education. Proceedings of the 42nd ACM technical symposium on computer science education, 21-26.

  • 17. Earl-Novell, S. (2006). Determining the extent to which program structure features and integration mechanisms facilitate or impede doctoral student persistence in mathematics. International Journal of Doctoral Studies, 1, 52-55.

  • 18. Ferrer de Valero, Y. (2001). Departmental factors affecting time-to-degree and completion rates of doctoral students at one land-grant research institution. The Journal of Higher Education, 72(3), 341-367.

  • 19. Freeman, M., & McKenzie, J. (2002). SPARK, A Confidential Web-Based Template for Self and Peer Assessment of Student Teamwork: Benefits of Evaluating across Different Subjects. British Journal of Educational Technology, 33, 551-569.

  • 20. Gibbs, G. (2006). How assessment frames student learning. In C. Bryan & K. Clegg (Eds.), Innovative assessment in higher education (pp. 23-36). London: Routledge.

  • 21. Greller, W., Santally, M. I., Boojhawon, R., Rajabalee, Y., & Kevin, R. (2017). Using Learning Analytics to Investigate Student Performance in Blended Learning Courses. Journal of Higher Education Development – ZFHE, 12(1).

  • 22. Harvey, L. (2000). New realities: the relationship between higher education and employment. Tertiary Education and Management, 6, 3-17.

  • 23. HEFCE-PISG (1999). Performance indicators in higher education. First report of the performance indicators steering group, 99/11. Bristol: HEFCE.

  • 24. Hirashima, T., Yamasaki, K., Fukuda, H., & Funaoi, H. (2011). Kit-Build Concept Map for Automatic Diagnosis. Proceedings of Artificial Intelligence in Education 2011, Auckland, New Zealand: Springer-Verlag Berlin Heidelberg, 466-468.

  • 25. Hirashima, T., Yamasaki, K., Fukuda, H., & Funaoi, H. (2015). Framework of kit-build concept map for automatic diagnosis and its preliminary use. Research and Practice in Technology Enhanced Learning, 10(1), 1-21.

  • 26. Hong, K. S. (2002). Relationships between students’ and instructional variables with satisfaction and learning from a Web-based course. Internet and Higher Education, 5, 267-281.

  • 27. Hussain, Ch. A. (2006). Effect of Guidance Services on Study Attitudes, Study Habits and Academic Achievement of Secondary School Students. Bulletin of Education and Research, 28(1), 35-45.

  • 28. Ke, F. (2010). Examining online teaching, cognitive, and social presence for adult students. Computers & Education, 55(2), 808-820.

  • 29. Larres, P. M., Ballantine, J. A., & Whittington, M. (2003) Evaluating the validity of self-assessment: Measuring computer literacy among entry-level undergraduates within accounting degree programmes at two UK universities. Accounting Education, 12, 97-112.

  • 30. Lave, J., & Wenger, E. (1991). Situated learning: legitimate peripheral participation. Cambridge, UK: Cambridge University Press.

  • 31. Lin, W.-J., Yueh, H.-P., Liu, Y.-L., Murakami, M., Kakusho, K., & Minoh, M. (2006). Blog as a tool to develop e-learning experience in an international distance course. Proceedings of the Sixth IEEE International Conference on Advanced Learning Technologies (ICALT’06).

  • 32. Lovitts, B., & Nelson, C. (2000). The hidden crisis in graduate education: Attrition from Ph.D. programs [Electronic version]. Academe Online.

  • 33. Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. http://dx.doi.org/10.1016/j.compedu.2009.09.008

  • 34. Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Educational Technology & Society, 15(3), 149-163.

  • 35. McCoubrie, P. (2004). Improving the fairness of multiple-choice questions: a literature review. Medical teacher, 26(8), 709-712.

  • 36. McGloughlin, C., & Lee, M. J. W. (2010). Personalised and self regulated learning in the Web

  • 2.0 era: International exemplars of innovative pedagogy using social software. Australasian Journal of Educational Technology, 26(1), 28–43

  • 37. McLoughlin, C., & Luca, J. (2002). A learner–centred approach to developing team skills through web–based learning and assessment. British Journal of Educational Technology, 33(5), 571-582.

  • 38. Millar, R. (2010). Analysing practical science activities to assess and improve their effectiveness. Hatfield: Association for Science Education.

  • 39. Moore, M. G., & Kearsley, G. (2005). Distance education: A systems view (2nd ed.). Belmont, CA: Wadsworth Publishing Company.

  • 40. Mushtaq, S. N. K. (2012). Factors affecting students’ academic performance. Global Journal of Management and Business Research, 12(9).

  • 41. Naranjo, M., Onrubia, J., & Segués, M. T. (2012). Participation and cognitive quality profiles in an online discussion forum. British Journal of Educational Technology, 43(2), 282-294.

  • 42. Nomura, T., Hayashi, Y., Suzuki, T., & Hirashima, T. (2014). Knowledge Propagation in Practical Use of Kit-Build Concept Map System in Classroom Group Work for Knowledge Sharing. Proceeding of International Conference on Computers in Education Workshop 2014, Nara, Japan: ICCE 2014 Organizing Committee, 463-472.

  • 43. Nonaka, I., & Konno, N. (1998). The concept of BA: Building a foundation for knowledge creation. California Management Review, 40(3), 40-54.

  • 44. Norhidayah, A., Kamaruzaman, K., Syukriah, A., Najah, M., & Azni Syafena Andin, S. (2009). The Factors Influencing Students. Performance at Universiti Teknologi MARA Kedah, Malaysia. Canadian Research & Development Center of Sciences and Cultures: Vol.3 No.4.

  • 45. Piccoli, G., Ahmad, R., & Ives, B. (2001). Web-based virtual learning environments: a research framework and a preliminary assessment of effectiveness in basic IT skill training. MIS Quarterly, 25(4), 401-426.

  • 46. Richardson, W. (2010). Blogs, wikis, podcasts, and other powerful web tools for classrooms. Corwin Press.

  • 47. Robinson, A., & Udall, M. (2006). Using formative assessment to improve student learning through critical reflection. In C. Bryan & K. Clegg (Eds.), Innovative assessment in higher education (pp. 92-99). New York: Routledge.

  • 48. Santally, M., & Senteni, A. (2013). Effectiveness of Personalised Learning Paths on Students Learning Experiences in an e-Learning Environment. European Journal of Open, Distance and eLearning, 2013(I). Retrieved from http://www.eurodl.org/materials/contrib/2013/Santally_Senteni.pdf

  • 49. Simonson, M., Smaldino, S., Albright, M., & Zvacek, S. (2009). Teaching and Learning at a Distance: Foundations of distance education (4th ed.). Columbus, OH: Prentice-Hall.

  • 50. Sluijsmans, D., Dochy, F., & Moerkerke, G. (1999). Creating a Learning Environment by Using Self-, Peer- and Co-Assessment. Learning Environments Research, 1, 293-319.

  • 51. Sugihara, K., Osada, T., Nakata, S., Funaoi, H., & Hirashima, T. (2012). Experimental evaluation of kit-build concept map for science classes in an elementary school. Proceedings of Computers in Education 2012, Singapore: National Institute of Education, 17-24

  • 52. Tempelaar, D. T., Heck, A., Cuypers, H., van der Kooij, H., & van de Vrie, E. (2013). Formative Assessment and Learning Analytics. In D. Suthers & K. Verbert (Eds.), Proceedings of the 3rd International Conference on Learning Analytics and Knowledge (pp. 205-209). New York: ACM. http://dx.doi.org/10.1145/2460296.2460337

  • 53. Toplis, R., & Allen, M. (2012). I do and I understand? Practical work and laboratory use in United Kingdom schools. Eurasia Journal of Mathematics, Science & Technology Education, 8(1), 3-9.

  • 54. Vonderwell, S., & Zachariah, S. (2005). Factors that influence participation in online learning, Journal of Research on Technology in Education, 38(2), 213-230.

  • 55. Vrasidas, C., & McIsaac, M. (1999). Factors influencing interaction in an online course. The American Journal of Distance Education, 13(3), 22-36.

  • 56. de Wever, B., van Keer, H., Schellens, T., & Valcke, M. (2009). Structuring asynchronous discussion groups: the impact of role assignment and self-assessment on students’ levels of knowledge construction through social negotiation. Journal of Computer Assisted Learning, 25(2), 177-188.

  • 57. Wolff, A., Zdrahal, Z., Nikolov, A., & Pantucek, M. (2013). Improving retention: Predicting at-risk students by analysing clicking behaviour in a virtual learning environment. Paper presented at the 3rd International Conference on Learning analytics and Knowledge.

  • 58. Yoshida, K., Sugihara, K., Nino, Y., Shida, M., & Hirashima, T. (2013). Practical Use of Kit-Build Concept Map System for Formative Assessment of Learners’ Comprehension in a Lecture. Proceedings of Computers in Education 2013, Bali, Indonesia: Asia-Pacific Society for Computers in Education, 906-915.

OPEN ACCESS

Journal + Issues

Search