While research has established the importance of questions as a key strategy used to facilitate student interaction in online discussions, there is a need to explore how the structure of questions influence students’ interactions. Using learning analytics, we explored the relationship between student-student interaction and the structure of initial questions with and without the Practical Inquiry Model (PIM). Degree centrality was used as the method to analyse the number of responses each student sent (out-degree centrality) and the number of responses each student received (in-degree centrality). Findings showed that the number of responses each student sent and received was higher in the discussions initiated by the PIM-question prompts. In addition, analysis revealed a positive relationship between students’ interaction and the discussions structured with PIM and non-PIM questions. Finally, there was a significant difference in out-degree centrality but no significant difference in in-degree centrality between discussions structured with the PIM and non-PIM questions. We conclude that initial questions can be structured using PIM as a guiding framework to facilitate student-student interaction in online discussions.
If the inline PDF is not rendering correctly, you can download the PDF file here.
1. Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction. The International Review of Research in Open and Distributed Learning, 4(2).
2. Andrews, J. (1980). The verbal structure of teacher questions: Its impact on class discussion. POD Quarterly: Journal of Professional and Organizational Development Network in Higher Education, 2(3 & 4), 129–163.
3. Avella, J.T., Kebritchi, M., Nunn, S., & Kanai, T. (2016). Learning analytics methods, benefits, and challenges in Higher Education: A systematic literature review. Online Learning, 20(2), http://dx.doi.org/10.24059/olj.v20i2.790.
5. Berland, M., Baker, R. S., & Blikstein, P. (2014). Educational data mining and learning analytics: Applications to constructionist research. Technology, Knowledge and Learning, 19(1-2), 205-220.
6. Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79, 1243–1289.
7. Borokhovski, E., Tamim, R., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are contextual and designed student–student interaction treatments equally effective in distance education? Distance Education, 33(3), 311–329.
8. Brooks, C. D., & Jeong, A. (2006). Effects of pre-structuring discussion threads on group interaction and group performance in computer-supported collaborative argumentation. Distance Education, 27(3), 371-390.
9. Cheng, C. K., Paré, D. E., Collimore, L. M., & Joordens, S. (2011). Assessing the effectiveness of a voluntary online discussion forum on improving students’ course performance. Computers & Education, 56(1), 253–261.
10. Darabi, A., Arrastia, M. C., Nelson, D. W., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning, 27(3), 216-227.
11. Darabi, A., Liang, X., Suryavanshi, R., & Yurekli, H. (2013). Effectiveness of online discussion strategies: A meta-analysis. American Journal of Distance Education, 27(4), 228-241.
12. Dawson, S., Macfadyen, L., Lockyer, L., & Mazzochi-Jones, D. (2011). Using social network metrics to assess the effectiveness of broad based admission practices. Australasian Journal of Educational Technology, 27(1), 16–27.
13. Ertmer, P. A., & Koehler, A. A. (2014). Online case-based discussions: Examining coverage of the afforded problem space. Educational Technology Research and Development, 62(5), 617-636.
14. Ertmer, P. A., Sadaf, A., & Ertmer, D. J. (2011). Student-content interactions in online courses: The role of question prompts in facilitating higher-level engagement with course content. Journal of Computing in Higher Education, 23(2-3), 157–86.
15. Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. The American Journal of Distance Education, 15(1), 7–23.
16. Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for meaningful discourse: A case study. British Journal of Educational Technology, 36(1), 5-18.
17. Hernández-García, Á., & Suárez-Navas, I. (2017). GraphFES: A web service and application for Moodle message board social graph extraction. In Daniel, B.K. (Ed.), Big data and learning analytics in Higher Education: Current theory and practice (pp.167-194). Springer.
18. Hosler, K. A., & Arend, B. D. (2012). Strategies and principles to develop cognitive presence in online discussions. Educational communities of inquiry: Theoretical framework, research and practice, 148-167.
19. Huss, J. A., Sela, O., & Eastep, S. (2015). A case study of online instructors and their quest for greater interactivity in their courses: Overcoming the distance in distance education. Australian Journal of Teacher Education, 40(4).
20. Ifenthaler, D. (2017). Are Higher Education institutions prepared for learning analytics? TechTrends, 61, 366-371. doi: 10.1007/s11528-016-0154-0
21. Kim, D., Park, Y., Yoon, M., & Jo, I. H. (2016). Toward evidence-based learning analytics: Using proxy variables to improve asynchronous online discussion environments. The Internet and Higher Education, 30, 30-43.
22. Kim, D., Yoon, M., Jo, I.H., & Branch, R. M. (2018). Learning analytics to support self-regulated learning in asynchronous online courses: A case study at a women’s university in South Korea. Computers & Education, 127, 233-251.
23. Kuo, Y. C., Walker, A., Belland, B. R., & Schroder, K. E. E. (2013). A predictive study of student satisfaction in online education programs. International Review of Research in Open and Distance Learning, 14(1), 16–39.
24. Lim, J., Jeong, A. C., Hall, B. M., & Freed, S. (2017). Intersubjective and discussion characteristics in online courses. Quarterly Review of Distance Education, 18(1), 29-44.
25. Lustria, M. L. A. (2007). Can interactivity make a difference? Effects of interactivity on the comprehension of and attitudes toward online health content. Journal of the Association for Information Science and Technology, 58(6), 766-776.
26. Moore, M. G. (1989). Editorial: Three types of interaction. The American Journal of Distance Education, 3(2), 1–6.
27. Patton, M. Q. (1990). Qualitative evaluation and research methods. SAGE Publications, Inc.
28. Park, C. L. (2009). Replicating the use of a cognitive presence measurement tool. Journal of Interactive Online Learning, 8, 140–155.
29. Richardson, J. C., & Ice, P. (2010). Investigating students’ level of critical thinking across instructional strategies in online discussions. The Internet and Higher Education, 13(1-2), 52-59.
30. Richardson, J., Sadaf, A., & Ertmer, P. (2012). Relationship between question prompts and critical thinking in online discussions. In Z. Akyol & R. Garrison (Eds.), Educational Communities of Inquiry: Theoretical Framework, Research and Practice (pp.197-222). IGI Global. doi. 10.4018/978-1-4666-2110-7.ch011.
31. Romero, C., Lopez, M. I., Luna, J. M., & Ventura, S. (2013). Predicting students’ final performance from participation in on-line discussion forums. Computers & Education, 68, 458–472.
32. Sadaf, A., & Olesova, L. (2017). Enhancing cognitive presence in online case discussions with questions based on the practical inquiry model. American Journal of Distance Education, 31(1), 56-69.
33. Salter, N. P., & Conneely, M. R. (2015). Structured and unstructured discussion forum as tools for student engagement. Computers in Human Behavior, 46, 18-25.
34. Sergis, S., & Sampson, D.G. (2017). Teaching and learning analytics to support teacher inquiry: A systematic literature review. In A. Peña-Ayala (Ed.), Learning analytics: Fundaments, applications, and trends: A view of the current state of the art to enhance e-learning (pp. 25-63). Springer.
35. Sher, A. (2009). Assessing the relationship of student-instructor and student-student interaction to student learning and satisfaction in Web-based online learning environment. Journal of Interactive Online Learning, 8(2).
36. Shukor, N. A., Tasir, Z., Van der Meijden, H., & Harun, J. (2014). Exploring students’ knowledge construction strategies in computer-supported collaborative learning discussions using sequential analysis. Journal of Educational Technology & Society, 17(4), 216–228.
37. Tempelaar, D., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, 408-420. https://doi.org/10.1016/j.chb.2017.08.010
38. Xie, K., Yu, C., & Bradshaw, A. C. (2014). Impacts of role assignment and participation in asynchronous discussions in college-level online classes. The Internet and Higher Education, 20, 10-19.
39. Xing, W., Guo, R., Petakovic, E., & Goggins, S. (2015). Participation-based student final performance prediction model through interpretable Genetic Programming: Integrating learning analytics, educational data mining and theory. Computer in Human Behavior, 47, 168-181. https://doi.org/10.1016/j.chb.2014.09.034
40. Zhang, J.-H., Zhang, Y.-X., Zou, Q., & Huang, S. (2018). What learning analytics tells us: group behavior analysis and individual learning diagnosis based on long-term and large-scale data. Educational Technology & Society, 21(2), 245-258.