ABSTRACT
The "doer effect" is an association between the number of online interactive practice activities students' do and their learning outcomes that is not only statistically reliable but has much higher positive effects than other learning resources, such as watching videos or reading text. Such an association suggests a causal interpretation--more doing yields better learning--which requires randomized experimentation to most rigorously confirm. But such experiments are expensive, and any single experiment in a particular course context does not provide rigorous evidence that the causal link will generalize to other course content. We suggest that analytics of increasingly available online learning data sets can complement experimental efforts by facilitating more widespread evaluation of the generalizability of claims about what learning methods produce better student learning outcomes. We illustrate with analytics that narrow in on a causal interpretation of the doer effect by showing that doing within a course unit predicts learning of that unit content more than doing in units before or after. We also provide generalizability evidence across four different courses involving over 12,500 students that the learning effect of doing is about six times greater than that of reading.
- Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research on teaching. Chicago: Rand McNallyGoogle Scholar
- Dewey, J. (1916), (2007 edition). Democracy and Education. Teddington: Echo Library.k.Google Scholar
- Hill, P. (2013). Emerging Student Patterns in MOOCs: A (Revised) Graphical View. e-literate. Available online: http://mfeldstein.com/emerging-student-patterns-in-moocs-a-revised-graphical-view/.Google Scholar
- Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41, 75--86. doi:10.1207/s15326985ep4102_1Google ScholarCross Ref
- Koedinger, K. R., Anderson, J. R., Hadley, W. H., & Mark, M. A. (1997). Intelligent tutoring goes to school in the big city. International Journal of Artificial Intelligence in Education, 8, 30--43.Google Scholar
- Koedinger, K. R., Corbett, A. C., & Perfetti, C. (2012). The Knowledge-Learning-Instruction (KLI) framework: Bridging the science-practice chasm to enhance robust student learning. Cognitive Science, 36 (5), 757--798. doi:10.1111/j.1551-6709.2012.01245.xGoogle ScholarCross Ref
- Koedinger, K. R., Kim, J., Jia, J., McLaughlin, E. A., & Bier, N. L. (2015 Learning is Not a Spectator Sport: Doing is Better than Watching for Learning from a MOOC. In Proceedings of the Second (2015) ACM Conference on Learning at Scale, 111--120. Google ScholarDigital Library
- McDaniel, M. A., Anderson, J. L., Derbish, M. H., & Morrisette, N. (2007). Testing the testing effect in theclassroom. European Journal of Cognitive Psychology, 19(4/5), 494--513. doi:10.1080/09541440701326154Google ScholarCross Ref
- National Research Council. (2002). Scientific research in education. Committee on Scientific Principles for Education Research. Shavelson, R. J., and Towne, L., Editors. Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.Google Scholar
- Pane, J. F., Griffin, B., McCaffrey, D. F. & Karam, R. (2014). Effectiveness of Cognitive Tutor Algebra I at Scale. Educational Evaluation and Policy Analysis, 36 (2), 127--144. doi:10.3102/0162373713507480Google ScholarCross Ref
- Pashler, H., Bain, P., Bottge, B., Graesser, A., Koedinger, K., McDaniel, M., & Metcalfe, J. (2007). Organizing Instruction and Study to Improve Student Learning (NCER 2007-2004). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education.Google Scholar
- Pavlik, P. I., Jr., & Anderson, J. R. (2005). Practice and forgetting effects on vocabulary memory: An activation-based model of the spacing effect. Cognitive Science, 29, 559--586.doi:10.1207/s15516709cog0000_14Google ScholarCross Ref
- Renkl, A., Stark, R., Gruber, H., & Mandl, H. (1998). Learning from worked-out examples: the effects of example variability and elicited self-explanations. Contemporary Educational Psychology, 23, 90--108. doi:10/1006/ceps. 1997.0959Google ScholarCross Ref
- Roediger, H. L. & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1, 181--210. doi:10.1111/j.1745-6916.2006.00012.xGoogle ScholarCross Ref
- Salden, R. J. C. M., Koedinger, K. R., Renkl, A., Aleven, V., & McLaren, B. M. (2010). Accounting for beneficial effects of worked examples in tutored problem solving. Educational Psychology Review. doi: 10.1007/s10648-010-9143-6Google Scholar
- Singer, S. R. & Bonvillian, W. B. (2013). Two Revolutions in Learning. Science 22, Vol. 339 no. 6126, p.1359. doi:10.1126/science.1237223Google ScholarCross Ref
- Lovett, M., Meyer, O., & Thille, C. (2008). The Open Learning Initiative: Measuring the effectiveness of the OLI statistics course in accelerating student learning. Journal of Interactive Media in Education, 2008 (1), 1--16. http://doi.org/10.5334/2008-14Google ScholarCross Ref
- Trochim, W. M. (2009). Evaluation Policy and Evaluation Practice. New Directions for Evaluation, 123, 13--32.doi:10.1002/ev.303Google ScholarCross Ref
- Wang, X., Wen, M., Rosé, C. P. (2016). Towards triggering higher-order thinking behaviors in MOOCs, in Proceedings of Learning, Analytics, and Knowledge '16. Google ScholarDigital Library
- Wieman, C. E. (2014). Large-scale comparison of science teaching sends clear message. Proceedings of the National Academy of Science, 111(23), 8319-8320. doi:10.1073/pnas. 1407304111Google ScholarCross Ref
- Zhu, X., & Simon, H. A. (1987). Learning mathematics from examples and by doing. Cognition and Instruction, 4, 137--166. doi:10.1207/s1532690xci0403_1Google ScholarCross Ref
Index Terms
- Is the doer effect a causal relationship?: how can we tell and why it's important
Recommendations
The Doer Effect at Scale: Investigating Correlation and Causation Across Seven Courses
LAK2023: LAK23: 13th International Learning Analytics and Knowledge ConferenceThe future of digital learning should be focused on methods proven to be effective by learning science and learning analytics. One such method is learning by doing—combining formative practice with expository content so students actively engage with ...
The Impact of Adaptive Activities in Acrobatiq Courseware - Investigating the Efficacy of Formative Adaptive Activities on Learning Estimates and Summative Assessment Scores
Adaptive Instructional SystemsAbstractThe purpose of this paper is to explain the learning methodologies behind the adaptive activities within Acrobatiq’s courseware, and to investigate the impact of these adaptive activities on learning estimates and summative assessment scores using ...
Causal Effect Estimation Using Variational Information Bottleneck
Web Information Systems and ApplicationsAbstractCausal inference is to estimate the causal effect in a causalrelationship when intervention is applied. Precisely, in a causal model with binary interventions, i.e., control and treatment, the causal effect is simply the difference between the ...
Comments