Applied Economics Teaching Resources

an AAEA Journal

Agricultural and Applied Economics Association

Research Article

Does Exam Formatting Affect Grades in Online Agricultural Marketing Courses?

Juan Pachon(a), Bachir Kassas(a), John Lai(a), Gulcan Onel(a)
(a)University of Florida

JEL Codes: JEL Codes: A20, A22
Keywords: Bloom’s Taxonomy, exam formatting, online learning, student performance

Publish Date: January 14, 2024

View Full Article (PDF)

Abstract

Understanding factors affecting student performance in online exams can help improve the accuracy and equity of performance assessment tools. While there is a significant body of literature dating back to the 1980s on the accurate assessment of performance in traditional in-person exams, the literature evaluating online exams in online classroom settings has been scarce. During the COVID-19 pandemic, online course offerings along with online exams in these courses have surged, leading to a renewed interest in understanding the extent to which formatting of exam questions could affect students’ grades. This study contributes to the literature on student exam performance in online classes by evaluating how scores are affected by two exam formatting treatments: ordering exam questions by chapter number and by question difficulty level. Two exams were administered in an online Agricultural Marketing class in two consecutive semesters. We investigate the treatment effects on average exam scores and exam grade distributions. The results show that neither type of exam formatting treatment has a significant impact on grade outcomes.

About the Author: Juan Pachon is an Undergraduate Student with the Department of Food and Resource Economics at the University of Florida. Bachir Kassas is an Assistant Professor with the Department of Food and Resource Economics at the University of Florida. (b.kassas@ufl.edu). John Lai is an Assistant Professor with the Department of Food and Resource Economics with the University of Florida. Gulcan Onel is an Associate Professor with the Department of Food and Resource Economics at the University of Florida. Acknowledgments: This study was approved by the Institutional Review Board of the University of Florida (Study Number: IRB202000232, Status: Exempt).

Copyright is governed under Creative Commons CC BY-NC-SA

References

Anaya, L., N. Iriberri, P.R. Biel, and G. Zamarro. 2021. “Understanding Performance in Test Taking: The Role of Question Difficulty Order.” CEPR Discussion Paper No. DP16099. SSRN.

Arora, S., P. Chaudhary, and R.K. Singh. 2021. “Impact of Coronavirus and Online Exam Anxiety on Self-Efficacy: The Moderating Role of Coping Strategy.” Interactive Technology and Smart Education 18(3):475–492. doi:10.1108/ITSE-08-2020-0158.

Bard, G., and Y. Weinstein. 2017. “The Effect of Question Order on Evaluations of Test Performance: Can the Bias Dissolve?” Quarterly Journal of Experimental Psychology 70(10):2130–2140. doi:10.1080/17470218.2016.1225108.

Carlson, J.L., and A.L. Ostrosky. 1992. “Item Sequence and Student Performance on Multiple-Choice Exams: Further Evidence.” The Journal of Economic Education 23(3):232–235.

Chen, C., K.T. Jones, M. Lawrence, and J.M. Simpson. 2022. “Can Educators Prevent a ‘Wild West’ Scenario in Giving Online Exams?” Quarterly Review of Distance Education 23(2):43–48.

Chidomere, R.C. 1989. “Test Item Arrangement and Student Performance in Principles of Marketing Examination: A Replication Study.” Journal of Marketing Education 11(3):36–40.

Clark, T.M., C.S. Callam, N.M. Paul, M.W. Stoltzfus, and D. Turner. 2020. “Testing in the Time of COVID-19: A Sudden Transition to Unproctored Online Exams.” Journal of Chemical Education 97(9):3413–3417. doi:10.1021/acs.jchemed.0c00546.

Dadashzadeh, M. 2021. “The Online Examination Dilemma: To Proctor or Not to Proctor?” Journal of Instructional Pedagogies 25:1–11.

Davis, D.B. 2017. “Exam Question Sequencing Effects and Context Cues.” Teaching of Psychology 44(3):263–267. doi:10.1177/0098628317712755.

Denny, P., S. Manoharan, U. Speidel, G. Russello, and A. Chang. 2019. “On the Fairness of Multiple-Variant Multiple-Choice Examinations.” Proceedings of the 50th ACM Technical Symposium on Computer Science Education: 462–468.

Geiger, M.A., and K.A. Simons. 1994. “Intertopical Sequencing of Multiple-Choice Questions: Effect on Exam Performance and Testing Time.” Journal of Education for Business 70(2):87–90.

Hambleton, R.K., and R.E. Traub. 1974. “The Effects of Item Order on Test Performance and Stress.” The Journal of Experimental Education 43(1):40–46.

Hauck, K.B., M.A. Mingo, and R.L. Williams. 2017. “A Review of Relationships between Item Sequence and Performance on Multiple-Choice Exams.” Scholarship of Teaching and Learning in Psychology 3(1):58–75. doi:10.1037/stl0000077.

Heck, J.L., and D.E. Stout. 1991. “Initial Empirical Evidence on the Relationship between Finance Test-Question Sequencing and Student Performance Scores.” Financial Practice and Education 1(1):41–47.

Hodges, C., S. Moore, B. Lockee, T. Trust, and A. Bond. 2020. “The Difference Between Emergency Remote Teaching and Online Learning.” EDUCAUSE Review. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-  teaching-and-online-learning.

Kolski, T., and J. Weible. 2018. “Examining the Relationship between Student Test Anxiety and Webcam Based Exam Proctoring.” Online Journal of Distance Learning Administration 21(3). Available at: https://ojdla.com/archive/fall213/kolski_weible213.pdf.

Krathwohl, David R. 2002. “A revision of Bloom's taxonomy: An overview.” Theory into practice 41(4): 212-218.

Lippi, S. 2016. “The Effects of an Online Program and Test Format on Student Performance.” Innovations in Teaching & Learning Conference Proceedings 8:2. doi:10.13021/g8sc76.

Manfuso, L.G. 2020. “How the Remote Learning Pivot Could Shape Higher Ed IT.” EdTech Magazine. https://edtechmagazine.com/higher/article/2020/04/how-remote-learning-pivot-could-shape-higher-ed-it.

Miller, R.M., and M.S. Andrade. 2020. “The Effects of Test Question Order on Task Persistence.” Research & Practice in Assessment 15(1):1–8.

Norman, R.D. 1954. “The Effects of a Forward Retention Set on an Objective Achievement Test Presented Forwards or Backwards.” Journal of Educational & Psychological Measurement 14(3):487–498.

Perlini, A.H., D.L. Lind, and B.D. Zumbo. 1998. “Context Effects on Examinations: The Effects of Time, Item Order and Item Difficulty.” Canadian Psychology 39(4):299–307. doi:10.1037/h0086821.

Rahim, A.FA. 2020. “Guidelines for Online Assessment in Emergency Remote Teaching during the COVID-19 Pandemic."Education in Medicine Journal 12(2):59–68. doi:10.21315/eimj2020.12.2.6.

Russell, M., M.J Fischer, C.M. Fischer, and K. Premo. 2003. “Exam Question Sequencing Effects on Marketing and Management Sciences Student Performance.” Journal for Advancement of Marketing Education 3:1–11.

Stowell, J., and D. Bennett. 2010. “Effects of Online Testing on Student Exam Performance and Test Anxiety.” Journal of Educational Computing Research 42(2):161–171. doi:10.2190/EC.42.2.b.

Vander Schee, B.A. 2009. “Test Item Order, Academic Achievement and Student Performance on Principles of Marketing Examinations.” Journal for Advancement of Marketing Education 14(1): 23–29.

Weinstein, Yana, and Henry L. Roediger. 2012. “The effect of question order on evaluations of test performance: how does the bias evolve?.” Memory & Cognition 40: 727-735.