Does Test Method Matter? An Investigation of the Effect of a Two-Facet Reading Comprehension Test on Test-Takers’ Scores

Authors

  • Dr. Abdulhamid Mustafa El-Murabet Onaiba The Libyan Academy for Postgraduate Studies, Misurata Branch, Misurata Libya

DOI:

https://doi.org/10.65540/jar.v29i1.829

Keywords:

constructed-response, effect, Multiple-choice, performance, reading

Abstract

The degree to which test-takers perform on a given assessment may vary depending on the method of presentation. This research investigated the impact of test format—constructed-response (CR) and multiple-choice (MC)—on reading comprehension performance among EFL university students. Twenty-four students from a Reading III course participated in both assessments, with the CR test administered first, followed by the MC test a few days later. A paired-sample t-test revealed significant differences in performance, with students typically achieving higher scores on the MC format. To explore response discrepancies, follow-up interviews were conducted with students who provided different answers on the two test formats. The results highlighted several key factors contributing to these differences, including time pressure during the CR test, the presence of answer options in the MC format, and the ability to make educated guesses. While these findings suggest that MC items may be more accessible under timed conditions, the study acknowledges the potential influence of order effects and individual student attributes. Future research should further investigate these factors, employing counterbalanced designs, and exploring the impact of test formats on higher-order cognitive skills.

References

Arshad, A., Shakir, A., & Ahmad, M. (2020). A Review on the Principles of a Reading Comprehension Test Construction to assess the Test Takers at different levels. Psychology and Education, 57(8), 1290-1302.

Campbell, J. R. (1999). Cognitive processes elicited by multiple-choice and constructed response questions on an assessment of reading comprehension. Doctoral Dissertation, Temple University. (UMI No. 9938651)

Chehrazad, M. H., & Ajideh, P. (2015). Effects of Different Response Types on Iranian EFL Test Takers’ Performance. Iranian Journal of Applied Language Studies, 5(2), 29-50.

Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches. (3rd ed.). SAGE.

Cronbach, L. J. (1988). Five perspectives on validity argument. In H. Wainer & HI.

Cunningham, G. K. (1998). Assessment in the classroom: constructing and interpreting texts. Psychology Press.

Currie, M., & Chiramanee, T. (2010). The effect of the multiple-choice item format on the measurement of knowledge of language structure. Language Testing, 27(4), 471-491. DOI: https://doi.org/10.1177/0265532209356790

Duran, E., & Tufan, B. S. (2017). The Effect of open-ended questions and multiple-choice questions on comprehension. International Journal of Languages’ Education and Teaching, 5(1), 242-254. DOI: https://doi.org/10.18298/ijlet.1676

Fulcher, G. & Harding, L. (2024) The Routledge Handbook of Language Testing. Routledge.

Gravetter, F. J., & Forzano, L. B. (2012). Research methods for the behavioral sciences (4th ed.). Cengage Learning.

Hancock, G. R. (1994). Cognitive complexity and the comparability of multiple-choice and constructed-response test formats. Journal of Experimental Education, 62(2).143-157. DOI: https://doi.org/10.1080/00220973.1994.9943836

Hassani, L., & Maasum, T. N. R. T. M. (2012). A study of students’ reading performance in two test formats of summary writing and open-ended questions. Procedia-Social and Behavioral Sciences, 69, 915-923. DOI: https://doi.org/10.1016/j.sbspro.2012.12.016

Hughes, A. & Hughes, J. (2020). Testing for language teachers. Cambridge University Press & Assessment. DOI: https://doi.org/10.1017/9781009024723

Kaçar, K. (2023). Commonly Used Techniques in Testing Foreign Language Skills and Communicative Language Teaching. Sinerji Uluslararası Alan Eğitimi Araştırmaları Dergisi, 4(2), 68-85. DOI: https://doi.org/10.54971/synergy.1381532

Karimi, F., Tabrizi, H. H., Aval, A. N. & Khorvash, F. (2014). Test Method Facet: Is Response a Facet? International Journal of Current Life Sciences, 4(11). 9010-9015.

Kennedy, P. & Walstad, W. B. (1997). Combining multiple-choice and constructed-response test scores: An economist's view. Applied Measurement in Education,10(4), 359-375. DOI: https://doi.org/10.1207/s15324818ame1004_4

Lim, H. (2019). Test format effects: a componential approach to second language reading. Language Testing in Asia, 9, 1-22.‏ DOI: https://doi.org/10.1186/s40468-019-0082-y

Martinez, M. E. (1999). Cognition and the question of test item format. Educational Psychologist, 34(4), 207-218. DOI: https://doi.org/10.1207/s15326985ep3404_2

Onaiba, A. E., & Jannat, F. B. (2019). Test Method Effect and Test-Takers Scores: A Critical Review of the Pertinent Literature. Scientific Journal of Faculty of Education, Misurata University, 1(14), 3-29

Polat, M. (2020). Analysis of Multiple-Choice versus Open-Ended Questions in Language Tests According to Different Cognitive Domain Levels. Novitas-ROYAL (Research on Youth and Language), 14(2), 76-96.

Rashidi, N., & Safari, F. (2014). Does the type of multiple-choice item make a difference? The case of testing grammar. International Journal of Language Testing, 4(2), 175-186.

Shohamy, E. (1984). Does the testing method make a difference? The case of reading comprehension. Language Testing, 1(2), 147–170. DOI: https://doi.org/10.1177/026553228400100203

Tibbitts, E. T. (1974). Exercises in Reading Comprehension: Huntsmen Offset Printing Pte Ltd, Singapore.

Salehi, M., & Sanjareh, H. B. (2013). The impact of response format on learners’ test performance of grammaticality judgment tests. Journal of Basic and Applied Scientific Research, 3, 2: 1335, 1345.‏

Sarsarabi, S. S., & Sazegar, Z. (2023). Investigating Different Kinds of Stems in Multiple-Choice Tests: Interruptive vs. Cumulative. International Journal of Language Testing, 13(2), 170-187.

Shahivand, Z., Paziresh, A., & Raeeszadeh, A. (2014). The effects of test formats on the performances of Iranian EFL students. Theory and Practice in Language Studies, 4(2), 366. DOI: https://doi.org/10.4304/tpls.4.2.366-373

Tsagari, C. (1994). Method effects on testing reading comprehension: How far can we go? Unpublished MA thesis, University of Lancaster, UK.

Vasan, M. C. A., DeFouw, D. O., Holland, B. K., & Vasan, N. S. (2017). Analysis of testing with MC versus open‐ended questions: Outcome‐based observations in an anatomy course. Anatomical sciences education, 11(3), 254-261. DOI: https://doi.org/10.1002/ase.1739

Wainer, H., & Thissen, D. (1993). Combining multiple-choice and constructed-response test scores: Toward a Marxist theory of test construction. Applied Measurement in Education, 6(2), 103-118. DOI: https://doi.org/10.1207/s15324818ame0602_1

Walstad, W. B., & Becker, W. E. (1994). Achievement differences on multiple-choice and essay tests in economics. American Economic Review, Papers and Processing, 84(2), 193-196.

Wolf, D. F. (1993). A comparison of assessment tasks used to measure FL reading comprehension. The Modern Language Journal, 77(iv), 473–489. DOI: https://doi.org/10.1111/j.1540-4781.1993.tb01995.x

Xie, S., Singh, C. K. S., & Wong, W. L. (2024). A Systematic Literature Review of the Effect of Test Methods on English Learners’ Reading Performance. Journal of Humanities, Arts and Social Science, 8(7), 1591-1599 DOI: https://doi.org/10.26855/jhass.07.009

Downloads

Published

2025-01-01

How to Cite

Onaiba, A. M. E.-M. (2025). Does Test Method Matter? An Investigation of the Effect of a Two-Facet Reading Comprehension Test on Test-Takers’ Scores . Journal of Academic Research, 29(1), 38–56. https://doi.org/10.65540/jar.v29i1.829

Issue

Section

Arabic and English Languages