PERTANIKA JOURNAL OF SOCIAL SCIENCES AND HUMANITIES

 

e-ISSN 2231-8534
ISSN 0128-7702

Home / Regular Issue / JSSH Vol. 33 (2) Apr. 2025 / JSSH-9044-2024

 

Exploring Postgraduate Students’ Experience with Rubric-referenced Assessment: Limitations and Solutions

Liang Jing Teh, Su Luan Wong, Mas Nida Md Khambari, Rahmita Wirza O. K. Rahmat and Sai Hong Tang

Pertanika Journal of Social Science and Humanities, Volume 33, Issue 2, April 2025

DOI: https://doi.org/10.47836/pjssh.33.2.03

Keywords: Analytic rubrics, areas of consideration, focus group discussion, higher education, limitations, postgraduate students, rubric-referenced assessment

Published on: 2025-04-30

Despite the various benefits of rubric-referenced assessment (RRA), multiple studies have revealed its potential pitfalls. Given the scarcity of research on the limitations of RRA and its solutions in the context of Malaysian postgraduates, this study explores the limitations of RRA and proposes potential strategies for improvement from the perspective of postgraduate students in Malaysia. The study adopted a case study qualitative approach and the Activity Theory as the theoretical framework. Five Malaysian postgraduate students provided their responses via two focus group discussions. The participants highlighted that rubrics may stifle creative self-expression, cause inconsistency in scoring, cause confusion among students, and be limited in catering to various learner needs. To address these issues, the participants recommended that the instructor allocate time for students to understand the rubric and engage in discussion about its content. They also proposed flexibility in rubrics to accommodate revisions based on student feedback and implementing scoring calibration sessions or training to maintain scoring consistency. Other suggestions comprised prioritising inclusive assessments, tailoring rubrics for different learner profiles, including specific numerical indicators in rubric descriptions, using a holistic rubric, and providing feedback to students according to the rubric.

  • Anandi, R. P., & Zailaini, M. A. (2019). Using Rasch Model to assess self-assessment speaking skill rubric for non-native Arabic language speakers. Pertanika Journal of Social Sciences & Humanities, 27(3), 1469-1480.

    Andrade, H. L., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Practical Assessment, Research & Evaluation, 10(1) Article 3. https://doi.org/10.7275/g367-ye94

    Bearman, M., & Ajjawi, R. (2021). Can a rubric do more than be transparent? Invitation as a new metaphor for assessment criteria. Studies in Higher Education, 46(2), 359-368. https://doi.org/10.1080/03075079.2019.1637842

    Bennett, C. (2016). Assessment rubrics: Thinking inside the boxes. Learning and Teaching, 9(1), 50-72. https://doi.org/10.3167/latiss.2016.090104

    Brinkmann, S. (2014). Unstructured and semi-structured interviewing. In P. Leavy (Ed.), The Oxford Handbook of Qualitative Research (pp. 277-299). https://doi.org/10.1093/oxfordhb/9780199811755.013.030

    Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. ASCD.

    Brookhart, S. M. (2018). Appropriate criteria: Key to effective rubrics. Frontiers in Education, 3, Article 22. https://doi.org/10.3389/feduc.2018.00022

    Bukhari, N., Jamal, J., Ismail, A., & Shamsuddin, J. (2021). Assessment rubric for research report writing: A tool for supervision. Malaysian Journal of Learning and Instruction, 18(2), 1-43. https://doi.org/10.32890/mjli2021.18.2.1

    Chowdhury, F. (2019). Application of rubrics in the classroom: A vital tool for improvement in assessment, feedback and learning. International education studies, 12(1), 61-68. https://doi.org/10.5539/ies.v12n1p61

    Cockett, A., & Jackson, C. (2018). The use of assessment rubrics to enhance feedback in higher education: An integrative literature review. Nurse Education Today, 69, 8-13. https://doi.org/10.1016/j.nedt.2018.06.022

    Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed methods approaches (5th ed.). Sage Publication.

    Darus, S., Stapa, S. H., & Hussin, S. (2003). Experimenting a computer-based essay marking system at Universiti Kebangsaan Malaysia. Jurnal Teknologi, 39(1), 1-18. https://doi.org/10.11113/jt.v39.472

    Engeström, Y. (1993). Developmental studies of work as a testbench of activity theory: The case of primary care medical practice. In S. Chaiklin & J. Lave (Eds.), Understanding practice: Perspectives on activity and context (pp. 64-103). Cambridge University Press. https://doi.org/10.1017/CBO9780511625510.004

    Engeström, Y. (1999). Innovative learning in work teams: Analyzing cycles of knowledge creation in practice. In Y. Engeström, R. Miettinen, & R.-L. Punamäki (Eds.), Perspectives on activity theory (pp. 377-406). Cambridge University Press.

    Fraile, J., Panadero, E., & Pardo, R. (2017). Co-creating rubrics: The effects on self-regulated learning, self-efficacy and performance of establishing assessment criteria with students. Studies in Educational Evaluation, 53, 69-76. https://doi.org/10.1016/j.stueduc.2017.03.003

    Francis, J. J., Johnston, M., Robertson, C., Glidewell, L., Entwistle, V., Eccles, M. P., & Grimshaw, J. M. (2010). What is an adequate sample size? Operationalising data saturation for theory-based interview studies. Psychology and Health, 25(10), 1229-1245. https://doi.org/10.1080/08870440903194015

    Hamilton, H., Gurak, E., Findlater, L., & Olive, W. (2000). The confusion matrix. https://www2.cs.uregina.ca/~hamilton/courses/831/notes/confusion_matrix/confusion_matrix.html

    Holmstedt, P., Jönsson, A., & Aspelin, J. (2018). Learning to see new things: Using criteria to support pre-service teachers’ discernment in the context of teachers’ relational work. Frontiers in Education, 3, Article 54. https://doi.org/10.3389/feduc.2018.00054

    Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment. Assessment & Evaluation in Higher Education, 39(7), 840-852. https://doi.org/10.1080/02602938.2013.875117

    Kite, J., & Phongsavan, P. (2017). Evaluating standards-based assessment rubrics in a postgraduate public health subject. Assessment & Evaluation in Higher Education, 42(6), 837-849. https://doi.org/10.1080/02602938.2016.1199773

    Krueger, R. A. (2014). Focus groups: A practical guide for applied research. Sage publications.

    Marques, J. F., & McCall, C. (2005). The application of interrater reliability as a solidification instrument in a phenomenological study. The Qualitative Report, 10(3), 439-462.

    Matshedisho, K. R. (2020). Straddling rows and columns: Students’ (mis)conceptions of an assessment rubric. Assessment & Evaluation in Higher Education, 45(2), 169-179. https://doi.org/10.1080/02602938.2019.1616671

    Ministry of Higher Education Malaysia. (2021). Alternative Assessment in Higher Education: A Practical Guide to Assessing Learning. Department of Higher Education Malaysia. https://utmcdex.utm.my/wp-content/uploads/2024/09/EBOOK-Alternative-Assessment-in-Higher-Education-2022.pdf

    Mok, J. C. H., & Toh, A. A. L. (2015). Improving the ability of qualitative assessments to discriminate student achievement levels. Journal of International Education in Business, 8(1), 49-58. https://doi.org/10.1108/JIEB-12-2013-0048

    Noh, N. M., Zain, M. R. M., Hamid, Y. S., Bakar, I. A. A., & Mohamad, M. (2021). Analytic rubric in evaluating the continuous assessment in projects of civil engineering undergraduate students in dynamics subject. Asian Journal of University Education, 17(4), 352-366. https://doi.org/10.24191/ajue.v17i4.16221

    Oakleaf, M. (2009). Using rubrics to assess information literacy: An examination of methodology and interrater reliability. Journal of the American Society for Information Science and Technology, 60(5), 969-983. https://doi.org/10.1002/asi.21030

    Panadero, E., & Jonsson, A. (2020). A critical review of the arguments against the use of rubrics. Educational Research Review, 30, Article 100329. https://doi.org/10.1016/j.edurev.2020.100329

    Panadero, E., & Romero, M. (2014). To rubric or not to rubric? The effects of self-assessment on self-regulation, performance and self-efficacy. Assessment in Education: Principles, Policy & Practice, 21(2), 133-148. https://doi.org/10.1080/0969594X.2013.877872

    Patton, M. Q. (2014). Qualitative research & evaluation methods: Integrating theory and practice (4th ed.). Sage Publications.

    Pérez-Guillén, S., Carrasco-Uribarren, A., Celis, C. L., González-Rueda, V., Rodríguez-Rubio, P. R., & Cabanillas-Barea, S. (2022). Students’ perceptions, engagement and satisfaction with the use of an e-rubric for the assessment of manual skills in physiotherapy. BMC Medical Education, 22, Article 623. https://doi.org/10.1186/s12909-022-03651-w

    Popham, W. J. (1997). What’s wrong--and what’s right--with rubrics. Educational leadership, 55(2), 72-75.

    Postmes, L., Bouwmeester, R., de Kleijn, R., & van der Schaaf, M. (2023). Supervisors’ untrained postgraduate rubric use for formative and summative purposes. Assessment & Evaluation in Higher Education, 48(1), 41-55. https://doi.org/10.1080/02602938.2021.2021390

    Sadler, D. R. (2014). The futility of attempting to codify academic achievement standards. Higher Education, 67(3), 273-288. https://doi.org/10.1007/s10734-013-9649-1

    Saeed, K. M., Ismail, S. A. M. M., & Eng, L. S. (2019). Malaysian speaking proficiency assessment effectiveness for undergraduates suffering from minimal descriptors. International Journal of Instruction, 12(1), 1059-1076. https://doi.org/10.29333/iji.2019.12168a

    Shadle, S. E., Brown, E. C., Towns, M. H., & Warner, D. L. (2012). A rubric for assessing students’ experimental problem-solving ability. Journal of Chemical Education, 89(3), 319-325. https://doi.org/10.1021/ed2000704

    Sitorus, M. L. (2020). Non-native English teachers interpretation of rubrics used for assessing students’ writing. Proceedings of the International Conference on Future of Education, 3(2), 16-25. https://doi.org/10.17501/26307413.2020.3202

    Terry, G., Hayfield, N., Clarke, V., & Braun, V. (2017). Thematic analysis. In C. Willig & W. Stainton-Rogers (Eds.), The Sage handbook of qualitative research in psychology (2nd ed., pp. 17-37). Sage.

    Venning, J., & Buisman-Pijlman, F. (2013). Integrating assessment matrices in feedback loops to promote research skill development in postgraduate research projects. Assessment & Evaluation in Higher Education, 38(5), 567-579. https://doi.org/10.1080/02602938.2012.661842

    Vygotsky, L. S. (1978). Mind and society: The development of higher mental processes. Harvard University Press.

ISSN 0128-7702

e-ISSN 2231-8534

Article ID

JSSH-9044-2024

Download Full Article PDF

Share this article

Related Articles