PERTANIKA JOURNAL OF SOCIAL SCIENCES AND HUMANITIES

 

e-ISSN 2231-8534
ISSN 0128-7702

Home / Regular Issue / JSSH Vol. 32 (3) Sep. 2024 / JSSH-8724-2022

 

Development of a Real-time Digital Learning Platform for Diagnosing Degrees of Mathematical Competency

Sineenart Phaengkham, Putcharee Junpeng and Keow Ngang Tang

Pertanika Journal of Social Science and Humanities, Volume 32, Issue 3, September 2024

DOI: https://doi.org/10.47836/pjssh.32.3.02

Keywords: Content constituent, construction of knowledge, degrees of mathematical competency, mathematical measures, real-time digital learning platform

Published on: 27 September 2024

The research aims to develop and verify a real-time digital learning platform for diagnosing the degree of mathematical competency of seventh-grade students. A total of 1,559 students from four regions of Thailand and six experts participated in this research. The researchers employed a design-based approach and the Multidimensional Coefficient Multinomial Logit Model to evaluate the effectiveness of the real-time digital learning platform. The measurement tool consists of two aspects, namely mathematical measures and the construction of knowledge dimensions, with 58 items to simulate students’ responses to the three respective content constituents: number and algebra, measurement and geometry, and statistics and probability. The findings showed that the multidimensional model offered a significantly better statistical fit in measurement, geometry, statistics, and probability, while numbers and algebra fit better in the unidimensional model. There were positive, strong, and significant correlations between the dimensions. The findings indicated that all items fit the OUTFIT MNSQ and INFIT MNSQ, which are between 0.75-1.33. The findings revealed that the internal structure when it came to evaluating the degrees of students’ mathematical competency using the Wright map and the Multidimensional Test Response Model conformed with the quality of the digital learning platform in terms of its usefulness, accuracy, and feasibility. This real-time digital learning platform for diagnosing the degree of mathematical competency can be accessed from anywhere with an internet connection, making education more accessible to a wider audience, including individuals in remote areas or those with physical disabilities.

  • Adams, R. J. (2005). Reliability as a measurement design effect. Studies in Educational Evaluation, 31(2-3), 162-172. https://doi.org/10.1016/j.stueduc.2005.05.008

  • Adams, R. J., & Khoo, S. T. (1996). ACER Quest: the interactive test analysis system (Version 2.10) [Computer software]. Australian Council for Educational Research. https://research.acer.edu.au/measurement/3/

  • Adams, R. J., Wilson, M., & Wang, W. (1997). The multidimensional random coefficient multinomial logit model. Applied Psychological Measurement, 21, 1-23. https://doi.org/10.1177/0146621697211001

  • Adom, D., Mensah, J. A., & Dakes, D. A. (2020). Test, measurement, and evaluation: Understanding and use of the concepts in education. International Journal of Evaluation and Research in Education (IJERE), 9(1), 109-119. https://doi.org/10.11591/ijere.v9i1.20457

  • Briggs, J. B., & Collis, K. F. (1982). Evaluating the quality of learning: The SOLO taxonomy. Academic Press. https://doi.org/10.1016/C2013-0-10375-3

  • Buakhamphu, S., Thawinkarn, D., Vatanasapt, P., Tang, K. N. (2024). Development of digital culture model for small size schools in the northeast region of Thailand. Kurdish Studies, 12(2), 724-732. https://doi.org/10.58262/ks.v12i2.056

  • Clark-Wilson, A., Robutti, O., & Thomas, M. (2020). Teaching with digital technology. ZDM –International Journal of Mathematics Education, 52, 1223-1242. https://doi.org/10.1007/s11858-020-01196-0

  • Michael, D. G., & Ritzelda, D. A. (2021). Improving the least mastered competencies on number sense of Grade 7 learners. International Journal of Advanced Research, 10(10), 127-139. https://doi.org/10.21474/IJAR01/15471

  • Fowler, S., O’Keeffe, L., Cutting, C., & Leonard, S. (2019). The mathematics proficiencies: A doorway into spatial thinking. Annals of Punjab Medical College (APMC), 24(1), 36-40.

  • Harman, M. (2021, May 8). The role of digital learning platforms in the academic growth of students. Kitaboo. https://kitaboo.com/the-role-of-digital-learning-platforms-in-the-academic-growth-of-students/

  • Hayes, A. (2024, February 23). Stratified random sampling. Investopedia. https://www.investopedia.com/terms/stratified_random_sampling.asp

  • International Commission on the Futures of Education. (2020). Education in a post-COVID world: Nine ideas for public action. United Nations Educational, Scientific and Cultural Organization. https://unesdoc.unesco.org/ark:/48223/pf0000373717

  • Inprasitha, M. (2022). Lesson study and open approach development in Thailand: A longitudinal study. International Journal for Lesson and Learning Studies, 11(5) 1-15. https://doi.org/10.1108/IJLLS-04-2021-0029

  • Inprasitha, M. (2023). Blended learning classroom model: A new extended teaching approach for new normal. International Journal for Lesson and Learning Studies, 12(4), 288-300. https://doi.org/10.1108/IJLLS-01-2023-0011

  • Junpeng, P., Inprasitha, M., & Wilson, M. (2018). Modeling of the open-ended items for assessing multiple proficiencies in mathematical problem solving. The Turkish Online Journal of Educational Technology, 2, 142-149.

  • Junpeng, P., Krotha, J., Chanayota, K., Tang, K. N., & Wilson, M. (2019). Constructing progress maps of digital technology for diagnosing mathematical proficiency. Journal of Education and Learning, 8(6), 90−102. https://doi.org/10.5539/jel.v8n6p90

  • Kanjanawasi, S. (2011). New theory of testing (3rd ed.). Chulalongkorn University Press. https://www.car.chula.ac.th/display7.php?bib=b1770973

  • Kantahan, S., Junpeng, P., Punturat, S., Tang, K. N., Gochyyev, P., & Wilson, M. (2020). Designing and verifying a tool for diagnosing scientific misconceptions in genetics topic. International Journal of Evaluation and Research in Education (IJERE), 9(3), 564-571. https://doi.org/10.11591/ijere.v9i3.20544

  • Kepner, H. S., & Huinker, D. A. (2012). Assessing students’ mathematical proficiencies on the common core. Journal of Mathematics Education at Teacher College, 3(1), 26-32. https://doi.org/10.7916/jmetc.v3i1.735

  • Kesorn, K., Junpeng, P., Marwiang, M., Pongboriboon, K., Tang, K. N., Bathia, S., & Wilson, M. (2020). Development of an assessment tool for mathematical reading, analytical thinking and mathematical writing. International Journal of Evaluation and Research in Education (IJERE), 9(4), 955-962. https://doi.org/10.11591/ijere.v9i4.20505

  • Kilpatrick, J., Swafford, J., & Findell, B. (2001). Adding it up: Helping children learn mathematics. National Academy Press. https://doi.org/10.17226/9822

  • Linacre, J. M. (2005). Rasch dichotomous model vs. one-parameter logistic model. Rasch Measurement Transactions, 19(3), Article 1032.

  • Lunz, M. E. (2010). Measurement research associates test insights. https://www.rasch.org/mra/mra-01-10.htm

  • Maisa, S. D., & Musthfa, B. (2021). Charging personal growth of preservice teachers with 21st century career and life skill during the COVID-19 pandemic and new era. The Asian ESP Journal, 17(7.2), 60-82.

  • Maoto, S., Masha, K., & Mokwana, L. (2018). Teachers’ learning and assessing of mathematical processes with emphasis on representations, reasoning and proof. Pythagoras – Journal of the Association for Mathematics Education of South Africa, 39(1), 1−10. https://doi.org/10.4102/pythagoras.v39i1.373

  • Manmai, T., Inprasitha, M., & Changsri, N. (2021). Cognitive aspects of students’ mathematical reasoning habits: A study on utilizing lesson study and open approach. Pertanika Journal of Social Sciences & Humanities, 29(4), 2591-2614. https://doi.org/10.47836/pjssh.29.4.27

  • Milgram, R. J. (2007). What is mathematical proficiency? In A. H. Schoenfeld (Ed.), Assessing mathematical proficiency (pp. 31-58). Cambridge University Press. https://doi.org/10.1017/CBO9780511755378.007

  • Phaniew, S., Junpeng, P., & Tang, K. N. (2021). Designing standards-setting for levels of mathematical proficiency in measurement and geometry: Multidimensional item response model. Journal of Education and Learning, 10(6), 103-111. https://doi.org/10.5539/jel.v10n6p103

  • Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. University of Chicago Press. https://archive.org/details/probabilisticmod0000rasc

  • Seo, K., Tang, J., Roll, I., Fels, S., & Yoon, D. (2021). The impact of artificial intelligence on learner-instructor interaction in online learning. International Journal of Educational Technology in Higher Education, 18, Article 54. https://doi.org/10.1186/s41239-021-00292-9

  • Thailand Ministry of Education. (2008). The basic education core curriculum. http://academic.obec.go.th/images/document/1525235513_d_1.pdf

  • Thailand Ministry of Education. (2017). Learning standards and indicators learning of mathematics (revised edition) according to the Core Curriculum of Basic Education, 2008. Agricultural Cooperative of Thailand. https://drive.google.com/file/d/1MDQEDkqGs01PnyzqEnyTVVNTS776ObCz/view

  • Thisopha, P., Thawinkarn, D., Wachrakul, C., & Tang K. N. (2023). An investigation on coding educational management for small-sized elementary schools in northeastern of Thailand. Remittances Review, 8(4), 3057-3071. https://doi.org/10.33182/rr.v8i4.211

  • Vongvanich, S. (2020). Design research in education. Chulalongkorn University Printing House. https://www.chulabook.com/education/102930

  • Wilson, M. (2005). Constructing measures: An item response modeling approach. Routledge. https://doi.org/10.4324/9781410611697

  • Wilson, M., Allen, D. D., & Li, J. C. (2006). Improving measurement in health education and health behavioral research using item response modeling: Comparison with the Classical Test Theory Approach. Health Education Research, 21, 19-32. https://doi.org/10.1093/her/cyl108

  • Wilson, M., & De Boeck, P. (Eds.). (2004). Descriptive and explanatory item response models. Explanatory item models: A generalized linear and nonlinear (pp. 43-41). Springer. https://link.springer.com/book/10.1007/978-1-4757-3990-9

  • Wilson, M., & Sloane, K. (2000). From principles to practice: An embedded assessment system. Applied Measurement in Education, 13(2), 181-208. https://doi.org/10.1207/S15324818AME1302_4

  • Wright, B. D., & Stone, M. H. (1979). Best test design: Rasch measurement. Mesa Press. https://research.acer.edu.au/measurement/1/

  • Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ACERConQuest Version 2: Generalized item response modeling software. Australian Council for Educational Research. https://www.researchgate.net/publication/262187496_ConQuest_Version_2_Generalised_Item_Response_Modelling_Software

  • Yao, L., & Schwarz, R. D. (2006). A multidimensional partial credit model with associated item and test statistics: An application to mixed-format tests. Applied Psychological Measurement, 30(6), 469-492. https://doi.org/10.1177/0146621605284537

  • Yulian, V. R., & Wahyudin. (2018). Analyzing categories of mathematical proficiency based on Kilpatrick opinion in junior high school. Journal of Physics: Conference Series, 1132, Article 012052. https://doi.org/10.1088/1742-6596/1132/1/012052