Analysis of the Quality of the Formative Test Items for Physics Learning Using the Rasch Model in the 21st Century Learning

Muhammad Asriadi, Samsul Hadi


Tests that have good quality are tests that have an even level of difficulty and can be completed as a whole by every respondent with every level of ability. This article aims to identify and analyze the quality of the items from the formative test material for static and dynamic fluids in physics subjects. This research used a survey with a cross-sectional method. Convenience sampling is a sampling technique used in this research. Then the sample in this study were 52 high school students class XI. The research instrument used a formative test for physics subjects, which were then analyzed using the Rasch model with the help of Winsteps software. The results of this study show that this formative test has a reliability level of 0.88 (very good) with a good level of problem difficulty and a good item fit level compatibility and there are no biased items in measurement so that this formative test is feasible to be used as a standard item in measuring students' abilities in materials of static and dynamic fluid in physics subjects.


21st Century Learning; Formative Test; Physics Learning; Rasch Model

Full Text:

PDF (English)


Haviz, M., Maris, I. M., & Fudholi, A. (2020). Assessing Pre-Service Teachers’ Perception on 21st century Skills in Indonesia. Journal of Turkish Science Education, 17(3): 351-363.

Herbert, Putro, B. L., Putra, R. R. J., & Fitriasari, N. S. (2019, November). Learning Management System (LMS) model based on machine learning supports 21st century learning as the implementation of curriculum 2013. In Journal of Physics: Conference Series (Vol. 1280, No. 3, p. 032032). IOP Publishing.

Arding, N. I., & Atun, S. (2020, January). Analysis of Junior High School students’ scientific literacy on simple effort and aircraft for everyday life. In Journal of Physics: Conference Series (Vol. 1440, No. 1, p. 012095). IOP Publishing.

Ofianto, O., & Suhartono, S. (2015). An assessment model of historical thinking skills by means of the RASCH model. REiD (Research and Evaluation in Education), 1(1): 73-83.

Anikarnisia, N. M., & Wilujeng, I. (2020, January). Need assessment of STEM education based based on local wisdom in junior high school. In Journal of Physics: Conference Series (Vol. 1440, No. 1, p. 012092). IOP Publishing.

Subali, B., Kumaidi, K., Aminah, N. S., & Sumintono, B. (2019). Student achievement based on the use of scientific method in the natural science subject in elementary school. Jurnal Pendidikan IPA Indonesia, 8(1): 39-51.

Gebze, D. A., & Perwati, S. (2020, January). Improving problem-solving ability in physics through android-based mobile learning application. In Journal of Physics: Conference Series (Vol. 1440, No. 1, p. 012022). IOP Publishing.

Istiyono, E., Dwandaru, W. B., & Rahayu, F. (2018). The developing of creative thinking skills test based on modern test theory in physics of senior high schools. Cakrawala Pendidikan, 3(2).

Larasati, P. E., & Yunanta, D. R. A. (2020, January). Validity and reliability estimation of assessment ability instrument for data literacy on high school physics material. In Journal of Physics: Conference Series (Vol. 1440, No. 1, p. 012020). IOP Publishing.

Nurjannah, N. (2017). Efektivitas Bentuk Penilaian Formatif Disesuaikan Dengan Media Pembelajaran. PARAMETER: Jurnal Pendidikan Universitas Negeri Jakarta, 29(1): 75-90.

da Silva, P. N., & de Abreu, R. C. P. (2017). DIA: A Computerized Adaptive Testing Tool for Assessing Student Learning.

Jensen, N., Rice, A., & Soland, J. (2018). The influence of rapidly guessed item responses on teacher value-added estimates: Implications for policy and practice. Educational Evaluation and Policy Analysis, 40(2): 267-284.

De Silva, C. R., & Hill, M. A. (2013). Higher order reading skills and reader response theory: Strategies for the classroom.

Osarumwense, J. H., & Duru, C. P. (2019). Assessment Of Model Fit For 2016 And 2017 Biology Multiple Choice Test Items Of The National Business And Technical Examination Board. International Journal of Innovation Education and Research, 7(4): 12-22.

Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Sage.

Hambleton, R. K., & Swaminathan, H. (2013). Item response theory: Principles and applications. Springer Science & Business Media.

Abdullah, H., Arsad, N., Hashim, F. H., Aziz, N. A., Amin, N., & Ali, S. H. (2012). Evaluation of students’ achievement in the final exam questions for microelectronic (KKKL3054) using the Rasch model. Procedia-Social and Behavioral Sciences, 60: 119-123.

Habibi, H., Jumadi, J., & Mundilarto, M. (2019). The Rasch-Rating Scale Model to Identify Learning Difficulties of Physics Students Based on Self-Regulation Skills. International Journal of Evaluation and Research in Education, 8(4): 659-665.

Isnani, I., Utami, W. B., Susongko, P., & Lestiani, H. T. (2019). Estimation of college students’ ability on real analysis course using Rasch model. REiD (Research and Evaluation in Education), 5(2): 95-102.

Mahmud, Z., & Porter, A. (2015). Using Rasch Analysis to Explore What Students Learn about Probability Concepts. Indonesian Mathematical Society Journal on Mathematics Education, 6(1): 1-10.

Lia, R. M., Rusilowati, A., & Isnaeni, W. (2020). NGSS-oriented chemistry test instruments: Validity and reliability analysis with the Rasch model. REiD (Research and Evaluation in Education), 6(1): 41-50.

van der Linden, W. J., & Hambleton, R. K. (Eds.). (2013). Handbook of modern item response theory. Springer Science & Business Media.

Bergh, D. (2020). Rasch Measurement: Applications in Quantitative Educational Research.

Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (1993). How to design and evaluate research in education (Vol. 7). New York: McGraw-Hill.

Creswell, J. W. (2002). Educational research: Planning, conducting, and evaluating quantitative (pp. 146-166). Upper Saddle River, NJ: Prentice Hall.

Al Kandari, A. M., & Al Qattan, M. M. (2020). E-Task-Based Learning Approach to Enhancing 21st-Century Learning Outcomes. International Journal of Instruction, 13(1): 551-566.

Purnamasari, U. D., & Kartowagiran, B. (2019). Application rasch model using R program in analyze the characteristics of chemical items. Jurnal Inovasi Pendidikan IPA, 5(2): 147-156.

Wibisono, S. (2019). Aplikasi model rasch untuk validasi instrumen pengukuran fundamentalisme agama bagi responden muslim. JP3I (Jurnal Pengukuran Psikologi dan Pendidikan Indonesia), 3(3).

Ujir, H., Salleh, S. F., Marzuki, A. S. W., Hashim, H. F., & Alias, A. A. (2020). Teaching Workload in 21st Century Higher Education Learning Setting. International Journal of Evaluation and Research in Education, 9(1): 221-227.

Salman, A., & Aziz, A. A. (2015). Evaluating user readiness towards digital society: a Rasch measurement model analysis. Procedia Computer Science, 65: 1154-1159.

Göçen, A., Eral, S. H., & Bücük, M. H. (2020). Teacher Perceptions of a 21st Century Classroom. International Journal of Contemporary Educational Research, 7(1): 85-98.

Camminatiello, I., Gallo, M., & Menini, T. (2010). The Rasch Model for Evaluating Italian Student Performance. Journal of Applied Quantitative Methods, 5(2): 331-349.

Susongko, P. (2016). Validation of science achievement test with the rasch model. Jurnal Pendidikan IPA Indonesia, 5(2): 268-277.

Boone, W. J., Staver, J. R., & Yale, M. S. (2013). Rasch analysis in the human sciences. Springer Science & Business Media.



  • There are currently no refbacks.

Copyright (c) 2021 Muhammad Asriadi, Samsul Hadi

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.


Institute of Managing and Publishing of Scientific Journals
STKIP Singkawang

Jl. STKIP, Kelurahan Naram, Kecamatan Singkawang Utara, Kota Singkawang, Kalimantan Barat, Indonesia

Email: [email protected]


JIPF Indexed by:


Copyright (c) JIPF (Jurnal Ilmu Pendidikan Fisika)

ISSN 2477-8451 (Online) and ISSN 2477-5959 (Print)