Factors Influencing High Scores in the Food and Nutrition Practical Examinations in Eswatini

Molyn Mpofu, Carol Phindile Dlamini

Abstract


The Eswatini Food and Nutrition (FN) examination results have shown that the practical examinations had higher scores than the theory papers, creating negatively skewed distributions. Students were scoring high marks in the FN practical examination component than in theory. This study sought to explore the factors that influence the allocation of high scores in FN practical examinations in Eswatini. A descriptive research design utilizing the qualitative research approach was employed. A sample of 17 participants was purposively selected, comprising of 10 FN teachers, 3 subject Regional Inspectors, 3 Moderators and one Subject Officer. Focus group discussions, interviews and document analysis were used to collect data. Thematic analysis was the tool used to analyse qualitative data obtained from interviews and focus group discussions. The study established that teacher competency levels were low as evidenced by unclear marking schemes. Schools lacked resources, which compromised on the monitoring and supervision of examinations. The study also established that FN practical examination assessment was subjective and that the use of a well-defined marking scheme could minimize the variations. Since FN is a practical subject, students needed to practice cookery tasks during the course of the year, hence students were more likely to excel during the end of year practical examinations. The study recommends discussion of assessment tools and continuous training for examiners before marking of the practical examinations.


Keywords


Dynamics; Elevation; Food and Nutrition; Practical Examinations

Full Text:

PDF

References


Andrew, D. H & Carol, C. Y. (2015). Descriptive Statistics for Modern Test Score Distributions- Skewness, Kurtosis, Discreteness, and Ceiling Effects. Educational and Psychological Measurement, 75(3), 365–388.

Aslett, J. (2016). Reducing variability, increasing reliability: exploring the psychology of intra-and inter-rater reliability. Investigations in university teaching and learning, 4(1), 223-232.

Brown, T. (2017). Describing a distribution of test scores. Retrieved from Ntweb.deltastate.edu/v.

Cambridge Report. (2017). Examination Council of Swaziland. Swaziland General Certificate of Secondary Education.

Cheung, D. & Yip, D.Y (2015). Teachers concerns on School-based assessment of practical work. Journal of Biological Education 39(4), 156-161.

Cizek, G. J. (2016). Grades: The Final Frontier in Assessment Reform. NASSP Bulletin. Reports Evaluative Journal, 80(584), 103.

Creswell, J. W. (2014). Research design: A qualitative, quantitative, and mixed methods approaches (4th Ed). United States of America: SAGE Publications.

Heald, J. (2018). Using Standard Deviation and Bell Curves for Assessment. Velag Wien: Springer Vienna.

Heynes, J. (2014). Qualitative Analysis of Fundamental Motor Skills. Australian Journal for Teacher Education, 41(3), 21-38.

Holroyd, M. (2015). The Influence of Contrast Effects Upon Teachers’ Marks. Educational Research Journal, 39 (2), 229-233.

Inspectorate. (2008). Department of Education and Science. Brunswick Dublin: St. Flannan’s College, CoClare.

Kellaghan, T. & Greaney, V. (2013). Monitoring Performance: Assessment and Examinations in Africa. Association for the Development of Education in Africa. Mauritius: Grand Baie publishers.

Leepile, G. (2009). Assessing Home Economics coursework in senior secondary schools in Botswana. Assessment and Quality Assurance Education. University Of Pretoria. Faculty of Education.

Manana, R. H. (2016). Consumer Science Teachers’ Experiences in the Implementation of School Based Assessment in Food and Nutrition Practicals in Eswatini. Unpublished paper, Examinations Council of Eswatini.

McSweeney, K. (2014). Assessment practices and their impact on Home Economics Education in Ireland. Ireland: The University of Stirling.

Moskal, B. M. (2013). Scoring Rubrics: What, When and How? Journal of Practical Assessment, Research & Evaluation, 7(3), 1-7.

Nair, G., Setia, K., Samad, A., Zahri, R., Luqman, A., Vade, T. & Che Nyah, H. (2014). Teachers’ Knowledge and Implementation of School-Based Assessment: A Case Schools in Terengganu. Asian Social Science, 10(3), 23-30.

Ng’ang’a, M.P. (2014). Effect of Marking Scheme on the Consistency of Scoring Mathematics Examinations. University of Nairobi School of Education. Department of Psychology.

OECD. (2013). The Organization for Economic Co-operation and Development. Education at a glance. Paris: Statistical Glossary.

Organization for Economic Co-operation and Development (2013). OECD. Education at a glance. Paris: Statistical Glossary.

Stroud, L. & Herold, J. (2011). Who You Know or What You Know? Effect of Examiner Familiarity with Residents on OSCE Scores. Journal of the Association of American Medical Colleges, 86(10), 8-11.




DOI: http://dx.doi.org/10.26737/jetl.v5i2.1671

Refbacks

  • There are currently no refbacks.




Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Published by:

Institute of Managing and Publishing of Scientific Journals STKIP Singkawang

Sekolah Tinggi Keguruan dan Ilmu Pendidikan (STKIP) Singkawang

Address : STKIP Singkawang, Jalan STKIP - Kelurahan Naram Singkawang, Kalimantan Barat, INDONESIA, 79251
No. Telp.   : +62562 420 0344
No. Fax.    : +62562 420 0584

JETL (Journal of  Education, Teaching, and Learning)

e-ISSN : 2477-8478

p-ISSN : 2477-5924

Editor in Chief Contact: [email protected] / Wa: +6282142072788

Publisher Contact: [email protected] / Wa: +6282142072788

Management Tools

     

JETL Indexed by:

  

 

  Creative Commons License

JETL (Journal of Education, Teaching, and Learning) is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.