EQUIVALENCY EVIDENCE OF THE ENGLISH COMPETENCY TEST ACROSS DIFFERENT MODES: A RASCH ANALYSIS

computer-based testing mode effects paper-and-pencil testing psychometric properties Rasch model

Authors

Downloads

The outbreak of the COVID-19 pandemic has transformed the educational landscape in a way unseen before. Educational institutions are navigating between offline and online learning worldwide. Computer-based testing is rapidly taking over paper-and-pencil testing as the dominant mode of assessment. In some settings, computer-based and paper-and-pencil assessments can also be offered side-by-side, in which case test developers should ensure the evidence of equivalence between both versions. This study aims to establish the equivalency evidence of different delivery modes of the English Competency Test, an English language assessment for civil service officers developed and used by the Human Resources Development Education and Training Center, a civil service training institution under the Ministry of Finance of the Republic of Indonesia. Psychometric analyses were carried out with the Rasch model to measure the unidimensionality, reliability, separation, and standard error of measurement estimates. The findings demonstrate that the paper-and-pencil and computer-based versions of the language assessment exhibit comparatively equivalent psychometric properties. The computer-based version of the English Competency Test is proven to offer a reliable and comparable alternative to the paper-and-pencil version.