Abstract:
This paper presents the findings of a study that intended to seek the content validity (CV) evidence of an
instrument to measure the reading ability of university students in Sri Lanka. The reading passages and items
were adapted from CEFR aligned Learning Resource Network (LRN) materials. The items were designed based
on the cognitive processing involved in completing each reading task as prescribed by Khalifa and Weir (2009).
As a part of collecting evidence for content validation of the instrumentation, Item Objective Congruence (IOC)
analysis is used in this study. In IOC, the congruence between the cognitive processing of reading and the test
items were studied providing quantified data for CV. A pool of twelve experts examined a total of 41 test items
against eight cognitive processing effectively. As the experts had chosen more than one objective for an item, the
IOC formula simplified by Crocker and Aligna (1986) for multi-dimensional assessment of multiple
combinations of skills was applied in the present study. The findings of the IOC indicate the experts’ varying
degrees of agreement in terms of what some of the items were designed to assess. 38 items had acceptable IOC
indices, one item was removed from the study and two items were modified. Items having high congruence show
that they test only one skill and those indicating low congruence notify that, items assess more than one
cognitive processing skill. The study demonstrates the utility of the IOC method in gathering evidence for CV.
Test development and validation are crucial in assessment which is the first and foremost process to evaluate
educational management.