The Design and Validation of A Tool to Measure Content Validity of A Computational thinking Game-Based Learning Module for Tertiary Educational Students
Salman Firdaus Sidek1, Maizatul Hayati Mohamad Yatim2*, Che Soh Said3
Computing Department, Faculty of Art, Computing and Creative Industry
Education University of Sultan Idris, Malaysia1-3
This study proved the design and content validity process of a computational thinking game-based learning module. The process involved a two-step method: instrument design and judgmental evidence. Content domain identification, item generation, and instrument construction were included in the former step while the latter involved seven experts to review and rate the essentiality, relevancy, and clarity of the generated 30 items in the first and 34 items in the second round. Suggestions and ratings by the panel of experts in the second step were used to examine the instrument content validity through content validity ratio (CVR), content validity index (CVI), and modified kappa statistic approach. The findings manifested the second round promised better results with the increment of totally essential items by 59.41 percent and the increment of total relevant, clear, and excellent items by 3.33 percent. It implies in the second round that 79.41 percent of overall items were significantly essential, and 100 percent of the overall items were significantly relevant, clear, and excellent. Overall, the instrument got significant content validity after the second round by s-CVI/UA=0.97 and s-CVI/Average=0.99. Hence, the instrument has a great potential to measure the content validity of a brand-new computational thinking game-based learning module. However, it was then recommended to involve more experts during content domain determination and item generation and to further explore the findings that support the content validity of 33 items on instrument reliability.
Keywords: content validity ratio; content validity index; instrument; learning module; modified kappa statistic