The Development of an Item Bank for a Management and Responsibility Competency Scale for Upper Secondary Students
Main Article Content
Abstract
This study aimed to develop and examine the quality of an item bank for assessing management and responsibility competency among upper secondary school students, as well as to establish cut scores for accurate competency classification. The instrument was a situational test with four response options, polytomously scored, developed based on the affective domain framework of Krathwohl to reflect authentic behaviors in real-life situations. Data were analyzed using Item Response Theory (IRT) with the Graded Response Model (GRM). The sample comprised 832 upper secondary school students. The results showed that 1) The measurement model consisted of four sub-competencies with a total of 14 behavioral indicators: (1) self-efficacy (5 indicators), (2) stress and uncertainty management (3 indicators), (3) planning and organizational management (3 indicators), and (4) responsibility (3 indicators), totaling 84 items. Confirmatory factor analysis indicated that the model fit the empirical data well (χ² = 73.874, df = 57, p-value = 0.0661, RMSEA = 0.019, SRMR = 0.016, CFI = 0.997, TLI = 0.994). All items met the selection criteria for discrimination (α) with a mean of 0.904 (range: 0.516–1.891) and threshold (β) parameters with means of β₁ = -0.281, β₂ = 0.906, β₃ = 2.122. The overall reliability (Cronbach's alpha) was 0.925, and the test information function demonstrated high measurement precision for ability levels in the range θ ≈ –1.0 to +3, this is suitable for students with intermediate to high ability. Competency cut scores were established using the mean threshold parameters and Wright Map, dividing competency into four levels: θ ≤ –0.28 (insufficient), –0.28 < θ ≤ 0.91 (emerging), 0.91 < θ ≤ 2.12 (competent), and θ > 2.12 (highly competent). The validated item bank is appropriate for accurate competency assessment and is ready for integration into a computerized adaptive testing (CAT) system in the future.
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
หากผู้เสนอบทความมีความจำเป็นเร่งด่วนในการตีพิมพ์โปรดส่งลงตีพิมพ์ในวารสารฉบับอื่นแทน โดยกองบรรณาธิการจะไม่รับบทความหากผู้เสนอบทความไม่ปฏิบัติตามเงื่อนไขและขั้นตอนที่กำหนดอย่างเคร่งครัด ข้อมูลของเนื้อหาในบทความถือเป็นลิขสิทธิ์ของ Journal of Inclusive and Innovative Education คณะศึกษาศาสตร์ มหาวิทยาลัยเชียงใหม่
References
Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. New York: Longman.
Baker, F. B. (2001). The Basics of Item Response Theory. Washington, DC.: ERIC Clearinghouse on Assessment and Evaluation.
Bond, T. G., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). New York: Routledge.
Brown, L., & Miller, K. (2016). Responsibility and Self-Management in Academic Success: A Structural Equation Modeling Approach. Journal of Educational Research, 80(3), 252-263.
Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, NJ.: Lawrence Erlbaum Associates.
Chanchusakul, S. (2021). A synthesis of research on the development of student competency measurement tools for 21st-century secondary education. Journal of Educational Measurement, Mahasarakham University, 27(1), 340-359. [in Thai]
Cizek, G. J., & Bunch, M. B. (2007). Standard setting: A guide to establishing and evaluating performance standards on tests. Thousand Oaks, CA.: SAGE Publications.
De Ayala, R. J. (2009). The theory and practice of item response theory. New York: Guilford Press.
DeVellis, R. F. (2017). Scale development: Theory and applications (4th ed.). Thousand Oaks, CA: Sage
Publications.
Embretson, S. E., & Reise, S. P. (2000). Item Response Theory for Psychologists. New York: Psychology Press.
European Commission. (2018). Key competences for lifelong learning. Luxembourg: Publications Office of the European Union.
Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate Data Analysis (7th ed.). New York:Prentice Hall.
Hair, J. F. Jr., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2014). A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM). Thousand Oaks, CA.: SAGE Publications.
Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and applications. Boston: Kluwer- Nijhoff.
Javali, S., Gudaganavar, N. V., & Raj, S. M. (2011). Effect of varying sample size in estimation of coefficients of internal consistency. WebmedCentral Biostatistics, 2(2), WMC001577.
Johnson, R. (2017). The Importance of Responsibility in Academic Achievement. International Journal of Educational Research, 85(1), 48-61.
Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28(4), 563–575.
Langka, P., et al. (2019). The development of a national standard toolkit for moral evaluation of students at all ages studying at academic institutions of the ministry of education. Journal of Education, Faculty of Education, Srinakharinwirot University, 20(2), 1-15. [in Thai]
Lievens, F., & Sackett, P. R. (2012). The validity of interpersonal skills assessment via situational judgment tests for predicting academic success and job performance. Journal of Applied Psychology, 97(2), 460-468.
Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale, NJ: Lawrence Erlbaum Associates.
McDaniel, M. A., Hartman, N. S., Whetzel, D. L., & Grubb, W. L., III. (2007). Situational judgment tests, response instructions, and validity: A meta-analysis. Personnel Psychology, 60(1), 63-91.
Ministry of Education. (1999). National Education Act, B.E. 2542 (1999). Bangkok: Khuru Sapha Lad Phrao. [in Thai]
Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). New York: McGraw-Hill.
OECD. (2018). The future of education and skills: Education 2030 – The future we want. Retrieved from https://www.oecd.org/education/2030-project/
OECD. (2019). OECD learning compass 2030: A series of concept notes. Retrieved from https://www.oecd.org/education/2030-project/learning/learning-compass-2030/
OECD. (2024). Education at a glance 2024: OECD indicators. Paris: OECD Publishing.
OECD. (2025). The state of global teenage career preparation 2025. Paris: OECD Publishing.
Office of the Education Council. (2017). National Education Plan B.E. 2560–2579 (2017–2036). Bangkok: Office of the Education Council, Ministry of Education. [in Thai]
Office of the Education Council. (2021). Guidelines for the Development of Basic Education Students' Competencies During the Transition to a Competency-Based Curriculum. Bangkok: Office of the Education Council, Ministry of Education. [in Thai]
Phrommaboon, T., et al. (2020). Development of a student competency testing system for
upper primary level to promote student quality enhancement in the 21st century. Bangkok: National Institute of Educational Testing Service (Public Organization) [in Thai]
Ployhart, R. E., & Ehrhart, M. G. (2003). Be careful what you ask for: Effects of response instructions on the construct validity and reliability of situational judgment tests. International Journal of Selection and Assessment, 11(1), 1-16.
Samejima, F. (1969). Estimation of latent ability using a response pattern of graded scores. Psychometrika Monograph Supplement, 34(4), 1-97.
Srisakda, B. (2015). Development of competency assessment tools for upper secondary
students according to the Basic Education Core Curriculum B.E. 2551(2008) (Doctoral dissertation). Faculty of Education, Chulalongkorn University. [in Thai]
Streiner, D. L. (2003). Starting at the beginning: An introduction to coefficient alpha and internal consistency. Journal of Personality Assessment, 80(1), 99-103.
Tuksino, P., et al. (2016). Measurement and assessment model of desirable characteristics according to national education standards and the development of an attitude structure measurement tool for basic education students. National Institute of Educational Testing Service (Public Organization). [in Thai]
UNESCO. (2015). UNESCO competency framework [PDF]. Retrieved from https://en.unesco.org/sites/default/files/competency_framework_e.pdf
UNESCO. (2015). Education 2030: Incheon declaration and framework for action for the implementation of Sustainable Development Goal 4. Retrieved from https://unesdoc.unesco.org/ark:/48223/pf0000245656
van der Linden, W. J., & Glas, C. A. W. (2010). Elements of adaptive testing. New York: Springer.
Wainer, H., Dorans, N. J., Flaugher, R., Green, B. F., Mislevy, R. J., Steinberg, L., & Thissen, D. (2000). Computerized adaptive testing: A primer (2nd ed.). New York: Lawrence Erlbaum Associates.
Wainer, H., & Mislevy, R. J. (2000). Item response theory, item calibration, and proficiency estimation. In H.
Wainer (Ed.), Computerized adaptive testing: A primer (2nd ed., pp. 61–100). Mahwah, NJ: Lawrence Erlbaum Associates.
Weekley, J. A., & Ployhart, R. E. (2013). Situational judgment: Antecedents and relationships with performance. Human Performance, 26(3), 213-234.
Whetzel, D. L., & McDaniel, M. A. (2009). Situational judgment tests: An overview of current research. Human Resource Management Review, 19(3), 188-202.
Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Lawrence Erlbaum Associates Publishers.
World Economic Forum. (2020). The future of jobs report 2020. World Economic Forum. Retrieved from https://www.weforum.org/reports/the-future-of-jobs-report-2020
World Economic Forum. (2023). The future of jobs report 2023. World Economic Forum. Retrieved from https://www.weforum.org/reports/the-future-of-jobs-report-2023
World Economic Forum. (2025). The future of jobs report 2025. World Economic Forum.
Wright, B. D., & Stone, M. H. (1979). Best test design. Chicago, IL: MESA Press.
Zhang, Y., & Liu, S. (2019). Planning and Organization Skills as Predictors of Academic Success. Journal of Educational Psychology, 111(6), 1014-1025.