SOLO (Structure of Observed Learning Outcomes) Based Metacognitive Approach in Students’ Creative Thinking and Science Literacy in Physics 8
Mariel Bermudez-Sobrera | Anamarie Gamboa-Valdez
Discipline: Artificial Intelligence
Abstract:
Science literacy and creative thinking are key competencies emphasized in the Program for International Student Assessment, which evaluates students' ability to apply scientific knowledge to real-world problems and think critically and creatively in solving complex challenges. Thus, this study examines the impact of the Structure of the Observed Learning Outcome Based Metacognitive Approach (SOLO-BAMA) in improving students’ Science Literacy and Creative Thinking. The SOLO framework, introduced by Biggs and Collis (1982), provides a hierarchical model for assessing student learning, emphasizing metacognitive development and deeper understanding which was in the content of Higher Order Thinking Skills Professional Learning Package (HOTS-PLP) introduced by the Department of Education. A true-experimental research design was employed, utilizing pretest and posttest assessments to measure learning gains among students exposed to SOLO-based instruction and those taught through traditional methods. The results indicated that students in the SOLO-based learning environment exhibited significantly greater improvements in both Science Literacy and Creative Thinking. Statistical analysis using the Mann-Whitney U test confirmed significant differences in posttest scores demonstrating the superior effectiveness of SOLO-based instruction. Additionally, the intervention group achieved higher learning gains than the control group, supporting previous findings that metacognitive strategies enhance student comprehension and problem-solving skills. These findings suggest that integrating the SOLO framework in science education fosters deeper learning, higher-order thinking, and conceptual understanding. The study recommends broader implementation of SOLO-based strategies and further research across diverse educational settings to assess its long-term impact.
References:
- Acido, J. V., & Caballes, D. G. (2024). Assessing educational progress: A comparative analysis of PISA results (2018 vs. 2022) and HDI correlation in the Philippines. World Journal of Advanced Research and Reviews, 21(1), 462-474.
- Adams, W. K., & Wieman, C. E. (2011). Development and validation of instruments to measure learning of expert‐like thinking. International journal of science education, 33(9), 1289-1312.
- Aitken, L. M., Burmeister, E., Clayton, S., Dalais, C., & Gardner, G. (2011). The impact of nursing rounds on the practice environment and nurse satisfaction in intensive care: pre-test post-test comparative study. International Journal of Nursing Studies, 48(8), 918-925.
- Aloraini, S. (2012). The impact of using multimedia on students’ academic achievement in the College of Education at King Saud University. Journal of King Saud University-Languages and Translation, 24(2), 75-82.
- Baeten, M., Dochy, F., & Struyven, K. (2013). The effects of different learning environments on students’ motivation for learning and their achievement. British journal of educational Psychology, 83(3), 484-501.
- Ballance, O. J. (2024). Sampling and randomisation in experimental and quasi-experimental CALL studies: Issues and recommendations for design, reporting, review, and interpretation. ReCALL, 36(1), 58-71.
- Bhagwat, M. S. (2016). Developing and Implementing Instructional Strategy on the Structure of Observed Learning Outcomes (SOLO) Taxonomy for Mathematics of Class–IX (Doctoral dissertation, Maharaja Sayajirao University of Baroda (India)).
- Biggs, J. B., & Collis, K. F. (2014). Evaluating the quality of learning: The SOLO taxonomy (Structure of the Observed Learning Outcome). Academic press.
- Butler, A. G. (2013). Exploring the Role of Social Reasoning and Self - Efficacy in the Mathematics Problem-Solving Performance of Lower-and Higher-Income Children. Journal of Educational Research and Practice, 3(1), 93-119.
- Castro Benavides, L. M., Tamayo Arias, J. A., Burgos, D., & Martens, A. (2022). Measuring digital transformation in higher education institutions–content validity instrument. Applied Computing and Informatics
- Delucchi, M. (2014). Measuring student learning in social statistics: A pretest-posttest study of knowledge gain. Teaching Sociology, 42(3), 231-239.
- Dunbar-Jacob, J. (2012). Minimizing threats to internal validity. Intervention research: Designing, conducting, analyzing, and funding, 91-106.
- Dy, A. S., & Sumayao, E. D. (2023). Influence of the pre-service teachers’ language proficiency to their teaching competence. AJELP: Asian Journal of English Language and Pedagogy, 11(1), 1-21.
- Dytham, C. (2011). Choosing and using statistics: a biologist’s guide. John Wiley & Sons.
- Gile, K. J., & Handcock, M. S. (2010). Respondent‐driven sampling: an assessment of current methodology. Sociological methodology, 40(1), 285-327.
- Guiaselon, B. U., Luyugen-Omar, S., Mohamad, H. A., Maidu, N. U., Maguid, N. P., Samson, C. D., & Sinsuat, D. R. R. S. (2022). Mismatch of Teachers’ Qualifications and Subjects Taught: Effects on Students’ National Achievement Test. Psychology and education: A multidisciplinary journal.
- Ho, S. K., & Gan, Z. (2023). Instructional practices and students’ reading performance: a comparative study of 10 top performing regions in PISA 2018. Language Testing in Asia, 13(1), 48.
- Johar, R., Syahfitri, M., Suhartati, S. A. Y. Z., Ikhsan, M., Idami, Z., & Rohaizati, U. (2024, March). Validity of Website-Based Statistical Learning Tools with an Islamic Context to Improve Student Mathematical Communication. In Proceedings of the 2nd Annual International Conference on Mathematics, Science and Technology Education (2nd AICMSTE) (Vol. 828, p. 53). Springer Nature.
- Kwangmuang, P., Jarutkamolpong, S., Sangboonraung, W., & Daungtod, S. (2021).The development of learning innovation to enhance higher order thinking skills for students in Thailand junior high schools. Heliyon, 7(6).
- Kyriakides, L., Charalambous, E., Creemers, B. P., Antoniou, P., Devine, D.,Papastylianou, D., & Fahie, D. (2019). Using the dynamic approach to school improvement to promote quality and equity in education: A European study. Educational Assessment, Evaluation and Accountability, 31, 121-149.
- Lapinid, M. R. C., Mistades, V. M., Sagcal, R. R., Gustilo, L. E., Balagtas, M. U., Gonzales, R. D., ... & Palomar, B. C. (2024). Aligning Philippine K to 12 Assessment Policies against International Benchmarks: Implications for Quality Reform. Philippine Journal of Science, 153(6B), 2375-2392.
- Lopez-Guerra, C. (2011). The enfranchisement lottery. Politics, philosophy & economics, 10(2), 211-233.
- Mandel, J. (2012). The statistical analysis of experimental data. Courier Corporation.
- Orhan, B. U. R. C. U. (2020). Investigation of the effect of student and school background variables, teaching and learning variables and non-cognitive outcomes on the components of scientific literacy in programme for international student assessment (PISA 2015) [Unpublished doctoral thesis]. Middle East Technical University, Ankara.
- Orcan, F. (2020). Parametric or non-parametric: Skewness to test normality for mean comparison. International Journal of Assessment Tools in Education, 7(2), 255-265.
- Rivas, S. F., Saiz, C., & Ossa, C. (2022). Metacognitive strategies and development of critical thinking in higher education. Frontiers in psychology, 13, 913219.
- San Juan, D. M. M. (2016). Neoliberal restructuring of education in the Philippines: dependency, labor, privatization, critical pedagogy, and the K to 12 system. Asia-Pacific Social Science Review, 16(1), 7.
- Silver, N., Kaplan, M., LaVaque-Manty, D., & Meizlish, D. (Eds.). (2023). Using reflection and metacognition to improve student learning: Across the disciplines, across the academy. Taylor & Francis.
- Shmygol, N., Galtsova, O., Solovyov, O., Koval, V., & Arsawan, I. W. E. (2020). Analysis of country’s competitiveness factors based on inter-state rating comparisons. In E3S Web of Conferences (Vol. 153, p. 03001). EDP Sciences.
- Tan, C. (2018). Comparing high-performing education systems: understanding Singapore, Shanghai, and Hong Kong. Routledge.
- Tanner, K., Williamson, K., & Johanson, G. (2018). Experimental research. Research Methods: Information, Systems, and Contexts:, 32(6), 925-926.
- Teig, N., Scherer, R., & Olsen, R. V. (2022). A systematic review of studies investigating science teaching and learning: over two decades of TIMSS and PISA. International Journal of Science Education, 44(12), 2035-2058.
- Vaz, S., Falkmer, T., Passmore, A. E., Parsons, R., & Andreou, P. (2013). The case for using the repeatability coefficient when calculating test–retest reliability. PloS one, 8(9), e73990.