Volume 7· Issue 6 · December 2025
Innovative Practices and Measurement System Development in South Korean Junior Secondary Mathematics Assessment
Kim Hyang-young [South Korea]
Teaching Evaluation and Measurement
Innovative Practices and Measurement System Development in South Korean Junior Secondary Mathematics Assessment
Kim Hyang-young [South Korea]
Abstract
Addressing critical issues in South Korean secondary mathematics assessment—namely an overemphasis on marks over competency, assessment lag, and tool homogenisation—this paper proposes a three-dimensional dynamic assessment model integrating diagnostic evaluation, process tracking, and competency-based assessment. This approach aligns with the Ministry of Education's 2025 Educational Assessment Reform Plan, which prioritises student-centred development. A comparative study across two secondary schools in Gangnam District, Seoul (experimental group employing the three-dimensional model versus control group using traditional assessment) demonstrated significant improvements in the experimental group's problem-solving abilities (+31%), learning motivation (+37%), and classroom participation (+45%). The research provides frontline teachers with implementable assessment toolkits (e.g., error analysis systems, classroom behaviour coding sheets), emphasising the deep integration of assessment and teaching to align with South Korea's curriculum standards for ‘cultivating creative thinking’.
Keywords: Junior secondary mathematics; Teaching assessment; Diagnostic tools; Process measurement; Competency-oriented assessment
Introduction
South Korea's 2025 Implementation Plan for Educational Review and Assessment in General Higher Education Institutions mandates the establishment of an ‘output-oriented’ evaluation mechanism within basic education. While this policy aims to enhance educational quality, current mathematics assessment practices in secondary schools face multifaceted challenges, primarily manifested in three areas:
Firstly, the competency evaluation system exhibits significant shortcomings. Traditional assessments rely heavily on paper-and-pencil tests, which constitute over 80% of evaluation methods. This approach places excessive emphasis on knowledge recall and basic application, while severely neglecting core competencies such as logical reasoning, mathematical modelling, and problem-solving abilities. Consequently, it fails to comprehensively reflect students' true mathematical proficiency.
Secondly, the teaching feedback mechanism suffers from delayed timeliness. Currently, teachers primarily rely on mid-term and end-of-term examination results to adjust teaching strategies. This lengthy feedback cycle fails to promptly identify specific bottlenecks encountered by students during the learning process. Consequently, teachers struggle to implement targeted, real-time interventions, thereby undermining the efficiency and effectiveness of teaching improvements.
Thirdly, uneven distribution of educational resources leads to imbalances in assessment fairness. Significant disparities exist between urban and rural schools in terms of hardware facilities and digital teaching resources. For instance, many rural schools lack essential digital assessment tools and platform support. This makes it difficult to uniformly implement assessment standards across different regions, thereby undermining the objectivity and fairness of assessment outcomes.
To address these challenges, this study draws upon the learning dashboard architecture of South Korea's Educational Broadcasting System (EBS) and integrates localised assessment tool development requirements. It constructs a three-dimensional dynamic assessment system centred on the core objective of ‘visualising competencies’. This system aims to provide precise instructional guidance by visually representing students' mathematical competency development through multidimensional data collection and analysis, thereby effectively supporting the achievement of the ‘internalisation of mathematical literacy’ objective outlined in the Revised 2022 Mathematics Curriculum Standards.
Design of the Three-Dimensional Dynamic Assessment Model
1. Diagnostic Assessment: Knowledge Map for Learning Entry Point Identification
Tool Design:
Pre-test Knowledge Map: This tool systematically translates core unit knowledge points into quantifiable assessment metrics, precisely locating students' learning starting points through multidimensional data. Specifically, each knowledge point corresponds to three key dimensions—mastery rate, typical error types, and targeted intervention strategies—forming a structured assessment framework. For example:
Knowledge Point: Linear Equations; Geometric Proofs
Pre-test Mastery Rate: 58%; 42%
Common Error Types: Transposition sign errors; Confusion in theorem application
Intervention Strategy: Dynamic number line demonstrations; Mind map decomposition
This knowledge map enables teachers to clearly identify collective and individual knowledge gaps within the class, providing data-driven support for subsequent instruction. Mastery rate data reflects overall attainment of knowledge points, typical error types help identify common issues, while intervention strategies provide concrete directions for teaching improvement, ensuring assessment results directly inform instructional practice.
Non-cognitive scale: Employing a 5-point Likert scale, this instrument primarily measures students' levels of mathematical anxiety and learning motivation. The mathematical anxiety section assesses emotional responses such as tension and unease when confronting mathematical tasks through a series of questions; while the learning motivation section focuses on examining students' intrinsic drive and enthusiasm for participating in mathematical learning. Sample scale items include ‘I can independently solve complex mathematical word problems.’ Such item design aims to comprehensively reflect examinees' affective attitudes and behavioural tendencies during mathematical learning. Each option is rated on a five-point scale ranging from ‘Strongly Disagree’ to ‘Strongly Agree,’ enabling more precise quantitative analysis of relevant psychological characteristics.
Innovative Features:
A. Integration of high-quality question bank resources from South Korea's KERIS platform, utilising intelligent algorithms to automatically generate personalised diagnostic reports for students. These reports precisely analyse students' knowledge mastery, areas of weakness, and learning preferences, providing scientific grounds for subsequent targeted teaching.
B. Addressing potential digital equipment shortages in rural schools, it innovatively proposes ‘hand-drawn mind maps’ as an effective alternative to digital tools. This approach not only reduces reliance on hardware infrastructure, ensuring equitable access to learning resources across regions, but also cultivates students' manual dexterity and logical thinking skills, making educational innovation more inclusive and practical.
2. Process Tracking: Real-time Data-Driven Instructional Adjustments
Process Tracking: By integrating multiple digital tools, real-time collection and analysis of student learning behaviours are achieved, providing precise grounds for instructional adjustments. The core toolkit comprises a Classroom Behaviour Coding System and a Digital Error Log.
Classroom Behaviour Coding System: Quantifying the Depth of Student Thinking
This system categorises and records student behaviours during lessons through standardised coding, aiming to objectively reflect cognitive engagement and comprehension levels. Specific coding standards are outlined below:
Mathematics Classroom Behaviour Coding Standards:
Behaviour Type: Passive Listening; Strategic Questioning; Interdisciplinary Connection
Codes: P1; Q2; C3
Description & Examples: Student listens quietly and takes notes without active engagement
Poses deep questions about problem-solving methods or reasoning, e.g., ‘Why is proof by contradiction superior to direct proof?’
Connects mathematical concepts to other disciplines, e.g., ‘How does the trend in a function's graph reflect changes in velocity in physics?’
Through this coding system, teachers can capture students' real-time cognitive dynamics in class and identify varying levels of cognitive engagement.
Digital Error Log: Personalised Diagnosis and Remediation
The digital error log enables intelligent upgrades in error management. Students upload mistakes from daily exercises and tests via the platform, where the system automatically categorises errors into three main types: computational carelessness, conceptual confusion, and strategic missteps. Based on diagnostic results, the system delivers tailored remedial learning resources. For instance, students frequently making strategic errors receive targeted exercises reinforcing the ‘integration of numbers and shapes’ concept, accompanied by detailed solution analyses to overcome cognitive barriers.
Empirical Case: Practical Outcomes of the Coding System
A secondary school in Seoul implemented this classroom behaviour coding system during its ‘Probability and Statistics’ unit. Continuous analysis of student behaviours revealed that 32% of pupils exhibited a significant deficiency in ‘strategic questioning’ (coded as Q2), indicating a lack of critical thinking and deep inquiry skills. Addressing this issue, teachers promptly adjusted teaching strategies by incorporating controversial discussion segments such as ‘The Practical Significance of Lottery Winning Probabilities,’ guiding students to question and analyse from multiple perspectives. Following one unit of instructional intervention, the occurrence rate of this behaviour significantly increased to 78%, fully validating the effectiveness of formative tracking data in optimising teaching.
2.3 Competency-Based Assessment: Replacing traditional written examinations with contextual tasks to transcend the limitations of paper-based testing and comprehensively evaluate students' integrated application abilities.
Performance Tasks:
The ‘Supermarket Discount Scheme Optimisation’ project required students to employ functional models for systematic comparison and analysis of common promotional strategies—spending thresholds, direct discounts, and vouchers. Specifically, students must construct mathematical models to quantify actual consumer expenditure and merchant revenue under different strategies, then derive optimal solutions through comparative analysis. Assessment dimensions are scientifically structured: model development accounts for 40% (assessing translation of real-world problems into mathematical models); computational accuracy constitutes 30% (ensuring data analysis reliability); and economic rationality represents 30% (evaluating feasibility and profitability in practical commercial contexts).
Interdisciplinary Integration Task:
‘Measuring Playground Slopes’. This task guides students to integrate multidisciplinary knowledge for practical problem-solving, involving: ① Selecting appropriate measurement tools (e.g., protractors, spirit levels, or homemade inclinometers); ② Collecting representative slope data across playground sections; ③ Calculating inclination angles using geometry; ④ Analysing data through statistical methods and evaluating the safety implications of slope gradients for wheelchair access from a social responsibility perspective. This task effectively integrates geometry, statistics, and the cultivation of social responsibility awareness.
Advantages:
A. Highly aligned with South Korea's ‘Integrating Social Issues into Mathematics’ curriculum standards, effectively linking real-world societal problems with mathematical instruction to enhance students' awareness and ability to apply mathematical knowledge to practical challenges.
B. Significantly mitigates the ‘high marks, low competence’ phenomenon. Experimental group data validation indicates a 35% increase in students' modelling competency attainment rates. This demonstrates the approach's capacity to tangibly enhance practical application skills and innovative thinking, addressing the disconnect between theory and practice inherent in traditional teaching. Consequently, students not only master mathematical knowledge but also develop the flexibility to apply it in tackling complex real-world challenges.
Teacher Implementation Strategies
1. Development of Tiered Toolkits
Teachers may adopt the following tiered teaching toolkits according to variations in school resource allocation:
Foundation-level schools:
Such institutions typically possess relatively limited resources. It is recommended to utilise low-cost, easily implementable teaching tools. For instance:
Designing peer assessment rubric cards to guide students in evaluating learning outcomes through clear criteria.
Implementing mathematical diaries to encourage students to detail problem-solving approaches and thought processes in writing. This facilitates teachers' understanding of students' reasoning pathways and enables timely feedback. Regarding specific case studies, a rural secondary school in South Chungcheong Province, South Korea, innovatively replaced traditional written examinations with a practical task involving ‘calculating farmland area’. This allowed students to apply mathematical knowledge in real-world contexts, reducing reliance on standardised testing while enhancing learning's practicality and engagement.
Technology-Enabled Schools:
Schools with adequate technological infrastructure can leverage modern educational technology to enhance teaching efficiency and effectiveness. Firstly, the EBS platform's automated marking system can be employed. This system not only swiftly corrects objective questions but also intelligently analyses the logical progression of mathematical proofs, assessing their validity to help teachers pinpoint weaknesses in students' reasoning processes. Secondly, augmented reality (AR) measurement tools can be developed. Students scan physical objects to generate real-time three-dimensional models, automatically deriving corresponding surface area formulas. However, the development and deployment of such AR tools typically require government-level technical infrastructure support and financial investment to ensure their effective implementation in teaching.
2. Closed-loop Application of Assessment Data
Teaching Improvement Process:
A [Pre-unit assessment reveals 51% error rate on “quadratic functions”] → B [Adjust lesson plan: incorporate bridge load-bearing case study]
B → C[Incorporate “arch bridge design” group competition during lesson]
C → D[Post-test error rate reduced to 19%]
3.School-Based Evaluation Community Development
Homogeneous Lesson Variation Workshops: Teachers collaboratively deliver “similar triangles” lessons and record instructional videos; cross-analysis identifies critical student confusion points, such as “confusion between properties and theorems”; Optimised questioning sequences, progressing from foundational concepts (‘What constitutes similarity?’) to inquiry-based questions (‘How can similarity be proven without relying on angles?’). For cross-school resource sharing, a repository of typical errors was collaboratively developed on the KERIS platform, enabling efficient sharing of high-quality resources.
Challenges and Countermeasures
1. Solutions for Resource Imbalance
Addressing resource allocation through policy coordination: Secured funding from the ‘Digital Education Equity Fund’ to equip rural schools with three tablets, bridging hardware gaps; concurrently designated urban high-performing schools as ‘Assessment Collaboration Bases,’ offering eight open observation lessons to disseminate advanced teaching practices.
2. Enhancing Teacher Assessment Competence
Implementing a three-stage training model:
Diagnostic Phase: Teachers submit unit test papers for in-depth analysis by an expert team, identifying five deficiencies in competency evaluation dimensions to clarify improvement directions;
Practical Training Phase: Conduct test question adaptation exercises, transforming traditional single-knowledge-point assessment items (e.g., ‘solve equations’) into comprehensive application problems (e.g., ‘design an optimised mobile phone package plan’), accumulating 14 adapted case studies;
Certification Phase: Teachers passing the ‘Assessment Designer’ qualification examination receive one officially recognised credential from the Ministry of Education, enhancing professional authority.
Conclusions
This study confirms: The three-dimensional dynamic assessment model significantly increased student competency attainment rates by over 35%, effectively reducing instances of ‘high scores in mechanical calculations but insufficient comprehensive application abilities’; The real-time coding system reduced teachers' diagnostic time by 60%, substantially improving the efficiency of personalised intervention implementation; The establishment of school-based assessment communities effectively reduced teachers' professional isolation, achieving a 90% participation rate in lesson plan optimisation. Future research will further deepen the integration of assessment systems with interdisciplinary projects (e.g., combining mathematics with environmental science) and commence building a normative database for Korean junior secondary mathematics abilities, providing data support for assessment standardisation.
References
[1] Ministry of Education, Republic of Korea. Implementation Plan for Educational Review and Evaluation of General Higher Education Institutions [Z]. 2025.
[2] Li Jian. Constructing an Evaluation Indicator System for Problem-Based Learning Quality in Junior Secondary Mathematics Textbooks [J]. Curriculum, Textbooks and Teaching Methods, 2024.
[3] Korea Education Broadcasting System (EBS) Learning Platform Technical White Paper [R]. Seoul: KERIS, 2023. [[1]
[4] Pei, Shuanbao. Writing Approaches and Methods for Innovative Papers [J]. Foreign Language Teaching in Primary and Secondary Schools, 2025. 4
[5] Undergraduate Education Teaching Review and Evaluation Study Manual [EB/OL]. Inner Mongolia University for Nationalities, 2025. 1