As the exam season comes to an end, students, as major stakeholders, shift their focus from exam preparation to receiving their reports, while assessment bodies concentrate on generating and communicating performance summaries and insights to stakeholders. Providing informative individual reports for students and aggregated reports for policymakers to understand the student learning landscape shapes the culture of the assessment journey in a jurisdiction. For example, in August 2024, A-level results were announced in England, accompanied by Ofqual's country-level data, sparking a debate on achievement gaps, particularly after comparing geographical attainment with national averages (The Conversation, 2024); this article continues to explore a similar debate from a technology standpoint.
Regardless of the scale of assessments - whether international student assessments, national large-scale assessments, or individual student reports - every generated report contains data that can help better understand what is happening in the learning domain and what systems need to be built to support student learning on multiple levels. Especially now, with the increased use of virtual learning and computerized assessments, the need for continuous and quick feedback, along with access to real-time data generated from student performance, has become an increasingly valuable asset.
In this article, we will explore the strategies and technologies around generation of customized reports and the methods used to communicate these reports to field-specific stakeholders.
In simple terms, report generation implies transforming raw data to meaningful insights. In the roadmap of assessment modernization, the generation of reports is evolving with the use of data management tools such as data analytics, artificial intelligence, and machine learning, offering increased precision and a more detailed approach. A snapshot of a pipeline dedicated to modernizing assessment data management, from our October 2023 featured article, is illustrated below:
Clearly, technology gathers a detailed and vast collection of data at every step, including login times, time spent on tasks, mouse clicks, idle time, key presses, the number of attempts, adaptive learning routes, score/marking stats, and even downloads of student reports, providing valuable personalized insights into each student’s progression paths.
Another aspect of handling data with technology is the ability to customize reports based on the needs of each stakeholder while supporting the overall accessibility agenda. The reports outlined below provide an overview of various aspects of the assessment report generation process, aligned with stakeholder expectations for accessible data aimed at understanding and improving student learning outcomes at different levels.
Individual Student Reports. In a technology-advanced assessment landscape, student performance reports are generated in two phases: an instant report immediately after the assessment (providing performance on selected response items) and a delayed report after marking is completed (providing performance on constructed response items). This report generation process integrates machine learning algorithms to analyze responses in real-time for the immediate report, while deep learning models are employed to assess constructed responses for the detailed, delayed report.
This type of report typically includes the academic result as a percentage, displayed in color with a progress line, along with detailed explanatory feedback on student’s performance of curriculum strands and skill categories. Additionally, the report also provides individual feedback on the student's strengths and areas for improvement, presented in clear, broken-down tables, and may include data from previous years, tips for parents, and a summary of performance levels at a glance.
Performance Overview Report. In practice, reports on student performance generated for policymakers usually need to include key statistics and trends that help them understand the educational landscape for planning and approving interventional programs. To support this goal of visualizing trends and pointing out urges on decisions, data visualization tools are very helpful to illustrate performance across different domains and jurisdictions in need. For example, at the jurisdiction level, data includes the number of students, their participation in assessments, performance across various domains, and whether they are meeting provincial standards. In line with special needs and inclusive policies in education, similar student data is also presented for students with special needs and those with different languages of instruction.
Policymakers are typically interested in analyzing and promoting student performance by comparing current results with past years, thereby justifying their roles through demonstrated improvements in student outcomes over time. Thus, such performance overview reports provide a clear interpretation of assessment data, supporting policymakers to improve student performance by timely understanding whether performance is improving or declining, and making informed decisions accordingly.
Learner Engagement Report. Having the learning at the center, the generation of such a report requires the use of advanced data integration techniques to integrate survey responses with student achievement data, including sentiment analysis to better understand student confidence and engagement levels. So, questionnaires for both students and teachers help understand learners' perspectives and the teaching environment, complementing student performance data to provide a more systemic view of students' learning experiences within school and classroom management contexts.
The questionnaire may assess students' interest and self-perception/confidence levels in competencies such as reading, writing, and math, as well as their growth mindset in each area of assessment. Growth mindset area includes indicators like how many students believe they can succeed in math if they try hard enough. Additionally, such reports may also cover technological access and proficiency, as well as students' self-directed learning and collaboration efforts in their perceived learning.
Teaching Confidence Report. The generation of this report requires the use of supervised learning models and neural networks, to analyze how various aspects of a teacher’s confidence impact student learning outcomes across different subjects and environments, identifying specific areas where improving teacher confidence could increase student performance and providing insights for future training needs and curriculum adjustments.
Finally, a teaching confidence report complements all of this student-oriented data through bits of information on teacher confidence in teaching transferable skills and school management’s use of data to understand how well curriculum expectations are met. Additionally, the report includes information on how data informs program planning, resource allocation, and teaching practices, becoming an objective engine for actionable decisions that support school principals and teachers.
In conclusion, within a modernized assessment environment, the reports detailed above are integrated into a unified pdf document provided to stakeholders. An example of such a comprehensive report is illustrated here.
In line with its stakeholder engagement strategy, once the reports are generated within the assessment organization, the next step in the agenda is to build a user-friendly communication channel of the findings to stakeholders. To make sure that the information is presented in a clearly and accessible form to all relevant stakeholders, there are number of ways technology could assist. For example, digital platforms, interactive dashboards, and personalized reports are tools that present data in a user-friendly manner, making it easy for students, educators, and policymakers to apply insights in daily activities and make informed decisions that positively impact educational outcomes.
While we can confirm that reports are sent and, in the best-case scenario, received (as indicated by download statistics), whether these reports are understood and become actionable, especially with the role of technology, is a major question this section will address.
Student Report Outreach. Students use mobile applications more frequently, and integrating reports into user-friendly web portals would promote student and parent access in real-time. Additionally, students prefer to tailor their own interface and menu for viewing information, so the use of interactive dashboards allows them to customize their view of different metrics and receive automated explanations of their scores.
Finally, being informed serves as a welcoming invitation to check updates, so students and parents would appreciate receiving notifications that alert them when new reports are available. We just need to make sure that for data privacy, encrypted emails and secure web portals are used.
Overview Report Presentation. Policymakers typically prefer concise summaries that provide a big-picture overview. So, data visualization software such as Tableau and Microsoft Power BI can greatly simplify complex data sets, transforming them into understandable graphs and charts for policymakers. Additionally, geographical information systems can map performance data by region, aiding policymakers in their analysis and quicker decision-making. Furthermore, with a proactive line of communication, automated report summaries can also be emailed directly to policymakers and other stakeholders to facilitate policy formation. Lastly, scheduling webinars or digital meetings where policymakers can directly interact with data analysts to discuss reports is an effective way to build bridges.
Communicating Learner Engagement. The more a student is focused upon, the more data points are collected, not only from assessments but also from the entire learning landscape. Integration with Learning Management Systems (LMS) facilitates updates and access, while sentiment analysis tools summarize qualitative feedback from surveys, making the learner engagement report a critical bridge. Additionally, organization of interactive forums or feedback sessions could support direct dialogue between students, teachers, and parents about engagement levels and strategies for improvement.
Sharing Teaching Confidence Report. Collaborative platforms can be used to allow teachers to access their reports, view peer benchmarks, and engage in community discussions, which would help increase engagement and share best practices in supporting student success. Additionally, within the engagement plan, we could provide recommendations based on report outcomes that support strategies for teaching transferable skills, further personalizing both teaching and learning. Finally, regular virtual workshops or seminars, where insights from the teaching confidence reports are discussed and strategies developed collaboratively, are essential for direct engagement.
In conclusion, this article aimed to reemphasize the importance of reporting and communication, presenting technology-driven solutions that support these two critical aspects of the assessment accountability domain, keeping both students and stakeholders informed and empowered to act. From individual student reports to systemic performance overviews, we've demonstrated how the use of data-driven insights and advanced technologies like machine learning and data analytics can transform raw data into actionable knowledge.
We can also conclude that communication contributes to progressing the quality of reports through feedback received during student and teacher engagement opportunities, which is why, without proactive digital communication tools, reports risk becoming static and less impactful. So, the integration of real-time data analysis and dynamic communication platforms - from digital dashboards to interactive forums - facilitates a deeper understanding of student learning and engagement, allowing stakeholders to make informed decisions more quickly and effectively drive meaningful change. Ultimately, these tools not only enhance the assessment process but also strengthen the communication channels between students, educators, and policymakers.
Finally, by developing technology-driven reporting platforms and employing open and responsive communication techniques aimed at fostering an informed, inclusive, engaged, and successful student body, we can ensure that our educational systems not only keep pace with technological advancements but also lead in innovative teaching and learning practices, fully equipped to face future challenges.
Vali Huseyn is an educational assessment specialist, recognized for his expertise in development projects of various aspects of the assessment cycle. His capability to advise on the improvement of assessment delivery models, administration of different levels of assessments, innovation within data analytics, and creation of quick, secure reporting techniques sets him apart in the field. His work, expanded by collaborations with leading assessment technology firms and certification bodies, has greatly advanced his community's assessment practices. At The State Examination Centre of Azerbaijan, Vali significantly contributed to the transformations of local assessments and led key regional projects, such as unified registration and tracking platform of international testing programs, reviews of CEFR-aligned language assessments, PISA-supported assessment literacy trainings, and the institutional audit project, all aimed at improving the assessment culture across the country and former USSR region.
Vali has received two prestigious scholarships for his studies: he completed an MA in Education Policy Planning and Administration at Boston University on a Fulbright Scholarship and also studied Educational Assessment at Durham University on a Chevening Scholarship.
Discover guided practices in modernizing assessments and gain insights into the future of educational assessments by connecting with Vali on LinkedIn.