Data for Learning: 5 lessons to make evidence-based education quality improvements

Data for Learning: 5 lessons to make evidence-based education quality improvements

By
Rebecca Ward

Teacher in a classroom helping students with computer skills

Data can be a powerful resource for education reform. Without it, education leaders can be immobilized because they do not have enough information to recognize that problems exist or to make the case for change. However, data only provides value if it is used to inform decisions.  Unfortunately, many education systems struggle with this step.  So, how can education leaders best be supported to use data to make better decisions in support of quality education improvements? In collaboration with local partners, IREX is supporting the development of data systems and tools to improve education systems through better, evidence-based decision making.

We share below some of the lessons from our work, including the importance of providing clarity on what data will be used for; balancing long term integration with rapid modeling; institutionalizing clear decision making points for data use; modelling inclusion; and the benefits of using context data. 

Improving the use of data across education systems continues to be a major focus of governments and development partners worldwide. Good data can help actors better understand system inputs, outputs, outcomes, and contextual factors so that decisions about policy and practice are responsive and evidence based. IREX has variously supported education partners to collect and use data about institutional capacity and performance, student satisfaction and learning outcomes, perceptions of the teaching profession, and teacher demand, supply, and career paths.

At this year’s Comparative and International Education Society Conference (CIES), I’ll be discussing what we have learned from developing a National Perceptions of Teaching Survey in Jordan and our development and use of a Higher Education Institutional Capacity Assessment Tool.  In case you can’t join me at these sessions, here are 5 lessons emerging from our work with partners to improve education system data.

Agree and provide clarity on what data will be used for

In higher education, institutional assessments can be used to enable leaders to recognize and prioritize performance improvement needs, foster organizational learning, and support planning and strategic decision making. They can also be used for benchmarking, accreditation, and accountability and there is considerable tension between these functions. IREX has used its Higher Education Institutional Capacity Assessment Tool (HEICAT) with universities in Sub-Saharan Africa, Eurasia, and the Middle East, for the purposes of both internal organizational learning and external accreditation. We have learned it is vital to engage partners prior to the assessment to agree and provide clarity across all stakeholders on the purpose of the assessment, including who will have access to data and what data will be used for, in order to build trust and facilitate honest dialogue.   

Aim for integration but model user-friendly functionality to secure buy-in   

In our work to support the Government of Jordan to collect and use data and projections about teacher supply and demand, we agreed with core partners early on that it should be housed in their existing EMIS, following best practices to integrate new data initiatives within existing systems and avoid proliferation of databases. To avoid losing stakeholder interest and buy-in during the integration process, IREX and our data partner created a fully functional “sandbox” model that  has been vital to maintain momentum, serving dual purposes: as a resource for training Ministry staff on the model, its capabilities, and data-informed decision making; and to demonstrate user-friendly functionality, which has been instrumental in fostering understanding and buy-in, including the Ministry’s decision to use the data to allocate scholarships for teacher preparation to areas of greatest need.  

Identify and institutionalize clear decision making points for data use

The introduction of new data collection regimes can stall in the absence of clear roles, responsibilities, and decision making points. If the data user community does not have a clear plan for data use, fatigue and resistance are more likely to set in. In the West Bank, IREX adapted the HEICAT to align with national licensure and accreditation requirements, ultimately working with university partners and the Accreditation and Quality Assurance Commission to integrate the self-assessment into the Common Framework for Quality Assurance in Palestinian Higher Education. The framework includes clear roles and responsibilities, timelines, and procedures for data collection and use. In Georgia, IREX provided technical assistance to the National Center for Teacher Professional Development (TPDC) to develop and use data from a Training Management System in support of a nationwide rollout of training to over 18,000 teachers and 2,000 school principals. Use of the database was integrated into the program management cycle to assist planning for subsequent budget years, including the number of additional trainings, catch-up trainings, and trainer procurement.     

Model inclusion and build data producer and user communities  

Data can be a source of power with the potential to exclude. Notably, youth are often absent in the production and use of data about education systems, despite being central to them. IREX is changing this by supporting youth and adults to generate and use data as a language to collaborate and inform decision-making on issues that affect them.

The Kisumu Issue-based Collaborative Network (ICON) is further enhancing youth work readiness by engaging higher education institutions, the public sector, and the private sector with diverse youth and youth-led/youth-serving organizations. A recent ICON data summit brought together 125 participants providing a platform for young people to share and explore data on youth employment and work readiness with decision makers in national and country government. In response to research indicating that employers valued soft skills as much as digital skills, summit participants advocated for soft skills such as empathy and communication to be incorporated in youth digital skills programs.  Also upon learning that receiving a certification from a vocational education and training center is not seen by many youth as a catalyst to better economic opportunities, youth participants advocated for stronger bridges to employers and earning.

Don’t forget context data

Educational data tends to focus on system inputs (e.g. enrollment, teachers, resources, expenditure, infrastructure), outputs (e.g. attendance, attainment), and outcomes (e.g. learning outcomes, employability). However, it is widely acknowledged that contextual factors significantly impact the performance of education systems, and these factors can provide valuable data. In Jordan, we are conducting a biennial National Perceptions of the Teaching Profession Survey to understand how Jordanians view the profession and to learn what they know about how to become a teacher. The data is being used to inform the content and targeting of annual national campaigns by the Ministry of Education to improve the profession’s public image with the goal of attracting more and better applicants to teacher training programs.