Part 6 Measuring Impact

When providing acceptable data for judging the effects of professional learning, evaluators should always include multiple sources of evidence. In addition, evaluators must carefully match these sources of data to the needs and perceptions of different stakeholder groups”

Guskey, 2014

Validity and Reliability

So, how do we know professional learning is having a positive impact? And, what are the types of measures we can use?

As we explored earlier in this guide, school context, culture and norms, along with leadership, can undermine the impact that professional learning can have on changing professional practice. When measuring impact, validity and reliability lie in multiple measures; data triangulation is key. With this in mind, we invite you to consider the ways a collaborative leadership team might view and measure the impact of organizational professional learning inclusive of:

  • Using a leadership professional development and learning continuum

  • Applying Guskey’s five-level model

  • Considering Data Sources on professional practice and teacher effectiveness

Using a Leadership Professional Development and Learning Continuum

International accreditation bodies, such as CIS, WASC, NEASC and IB Professional Standards and Practices, support schools in a developmental process using rubric-based continua as tools for continuous whole-school reflection and growth.

To complement a whole-school accreditation or reaccreditation growth process, leadership teams might also consider developing a professional learning continuum as part of a whole-school improvement rubric, to use as a tool for building a shared understanding of what leading impactful professional learning can look like.

Looking at this example below, we invite you to consider how a shared professional learning continuum might be used/adapted as part of a whole-school improvement plan:

Applying Guskey’s 5-Level Model

We invite you to consider Guskey’s effective professional learning evaluation model occurring at five critical stages or levels of information. “The five levels in this model are hierarchically arranged, from simple to more complex. With each succeeding level, the process of gathering evaluation data requires more time and resources. And because each level builds on those that come before, success at one level is usually necessary for success at higher levels” (Guskey, 2014).

Level 1: Participant reaction

  • Gather data on participants’ reactions to the learning provided on key areas of presenter, content, and context.

Level 2: Participants’ learning

  • Gather data on participant learning to impact changes in practice on student outcomes, such as observations, coaching conversations and learning log reflections.

Level 3: Organizational support and change

  • Data gathered will vary depending on professional learning goals, and can range from looking at school records or examining meeting minutes, to various community questionnaires.

Level 4: Participants’ use of knowledge and skills

  • Data gathered on the extent to which professional learning is enacted will vary depending on professional learning goals. It can range from questionnaires to reflection logs and professional portfolios.

Level 5: Student Learning Outcomes

  • Gather data at multiple levels and sources to provide acceptable data for judging the effectiveness of professional learning. See the next section on Professional Practice and Teacher Effectiveness.

Considering Data Sources on Professional Practice and Teacher Effectiveness

"The leading professional learning role [promotes] a challenging learning culture […] through personal involvement in continued professional development and learning, knowing what content and learning activities [are] likely to be of benefit and promoting evidence-informed, self-regulated learning. Key to ensuring sustained impact from continued professional development and learning over time leaders promote such activities as core business, so that teachers continuously monitor the impact of their work on student outcomes and examine the implications of such monitoring for their future teaching."

Cordingley, P., et al, 2016, p.10

a) Use frameworks.

Frameworks provide useful instruments for observations, support quality assurance, underpin self-reflection and self-directed learning, and foster professional conversations that highlight the strengths of professional practice and areas for development in both teaching and non-teaching staff. They can provide a formative springboard for individual professional learning and goal setting. As a ‘starter-kit’ when considering effective pedagogy, and with caveats acknowledged, the review of underpinning research on What makes great teaching? (Coe et al., 2014) signals:

  • Danielson’s Framework for Teaching (1996, revised, 2007/2011/2014)

  • Pianta’s Classroom Assessment Scoring System (CLASS)

  • Rosenshines’s Principles of Instruction (2010 and 2012)

  • Creemers and Kyriakides’ Dynamic Model

b) Use the best assessments available to measure student progress.

As noted “the best feedback to guide the pursuit of effectiveness is to focus on student progress, and that requires high-quality assessment of learning.” (Coe et al., 2014, p.47)

c) Consider further data points

Further data points that we might consider when using multiple and multi-dimensional measures might include:

  • Lesson observations by collaborative teams, peers, principals, and external evaluators;

  • Student ratings related to future achievement and formative teacher feedback;

  • Classroom artifacts such as lesson and unit planning;

  • Principal’s judgment supported by their considerable background knowledge and access to additional information.

To further reference the underpinning research review by Coe et al., (2014, p.47):

“all these methods have potential value, but all have their problems. If they are done well, using the best available protocols, with awareness of how they can be biased or inaccurate, and with due caution about what inferences they can and cannot support, then they should be useful tools.”

Application of Value Added Models is not considered reliable. Research findings suggest controversy in using these as a single measure of effectiveness. See Cordingley, P., et al, 2016, p.33-35, for further reading on this.

Sharing Our Impact

Within our data-driven approach, knowing and sharing our impact is key to continuous school improvement and provides us with moments to celebrate and elevate impact, progress and the purpose behind meaningful professional learning.

Co-construction and collaboration are essential for school improvement and sustainability. We have to understand what works and does not work to make our contextualized professional learning system and pathways responsive, flexible, and tailored enough to be truly meaningful and impactful.

Last updated