Catalyst @ Penn GSE convenes school and district teams to engage in a continuous improvement approach to research-practice partnerships. With Catalyst guidance and support, inquiry communities develop and test ideas to address complex challenges common across educational contexts. Through inter-district collaboration, practitioners build their capacity to utilize continuous improvement approaches in their ongoing work.
We support schools and districts in better understanding their problems of practice and designing interventions to address them using inquiry cycles that focus on:
By engaging with leaders from diverse school environments, participants can learn interventions other schools are deploying as well as possible adaptations for local contexts.
Learn more about our work with topical inquiry communities in the research briefs below.
The first Catalyst @ Penn GSE Inquiry focused on the following two questions: How do teachers of mathematics in our schools use questioning? And how can we support teachers to be more aware of the types of questions they ask students?
This inquiry focused on Teacher Questioning in Mathematics and was chosen by Catalyst in response to a survey listing key problems of practice that was shared with Penn GSE alumni practitioners who had expressed an interest in participating. The single cycle inquiry took place over the course of six months, from January to June 2018. The inquiry consisted of four sessions.
The inquiry was guided by two focusing questions:
The first session (January 2018), introduced participants to an inquiry process using a four-part cycle (see the inquiry cycle figure under Our Approach) derived from Demings’ Plan-Do-Study-Act (PDSA) model, described the evidence base for effective mathematics teacher questioning, and introduced seven different frameworks for categorizing teacher questioning. Participants broke into small groups to investigate the different models and returned together to debate how the aspects of the different models could be used to develop a framework specific to their needs to assess math teachers’ use of questioning in their classrooms. Based on participants’ suggestions, the Catalyst team developed a draft classroom observation protocol, which participants were asked to pilot in a class or two and provide feedback.
The Catalyst team then refined the observation protocol and participants were asked to observe at least three teachers in their school or system (one which they considered a struggling teacher in their context, one which they considered an average teacher, and one they considered a strong math teacher). Participants collected data over the course of a month and submitted the data to Catalyst. Catalyst analyzed the results for each school and comparatively for elementary and secondary grades, looking at the question type (factual, task, procedural, conceptual, application), open/close ended questions, and teacher/student-initiated questions. Catalyst also compared the different types of questions by different teachers (struggling/average/strong).
Table 1: Question Type Framework Developed by Inquiry Group
Question Type | Description |
---|---|
Task | Questions from the teacher to monitor progress, monitor student behavior, or questions from teachers or students to clarify classroom procedures. |
Factual | Questions that ask students to recall knowledge or that seek to solicit a name or specific information from students. Often closed ended questions. |
Procedural | Questions that ask students to identify or name a mathematical process or solve an equation that has been given. Often closed ended questions. |
Conceptual | Questions that offer students the opportunity to explain ideas, compare, contrast, identify trends, construct, generalize a process, or explain their approach. Often open ended questions. |
Application | Questions that require students to transfer knowledge or skill to a new context or evaluate, analyze or justify their own response or another’s response. Often open ended questions. |
In session two (February 2018), participants analyzed the number and percent of questions that their teachers asked within each question type for their school and for all schools combined. Participants discussed the results including trends and changes in questioning that they hoped to see in their schools. They then began to design an intervention with their teachers, to discuss the research and question type framework as well as school and teacher question type data. Each site developed an intervention based on student math questioning to increase teacher awareness and capacity to ask meaningful questions.
Table 2: Inquiry Cycle
January | Meeting 1–January 19
|
February | Create and test observation tool Meeting 2–February 21
|
March | Design and implement intervention |
April | Meeting 3–April 4
|
May | Collect post-intervention data |
June | Meeting 4–June 7
|
The third session (March 2018) was a virtual check-in to see how things were going in each site and collectively provide suggestions to participants. Participants were asked to collect data again using the same protocol with the same teachers, if possible. Again, participants collected their data and provided it to Catalyst, who added comparisons between the two data collection rounds to identify any significant changes in question types after the school’s intervention.
The fourth session (June 2018) focused on comparing pre-post results and discussing the design and influence of the experience on teacher and leader practices. The goal of the fourth session was to allow participants to share learnings with each other and to identify possible ways to continue an inquiry in question types or launch other inquiries into a problem of practice at their school sites.
Through the analysis of survey, interview, and implementation data, we found that engaging leaders and teachers in an inquiry approach to evaluating their questioning can increase both open-ended questioning and high-order questions.
Each school saw some changes in either question type or open ended questions, which was a primary goal of the intervention. Three of the five schools also saw an increase in conceptual questions.
Details
Inquiry participants reported that the process of inquiry was valuable and indicated their willingness to engage in more inquiry cycles while exploring ways to incorporate the process into other aspects of their work.
Details
The South Jersey Data Leaders Partnership (SJDLP) is an organization of more than 40 public school districts that seek to improve outcomes through the effective use of data. The goal of the partnership, which is in its third year, has been to build a culture of using data to guide decision making and, at the same time, to introduce participants to the key tenets of continuous improvement
The South Jersey Data Leaders Partnership (SJDLP), a membership organization composed of 46 public school districts in southern New Jersey, seeks to “improve outcomes for students, teachers, schools, districts, and state departments of education through the effective use of data.” SJDLP partnered with Catalyst @ Penn GSE in the fall of 2018 to conduct an inquiry of teachers’ and administrators’ attitudes towards data and their capacity to engage in data-driven decision-making. The goal of the partnership was two-fold: to address a perceived lack of comfort among teachers in using data to inform instructional practices and to introduce participants to the key tenets of short-cycle inquiry and improvement science. The single-cycle inquiry took place over the course of nine months, from September 2018 to June 2019. The inquiry consisted of five sessions.
The first meeting of the inquiry cycle with SJDLP began with Catalyst providing an overview of the inquiry process and explaining the four key components of the inquiry cycle: the focusing question, investigation, action, and evaluation. Members then reviewed the two validated survey instruments, the Data Driven Decision Making Efficacy and Anxiety Inventory (3D-MEA) and the Dimensions of Learning Organization Questionnaire (DLOQ) used to collect data. The Catalyst team shared a brief literature review highlighting the most current research on data-use in schools in an effort to ground the inquiry in a basic understanding of data-based decision-making. SJDLP member districts, represented by teams comprising teachers, data specialists, and other building- and district-level administrators, had a six-week window to administer the survey to teachers and staff members in their schools.
At the second meeting in November, Catalyst shared preliminary South Jersey regional results as well as results for individual districts and schools. SJDLP members analyzed the data and hypothesized potential explanations and root causes for why some schools scored lower on their comfort level with analyzing and interpreting data than others. Meeting in district teams, participants discussed obvious and non-obvious explanations for the results and how to test their hypotheses. For example, one district hypothesized that differences they found between elementary and middle school teachers’ comfort level with data was due to a higher number of new teachers at the middle school, while another district hypothesized that differences in attitudes between schools was due to building-level leadership styles. Other potential root causes suggested by SJDLP members included districts’ not sharing data with teachers in a way that was both timely and comprehensive and the lack of an effective tool or platform to make the data accessible. The meeting concluded with a discussion of how to share findings with key district stakeholders by creating a data story.
At the SJDLP winter meeting, district teams focused on using the survey results to select and refine a problem statement. Several districts, for example, decided to address new teachers’ lack of confidence in using data to make decisions in the classroom. Catalyst introduced the Driver Diagram as a tool participants could use to develop a common understanding of key levers for change and a shared theory of action within each district. Participants concluded the meeting by brainstorming interventions and by considering how to test proposed theories of change quickly and easily (e.g., How will we know a change is an improvement, and what counts as data?). Participants left the meeting charged with developing and implementing a small-scale intervention and collecting data on its effectiveness before reconvening in March.
The fourth meeting, held in March, served as an opportunity for SJDLP members to share progress updates with one another and to troubleshoot challenges. Participants continued to develop an understanding of what constitutes evidence of improvement, which included a discussion about different types of data. The group also considered the differences between incremental evidence of improvement and indicators of systemic or long-term outcomes. While some districts focused on developing large-scale, long-term interventions, which they planned to implement over the spring and during the following school year, others focused on smaller, more targeted strategies, like introducing a small group of teachers at one school to a new protocol to use when reviewing data.
At the fifth and final meeting in May, each district team brought a one-page summary of the inquiry work they had accomplished during the school year as well as expected plans for 2019-20. In addition to sharing the summaries and receiving feedback, participants discussed the role of evaluation in the inquiry cycle process. The meeting concluded with a whole-group reflection on what individuals had learned from participating in the inquiry cycle this past year.
Through the analysis of survey and interview data as well as observations during inquiry meetings, we found that inquiry teams often focus improvement efforts on interventions related to systems and structures for using data in their schools. In the first year of the inquiry, many teams did not as clearly articulate how the systems would be piloted, what implementation would entail, and how these new systems or routines would be evaluated. This observation has informed our learning at Catalyst about how we can better structure our inquiry process and facilitation to help improvement teams more fully use these important practices in continuous improvement.
All inquiry teams (school or district) identified systems, structures, and processes that needed to be created or implemented in order to improve data use in their systems. These systems or processes spanned technical systems to structures in school and district schedules.
Details
While each inquiry meeting focused on a different aspect of continuous improvement, including the value of short inquiry cycles, most inquiry teams immediately identified potential solutions and approached implementing those solutions in ways that are typical of large-scale improvement approaches. Rather than identifying ideas to test or pilot in low impact areas, most improvement teams planned to implement their ideas system-wide and for the entire school year. This led the improvement facilitation team at Catalyst to reflect on times when improvement teams employed continuous improvement practice and ways in which Catalyst could adapt facilitation of inquiry communities to foster the sort of small-scale and very initial experimentation necessary to engage in true continuous improvement.
Details
New Approaches
The Equity and Student Belonging Inquiry Community began in fall 2018 and is now in its third year. In this inquiry community, district teams in Southeastern Pennsylvania are investigating the sense of belonging among their student populations and learning how to use improvement science to support their work.
In the fall of 2018, after surveying a host of K-12 practitioners to determine their highest priority problems of practice, Catalyst @ Penn GSE facilitated a short-cycle inquiry on student sense of belonging. The purpose of this offering was twofold: to help educators address this important problem of practice and to introduce them to an effective methodology for addressing a wide range of school-based challenges. This inquiry followed a partial cycle, taking place over the course of four months from September through December 2018 rather than over the course of a full academic year. The inquiry consisted of two sessions, with two optional sessions added in Spring 2019 to support districts with developing plans to respond to what they learned in the fall sessions.
To begin the inquiry process, participating district teams, comprising teachers, counselors, and administrators attended an introductory meeting with Catalyst @ Penn GSE. Responding to an open invitation, 15 public school districts and two independent schools joined the inquiry group, with 13 teams attending in person at Penn and four joining virtually. At the first meeting, Catalyst, along with experts in student social development, reviewed the current research on student sense of belonging with participants. Catalyst then introduced the Psychological Sense of School Membership Survey (the PSSM), a validated survey instrument intended to help districts develop a deeper understanding of how students in their school communities view issues related to belonging. With Catalyst’s support, district teams customized the survey to meet the needs of their unique contexts and also decided which grade levels to survey. Districts then had a six-week window to administer the survey to students via an online link, after which Catalyst staff analyzed the results.
At the second meeting in December, each team received a customized report of survey findings, which included results for each grade level and school that participated, district-wide results, and the combined results of the 15 districts and two independent schools in the inquiry. Meeting in individual district teams as well as in cross-district support groups, participants analyzed their data for trends, unexpected outcomes, and potential root causes that might explain their results. Equipped with this information, participants returned to their respective districts and shared survey results with key stakeholders, including school board members, administrators, teachers, and parents. Based on what they learned from their survey results and stakeholder engagement, district teams developed interventions designed to increase student sense of belonging in school.
Catalyst hosted two optional virtual meetings in the spring of 2019 to provide interested participants with additional support. During these one-hour online meetings, participants gave progress updates on the work they were doing and brainstormed potential solutions to implementation challenges.
Analysis of surveys, interviews, and participant reflections yielded several interesting insights about how schools approached improving students’ sense of belonging, and how practitioner-led inquiry around continuous improvement could support developing and implementing these interventions.
To learn more about joining one of our inquiry communities, please contact Megan MacDonald at mmacdo@gse.upenn.edu.