The South Jersey Data Leaders Partnership (SJDLP), a membership organization composed of 46 public school districts in southern New Jersey, seeks to “improve outcomes for students, teachers, schools, districts, and state departments of education through the effective use of data.” SJDLP partnered with Catalyst @ Penn GSE in the fall of 2018 to conduct an inquiry of teachers’ and administrators’ attitudes towards data and their capacity to engage in data-driven decision-making. The goal of the partnership was two-fold: to address a perceived lack of comfort among teachers in using data to inform instructional practices and to introduce participants to the key tenets of short-cycle inquiry and improvement science. The single-cycle inquiry took place over the course of nine months, from September 2018 to June 2019. The inquiry consisted of five sessions.
The first meeting of the inquiry cycle with SJDLP began with Catalyst providing an overview of the inquiry process and explaining the four key components of the inquiry cycle: the focusing question, investigation, action, and evaluation. Members then reviewed the two validated survey instruments, the Data Driven Decision Making Efficacy and Anxiety Inventory (3D-MEA) and the Dimensions of Learning Organization Questionnaire (DLOQ) used to collect data. The Catalyst team shared a brief literature review highlighting the most current research on data-use in schools in an effort to ground the inquiry in a basic understanding of data-based decision-making. SJDLP member districts, represented by teams comprising teachers, data specialists, and other building- and district-level administrators, had a six-week window to administer the survey to teachers and staff members in their schools.
At the second meeting in November, Catalyst shared preliminary South Jersey regional results as well as results for individual districts and schools. SJDLP members analyzed the data and hypothesized potential explanations and root causes for why some schools scored lower on their comfort level with analyzing and interpreting data than others. Meeting in district teams, participants discussed obvious and non-obvious explanations for the results and how to test their hypotheses. For example, one district hypothesized that differences they found between elementary and middle school teachers’ comfort level with data was due to a higher number of new teachers at the middle school, while another district hypothesized that differences in attitudes between schools was due to building-level leadership styles. Other potential root causes suggested by SJDLP members included districts’ not sharing data with teachers in a way that was both timely and comprehensive and the lack of an effective tool or platform to make the data accessible. The meeting concluded with a discussion of how to share findings with key district stakeholders by creating a data story.
At the SJDLP winter meeting, district teams focused on using the survey results to select and refine a problem statement. Several districts, for example, decided to address new teachers’ lack of confidence in using data to make decisions in the classroom. Catalyst introduced the Driver Diagram as a tool participants could use to develop a common understanding of key levers for change and a shared theory of action within each district. Participants concluded the meeting by brainstorming interventions and by considering how to test proposed theories of change quickly and easily (e.g., How will we know a change is an improvement, and what counts as data?). Participants left the meeting charged with developing and implementing a small-scale intervention and collecting data on its effectiveness before reconvening in March.
The fourth meeting, held in March, served as an opportunity for SJDLP members to share progress updates with one another and to troubleshoot challenges. Participants continued to develop an understanding of what constitutes evidence of improvement, which included a discussion about different types of data. The group also considered the differences between incremental evidence of improvement and indicators of systemic or long-term outcomes. While some districts focused on developing large-scale, long-term interventions, which they planned to implement over the spring and during the following school year, others focused on smaller, more targeted strategies, like introducing a small group of teachers at one school to a new protocol to use when reviewing data.
At the fifth and final meeting in May, each district team brought a one-page summary of the inquiry work they had accomplished during the school year as well as expected plans for 2019-20. In addition to sharing the summaries and receiving feedback, participants discussed the role of evaluation in the inquiry cycle process. The meeting concluded with a whole-group reflection on what individuals had learned from participating in the inquiry cycle this past year.
Promising Practices & Lessons Learned
Through the analysis of survey and interview data as well as observations during inquiry meetings, we found that inquiry teams often focus improvement efforts on interventions related to systems and structures for using data in their schools. In the first year of the inquiry, many teams did not as clearly articulate how the systems would be piloted, what implementation would entail, and how these new systems or routines would be evaluated. This observation has informed our learning at Catalyst about how we can better structure our inquiry process and facilitation to help improvement teams more fully use these important practices in continuous improvement.
Finding 1: Inquiry teams often focus improvement efforts on interventions related to systems and structures for using data in their schools.
All inquiry teams (school or district) identified systems, structures, and processes that needed to be created or implemented in order to improve data use in their systems. These systems or processes spanned technical systems to structures in school and district schedules.
Details
- Technical solutions. Approximately a third of participating teams identified that their schools or districts needed to implement a new technology in order to support data use. These technologies included tools focused on qualitative data collection for staff surveys, quantitative data collection tools or assessment management systems for student assessment data, and data analysis and visualization tools. One improvement team framed this as an aim to make charts, graphs, and other forms of data “less intimidating for teachers.” This focus on technical solutions may be due in part to the membership of the SJDLP where at least some members hold assessment and data management administrative responsibilities. This finding may also represent a true gap in the abilities of schools and districts to generate and access data that, if not addressed, would prevent deeper work with teachers and leaders around data use.
- Structures and processes to enable data use. All improvement teams discussed creating processes within their schools and districts to support data use. These included creating teams specifically designed to look at data, developing specific meeting structures between school and district leaders, or data-focused meetings at schools and across schools. Other teams focused on providing training for using data and/or creating protocols for data analysis. One improvement team described their approach as providing “targeted training on data drive problem solving.” Interestingly, for some improvement teams, the protocols and training would be introduced in existing meeting structures such as Professional Learning Communities or district-wide quarterly meetings while other schools and districts chose to design both the protocol or training and the space in which it would take place.
Finding 2: Inquiry teams need increased support to use continuous improvement approaches.
While each inquiry meeting focused on a different aspect of continuous improvement, including the value of short inquiry cycles, most inquiry teams immediately identified potential solutions and approached implementing those solutions in ways that are typical of large-scale improvement approaches. Rather than identifying ideas to test or pilot in low impact areas, most improvement teams planned to implement their ideas system-wide and for the entire school year. This led the improvement facilitation team at Catalyst to reflect on times when improvement teams employed continuous improvement practice and ways in which Catalyst could adapt facilitation of inquiry communities to foster the sort of small-scale and very initial experimentation necessary to engage in true continuous improvement.
Details
- Testing ideas. Fewer than a third of improvement teams discussed piloting or trialing their new systems or structures. When improvement teams did plan to test their ideas, they primarily took two approaches: (1) testing with a single grade level team or teams or (2) expanding an approach for which they had prior evidence of success in one school or grade level to other schools or grade levels.
- Focusing on implementation. Most improvement teams discussed larger scale systems and structures as implementation supports in and of themselves, for example pairing training or professional development with a new data technology system. Rarey did improvement teams identify the specific ways in which a new structure or routine would be implemented as part of their overall improvement strategy and thus failed to identify how they would collect implementation data in addition to the survey data.
New Approaches
- Leveraging inquiry community meetings to plan short-cycle tests of change. In the second and third years of this inquiry community, Catalyst has begun using Plan, Do, Study, Act (PDSA) cycles as the short-cycle inquiry model. In the 2020-21 school year, Catalyst is supporting improvement teams with planning and testing change ideas on a smaller scale through pilots in addition to or instead of implementing sweeping changes.
- Gathering evidence before scaling. Through leveraging consistent PDSA cycles, Catalyst facilitators will focus support efforts on providing improvement teams with consulting and feedback specifically focused on gathering evidence related to implementation and using that evidence to inform their next inquiry cycle such that PDSA cycles are connected and evidence is gathered before change ideas are implemented more widely.