In my first two posts, I described how we at Jackson State University (JSU) transformed our clinical practice and our course curriculum to strengthen our teacher candidate’s experiences so they are ready on day one to successfully teach all students, especially Black, Latinx and students experiencing poverty. In this post, I’ll share how the process of collecting, analyzing and using data is a foundation for those efforts and everything we do to strengthen our teacher preparation program.
I’ll start by focusing on clinical practice. As my first post describes, before we overhauled candidates’ clinical practice experience, we had a retired educator observe candidates a few times a year and then manually report those observation data to the Director of Teacher Quality who kept the data centrally. So we collected data on how candidates were progressing but didn’t analyze what they were telling us or collect additional data to help us see multiple perspectives on candidate progress. And we didn’t do anything differently as faculty members and program administrators once we had the data. What we didn’t fully see then that we do now is that we were sitting on information that could help us provide differentiated support to each candidate given where they stood on their trajectory towards becoming excellent beginning teachers. We also—if we viewed candidate observations as a whole cohort—didn’t see that the data could tell us as a program what we were doing well and where we needed to focus more energy to support candidate learning as a whole.
That all changed once we began to think about data as a critical resource we needed to strengthen our programming. Now, we have a candidate-centered model with data at the core. Here is what it now looks like at JSU to collect, analyze and use data to support clinical practice:
We also collect survey data directly from teacher candidates and mentor teachers who support candidates in their student teaching experiences.
We used a similar process to collect, analyze and use data to support our curriculum overhaul. Partnering with US PREP, we worked in design-based research teams (DBRT), which included JSU faculty working with a diverse group of faculty researchers across other programs in the US PREP coalition. These teams gathered pre-, in-design and post-course assessment data from across five core courses over the course of two semesters. We gathered those data from candidates on their experience with courses, piloted a redesign based on those data and then implemented revised courses in the following academic year. This process unearthed key insights. For example, the DBRT process led us to determine that our math methods course did not meet the needs of elementary education majors, as the content contained a disproportionate emphasis on secondary math content. Similarly, DBRTs also revealed that the required reading course for both elementary and secondary education did not adequately serve secondary education majors, because they had not been required to complete the foundational prerequisite courses prior to taking it. These insights led us to make changes in required courses and their sequencing across programs.
It is not always true that more data is better. Sometimes you can be swimming in a sea of data but not have the tools to make sense of them in a systematic way. Using a US PREP toolkit and with help from researchers from the University of Washington, we implemented self-studies to understand how our faculty were using data for the purpose of program improvement. This process involved gathering quantitative and qualitative survey outcomes with data collected from all stakeholders, including teacher candidates, faculty, administrative team members, and district personnel such as mentor teachers, school leaders, and key office personnel. Pulling together data from all of these sources, we made two fundamental program changes:
All of this work required us as faculty to think differently about data. We had to shift our mindsets. In years past, we collected data when a compliance report was due. So we complied, and that was that. In our transformed model, we have come to view data as an essential tool to help support a continuous program improvement reflection process with our district partners. That was a pedagogical shift for us that has made all the difference.
As we worked together to make sense of the data, it became clear that we also had to create new structures to support data review and response. One example: we now have a fixed agenda item in departmental meetings with reporting to a newly established Teacher Preparation Task Force, which has as its charge to continuously improve the program, and includes educator preparation faculty, department chairs, essential program staff and administrative leaders.
In short, data have completely changed how we work. Even with the new challenges we are addressing today given the ongoing COVID-19 pandemic – juggling new priorities for virtual learning, managing often new obligations in our personal lives – we have maintained our commitment to review data together. That speaks volumes to the importance of data in helping us and our partners to sustain the improvements we have made for years to come. We look forward to the continued journey!