In this fireside chat, Peter Leonard, the Executive Director of Student Assessment and MTSS at Chicago Public Schools and Ila Deshmukh Towery, a partner at Education First, explore how school districts can navigate decisionmaking around AI opportunities and challenges. They share lessons learned from Education First’s collaboration with the Chicago Public Schools Teaching & Learning team on an exploration rubric for AI decisionmaking, a project completed in March 2024. This conversation offers insights into how school districts can responsibly leverage AI to support teachers and students.
The transcript for this fireside chat, originally conducted at CCSSO’s fall collaborative in New Orleans, LA on September 24, 2024, has been edited for clarity and length.
Ila: So, Peter, tell us a little bit about the project we worked on together, and what led you to want to develop a rubric for AI decisionmaking.
Peter: Thanks for the question, Ila. The project originated from a sprint grant put out by the Gates Foundation. The timeline was tight—just around six months. We initially aimed to support the Skyline Curriculum, an initiative to ensure a high-quality, culturally responsive curriculum in every classroom. We wanted to explore how generative AI could support this effort. However, it quickly became apparent that the solution space was too broad to fully evaluate the feasibility of one or two specific use cases in that timeframe.
The project evolved in collaboration with the Education First team to develop the AI Exploration Rubric. And, not just a rubric but a tool and an approach that can help our system determine when AI is appropriate to use. It includes answering key questions: Should we use AI for this purpose? Could we, from an operational, fiscal and ethical perspective? And then we actually would think about the how. We wanted to go from testing 1-2 use cases to developing an approach, a tool and a process for the organization.
Ila: Tell us a little bit about one or two of the use cases you were interested in exploring.
Peter: One that we were interested in was ongoing content development for Skyline. Skyline was adapted from an initially provider provided curriculum to make it more culturally relevant, Chicago specific and digital. The curriculum is available across K12, across 5-6 subject areas, with digital lesson resources and assessments. Given all of those domains, we’re always thinking about how we can be more effective and efficient. For this project, we were interested in content development specifically in the math space.
Ila: Can you talk a little bit about what you all learned about the AI use cases and anything that you’ve taken forward into implementing what you all were testing since our project ended?
Peter: Just as all the AI developments and technical spaces are evolving quickly, so are the ways in which systems are organized around AI. CPS also stood up an AI steering committee that married the technical and instructional work. This was key to continuing to explore and work with partners to bring more use cases to the table for the rubric. For example, we’re a close Google partner, and we work with Microsoft a lot, too. We are continuing to explore how we might build our own medium language model for ongoing development of curriculum content. We know it’s technically possible and we think there’s a really strong value delivery on curriculum content that is connected to an instructional course.
Ila: Given that you’re in the early stage of this project, what are some pain points that you experienced or see as potential pain points coming down the road?
Peter: While we’re eager to move on what we see as an opportunity, we need to be very deliberative once we get to the how because of the student privacy concerns and the technical considerations around which partner to choose, including what data goes into a model and whether a model trains on the data or protects it. We are doing our due diligence to ensure that what we do move forward with is robust, that every box has been checked, and that when we’re leveraging generative AI at a system level, we’re as deliberative as possible.
CPS developed an AI guidebook to provide structured guidance to different users around how to consider generative AI in their work. We very intentionally didn’t pursue developing a “capital P” policy. We know the landscape is going to evolve quicker than our policies could, so we are committed to updating this guidebook quarterly and pairing the guidebook with a professional learning plan by user and use case.
We’re trying to balance where we know we need to be really protective, where the risk is high, with some of the areas that we can be much more innovative. We want to use AI responsibly, ethically but also creatively so that we can really deliver value to people across the system.
Ila: One thing that really struck our team in working with CPS and your team, Peter, is that you had put together a very cross-functional team of folks that wanted to work together. It really struck us how collegial and collaborative the relationships between departments were. How did that team come together? How did you think about who should be at the table to put together a rubric and analyze these use cases?
Peter: I like to think we have a pretty collaborative culture here at CPS. No matter what your function is—whether it’s assessment, content or legal—we’re all focused on the same thing, which is how do we strengthen the student experience so that every student’s daily learning experience is joyful, rigorous and equitable. I was the convener of this group and we brought together our leadership within our office of Teaching & Learning, including our chief and deputy chief, our math content leads and our curriculum, instruction and digital learning team (which is connected to education technology and the burgeoning work of generative AI).
So, the core group that helped initiate the project became the core team for the ongoing engagement. It was about ensuring the right voices were at the table to create a holistic approach, and kudos to the Education First team for also doing about 1,000 interviews across the organization and the field. Though that’s a little bit of hyperbole, it helped us develop a rubric and a tool that could be comprehensive for our approach.
Ila: The cross-functional nature of your team was crucial. Could you expand on why that was so important?
Peter: Absolutely. The diversity of perspectives made a big difference. When we first started, we thought we’d only be analyzing a couple of use cases. But it quickly became clear that we needed a broader framework to make sense of AI’s potential across different contexts. Having tech and content represented helped us think about the implications from multiple angles—whether it was ensuring cultural responsiveness in curriculum or safeguarding privacy.
Ila: We’d love to hear you talk about the role the state education agency (SEA) played. What do you see as the role of the SEA in supporting your work?
Peter: I often think that states are best for LEAs when they focus on the big things that they do absolutely excellently, and that creates the conditions for districts and others to be innovative. There are foundational pieces that the state already has in place, such as the strong stance on student data privacy and other technical requirements that set a good framework and starting point for how we’re approaching aspects of our work with AI.
Ila: And, what advice, from a district perspective, do you have for state leaders about supporting districts in making smart and coherent decisions about AI?
Peter: So, first, stay connected to the research and practices. Just from last year to this year the move in familiarity and regular use of AI by students and teachers has been massive. I encourage you to familiarize yourself so that you are connected to the types of issues that are coming up and the opportunities. Second, explore ways that you can be a connector or convener to build capacity or tap into the areas of innovation within your state, including working with partners. For smaller districts without a lot of central resources, the state can help by creating professional learning communities or task forces, providing a platform for collaboration and sharing knowledge. The work with generative AI is bubbling up to the level where there are a lot of great knowledge sharing opportunities, and the state could be of high value in that area. Finally, continue to set strong frameworks.
Peter: Our work with you was our first AI project, and I know it was your first AI project, too. I know you’ve learned a lot since then. What have you learned from working with us and other districts around AI?
Ila: One of the prior questions I asked touched on this—having a cross-functional team was super important. It represented the different perspectives that are all trying to work around a shared strategy for the system. That was one really important thing because of how collaborative and ready the team was to think about a larger framework for analyzing multiple use cases.
Besides having a strong cross-functional team, our team has talked about how a district can think about bringing more of a coherence lens to its AI strategy. Our Education First team developed some accompanying materials that showed you can’t just look at use cases one by one because they have to be part of the strategy.
Finally, and this is my hat tip to Peter and the team, I came in having a lot of questions about the goal of efficiency. Often with AI, the value proposition is it saves you time. And we had a little bit of debating whether or not that should be the goal. Should the primary goal be efficiency or are there other things that matter more than time saving? One thing I keep taking away from talking to the CPS team, especially Peter and his colleague Sasha Klyachkina, is that you have to also think about what the counterfactual is. What is it replacing? It could be a time saver. It also could be replacing something that is bad practice because it’s not necessarily true that what’s happening at any level, whatever user you have in mind, is the right thing to do.
Peter: I think the thing to add in terms of efficiency is time saving opens space up for other types of thinking and activities. So, as we help different users with AI, we always ensure that there’s a human in the loop and it also can create opportunities to do different types of thinking and work to enhance your impact. We’re not trying to just squeeze every bit of productivity we can out of something, but if we’re able to allocate certain aspects of work to generative AI, then it opens up a range of activities that were perhaps closed off before for deeper planning or deeper engagement with students. We want to think creatively about how this could shift and enhance work for everyone involved.
Ila: Yes, I came into the project skeptical of AI, thinking of all the downsides, and I started to see the bigger picture of how to think about what it’s replacing. Is it replacing something that’s already going well, or is it replacing something that actually isn’t going well and this will also help save time? That was a really important lens, and we’re taking that into a district network we’re launching.
Stay tuned to learn more about the 10+ school districts joining Education First’s 2025 AIxCoherence Academy, a 9-month program to explore integrating AI into teaching, learning and system-level efforts to solve real-world challenges.