Case Studies » Bridgewater Primary School
We had the pleasure of talking to Samantha Mawer about the school’s use of Insight to meaningfully handle data from a huge three-form entry primary.
Driving Progress, Driving Learning
“Before levels went, we were using a different pupil tracking system. They came and gave us their suggested approach for the replacement of levels, which seemed to us to just be replacing levels with levels and steps to move through. I’d spent a good long while researching and thinking: ‘If we’re not going to use levels then what can we use instead?’
“What we wanted to use was something that was based in the classroom, that was actually going to be used quite purposefully by the teachers to drive progress, to drive learning, and then we were going to worry about the tracking bit afterwards.
Choosing a Tracking System
“I did a lot of research and I looked at what Michael Tidd had done with his Key Performance Indicators. For the first year, I simply used and adapted his Excel spreadsheets. I liked what it offered – we could track key concepts – but what I needed was something that was going to do it for me, something that was more reliable than my spreadsheets (where there was lots of opportunity for human error to occur). We were looking for something that could handle all of this data from a three-form entry school, but also be based on what was going on in the classroom and tracking the key concepts (rather than going the other way) and this is what Insight offered.
“I heard about Insight on Twitter via Jamie Pembroke. There was quite a buzz at all the Learning First Saturday meetings, where people would volunteer their time to talk about assessment and making it work in schools. Jamie said; ‘This place can do it’. I looked at the system because of him. There are quite a few schools around here who have started to use Insight because I’ve showed them what it can do.
“The biggest challenge was finding what we wanted: to keep it about learning, but also to make sure that our data was going to be safe. If somebody made a mistake we weren’t too worried because we could phone Insight up. We needed two parts: the technical side, as well as basing it on learning.
Making it Our Own
“We didn’t need to explore other systems because we felt really confident that Insight would provide us with what we needed. There is very much a bespoke element to it. You think, ‘I need to see this… or how could I see this easily?’ and it’s just an email or a phone call and either it’s done within 24 hours or it’s on the road map and then you get notified when it comes up. The bespoke nature of it appealed to us because we really want the best information but also as little information as possible. When we spoke to Andrew at Insight initially and he told us what he could offer, we didn’t need to go anywhere else because we could make it our own.
Tables at the Touch of a Button
“The fact that we could produce lovely looking tables at the touch of a button was brilliant. The focus from the teacher’s perspective is going into assessing over a longer period of time, and to key concepts and thinking about those. Then there’s the ease of a click of a button and you’ve got a progress matrix up. What we’re doing now is trying to reduce teacher workload. All of the information is there at the click of a button – so our pupil progress meetings are about the teacher’s exploring a pupil progress matrix and coming up with actions as a result. There’s no need to calculate anything complex because it’s all there in front of us.
Getting Up and Running
“It was really quite simple to get up and running with Insight. We just sent across the existing data that we had (baseline data and our summative judgement information) and their team pre-populated the system with the information that we wanted to keep. In terms of the key concepts, we said to start us off with Michael Tidd’s and then we tweaked and adapted it as we went along. We’ll probably still do some more adapting of those concepts as we think more around them, but it was very quick and easy to get started.
Ofsted loved it
“It continues to meet with our expectations. Again, the pupil progress matrices are really good in that there’s no child hidden behind an average or percentage; you’re looking at all children from their starting points and how they’re progressing. We’ve got that granular detail of the key concepts so we can really support those children who are working towards – to not just be working towards -but also to be making real progress in certain areas. Teachers can track that and celebrate that and focus on what’s important.
“It’s also really easy to track attainment differences as well within the key groups. It’s got everything that we asked for. Ofsted loved it. We had Ofsted in November and I went through how we used the system. I went through children who were working below the expected standard and how we make sure that we gap fill. We have an expectation of gap filling in the Autumn terms so that they’re building on something – it’s not just ‘working towards’ every single year of their lives, and she said that it was music to her ears how teachers were using assessment to drive learning.
Taking Ownership of Insight
“All of the teachers are very positive about Insight and the majority of them get on well with the system. There are a couple who are never going to like a computer assessment system, but we support each other. We’re a big school and work in teams so really, it’s about making sure everyone is using it together so that everyone learns how to use it.
“Newer staff members are using Insight more thoughtfully; they have taken ownership of how they use the system. I don’t want it to be me just telling them how to use it; I want them to take ownership of it and that’s beginning to happen.
“We moderate teacher assessments across teams internally, as well as externally, and use low stakes testing in the form of PiRA and PUMA, as well as Comparative Judgements for writing, to benchmark attainment, monitor progress and ensure consistent assessment judgements.
“We utilise gap filling for those who are below or working towards. I want pupils to build on firm foundations. We always look at the pupil progress matrices so that we can track pupils’ baselines against where they are currently. We look at depth of understanding and curriculum coverage so that we can have conversations around whether we’re on track to cover the curriculum over the year and to see how secure children are in that particular curriculum.
“Again, we use Insight’s ‘average depth’ measure to question and push those higher attainers, to think about whether we can demonstrate their understanding at a deeper, broader level within the age-related expectations. In terms of PIRA and PUMA, that’s good benchmarking, so we can see what they’re attaining in these test situations.
“I ask the teachers each term to put in what we call a ‘point in time assessment’, and that will just be how the children are performing generally in class. All of that sort of information goes home as well, so we’ve got that on the system.
The Biggest Impact
“The biggest impact is when we talk about progress in pupil progress meetings. I think with levels, there’s a lot of conversation around average point score – and a lot of numbers – whereas now we talk about real people because now we’re looking at the children. We can just bring that Progress Matrix screen up and talk about children’s progress. We can go into quite a lot of detail by clicking on a child: we can see how they’re accessing the curriculum so we can have deeper, better conversations around that child’s learning.
“That’s the biggest impact – from talking about percentages and decimal points’ progress, to talking about a particular child or a group of children and what we can do to drive their learning further. The conversations around data are now far more meaningful.
“There has been a definite reduction in workload; when we were using levels we were calculating points progress and throwing information into different sheets and now, for pupil progress meetings, there’s no analysis of the data before we actually sit in the meeting and bring that information up on the screen. Therefore, what we actually get as a result of the meeting, is that our interventions are sorted by the end of that meeting – there’s no prior work; the data is already there, we can have a look at that, we can explore it, we can interrogate it. We don’t need to print it out. We don’t need to regurgitate it, it’s on there; ‘Now what are our actions as a result?’
“The real impact has been to be able to talk about real children and not percentages and average points in those pupil progress meetings. It’s worked really well for us and we always get great support whenever we phone up; if we’re confused any of us can call in and someone is always very quick to help us. We’ve had a few people phone in and so that support has been invaluable.
“We’ve gone from talking about percentages and decimal points’ progress to talking about particular children or groups of children and what we can do to drive their learning further.”