Friday 31 October 2014

Can I be that little bit better at......'doing' data


I will be the first to admit that data and I have had a turbulent relationship over the years.  When I talk about data in this instance I am talking about test scores, exam scores and assessments.  I fully understand the importance of tracking where students are in order to identify those who are falling behind, those who are on track and those who are performing above initial expectations (if we even know what that is?).  After assessments or at calendared data collection points over the various terms I will happily input information to help build up a picture of my students.  I create my own spreadsheets where I can add any additional information in an effort to demonstrate what might be happening over the year.  Here's the problem though.  Once I have input all of this data I rarely do anything productive with it.  Of course I do the obvious stuff and look over it and identify any trends or anomalies that may crop up.  There's the 'Who has hit their targets' check.  There's the 'Who is working above expectation so I can breathe a sigh of relief' check.  There is also the 'Arrrggghhh! What on earth are they doing?!' check which usually results in me crying inside before planning what to do to help this student.  I can use it to talk to students about progress over time and even inform parents of how students are doing.  I do get the feeling though that I have drifted unconsciously into collecting data for the sake of collecting data.  As if it is a way of compiling evidence which I can show line managers without really knowing why.  It kind of feels like I am collecting data for others rather than doing it to either a) improve students learning, or b) improving and aspect of my teaching.  In fact I get the overwhelming feeling at times (and it's my own fault) that I am simply 'doing' data rather than 'using' data for any real or significant purpose.  And within a department context as well, are we using data collectively to bring about meaningful change?  Who knows?  It's got to that time when it's time to change how I tackle the data beast.

So what have we been doing?

The last few years (within our department) have seen us begin to approach data with more of a purpose.  Initially we would input assessments and on a very large spreadsheet we could do the basics.  This would include things like:
  • Picking out those students who have done well
  • Who is on track?
  • Who is performing below expected standard?
  • Who would benefit for intervention?
  • Who is on the C/D borderline?
  • Who do I need to contact home about?
  • What topic areas did students perform badly in which I need to revisit?
The list looks a bit negative and reactive, dealing with what's happened.  There are a number of other things as well but the primary focus on this was to pretty much highlight those not performing as they should and doing something about it.  It could help drive conversations with line managers or within a department about the current state of play.  It usually resulted in some interventions and occasionally ended in a few kick up the bums (for staff and students).  However, I feel that none of this significantly changes the thing that really matters; the teaching and learning.  Remember that this is just my own opinion but here's why.

Are we asking questions?

One of the most common sense things I've heard about data came from our Deputy Head in a meeting last year.  In a conversation about how to create change from what we've collected he said that "Data provides a great opportunity to ask questions about what is going on".  Now I've heard a lot of thoughts around data but this one sits firmly at the front of my mind.  Data can show us a lot of things but asking questions from what we see is a more powerful strategy in my personal opinion.  Why is student x not performing as well as they have in term one and two?  Is there a link between under performance and the seating plan in the class.  How can group x make as much improvement as group y?  Data doesn't always show you the full picture.  In fact it shows you very little compared to the enormous amount of factors that takes place in classrooms and lessons over the whole year/key stage.  But it's contribution can be very powerful if we use it to spot things and ask questions.


Are we just talking about data?

I've spent many a meeting simply talking about data.  With my latest set of results and a spreadsheet sat in front of me I can revel in high grades and cower with poor ones.  We can compare and discuss how classes have been performing.  We can find averages and mention the overused phrases like 'Above national average', '3 levels of progress' and 'Ofsted would grill us with these results'.  The thought from me though is what is this achieving?  Yes we are talking about data but is that actually bringing about change?  Yes it may give me a wake up call that my class is behind everyone else's but do I know how to rectify that?  Yes I can put students forward for intervention but are we missing the point as to why they need it in the first place?  Does knowing that a certain class has 12% higher A*- C help other teachers or simply demoralise them?  Are we just talking about data to simply say we've talked about data?

Does data make us say things we're not quite sure of?

There are those times when the data in front of us doesn't read so well.  At that moment of time (especially when sharing it with others) we may start saying things like "Well what do you expect with those kids?" or "They are bottom set" or "Well I must just be a terrible teacher!".  The flip side to this is when results go very well and we revel in the glory.  Sweeping statements like this don't actually help in the bigger scheme of things and actually masks over the details.  I've done it many a time and have also seen colleagues beat themselves up because of how their classes have performed.  For me, comments like those above help us deal with the disappointment we feel inside when things haven't quite gone as expected.  But saying these things doesn't unpick areas we can work on.  It generalises things without focusing on the detail.  It becomes a factor or a reason which unless we look at it more closely, we might not be quite sure it really is the case.

Does someone else do the analysis for us?

Workload is an issue and I know that delegation can make it easier.  I do wonder if having someone else analysing our data helps us understand the bigger picture ourselves.  Although it can be time consuming, understanding your own class performance helps in some small way identify steps to move forward.

Do we follow up the data?

I'm sure we all do but I'll just make the point anyway.  If students perform poorly on a topic area, do we find time afterwards to close the gap between what they know and what they should know?  If we pick a poorly answered topic area to reteach with the class afterwards, what about those who actually performed well in it?  Do we do a blanket coverage for everyone or can we make it bespoke so people work on areas that they need to?


Is data dominating our time unnecessarily?

I hope not but it can easily do so.  If the time it takes outweighs the benefits or impact it brings, does the system need to change? 

Are we finding time to collaboratively look at data?

Sometimes data can feel like an isolated task.  I mark the assessments.  I create the spreadsheet.  I analyse the results.  I react to the results.  I then plan what to do with the results.  I then do something about the results.  I then also have a meeting to discuss results with a line manager where I talk about my results.  The isolation can sometimes make us work on problems and find solutions which probably aren't better than the initial idea in the first place.  Take this as an example.  A class does poorly on a particular component of a test and as a result we spend the next lesson reteaching it.  But what if we teach it similarly to the way we did the first time?  That was the way I taught it when students clearly didn't understand it so will they get it again this time?  Has there been a change in the way I taught it?  If not, I shouldn't be surprised if the same misconceptions crop up or students still don't get it.  As they say, practice doesn't make perfect.  Perfect practice makes perfect.  Using colleagues during the data analysis can be an enormously important approach and a great time for learning off of each other.

Is data improving teaching and learning?

And this is the main question on my mind?  If we are simply 'doing' data then I'm not sure we are.  If we just talk about it without bringing meaningful change then we certainly aren't.  If we look at data and pick weaknesses, but then still teach the same way, then once again I don't think we are.  If we try and hide our data or make generalisations because results aren't great then we aren't.  Is data improving our teaching?  Is data improving the learning in the classroom?  After a meeting with Pete Pease, our Director of Learning for Maths and Science, I'm hoping that it begins to do so.  Data can still be about trends and anomalies, but in a more powerful approach shouldn't it also be about developing us as teachers and improving the learning that goes on in classrooms.

How? - Data with a purpose

During that meeting I found that we've taken great leaps in the way that we are tackling data in our department.  It seems that although not full proof (and the model I will propose isn't full proof either), what we have put in place has created a great foundation.

When marking exams or tests we note common misconceptions or identify students who have performed differently then expected.  This simply takes the form of a blank sheet of paper and a few scribbles as we go.  Building a picture as we go can be vital later on.

After each theory assessment or exam we then create a spreadsheet which breaks down the exam into its smaller components.  These components are colour co-ordinated for cross reference and tracking.  For each component we enter the mark that each individual student got.  It takes very little extra time.  Although this is nothing new and not rocket science, it allows us to do a number of effective things.  Firstly we can get a better overview of each topic area.  How have the students performed in general?  Did they score well or is this an area we need to look more closely at?  Were there any strong areas?  Is there a trend in results compared to question types (multiple choice, short answer or long answer)?  Was it a technical aspect that was answered well/not well?  Was it the written communication that was effective/ineffective.  The spreadsheets allow us to quickly get a better understanding of where we are.

Secondly, the data allows us to look at students performance over time in each topic area.  Our unit tests, for example, used to be block tests which just focused on what had been taught.  This is no longer the case and we now include questions from every taught topic in every exam.  Ultimately our final unit test will have questions from the full course.  With this colour co-ordinated on our spreadsheet we can quickly see if the topic of  'somatotypes' for student A is still a weak area or has it got better?  Is student B forgetting information over time?  Is it the same areas that are still causing us the most problems?  This allows us to keep our finger on the pulse and respond when needed.

We are also very focused on follow up and after every test and data analysis, we run two closing the gap lessons.  In the past we may have picked a poorly answered topic area and taught this to all students once again.  But what if some students did very well in this and it becomes a waste of time?  Instead we share the test information with individual students and they use this to pick weak topic areas to focus on.  They revisit the targeted area, analyse their response, check mark schemes/lesson notes/text books/resources and attempt to answer the question again with a better outcome.  Some students simply read the question again and instantly know what the answer should have been.  This approach makes it bespoke for all students and allows us the opportunity to go round and work with students on a 1:1 basis or in small groups.  Two lessons of work and hopefully this topic isn't a weakness in our next unit test.

So what's different then?  How can data improve T&L?

Pete's message was very clear.  Although we have a solid foundation and things are better than they previously were, can we raise our game and use this data to improve teaching and learning?  The power of in depth analysis and collaboration within your department team can be a powerful tool and one which can occasionally be underused.  Finding time in meetings to step away from 'doing' data should be high up on the priority list.


Transparency and a supportive culture

The first area for improvement is transparency within the department and one of a supportive culture.  Sharing everyone's data with one another may seem a scary prospect and may cause anxiety if an individuals class has underperformed.  Even more so during a department meeting.  Why would we want others to see this?  It can feel like we are opening ourselves up and bringing our reputation as a teacher into question.  But it doesn't have to be like that.  The ethos should be about using each others experience to help move the department as a whole together.  It's a difficult culture to develop but learning off of each other through discussion and observations is a powerful driver of change in teachers habits.  Sharing a departments data with its own teachers allows us to sit down with each other and perform a detailed analysis.  We can look across classes at different groups of learners.  We can highlight topics that have been answered well across the different groups.  We can see what question types have been answered better in different classes.  All of these things, with each other, sat around a table, allows us to then ask the question 'why?'.

Bringing out the detail

With asking why we normally have a multitude of reasons being given.  Sometimes if a topic has gone well and results highlight this we can say things like 'they were a bright class'.  If the results are poor we can say 'no matter what I said they didn't get it'.  These things might well be the case but it doesn't give us much to work with.  It isn't specific enough.  Instead it's worth looking at the overall spreadsheet and identifying areas that weren't answered well.  Picking out topics (only two or three at a time) we can then bring out the students exams, tests and assessment papers for detailed analysis.  What did students write on question two in all of our classes?  Can we see what their thought process or thinking was?  What specifically was it that they got wrong, missed or interpreted incorrectly?  Is it a trend across all classes which may be down to our scheme of work or materials?  If not, is it down to individual classes or specific groups.  It is an interesting task and by unpicking student answers for poorly answered questions we get to the core of the problems.  I could hazard a guess and say 'well it's because they don't use technical terminology' but how do I know unless as a group of teachers we look through the papers in detail to see if it actually was?

A driver of change

Once we have looked at the papers and worked out what went wrong in these questions/topics, we can then begin to bring about change.  Usually I would then go back and reteach that piece of information so students got exposed to it again.  What if the way I taught it was the problem?  What if my explanation was the issue?  What if my examples confused students?  Through collaborative analysis we can look at classes who did perform well on a topic and ask the teacher to share how they taught it.  What did they say?  How did they model it?  What resources did they share?  We rarely find the time to visit other classrooms so this is a perfect, and very beneficial, time to develop each others practice.

Making the change to the learning that takes place should then be the key.  Knowing what my class needs to work on, and knowing how other teachers have successfully taught it, we can then spend time working together to improve students understanding.  This is where planning with colleagues comes into its element.  Challenge each other, learn from experience, seek advice and reflect.  If it doesn't work when we reteach or recover the topic again we can easily come back and unpick it again.  Has data ever allowed this to happen in your department?  I know it can easily be missed.

Doing data?

We shouldn't simply be 'doing' data.  The value of what we collect and collate can actually be a powerful driver of change and if used wisely and in a manageable way, can actually ensure (in a collaborative way) that we improve and increase the learning in our classrooms.  It doesn't have to be more time consuming either.  We mark the papers so we simply input the finite grades.  We look at our results so it's now about looking at this as a department.  We have the papers so it's easy to bring them out to scrutinise specific answers from your class.  We always try to correct the mistakes of our learners but now it's about learning off of each other and making that change to our practice.  It's about a culture within a department and one which could contribute positively as we move forward.  It's about using data rather than doing data.

2 comments:

  1. I just wanted to let you know that your blog is brilliant! Thank you for putting all this effort into writing such rich, informative posts.

    ReplyDelete
  2. I'll second that :) Quality stuff mate, cheers

    ReplyDelete