HCC governor services

Last night’s governor training was on “RAISEonline to improve our schools” and was led by Chris Martin, from Hampshire Inspection and Advisory Service:

Christine Gilbert, ex head of Ofsted said:

“It is data that will challenge thinking and stimulate discussion leading to improved practice.  It is data than enables progress to be monitored.”

Why should governors use data?

  • School self-evaluation
  • School accountability
  • Preparation for external inspection and accountability
  • Ensuring school is setting challenging and aspirational targets for the future (pupils, cohort and school)
  • Monitoring progress of pupils and cohorts in the school
  • Developing a vision and strategic direction
  • Performance managing the headteacher
  • Determining the allocation of resources: for example, Pupil Premium


Important not to over-simplify data analysis as the organisations are full of people and complex and so it needs to be dealt with sensitively.  RAISEonline focuses on self-evaluation.


RAISEonline: Key Processes

  • RAISEonline makes use of the existing data collected nationally through the school census and the KS2 testing agency.
  • The data is matched together using the Unique Pupil Number (UPN).
  • RAISEonline is an unvalidated report in October, this is unamended data.  Schools have a chance to check that data set.
  • After the school’s checking exercise a second set of validated data is due in March (including the Data Dashboard).
  • The validated data is used to construct the Performance Tables (normally 15th December).


Establishing Protocols

  • The National Governors Association recommends each Governing Body should nominate a couple of governors to have access as a minimum, to allow you to see all the data.
  • Each year in the autumn term, the school’s RAISE Summary Report should be presented by a member of the SLT to a Full GB meeting.
  • The Governing Body must decide how it will consider and analyse the more detailed data, and may set up a committee to consider this or ensure the monitoring of school performance data is within the remit of another committee (Strategy Committee).


Key Understanding

Purple G on a page means it is important for Governors to look at.

Significance: any piece of data can naturally fluctuate down to chance.  Based on Standard Deviation, e.g. 26.8 +/- 1.0 is outside national average 28.2 and so is significant.  A – will mean that significance has not been calculated, often because the percentage is too near 0 or 100.  But in the reading example there was no test, it was done as a combination of writing and entered later.

Point Scores: L x 6 + 3 = Point Scores (with +2 for A and -2 for C).

Trends: look at overall journey, but ————— in-between columns means that the way of calculating the APS:

2009 Eng, Mat, Sci

2010: Eng, Mat

2012: Writing teacher assessed, combined with reading to create English, Maths

2013: Writing, Reading, Maths

The formula for creating the APS for all subjects is:


———  + M




So Maths has a higher weighting, and SPAG is excluded from the All Subjects APS.

Sample size: Instead of FSM it may be old FSM ratings of DIAG Deprivation Scores.

Types of Indicator: Attainment: What did they get – their Points Score.  Threshold measures reaching a particular standard, e.g. Level 4 – doesn’t matter if you got Level 4, Level 5 or Level 6 you passed the Level 4 threshold.  Point Score is focussing on the overall attainment of all pupils.  Progress: the difference from beginning to end – expected 2 levels, e.g L2 at KS1 to L4 at KS2.  Value Added looks at all those on a L2B and looks at how they did in KS2 and then it is compared to the school and national averages – you increase this to move the goal posts and to increase the progress.  With Levels going it will be much more focussed on Value Added.  Achievement: OFSTED’s criteria putting together attainment and progress – high attainment but no progress isn’t good enough; attainment is really low, but children arrived with such low levels so fantastic progress.


RAISEonline Context

  • FMS 2012 onwards is FSM Ever6.
  • Stability is what % of children lasted from 1st October to the last day they were able to be taught within that school.
  • ADACI – every postcode is ranked on the ADACI score – it is the average of where your children live – not where your school is based.
  • Gender balance – are we doing anything differently; do we need to tailor the curriculum for a particular class?
  • Attendance and Exclusion data is painfully slow and so often not on first RAISEonline but the HT should be able to state those in the absence of the data in the Unammended Report.  Median trend line for school’s for FSM level shows schools similar to the context.  Do you have a few pupils absent a lot; do you have a lot of pupils absent a little bit.
  • Prior Attainment: pupils who were not in mainstream education, in the country etc., are not counted – see the coverage to see how many are counted.  School APS is based on the pupils scores – not the school score – so it will be the scores of the pupils you have even if they did KS1 in another school.  Prior attainment bands are: Low L1, Middle L2, High L3.


RAISEonline Attainment

  • Tests in 2013 schools are below the floor standard if:
  • Fewer than 60% of pupils do not achieve Level 4 or above.
  • It is below the English median for progression by 2 levels in reading (91%), in writing (95%) and in maths (92%).
  • For 2014 it is fewer than 65%.
  • If prior attainment was green then attainment should be more likely to be green – in essence not something to celebrate as much as blue or blank in prior attainment to green.


RAISEonline Progress

  • Key is to look at the trends, are we maintaining or growing success.
  • Ask HT for 2014 and 2015 predicted trends.
  • 2 levels of progress is no longer good enough, they on average push their children further.
  • Key to dissect and understand not just whole school, but e.g. how well L1 progress, how well L3 progress.
  • Good means you have to be within at or close to (approximately 5% normally) of the National Average APS.
  • Value Added: coverage says how many of cohort have a KS1 result.  A value added score of 100.8, with a range of 0.6 would be 100.2 to 101.4.   100 is the average national value added score.  Every child has got 0.8 points above what the national average for progress was that year which is why it is a moveable figure.  If the 100.00 sits within the range of the VA Score for the school it won’t be green or blue.
  • In Maths and Writing every mark literally contributes to VA.
  • Value Added Line – you want to be above the line – the pupils got higher than expected – the further the distance from the line shows someone who got much higher than expected.
  • Your school can produce any table or graph with any characteristics, e.g. FSM v non-FSM, SEN v non-SEN, boys v girls through the interactive reports.


RAISEonline Closing/Narrowing the Gaps

  • Focus on the tables with three years to look at the trend of closing the gap.
  • Within School Gap – service children have limited attainment different, the funding is linked to emotional support – that’s why FSM and LAC are highlighted in these tables.  Always check which characteristics you are analysing.  Anything less than 10 is low, anything over 30 is extremely worrying.  Attainment has to be linked to progress, as often FSM has much lower prior attainment so look at both attainment and the prior attainment.
  • National Benchmark is important to ensure we don’t focus too much on CLA/FSM to the detriment of CLA/FSM.  We want gradual improvement of non-CLA/FSM pupils with higher improvement in CLA/FSM pupils – but we will never close the gap.
  • Which characteristic drives the under performance if FSM and SEN for example.


What other performance data is available to us?

  • School Data – are your predictions right and if not there is something wrong in the system – want similar data to RAISEonline and should be able to give you.
  • OFSTED Data Dashboard
  • Performance Tables
  • Fischer Family Trust Data Dashboard – been around a lot longer than RAISEonline – does a comparison to similar schools with same characteristics (creating lower expectations of schools with more FSM but this is not the world we live in), looks at what results might be for next 3 years given the information of pupils; do a lot of 3 year trends which is more reliable, but in 3 years a school could be completely different e.g. change in SLT.


What other data would we like?

  • Pupil attitudes
  • Engagement in community activities
  • Participation in sport
  • Behaviour


How effectively is the school using the data from RAISEonline

  • Is it referenced in self-evaluation documents?
  • Is it referenced in the SIP?
  • Does the school use the question level analysis function?
  • Does it inform performance management processes?
  • Which staff members receive RAISEonline data?
  • Is it used to inform annual review meetings with curriculum leaders etc.
  • What other data does the school use to triangulate with the data from RAISEonline?


Some key messages

  • RAISEonline may provide cause for celebration
  • RAISEonline is a tool for asking robust questions
  • RAISEonline provides insight into the performance of the school, but it is not the only source of evidence.
  • RAISEonline needs to be used sensitively.
  • RAISEonline’s greatest value will be the insights that it provides into plans for school improvement.
  • RAISEonline looks back, governors look forwards
  • Focus on trends
  • Watch your sample size
  • Understand significance.
Married to the amazing Sarah and raising Jakey, Daniel, Amelia, Josh & Jonah in our blended family. Passionate for Jesus, social work & sport.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.