MY SCHOOL and SIMILAR SCHOOLS

When he woke up on Thursday January 28th this year Shaun Addy, the Principal of Blacktown Boys’ High in Western Sydney was amused to open the My School website and find that his school was “similar” to the Alice Springs School of the Air.  On the same day the parents of Terrigal High School (NSW Central Coast, 1,274 students) would have scratched their heads when they saw that their school was “similar” to Buntine Primary School (WA wheat belt, 300 km north of Perth, 19 students).

You might have made one of the reportedly nine million hits on the website as people checked to see how your local school was faring against other, similar schools. But your first encounter with the My School website’s “Statistically Similar Schools” page was probably just as mystifying until – and unless – you read the fine print.  The clue is in the word “statistically”.

The Australian Curriculum Assessment and Reporting Authority (ACARA) explains that – as unlikely as it may seem at first sight – the communities of the schools against which your school is compared are sufficiently alike in the things that influence success at school to make a valid comparison.  They have good evidence that the method can work well under the right conditions, and community factors can indeed account for 60-70% of a school’s results.

The lists and numbers on My School are a bit like the weekend sports results.  It’s probably because some very similar thinking lies behind it.  In fact, as offensive as the idea is to educators, it’s probably easier to understand some of its concepts in a sporting context.  For instance, there’s a strong suggestion that all of the “statistically similar schools” are on the same level playing field, equally kitted out, ready to play the learning game and any differences on the scoreboard at the end must be down to the coaches.  It’s a delightfully simple idea: scores are all listed on the website in the NAPLAN table.  If you aren’t sure what the numbers mean, just look at the coloured bars: green means you’re ahead of the competition, red means you’re trailing.

To make the competition seem fair, each school has a grading, with the awesome title of Index of Community Socio-Educational Advantage” – ICSEA, for short.  That’s on the website too, along with a breakdown (in “quartiles”, or one-quarter slices) of how your school’s community splits up on educational advantage.  If your community is mostly executive/professional folk in the top quartile, then your school will have a really high ICSEA and you’re definitely playing in the A Grade!  If your community has a lot of single parents or unemployed adults occupying the lower quartiles, then your ICSEA will be down and your school is relegated to the lower grades for “competition” purposes.

Of course, some people might find it offensive that their school and community are publicly rated this way, but competitions have to be fair: after all, you wouldn’t tolerate your ten-year-old stick-insect playing against beefy teenagers, would you?

But here’s where it starts to get tricky.  On the sporting field, everyone has a good idea if the grading system gets it wrong, just by looking at the players.  On the education paddock, it’s more obscure, so you must rely on the ACARA experts to get it right.

It’s hard to know if they’ve got it right.  Teachers generally understand the resources and challenges in their school community pretty well.  They might have “gut feelings” that something is wrong with the rating of their school community, but without the kind of data that’s available to ACARA, it’s almost impossible to argue with the experts.

The NSW Secondary Principals’ Council took up the challenge when several principals claimed – some with good reason, it appeared – that their school was mismatched on the similar schools web-page.  The Council’s investigations turned up several anomalies – schools with the same ICSEA, but very different quartile arrangements.  Curious!

It’s time for some background.  The ICSEA is calculated using students’ addresses and information from the Australian Bureau of Statistics (ABS).  There are a couple of problems with that.  One is that the ABS data is from the 2006 Census and it’s actually getting a bit out of date.  Another is that, for privacy reasons, the ABS does not provide data for the actual addresses where students live.  They provide average data for their census collection districts – blocks of around 200-250 families.

But calculating the ICSEA this way assumes that every student (or at least the total collection of students) from a particular census district matches the average for that district.

How likely is that assumption to be correct?  Well, fairly likely, but it’s certainly not guaranteed.  The more students that come from a particular census district, the more likely it is that the total will accurately match the district average.  Unless, that is – and it’s a big “unless” – some of the students from that district (e.g. the rich ones, the bright ones, the Catholics, the Muslims, the sportspeople, the musicians, etc) are being shunted off elsewhere.  Our schooling system encourages parents to make these sorts of choices and the more that it happens, the less reliable the “average” assumption becomes.

To try to rein this in, ACARA gives school systems (not the schools themselves) a chance to review the draft calculations before publication, using any better data that they might have, including data on the actual families of the students in the school.

As the peak Principals’ Council found, this is where the cracks start to appear. It seems that the anomalies they found are cases where the draft ICSEA was changed, but the original quartile information was left on the website. (Very soon after this was pointed out, the quartile data disappeared from the “adjusted” schools!)

It’s possible to figure out roughly where the quartile figures would have placed those schools initially and to see how far they were moved.  Some of the apparent differences are enormous – over 100 points on the ICSEA scale in some cases.  Changes of this size are more than enough to change those colours on My School that are supposed to tell you how your school compares.

Of course, we should be relieved that the ICSEA values for many NSW schools were changed: the fact that the NSW Department of Education had good enough data from their students’ enrolments to show that some of the draft values for their schools were wrong, headed off a more serious problem in the case of those schools at least.  We can only trust that other school systems were able to do the same, because the initial ACARA calculations apparently got some of the figures very wrong indeed.

Questions have to be asked: why didn’t ACARA use student family data in the first place instead of this elaborate and flawed census district method?  Exactly what criteria and data did the systems use to adjust the ICSEA values?  How many were adjusted and which ones?  How many were not adjusted and how big did the discrepancy have to be before it attracted attention?  What does any of this say about the integrity of the My School website?

If ACARA could not use actual student data for legal or logistical reasons, then the “statistically similar schools” comparison should not have been set up in the first place.  Of course; the Deputy Prime Minister wanted parents to be able to compare schools on their NAPLAN results and it had to appear to be “fair”. Was ACARA under pressure to get this site up and running this year, whatever the cost in terms of accuracy?

The NSW Principals’ Council has urged ACARA to redevelop the My School website to provide parents and the public with more – and far richer – information about their schools than the present website does – without the simplistic school comparisons that label and mislead.  What is important is how many students are reaching benchmarks in their learning in your school.  Check other schools’ results by all means, but don’t expect some spurious number to tell you which schools you should look at, because no single number can, even if it’s calculated with the best information.

At the end of it all, education is not a sport.  Surrounding it with the kind of attitudes and statistical trappings better suited to the ABC Cricket Book is demeaning; both to the schools and the communities they serve.

Bernie Shepherd, Retired Principal
19 March 2010

Business Partners

Education Partners

Top