How to privatise the education system, without people noticing – a step by step guide.

privatise education

  1. First, design a performance system that compares every school’s outcome against the national average – thereby ensuring that there will always be a large proportion of schools whose results are deemed ‘broadly average’ (and a chunk of schools deemed significantly below average).
  2. Create an Inspection system that will declare any school that has consistently attained below-average results to be Inadequate.
  3. Where schools are found to be Inadequate, claim that this is proof that being “controlled by the local authority” is not working. Hand over all the publicly funded assets of such schools (buildings, land and all) to a private company – along with a huge cash boost to cover “conversion costs”.
  4. Wipe the slate clean, in terms of the previous years of poor results (because this is now a different school, so those legacy results no longer apply) and re-inspect it before the next set of results are published. With no official results to go on, declare that the school is now Good – now that it is no longer “controlled by the local authority”. (NB it helps if you have a very cosy relationship with the media, so that no-one actually explores the fact that schools haven’t been “controlled by the local authority” for years. Of course, you know full well the reality is that Headteachers and Governing Bodies determine how to run their school, including full control of the budget. And all statutory requirements, such as the National Curriculum, SATs tests etc, are determined by national – not local – government. Make sure this is not discussed by the media.)
  5. Change the Inspection system so that schools attaining average results are now deemed to “Require Improvement”, rather than be considered Satisfactory. (Sell this idea to the media using the very noble-sounding intention that you want every school to be a ‘Good’ school. Do not mention the fact that this is statistically impossible when you rely upon a norm-referenced data system.)
  6. Change the law so that schools that have consistently been judged to “Require Improvement” must now be removed from ”local authority control” and handed over to a private academy trust.

Result: you now have a majority of ‘state’ schools owned and run by private companies who can control decisions about school policy, curriculum, resources etc – in such a way that profits are maximised. (NB the ‘academy trust’ itself may be not-for-profit, but it can have close links to educational suppliers, publishers, contractors etc) Private investors can get richer, and those evil local authorities are decimated. (Handy hint: the canny politician can make a bit of extra income here, by investing in such companies prior to handing the assets over. Just watch those share prices soar.)

Follow these simple steps and you can achieve this aim without anyone realising that billions of pounds worth of public assets have been given away to your business associates.

If you have a particular ideological axe to grind, and you want to make things tougher for the socially and economically deprived areas of the country, abolish any data system that compares children’s progress with the progress made by children in schools in similar circumstances (a “contextual” approach) and insist on a system that ignores any such external factors. This will mean that schools that might have been considered to be doing “relatively well given their challenging circumstances” will likely be relegated to being ‘not significantly better than national average’. This will ensure a large stock of ‘Requiring Improvement’ schools situated in economically more deprived areas, where it will be easier to pursue the academisation agenda without parental protest.

You can further push the knife in to these schools by giving them a poisoned chalice of extra cash (‘Pupil Premium’) for all the pupils that are from poorer families – but state that unless this money is spent in such a way that raises the attainment of the poorest children to equal the ever-rising national average, the leadership and management of said school will be deemed Inadequate. For a hard-working Head trying their best to serve a deprived community, it’s a lose-lose situation.

Before you know it, all those loony lefty local authorities up and down the land, who seem intent on trying to serve communities and help the disadvantaged, will have been cut back to nothing, and our schools will be safe in the hands of private enterprise and market forces.


Another sensationalised piece of garbage about school standards

So the London Evening Standard responds to this year’s publication of the Primary Performance Tables (otherwise known as ‘league tables’) with the headline 20% of London pupils leave primary school unable to read or count.

Utter garbage.

I’m not sure whether Anna Davis, the article’s author, is deliberately trying to mislead the public or whether she has a gross misunderstanding of the facts – but either way, liar or moron, this is appalling journalism.

It would have been very easy to put a positive spin on this story – easy, because it is a positive story. The headline could have read:

‘Best results ever for England’s primary schools’ or ‘London children achieving above national average’.

But no, why tell a positive story about the excellent results achieved this year, thanks no doubt to the efforts of an extremely hard-working teacher workforce, when instead you can contribute to the media’s beloved pastime of teacher-bashing. The reality is that national standards at the end of primary school have never been higher than they are now. The pressure on schools to perform keeps increasing, the Ofsted framework is constantly being rewritten to raise the bar higher,  the level thresholds of the KS2 tests keep rising  – and yet, despite all this, 75% of our 11-year-olds, with the help and dedication of their teachers, have managed to achieve at least Level 4 in all 3 core subject disciplines of reading, writing and mathematics. Higher than ever before – although the Performance Tables do not provide a fair comparison, as previous years’ results are based on Level 4+ in English and maths, which could have been achieved without level 4 writing. (To be fair, Anna Davis does make this point, albeit in her 5th paragraph.)

Let me just spell out that statistic again, for the likes of Anna Davis. That figure of 75% (or 79% of all London children) represents the proportion of children that have gained at least Level 4 in reading and writing and maths. If a child failed to get level 4 in just one of those areas, by just a single mark on the test paper, they would form part of that 25% that did not meet the standard.   Bear in mind that in reading the threshold for level 4 was increased by one mark, so you could have children who, based on the 2012 markscheme, would have got the Gold Standard Level 4+ in R,W&M, but fell short in 2013 because the bar was raised.

And what of those children that didn’t meet the standard? Is it correct to say that they are leaving primary school “unable to read or count“?  Of course not. The majority of the general public of course has no idea about the curriculum requirements to achieve level 4 in the Key Stage 2 tests, so it is all too easy for a journalist to make such a claim and for the general public to accept it unquestioningly, but believe me it is considerably more demanding than ‘being able to read and count’. I suspect many adults would fail to achieve level 4 in the reading test, as it is more than just a case of merely reading a passage and demonstrating a basic comprehension. The level of reasoning, critical thinking and analysis of text is way beyond anything I remember doing when I was at primary school. I have my doubts that Anna Davis herself would pass, given the lack of understanding and logic she has displayed in her article.  Likewise, level 4 maths is a bit more complex than ‘counting’.

So what other inaccuracies has Ms Davis trotted out? Let’s go through her article line by line.

More than one in five children in London left primary school this year without reaching basic levels in reading, writing and maths, new figures reveal today.

Well, if by ‘basic levels’ you mean level 4 then this is true. But as I have argued, the use of the word ‘basic’ to describe level 4 is debatable.

Almost 17,000 11-year-olds failed to get to ‘Level 4’, which is considered the minimum benchmark, in their SATs tests in all three subjects, according to analysis by the Evening Standard.

Ok. Out of interest, what proportion of these 17,000 pupils would you actually not expect to have reached the standard?  For instance, this figure includes pupils with all manner of special educational needs or disabilities and also includes pupils that have recently arrived from abroad, not speaking the English language. Such children would be highly unlikely to achieve level 4 in all 3 core subject areas. What would be more useful would be to see how the overall proportion of children reaching the standard compares with other areas nationally – and, as has already been established, London exceeded the national average, and then to filter within that to look at pupils by sub-group e.g. those whose first language is English, those with/without special educational needs etc. But that would require some data literacy.

Our league tables, published today based on data from the department for education, reveal that 21 per cent of children who took the tests started at secondary school in September already behind their peers. This compares to last year when 18 per cent of London children failed to reach Level 4.

Fairly meaningless statement. There is a normal distribution of levels of attainment. Unless you happen to be a child at the very top end of that distribution, then it could be said that you are ‘behind your peers’. This will always be the case. Children at level 5 are behind their peers too, given the numbers of children now achieving level 6 (but don’t let the facts about higher-than-ever levels of attainment get in the way of a good story).

But children in London schools continued to outperform those in other areas. In London, 79 per cent of pupils reached the benchmark this year, compared to 75 per cent nationally.

And now we get the good news – long after the casual glancer of the article has lost interest in reading to the end, having made their mind up that standards are clearly shockingly low.

The exams this year, which children took in May, were tougher to pass than in previous years. For the first time 11-year-olds had to reach Level 4 in both a reading and a writing test.

Indeed. I might add to that last sentence ‘in order to be counted in this particular statistic’. Any implication that, in the past, teachers didn’t need to bother about getting their children to level 4 in both reading and writing would be nonsense.

A spokesman for the Department for Education said the tougher target was brought in to drive up standards and end “years of entrenched failure”. Schools must ensure at least 60 per cent of children pass all three tests and meet pupil progress measures to pass the government’s floor target.

Years of entrenched failure. LOL. It’s just laughable. Ask any parent of a child currently at primary school (particularly upper KS2) – they will on the whole be amazed at the work their children are doing in school and remark that they are studying concepts that used to be taught at secondary school. No doubt about it – expectations of what should be learnt in the primary phase have risen dramatically over the last 2 decades. ‘Years of entrenched failure’ – just ridiculous.

The data reveals 81 London schools are in danger of falling below this goal. It means they are at risk of being put into “special measures” by Ofsted and turned into an academy.

This would be a good place for a serious education journalist to raise a discussion about whether forced academisation actually makes the slightest bit of difference to a school’s effectiveness.

Nationally, 767 primary schools fell below the floor target.

Including some academies!

A spokesman for the Department for Education said: “Schools with a long history of under-performance which are not stepping up to the mark will be taken over by an academy sponsor. The expertise and strong leadership provided by sponsors is the best way to turn around weak schools and give pupils there the best chance of a first-class education.”

Is this claim about “the best way” based on any kind of evidence? And who is this nameless DfE spokesman? Can we have some proper referencing please?

According to the tables, the best school in London is Grinling Gibbons primary school in Lewisham. It was one of 40 London schools with a 100 per cent pass rate for all three tests, and also scored well in the “value added” category which judges how well pupils improve after joining.

Well done Grinling Gibbons. But also well done to all the other schools that have done very well, but not been singled out by this writer (who, by the way, seriously needs to look up the definition of value added).

Harris Primary Academy Coleraine Park in Haringey was praised by the department for education after results improved by 23 percentage points.

Of course the flipside of being one of the ‘most improved’ schools, is that it really just emphasises how poor you were last year. Schools that are consistently good can never show huge rates of improvement.

Today’s results come after Ofsted’s first annual report about London found that inspection outcomes in the capital are the best in the country, and that poorer pupils do well.

THIS IS GOOD NEWS! This could have been your headline.

But it also warned that children who do well at the age of 11  do not always go on to do well at GCSE.

Well, duh! Sometimes children might not make great progress between Year 6 and Year 11. Tell us something we don’t know. On the other hand, most of them do.

So, what has Anna Davis achieved with her piece? Chances are, if you were already of the mindset that standards are terrible, you will have just looked at the headline and felt that it confirmed your view. If, like me, you think that teachers and children are achieving amazing things, you will just have been frustrated at how well buried within the article the positive facts lay. And if you are a primary school teacher you will probably have automatically carried out an assessment of the standard of Ms Davis’ writing and placed it somewhere within level 3.

79% of London’s 11-year-olds could have done a better job.

The Travesty of Unintended Consequences: the abolition of National Curriculum levels

Since the Education Reform Act of 1988, we have had in the UK a National Curriculum, comprised of a core set of knowledge and skills that all schools are expected to teach. Along with this curriculum, came the political desire for school accountability: a system of final summative assessments at the end of each Key Stage so that pupils’ learning could be assessed in a consistent manner and schools could be ranked in national league tables.

Thus were born ‘levels’.  Each level in each curriculum subject was defined in terms of the skills and knowledge that pupils should acquire, organised into a hierarchy i.e. the skills ascribed as being in ‘level 3’ build upon those that are labelled as ‘level 2’, which in turn built upon level 1. Expectations were set. The standard expected at the end of Key Stage 1 (7-year olds) was Level 2; the standard at KS2 (11-year-olds) was Level 4.

And that’s pretty much been the system for the last 2 decades.  The curriculum has been tinkered with from time to time, but the basic concept has remained the same.

The system has certainly had its critics – but criticism is mostly around the nature of league tables and the negative effects of high-stakes assessment, such as narrowing down the curriculum to focus only on those areas that are tested, and ‘teaching to the test’ rather than providing quality learning opportunities.  These are very good arguments, but they are not the focus of this blog post. I’m just setting the scene here to give a bit of background to the situation we find ourselves in currently. As I say, high stakes assessment (‘SATs tests’ as they are often referred to) and league tables of results have attracted their critics. But what about the levels themselves? Are they the problem? This Government seems to think so. As they state here on their website, “this system is complicated and difficult to understand, especially for parents”. Really? Level 4 is more difficult than Level 3. Confusing?    “It also encourages teachers to focus on a pupil’s current level, rather than consider more broadly what the pupil can actually do.”  Well no, I would argue that is the high-stakes assessment and league table culture, which is based entirely upon pupils’ results in English and maths tests, that causes (in some cases) teachers to focus only on the level (in those subjects) and not on pupils’ broader achievements. Not the levels – the system.

It is worth knowing a little bit more background behind the steps that led up to this stance that the DfE has taken (especially as it makes a mockery of what they are suggesting instead). Many professional and academic bodies and individuals were consulted by the DfE shortly after taking office in 2010, regarding changing the curriculum and the system of assessment. I would recommend this response in particular from the Cambridge Primary Review. It highlights the tension between formative assessment (the everyday judgements that teachers make about how pupils are learning, which determine the feedback that they give to pupils and inform their planning of future learning experiences) and summative assessment (evaluating what has been learnt – usually at the end of a particular unit of teaching).  These two sides to assessment can get in the way of each other. Formative assessment (often called ‘AfL’ – Assessment for Learning) is all about pupils understanding what is next for them in their learning journey and how they are going to get there. It does not depend upon pupils knowing what level they are working at – rather, it is about them understanding their next steps e.g. ‘I need to use more adjectives in my writing to set the scene effectively’ or ‘I need to check that my answer to a mathematical problem makes sense in the context of the problem’ etc. In other words, specific bits of learning. Not, ‘I am at this level and my target is to get to the next level’.  However, the summative assessment agenda – SATs and league tables – have caused many teachers (and Ofsted inspectors) to become confused, such that it is not uncommon to find primary school teachers regularly marking children’s work and putting a level on it, as well as writing about what was done well and where the pupil could improve.  This is contradictory to a wealth of research (from Butler, 1988, to Black & Wiliam, 1998, and so on) that demonstrates that when a pupil is given a numerical or ranked judgement of their work (mark out of ten, grade, level) and some constructive comments about how to improve, they do not take the comments on board. All they see is the mark, grade or level. This is therefore not good practice for day-to-day learning experiences, where the main objective of feedback is to help the learner move on in their learning.

So we have seen over recent years a culture of level obsession – pupils and their parents regularly being told what their level is, which is often detrimental to their learning and encourages a sense of ranking and comparison, as pupils (and their parents) quickly realise that other children in the class are on a higher level than they are. Apart from those at the top of the class of course, who can also be adversely affected if they start to develop the ‘fixed mindset’ of ‘I’m doing well because I’m clever’, which can lead to feelings of anxiety – ‘What if the next piece of work I do isn’t levelled as highly as this one? I wouldn’t be clever anymore…’  (See the fantastic work of Carol Dweck to explore this more.)  So despite the fact that regularly assessing and telling children their levels undermines good formative assessment – and it is good formative assessment that helps pupils learn best – this is going on anyway, often because teachers think it is expected of them.

I suspect that it was this unfortunate situation, which bodies such as Cambridge Primary Review managed to explain to the DfE, that then led to the DfE’s knee-jerk response ‘We’re abolishing the levels”.  Not a response of “Oh, I see now why the high stakes SATs tests and league tables might actually be a bad thing”, or even a response of “I see that some people are using levels in a way that they weren’t originally intended – we must put out a statement explaining how they should and should not be used” – no, just abolish them. With no idea at all how to replace them. A complete vacuum. The politicians still want accountability measures. They still want to be able to publish league tables. They just don’t know to measure what pupils have learnt, if not by using levels.

Until now.  The Primary Assessment & Accountability proposals have now been published, open to consultation until October 11th 2013.  What is the suggested approach to informing pupils and parents whether they have reached a good standard or not, to replace ‘Level 4’ or ‘Level 5’ etc? Telling them which decile they are in, based upon a national distribution.  Is saying to a child “Well done, you’re in the top 10%” any better than saying “Well done you got Level 6”?  Is saying to a child “You are in the bottom 10% of all pupils” better than saying “You have reached Level 3”?

I would argue that there is a fundamental difference, and that is this.  As I stated earlier, the level descriptors mean something in absolute terms. They relate to a given set of knowledge and skills. Anyone can find out what these are, by either talking to teachers or using a search engine. It is theoretically possible for every child in the land to reach Level 4 at the end of primary school, if they all acquire the necessary learning.  But of course (fundamentally obvious point coming up, but one often forgotten by politicians and the media) it is never going to be possible for all children to achieve in the top 10% or top 20% or any other measure that is defined relative to the distribution.  (We all remember the classic line about how shocking it is that we still have almost 50% of pupils achieving below average – although I can’t actually remember who said it…)

So the proposed new system is one in which it is not possible for every child to be judged to have achieved at a predetermined acceptable level.

And, by implication, it will never be possible for all schools to be judged by Ofsted as having ‘good levels of achievement’, so Mr Wilshaw’s vision of ‘a good school for ever pupil’ will remain a mathematical impossibility.

It’s a desperate situation. There are many losers here. Ultimately pupils, but also teachers and headteachers. Who would want to be a leader of a school in difficult circumstances, in the knowledge that the odds are stacked against you – the chances are that, however hard you work, however hard your teachers and pupils work, you are unlikely to ever move from the bottom half of the national distribution to the top half, because everyone else is raising their game too. (See this blog post from Mike Treadaway of FFT for an eloquent exposition of this. Furthermore, this from the Headteacher at King Edward VI Grammar School, Chelmsford.)  And who would want to be the pupil, told at age 11 that, however hard they tried, there are still 70% or 80% or 90% of pupils doing better?

So this is where the Unintended Consequences come in.  The argument to get rid of levels may have arisen out of a very sensible and noble desire – a desire to get back to classrooms that are focused on learning – pupils acquiring life skills, developing as social human beings, collaborating with peers rather than competing against them.  But it has led to this. One of the most divisive ideas this current Education Department has come up with yet (and there’s some stiff competition).

This is a call to arms. Whether you work in education or not, whether you have school-age children or not, I would urge you to respond to this DfE consultation. In my opinion, of the many consultations the DfE has launched during this term in office, this is the most important.  We need to overwhelm them with the sheer volume of opposition. There is a lot at stake for our children.  Even more than when they sit their SATs tests.







You can stick your stickmen!

Throughout 2006 and most of 2007, the Conversion data aspect of the RAISEonline reports was missing. When schools produced their Full Reports, there was a page set aside for this data, but all it showed was a message saying that the data was not yet available.

Well it has finally appeared – although it’s not quite what people might have been expecting. In place of the old style conversion data, a table showing all the different percentages of pupils converting from one level to another between Key Stages, we have a diagram of stickmen (see below) – each stickman representing a pupil (or 1% of pupils where cohorts exceed 100). These stickmen are colour-coded according to the progress they have made during the previous Key Stage.

Stickmen diagram

But this diagram gives far less information than the old conversion data. It says absolutley nothing about the amount of progress made by pupils who have reached the Government’s minimum expectation (e.g. Level 4 at Key Stage 2) – despite the fact that some of these pupils could have made poor progress. And it also says nothing about the amount of progress made by pupils with special educational needs, if their level of attainment is 2 levels or more below the national standard. It is only those pupils whose attainmnt is exactly one level below the expectation whose progress is indicated by colour-coding.

In other words, whoever has designed this report as a replacement to the old conversion tables has substituted useful and inclusive detail for very basic simplicity. Is the implication that people didn’t understand the previous data? In terms of the National Curriculum attainment targets for Data Handling, we’ve moved from data that required Level 4 skills to read it (a table of percentages) to data pitched at Level 2 (essentially a pictogram where one symbol equals one person). Inevitably, a lot of the usefulness has been lost. The phrase “dumbing down” springs to mind.

RAISEonline continues to raise a laugh.

Following on from last week’s post about RAISEonline, I can now report that we have this week entered Phase 1 of the release of the Data Management aspect of the software – Hooray! – which means that schools can now do the bit about creating their own pupil attribute fields, by which they can then filter the reports. But you’ve got to be pretty good at working with different sorts of files to do so. Standard pupil details (name, date of birth etc.) can all be uploaded to RAISEonline via CTF (Common Transfer File – also known as xml) but the school-defined stuff has to go in via a CSV, which you can create in Excel, but it’s not the standard Excel file type. And it only works if you’ve set up the field names exactly as per the names you’ve specified in another area of the program. (And even when you’ve done it, the filtering that you can do is not very advanced.)

You can also enter Question Level data – this time it must be via a standard Excel file (xls) – but (and here’s the comedy) having entered the information there’s absolutley nowhere that you can view any tables, charts or graphs of it. So what exactly is the point? Maybe that will be in Phase 2.

I foresee a lot of schools having a lot of problems with this – especially primary schools, many of which simply do not have any members of staff sufficiently confident in manipulating CTFs, CSVs and XLSs to be able to do this stuff easily.

Again I come back to the point I’ve made on a previous occasion – RAISEonline is not as easy or intuitive to use as PAT, does not offer as many different options as PAT and the reports are not as well-designed as those in PAT.

But then, as they say, that’s progress.

RAISEonline – no raison d’etre.

What an absolute farce RAISEonline continues to be. For anyone out there that doesn’t know, let me give you the background to this ridiculous tale.There is a perfectly excellent piece of software used by many schools for analysing their data, commonly known as PAT (Pupil Achievement Tracker) which was produced by the DfES (now DCSF) in 2003. In addition to this, every year schools received a “PANDA” report (Performance AND Assessment) which was produced by Ofsted.

At some point in 2005 a decision was made to merge these 2 concepts together into a web-based tool that would contain (we were told at the time) “all the functionality of PAT” as well as being the vehicle by which school’s PANDA reports would be produced.

Sounds OK in principle.

And thus was born RAISEonline, the bastard child of PAT and PANDA.

For the last 15 months this woefully inadequate piece of programming has continually failed to live up to expectation, with published timelines for the launch of its full capability being constantly re-written, in the vain hope that we won’t notice that another missed deadline has slipped by.

Initially PAT was going to be discontinued last Summer and RAISEonline was due to take over. Luckily at least someone at DFES had the common sense to throw PAT a lifeline and continue to provide the upgrades necessary for schools to be able to continue to use it up to and including the present time. Without it schools would have been scuppered.

Over the last year I have been regularly making enquiries about some of the promised but as yet undelivered aspects of RAISEonline – such as the ability for schools to analyse pupil results by their own customised attributes, and to analyse test papers at question level – features which were originally promised for the July 2006 launch of the software. Only last June I was assured they would definitely be there within the month. Are they present yet? No.

And even if the full capability of the software had been made available, the program is still not a patch on PAT. The graphs are nowhere near as clear, many of the handy little features that PAT contained are missing – like the ability to quickly identify pupils on the scattergraph by simply hovering the mouse over the plotted point. And woe betide you if there are 2 or more pupils in the same point on the scattergraph. In RAISEonline it is utterly impossible to identify them. The full functionality of PAT? – I don’t think so.

And then of course you realise that – whilst it may have sounded like a great idea for Ofsted and DfES to work more closely together on the provision and use of data – they don’t actually seem to be interested in exactly the same set of indicators. So for all those schools that had identified through PAT a need to focus on Writing at Key Stage 2 – now in RAISEonline you can no longer see English split into Reading and Writing so if you were hoping to be able to measure the impact of what you’ve been doing, I’m afraid you’re screwed.

And don’t get me started on the utterly ludicrous fact that the techie people that have programmed the software clearly didn’t understand the order of National Curriculum levels (i.e. 1 is lower than 2c which is lower than 2b which is lower than 2a which is lower than 3) so when you sort pupils into ascending order of level it comes out all wrong  (1, 2a, 2b, 2c, 3) .   I don’t blame the programmers – the fault must lie with whoever has briefed them. But it just beggers belief that the software can be launched into the public domain whilst containing such a basic error. An error which, I might add, I reported to them 12 months ago now and which has still not been fixed.

So where are we now? Still waiting for the full functionality, but the program does at least now contain the full set of data (Key Stages 1 – 4) for Summer 2006. It only took a year. How long will we have to wait for the analysis of the 2007 tests? Who can say? But don’t hold your breath…


KS2 SATs results – time for the annual farce!

And so we’ve reached that time of year again. That long-dreaded moment when the end-of-Key Stage 2 SATs results are delivered to all the primary schools in England and, once again, teachers up and down the land have their suspicions confirmed – that the system is a farce, a complete and utter waste of time.

Because yet again I’m hearing tales of very poor standards of marking – particularly in the creative writing papers – so pupils are coming out in some cases a whole National Curriculum level lower than their teachers have assessed them to be working at. And in my experience, most of the time that is not due to teachers over-inflating their assessments. Teacher assessments are easily the most reliable and valid measurements of children’s attainment, particularly where there are rigorous procedures in place for cross-checking each other’s judgements. The ‘snapshot’ judgement of a SATs test, in which kids are often required to write about the most uninspiring of topics, is (by comparison to a teacher’s own assessment based on robust evidence) a poor reflection of a child’s ability.

All of which wouldn’t be such a big deal, were it not for the following points:

1 – such enormous emphasis is placed on these results by bodies such as Ofsted, whereas the teacher assessment data seems to be largely ignored (the dropping of teacher assessment data from the Ofsted ‘PANDA’ reports in 2004 sent out a clear signal that their significance had been devalued)

2 – the secondary schools to which these pupils are about to transfer tend, on the whole, to value the KS2 test data more than the teacher assessments, which can lead to the lowering of expectations etc – either that or they just take the view that the whole data-set is too inconsistent to be of any value, so the kids have to be re-tested on entry to secondary school to establish a baseline (i.e. a further waste of time)

3 – where tests have been poorly marked, the system of applying for re-marking only allows for cases where the overall subject level could potentially change, not where the issue is just a possible change in level of one constituent part of the subject (e.g. Writing, being just one part of the total English level). Yet in the calculation of a school’s ‘Value Added’ score, every single mark counts. So schools have been advised that they can make their own in-house changes to the marks without having to bother with the official remarking process – making a mockery of the whole system.

In short, therefore, these statutory national tests are a complete joke. And the travesty is that teachers of Year 6 pupils find themselves obliged to spend enormous amounts of time preparing and practising for these all-important tests, all of which erodes valuable curriculum time when these kids could actually be experiencing something of value. So our children are losing out under this system. And being placed under inappropriate amounts of pressure at the same time. And all at the age of only 11. The kids of this country are the most over-tested kids in the world, according to a recent GTC statement. And it stands to reason that the more time you spend weighing your pig, the less time you have to feed it. Teachers should be able to spend their time teaching not testing. The sooner we can do away with these high-pressure tests, and with league tables while we’re at it, the sooner teachers can get back to the business of teaching a rich and stimulating curriculum, nurturing children’s talents and preparing them for the real world.