Naplan, apples and oranges?

I am somewhat of a NAPLAN sceptic: see for example This is the Naplan post that wasn’t… and NAPLAN craplan… And on M’s anniversary. This year NAPLAN trialled online testing. In their FAQ they anticipate an issue with this but respond rather blandly: “Following extensive research undertaken by ACARA, NAPLAN online and paper forms have been explicitly designed to be comparable. Results for both paper and online tests will be reported on the same NAPLAN assessment scale for each test. The use of a common assessment scale, covering Years 3, 5, 7 and 9 in each of the areas of conventions of language, numeracy, reading and writing, allows for an individual student’s achievement to be mapped as the student progresses through his or her schooling.” Trouble, as I see it, is the cunning trick whereby student response in the online version actually adds to/changes the test: makes me suspect the concern about comparability hasn’t really been answered.

As the  Australian Education Union has said:

Victoria’s Education Minister James Merlino said he was “extremely concerned” about reports that the results from the pen-and-paper version of the test and NAPLAN online version may be “not comparable”. NSW Department of Education Secretary Mark Scott said that while parents would be able to see how their own children scored, ACARA might not be able to compare system-wide student performance from this year to last year or previous years.

The AEU has led the call for a comprehensive review of NAPLAN. This has been joined by parent and principal associations around the country. The AEU has also called on Min. Birmingham to immediately give a full explanation of what went wrong with the NAPLAN online trial, and whether the data comparison issue can be rectified.

In the Sydney Morning Herald:

Almost 200,000 Australian students sat NAPLAN online this year and the rest did a pen-and-paper version, but state education ministers and directors general are concerned  the two sets of results are not statistically comparable.

Ministers in two states said they were considering withdrawing from NAPLAN online until their confidence was restored….

The Herald understands the main problem with differing results relates to the grammar and punctuation test.

One of the innovations of NAPLAN online is that the test adapts to the child’s ability. If the students get the first set of answers correct, the questions get harder. These tests give a more accurate diagnosis of strengths and weaknesses.

But this year, strong performers in the reading test were given difficult questions from the beginning of the grammar and punctuation test. They did not get the so-called “confidence building” questions, a key part of test design.

The students who sat the written version did have those confidence-builders. As a result, the top-performing students in schools that ran the test online did not perform as well as the students who sat the written version.

Because the online version is more accurate, it also more effectively separates the very top and bottom-performing students across all the tests, so some of the highest performers might appear to have not performed as well as they did last time….

Apples and oranges?

See also NAPLAN is dangerous and limited: expert panelists, and NAPLAN results delayed over concerns national data could be invalid.