CHAPTER 3
The Impacts of NAPLAN
Unintended Consequences
3.1
The committee's interim report cites examples presented by a number of
submitters to the inquiry that a range of unintended consequences have emerged
as a result of NAPLAN testing. These include negative or adverse consequences
such as a narrowing of the curriculum or 'teaching to the test'; the creation
of a NAPLAN preparation industry which compounds the perception that NAPLAN is
a 'high stakes' test; and adverse or negative impacts on students.[1]
Teaching to the test
3.2
The Australian Curriculum, Assessment and Reporting Authority (ACARA)
acknowledge that there have been accusations that NAPLAN has resulted in a
number of intended consequences. However, it does not accept there is evidence
that negative unintended consequences are endemic. ACARA also contends that
some perceived unintended consequences are a result of a misconception of what
NAPLAN is designed to achieve:
It is also important to note
that some of these reports of unintended consequences have not accurately
contextualised the purposes of the program and attributed unrealistic
expectations to what NAPLAN should achieve. By way of example, the teacher
survey undertaken by Murdoch University in 2012 invited participants to respond
to statements such as: ‘NAPLAN promotes a socially supportive and positive
classroom environment’ and ‘NAPLAN has meant that students have control over
the pace, directions and outcomes of lessons in my class’. Both of these
aspects of classroom environment and curriculum planning are clearly the
responsibilities of teachers.[2]
3.3 The Australian Primary Principals Association quoted from an
independently conducted survey of primary school principals that showed the
curriculum can be altered, even inadvertently, in preparation for NAPLAN tests:
'Teachers, despite knowing that they should not be teaching to the tests, do alter the regular
curriculum delivery to ‘train’ the students in the peculiarities of the tests. Much time is given over
even
in the previous year to NAPLAN, to enable the students to have the best opportunity to
demonstrate their skills and knowledge.'[3]
3.4
ACARA differentiates between negative consequences of preparing for the
NAPLAN test, such as replacing the broader curriculum with a teaching-by-rote
using NAPLAN past papers, and a welcome focus on the development of students'
literacy and numeracy skills.[4]
3.5
The Tasmanian Department of Education (DoE) submitted there was no
evidence across their schools that the curriculum was being narrowed as a
result of NAPLAN testing. The Department fully acknowledges that there have
been changes to teaching methods, but these changes are 'entirely appropriate':
[T]he DoE has no evidence to suggest that there is a
narrowing of the curriculum in our schools or that teachers are ‘teaching to
the test’ as a result of NAPLAN testing. Whilst the data supports schools in
identifying areas for improvement and explicit teaching to address these areas
may occur, these processes are entirely appropriate for improving students’
learning outcomes.
In fact, the term ‘teaching to the test’ is often used to
refer to just these situations where identification of areas for focus and then
implementing appropriate strategies to address needs has occurred. Tasmania
asserts that this represents pedagogically sound practice.[5]
3.6
The Department for Education, Employment and Workplace Relations (DEEWR)
concurred with the view that NAPLAN has had a transformative and positive
impact on a student's education:
The impact on teaching practice is profound, and has
led to culture change
at schools as teachers have
learned to use
data and see
the positive effects on their student’s
learning.[6]
Committee View
3.7
The committee accepts there are going to be changes in pedagogy when
something as radical as standardised testing is introduced. The extent of
these changes and the impacts on other aspects of the curriculum are issues
that need careful consideration when analysing the effectiveness of NAPLAN.
3.8
The committee are also concerned with the comments made by ACARA that
suggest unintended consequences can be as a result of a miscomprehension over
what NAPLAN's intended consequences are. The overarching objectives of Naplan
are clear, but as discussed in the previous Chapter, they are not clear at an
operational level. To suggest that changes in the classroom are not as a result
of NAPLAN is not taking full responsibility for the profound impact that
standardised testing can have. This in itself is not a reason not to test, but
it is something that educational authorities need to be cognisant of in
providing support to schools as part of the NAPLAN process.
NAPLAN's impact on students
3.9
One of the most contentious aspects of NAPLAN's introduction is the
impact national testing has on students. The committee received evidence from
numerous submitters that they had experience of students being affected by the
testing regime, with the majority of submitters reporting negative impacts.
3.10
The Australian Primary Principals Association commissioned CANVASS to
carry out a survey to gauge the views of Primary Principals across the
country. Primary Principals: Perspectives on NAPLAN Testing &
Assessment (the survey), found that 'sixty-six percent of respondents said
NAPLAN testing has a negative impact on the wellbeing of students.'[7]
According to the survey, 'the greatest impact of NAPLAN is on student
wellbeing'.[8]
3.11
While the survey did not find that all students suffered adverse impacts
from the testing, it did cite examples of the type of impacts that some
respondents claimed the testing had on their students:
-
Pressures surrounding NAPLAN trigger
self-esteem issues and anxiety,
leading to disengagement,
absenteeism, apathy and behavioural problems e.g. playground fights
-
Particular logistical difficulties for disabled students
sitting the tests
-
The demands of
extra-curricular
tutoring for NAPLAN impacting on student welfare
-
Student boredom and a lack of enjoyment in
the NAPLAN preparation. [9]
3.12
The Whitlam Institute also carried out research into the impact of the
testing on students. This research reported that educators, as well as
parents, are concerned with the effects on students:
The evidence from the data suggests that a large proportion
of educators are reporting that at least some students are suffering health and
well-being issues as a result of the NAPLAN. Difficulties include physical
responses such as crying, sleeplessness, and feeling sick, as well as
psychological responses such as an inability to cope emotionally, feelings of
inadequacy, and concerns about the ways in which others might view them.
Respondents also reported significant numbers of parents raising concerns about
the impact of the tests on their children’s well-being.[10]
3.13
Epping Heights Public School reported the biggest impact on their
students has been an 'increase in anxiety'. It highlighted potential long term
issues with students' activities being focussed on two elements of learning at
the expense of other activities which are also important for children to
develop:
For these students their life experience is already affected by less time being given to developing interests, talents and creative thinking. In a
rapidly changing world, students need skills to become
life-long learners,
adaptive and multi-skilled. How
will students
who have been coached to a narrow set
of
criteria succeed at a tertiary level and beyond to become well educated, creative
and well-rounded citizens?[11]
3.14
The Australian Council of State School Organisations submitted that
feedback from parents has indicated that their students are adversely impacted
by the testing and this can more evident in students at a younger age and also
in more rural and remote schools:
Clearly we see students display signs of
stress or sickness
leading up to the NAPLAN testing regime, parents have expressed their concern with regard to their
child either
not wanting to go to school or
are
anxious about the exam.
Anecdotally the negative impacts on wellbeing appear
greater in rural and remote schools,
this we believe is due to the smaller
sizes of
the
community and the ease of which whole school results can identify individuals.[12]
3.15
However the committee received evidence that research also showed that
stress was the most reported impact, but this is the likely response to any
test and therefore perfectly normal:
While test anxiety is of concern, NAPLAN testing has in no
way created hysteria beyond what would be expected of any test situation. Being
anxious about a test is quite normal and probably a useful emotion that all
humans experience as part of life’s great tapestry. To mount a case that
somehow NAPLAN is damaging a generation of children says more about parenting
than it does about the test itself. I am yet to be at a school that doesn’t
make every effort to support children through NAPLAN in a positive and
encouraging manner.[13]
Committee View
3.16
The committee accepts that anxiety will be a factor for some students in
any testing environment. What makes NAPLAN of particular significance is the
perception of NAPLAN as a 'high stakes' test. This is an issue which is
considered in later in this Chapter and the committee reserves its views to
that section.
NAPLAN's impact on students with specific needs
3.17
The committee received submissions from a number of organisations on the
impact NAPLAN testing has on students with various diverse need including those
with disabilities, indigenous students, and students from non-English speaking
backgrounds which includes refugees and migrants.
Students with disabilities
3.18
Adjustments are permitted for students with disability to support their
access to the tests and facilitate maximum participation, and are intended to
enable access to the tests on an equivalent basis to students without disability.[14]
3.19
ACARA's submission explained that students with disability are
'encouraged to take part in NAPLAN', and special provisions to support
individual students with disability and special needs are outlined in the National
Protocols for Test Administration.[15]
However submitters raised a number of issues with this, that can impact both
schools and individual students.
3.20
One of the issues for many of these submitters is the danger that
resources for a school are influenced by NAPLAN test results which can be
impacted by a number of factors, often outside of the control of the school.
According to evidence received there is pressure placed on a school to ensure
it receives the best possible test results and this can lead to the exclusion
of students with special needs. Epping Heights Public School raised this
issue:
There are a number of students in our school with learning difficulties that require additional
support but we receive very little
support because the majority of our students achieve good results. We wonder if it is fair that these students are judged by the results of others in their cohort. Alternatively,
it seems wrong that schools prevent students with difficulties
from sitting the NAPLAN as it will ‘bring down’ their results.[16]
3.21
Submitters argued that the inherent comparative nature of national
testing is incompatible with including students with varying degrees and type
of special needs. Professor Joy Cumming urged caution in comparing results
across schools, or even classes within schools for this reason:
Schools may use results to compare class results in terms of
raw performance. Such results do not provide information on class differences
in achievement, proportion of students with disability, or from language
backgrounds other than English. Therefore, comparisons across classes need to
be made with caution.[17]
3.22
Professor Cummings' submission goes on to discuss the range of
adjustments utilised in other testing environments around the world that can
level the playing field, but that have not been introduced in Australia to the
necessary degree:
NAPLAN administration at present makes inadequate allowance
for the appropriate assessment of students
with disability. They must participate in the standard
NAPLAN tests with a small range of adjustments.
...
The question still arises as to whether the current approach
to NAPLAN testing is compliant with the Disability Standards for Education
2005 or whether students with disability are being indirectly discriminated
against in current practices, through being expected to meet participation
requirements that they are not able to meet but that students without disability
can meet. The test forms do not enable students with disability opportunity to
demonstrate what they know and can do.[18]
3.23
Data is also published on the MySchool website about the level of NAPLAN
participation for each school, broken down by exempt, absent and withdrawn.
This data is compared to the national average.[19]
Significant work is currently being undertaken by all governments to implement
a Nationally Consistent Collection of Data on School Students with disability.[20]
Students from a non-English
speaking background
3.24
While the committee understands students from non-English backgrounds
are exempt from the test if they have been in Australia for less than one year,[21]
it received evidence that students arriving from areas of conflict or traumatic
events this may not be sufficient to put them on a par with other students.
3.25
A submission from the Multicultural Development Association (MDA) and
Townsville Multicultural Support Group (TMSG) cited research that found
students from refugee backgrounds are particularly vulnerable:
Students of refugee background are particularly vulnerable due to the significant interruptions to schooling
and
the social and psychological impacts of
their refugee journey. Many students may have experienced forced migration, significant
loss, violence, and trauma. These experiences impact
on students’ ability to learn in the school
environment and require a whole
of school response.[22]
3.26
The Victorian Association for the Teaching of English also reported
negative impacts on students from Non-English speaking backgrounds:
I was confronted with having to watch a number of intelligent
and capable EAL students (who had been in Australia just over one year) not
cope at all with all the tests and feeling frustrated and upset. In the end,
they became so frustrated that they started making jokes about it and treating
it as a waste of time. The Naplan tests are all very well for those who are
successful academically, for children who do not meet the set benchmarks or are
disadvantaged in some way, the tests are an attack on their self-esteem as they
reinforce the message that they are failing.[23]
3.27
The Australian Council of TESOL Associations was supportive of the
government's efforts to improve educational outcomes for all students, and
accepted that standards are necessary to ensure a certain level of education
across the country. However they argued the NAPLAN tests are not an
appropriate mechanism for measuring skills of those students where English is
not their first language, or students that are culturally diverse from the
educational mainstream in Australia. According to their submission, this
because the tests make the following assumptions:
-
students’ proficiency in English relates to their maturity and
their grade level in Australian schools
-
students’ development in English follows an
English-as-a-first-language pathway
-
students have a knowledge base related to “the curriculum in
each state or territory” (as
stated on the NAPLAN website)
-
students are urbanised
and sometimes that:
-
students are from middle class Anglo-Australian backgrounds.[24]
3.28
Mr Leonard Freeman, Principal at the Yirrakala School in the Northern Territory
contended that culturally specific content in the NAPLAN tests place students
from indigenous and refugee backgrounds at a disadvantage. His submission
provides some examples of where a student without similar cultural experiences
would not be able to comprehend the scenarios in question:
The stimulus texts [...] focussed on included a narrative about
a paperboy and an advertisement for a movie which included the title of the
movie, session times, classification details and the price of admission [].
While cinema posters and newspaper deliveries are common place in cities and
town across Australia, there are no cinemas or delivery boys in remote
Indigenous communities.
...
The story begins with the householder complaining to the
newspaper boy ‘you left the paper jutting out of the back of my box’ and we
also learn the owner had previously complained the paper needs to be left ‘in
line with the fence’. This question was designed to test whether students could
infer the meaning of new words and constructions. Yet to do so the students
need to be familiar with the cultural context, in this case the students need
to know that houses have a box on their fence line where mail and newspaper
deliveries are left. If the student has grown up in a remote community or
refugee camp where there are no letter boxes and few houses have fences they
will not be able to access the meaning of the text.[25]
Committee View
3.29
The committee shares the concerns of some submitters that while
provision for students with disability and special needs is made, it is still
leaving the students with disability at a disadvantage from those without
disability. This in turn can lead to a distortion in test results that can
impact the student and the school, especially if the impacts can affect
resource allocation.
3.30
This is one of the reasons the committee is considering whether national
testing based on samples of schools and students may be a better option than
the almost universal tests currently carried out. It may be that adaptive
testing introduced through NAPLAN Online could tailor the test to the abilities
and progress of the individual student, and if this is the case, the committee
would strongly support this approach.
Recommendation 2
3.31
The committee recommends that when designing adaptive testing for NAPLAN
Online the needs of students with disability are taken into account.
3.32
The issue of NAPLAN testing for students from a non-English speaking
background is one that elicits strong opinions. The committee understands
students are given a one year grace period before they are expected to take
part in the NAPLAN process, but in the committee's view, this is not sufficient
to provide a level-playing field with native English speakers.
3.33
The committee also notes the assertion by ACARA that 'test developers
must ensure NAPLAN tests are not culturally biased against Aboriginal and
Torres Strait Islander students'. [26]
While this is welcome, it still places students from a culturally diverse
background, (which includes Indigenous students) at a distinct disadvantage.
Similarly to students with disability and special needs, this has the potential
to impact both the individual student and the school. Again, the committee is
optimistic that the introduction of adaptive testing will have a positive
impact of this student cohort.
Recommendation 3
3.34
The committee recommends that when designing adaptive testing for
NAPLAN Online the needs of students from a non-English speaking background are
taken into account.
The impacts of the My School Website – A 'High Stakes' test?
3.35
The development of the NAPLAN testing and reporting regime has led to
frequent accusations the test has become 'high-stakes'. It is not the tests
themselves that make it 'high stakes', or even the impact on individual
students, but the way the data is used. If substantial resources are allocated
to schools on the basis of NAPLAN results, then schools understandably deem
them to be of significant importance. If the tests results are used to
construct league tables, the effects on schools and parents become
significant.
3.36
Almost all submitters described the tests as 'high stakes', and most
were critical of what they consider to be a disproportionate significance
placed on the results. The NSW Primary Principles Association commented that
the tests themselves are not high stakes for the pupils, because there is 'no
personal consequences'[27]
for them, however this is not the case for the school or broader community:
Students are not
denied
advancement
in the school system as a result of their performance in the tests. More and more
however, the public and the education community are hearing that the tests are high stakes tests. What is done with the results and how the results are used in the public arena
is
consistent with a high stakes assessment.
In short, NAPLAN is a low stakes assessment, the results of which are being used in a high stakes methodology. The
recognition needs to
be that NAPLAN is one test on one
day and a snapshot at best of individual performance.[28]
3.37
The Whitlam Institute's report, The Experience of Education: The
impacts of high stakes testing on school students and their families – An
Educator's Perspective, discussed earlier in this report, surveyed
educators on their perceptions of the test and whether they considered it to be
'high stakes'. Their response concurred with other research:
Respondents’ perceptions of the purposes of NAPLAN and their
views of what impact reported poor results could have on schools strongly
suggest that NAPLAN is viewed by the teaching profession as ‘high stakes
testing,’ confirming views already expressed by Lingard (2010) and Lobascher
(2011).[29]
3.38
The Whitlam Institute had previously conducted a literature review into
the impact of high stakes testing and concluded there were serious impacts on
student and their families, and arguing this was a consequence of publishing
the test data on the My School website:
[T]he publication of NAPLAN results on the My School website,
with the associated publicity and impact on schools and students, means that
NAPLAN may be defined as constituting high stakes testing.[30]
3.39
Other submitters argued that because NAPLAN is a test of literacy and
numeracy only, and the results are subsequently published on the My School
website, this marginalises other elements of the curriculum because schools are
under pressure to concentrate on those results that are published:
The almost complete attention to testing of only certain kinds of literacy and numeracy skills
serves to marginalise the other curriculum areas, especially in primary
schools. This is inevitable when the stakes are raised by publication of results on My School. This often causes standards in these curriculum areas to be compromised as they received
insufficient attention in the school program.[31]
3.40
According to submitters, it is not only schools that are subject to the
pressure associated with NAPLAN being regarded as a 'high stakes' test. Some
parents also see the emphasis placed on NAPLAN through the press and feel
pressure to prepare their child to do well in the exams. Fintona Girls
submitted that in their experience:
..[i]t is difficult to reassure parents that these tests simply provide one measure of a child’s performance
in
a specific place in time when the results
are made available on the My School website
and performance league tables
that use NAPLAN data
appear in The Australian (as
they did on June 1st, 2013).
...
As long as NAPLAN is the only measure used in the MySchool
website to measure achievement, many schools will teach the tests and
parents will do all that they can (including tutoring and purchasing commercial
products) to enable their child to do as well in the tests as possible.
3.41
Other submitters also commented on the typical reportage of the NAPLAN
test results and how this compounded the perception of NAPLAN being 'high
stakes':
The publication of school NAPLAN results on the My School
website and the way that some parts of the media report and comment on these
results have caused many people to quite inappropriately view them as a valid
measure of whole-school performance. The tests probe only a very narrow slice
of the whole school curriculum and are not designed and are not adequate to
support this sort of evaluation.
A supplement entitled “Your School” issued with The
Weekend Australian
of 1-2 June, 2013 is a good example of the misleading practice
described above. The supplement included pages headed “The nation’s top 100
primary schools” and "The nation’s top 100 secondary schools”.[32]
3.42
The effect on teaching has also been profound according to Jenny Cullen,
an experienced educator and Assistant Principal. She described the situation
before and after test results were published on the My School website:
Prior to the introduction of this comparative website, NSW
had been conducting Basic Skills Tests for many years each August. In primary
schools, children were usually given 2 or 3 practice test experiences so they
became familiar with test conditions and multiple choice questions.
After the introduction of My School website, the stakes were
instantly raised. For months in advance of NAPLAN, teachers in Yrs 3 and 5 now
shape their teaching to the test. Children sit many practice tests. In some
classrooms they write each week in the NAPLAN test condition of 40 minutes with
no assistance during this time[...]Teachers
feel pressured and judged. The results are pounced upon nervously.[33]
3.43
The School of Education at Deakin University described the publication
of NAPLAN results on the My School website as a 'summative judgement', that is
not a fair and true reflection of all the elements that make up an effective
learning environment. The publication of the results also takes away from the
positive aspects of a standardised test where the goal it is to 'diagnose the
literacy and numeracy needs of individual students':
This goal, however, is being compromised by the publication
of results on the MySchool website, which reduces the test results to summative
judgments of a whole school’s performance, sometimes unfairly stigmatizing the
school in the eyes of a general public that does not fully appreciate the
complexities of teaching and learning and the complex judgments involved in
assessing students’ abilities, especially with respect to addressing the needs
of culturally diverse and disadvantaged communities.[34]
3.44
The Australian Education Union (AEU) discussed the committee's previous
inquiry into NAPLAN and emphasised the research available to the committee then
had since been supplemented by further evidence from both Australia and
internationally, of the negative impact on teachers and students from high
stakes testing:
[T]he existing body of evidence on high-stakes testing in
general, and NAPLAN in particular, has been supplemented by both international
research literature and recent Australian research.
3.45
The AEU urged the Committee and the government to take notice of the
research and reconsider the publication of data on the website in its current
form:
We urge this Inquiry to heed the evidence-based findings
about the largely negative impact of high-stakes testing on teaching and
learning in the context of the NAPLAN program and publication of NAPLAN data on
the My School website.[35]
3.46
However the perception of the My School website as a basis for
comparative analysis is very different for the educational authorities who
require various tools to be able to address deficiencies across their area.
Tasmania's Department of Education espoused the benefits of the My School
website, while being conscious of the potential pitfalls:
The DoE considers that overall
the My School website supports processes of accountability, evaluation, collaborative
policy development and resource allocation within school communities.
It also provides
valuable comparisons of a school’s data sets with those
of
a group of other schools
with similar
socioeconomic status (SES) profiles as a
context.
However, the DoE does acknowledge the potential of My
School
data to be used to create simplistic
league
tables which do not take into account the local context and factors such as SES.[36]
3.47
ACARA also defended the publication of NAPLAN results. ACARA's
submission lists what it terms as 'clear advantages' of the publication of
NAPLAN data and emphasises that every effort is made to avoid the compilation
of league tables. Amongst the advantages they cite include:
-
Encouraging discussion between parents, the wider community, and
schools about school improvement.
-
Illustrating the improvements that are being made in schools,
sometimes in very difficult circumstances, and celebrating the positive
contributions of teachers and schools. In particular, gain data can be used to
highlight the improvements made by students between the test years.
-
Encouraging professional discussions between schools on
strategies that have been used to improve student literacy and numeracy skills.
My School allows principals and teachers to identify schools with a
similar student cohort which may be achieving a higher level student gain.
-
Challenging schools in which students are achieving above average
results to compare themselves to schools with similar student cohorts and
examine whether students are improving at the rate expected or whether the
school and its students are ‘coasting’.
-
Providing a breakdown at a school level of the percentage of
students achieving results in each proficiency band. This allows both the
school and the community to monitor the progress of each student cohort and
ensure that teaching practices are improving results for all students.[37]
3.48
Data is published but sanctions to schools and teachers on this basis
are not applied. In many cases, additional funding or other forms of assistance
has been provided to schools that have not been performing as well as expected.
The Department indicated that schools identified as having below average
student outcomes in literacy and numeracy have been allocated addition funding
of $11million in total, to assist them raise literacy and numeracy performance.[38]
Committee View
3.49
The committee accepts that data obtained from NAPLAN is of significant
value to stakeholders including students, schools, parents, education
authorities, the wider community, and state and national governments and the
provision of this data through MySchool has enabled greater understanding about
the performance of schools. However there are also significant disbenefits from
publishing the data in a manner that as part of its functionality either
compares itself, or allows easy comparison between schools.
3.50
The committee further accepts that the Department and ACARA discourage
the use of the data to develop league tables, but this does not diminish the
facts that this occurs. One of its core elements of the MySchool website
is the ability to compare schools, but given the number of variable involved in
the testing process, and the almost infinite variation in students, a true
comparison is not possible.
3.51
For this reason the committee would still like to see the data
published, but some of the core ranking and comparative functionality removed
from the website. This would allow for data to be published to schools and
parents and education authorities, but it would limit the disingenuous use of
the data to rank schools.
Recommendation 4
3.52
The committee recommends that ACARA closely monitor the use of NAPLAN
results to ensure results are published to assist the Government to deliver
extra, targeted funding to schools and students who need more support, rather
than the development of league tables.
Navigation: Previous Page | Contents | Next Page