CHAPTER 2

CHAPTER 2

Key issues

2.1        The committee has received a large number of detailed and carefully prepared submissions in relation to this inquiry, and has also received evidence from witnesses at its public hearing on 21 June 2013. It is apparent that the effectiveness of National Assessment Program - Literacy and Numeracy (NAPLAN) is a matter of some controversy and there are a full range of views on this issue. This chapter briefly identifies some of the key issues that are apparent to the committee thus far, as well as areas where further research and inquiry are necessary.

Is NAPLAN achieving its objectives?

Objectives of NAPLAN

2.2        The Department of Education, Employment and Workplace Relations (DEEWR) advised that five objectives have informed the development of NAPLAN testing:

1. that the reporting of literacy and numeracy test results is reliable and nationally comparable;

2. that the proposed national literacy and numeracy tests be rigorous;

3. the central aim of national assessment should be finding out what students can or cannot do and lifting the performance of every student in every school;

4. the tests should focus on the diagnosis of each student’s strengths and weaknesses as a means for planning educational interventions; and

5. the development of new standards to cover the full range of student achievement.[1]

2.3        The Australian Curriculum, Assessment and Reporting Authority (ACARA) submitted that the National Assessment Program 'is the means by which governments, education authorities and schools can determine whether or not young Australians are reaching important educational goals for literacy and numeracy'.[2] On its website ACARA advises that the primary objective of NAPLAN is to provide the:

[M]easure through which governments, education authorities, schools, teachers and parents can determine whether or not young Australians have the literacy and numeracy skills that provide the critical foundation for other learning and for their productive and rewarding participation in the community.

The tests provide parents and schools with an understanding of how individual students are performing at the time of the tests. They also provide schools, states and territories with information about how education programs are working and which areas need to be prioritised for improvement.[3]

2.4        ACARA noted that the NAPLAN tests are only one part of the assessment and reporting processes conducted by each school and do not replace 'extensive, ongoing assessments made by teachers about each student's performance'.[4]

2.5        Submitters to the inquiry varied in response to the objectives of NAPLAN. Some submitters supported the objectives in full, but argued that NAPLAN was unable to meet them, while others suggested that the objectives need to be revisited. Still further submitters suggested that NAPLAN was being used for a much broader range of purposes than originally anticipated, a viewpoint elaborated on below.[5] Finally, some submitters variously questioned the utility of NAPLAN, as well as its expense and its potential to actually cause harm to students, the implication being that the NAPLAN regime should be discontinued.[6]

2.6        The Australian Education Union submitted that it is timely to reconsider the purpose of NAPLAN, and its style of delivery. During the Melbourne hearing the committee was advised:

In our most recent correspondence with the federal minister, we have argued and suggested that it is time for a thorough re-examination of the purpose and processes of delivery of NAPLAN. We do that because we do not believe there is any clarity about the purposes of NAPLAN. It is now starting to become the be-all and end-all for anyone, depending on what their hobby horse is at the time. For example, we are told it is a diagnostic test and then we are told it is not a diagnostic test. We are told you can prepare for it and then we are told you cannot prepare for it.[7]

2.7        It seems to the committee that over time the purposes of NAPLAN have expanded. This is in part because NAPLAN data is our only nationally consistent data for educational outcomes. The Australian Education Union noted that:

[O]ne of the great concerns for educators across the country [is] the fact that there is a lack of clarity around the purpose of NAPLAN, and there are increasing references to NAPLAN and the results of NAPLAN being used for a wider and wider variety of purposes...

...Data is important; it is the misuse of that data, or the incorrect use of that data, that concerns us greatly. That, in itself, contributes to growing anxieties, if you like, and the high stakes associated with the NAPLAN program.[8]

Achievement of objectives

2.8        The committee heard a range of concerns expressed by witnesses and submitters about the ability of the NAPLAN tests to achieve the original objectives. Among other criticisms, NAPLAN was criticised for testing a very narrow field of Australian curriculum, having extremely limited diagnostic ability, lacking rigour and containing a high margin of error.[9] A discussion of the full range of views is beyond the scope of this interim report, however some examples follow.

2.9        A number of submitters questioned whether NAPLAN testing could achieve its objectives, on the basis that NAPLAN data is limited and at best provides an estimate of each child's literacy and numeracy at a particular point in time.[10] The committee heard that for these reasons NAPLAN was not a particularly useful tool to assess student ability and that a more useful assessment could be provided by teachers.[11] The committee also heard that the four month wait before NAPLAN results are released inhibits the usefulness of the test as a diagnostic tool, as does the time of year that the test is conducted.[12]

2.10      Many submitters and witnesses expressed concern that NAPLAN data was being used for purposes for which it was not originally intended. Dr Gloria Latham observed that while NAPLAN results are described by ACARA as providing a snapshot of children's learning, the results were being used by the Australian government and by schools for other purposes, including as a measurement of learning and to assess the quality of schools and teachers. [13]

2.11      Dr Suzanne Rice observed that much of the criticism around NAPLAN was 'not necessarily a disagreement with testing per se but rather with the uses to which the testing data might be put that is central to some of the key debates'.[14]

2.12      Following this theme, submitters also were critical of the decision to use NAPLAN data to determine the Schooling Resource Standard.[15] The Australian Primary Principals Association told the committee that NAPLAN data was not fit for this purpose and the use of the data in this way 'privileges NAPLAN data to an almost unbelievable level'.[16]

Unintended consequences          

2.13      The committee heard that a range of unintended consequences have emerged as a result of NAPLAN testing. These include: adverse impacts on students, narrowing of the curriculum, creation of a NAPLAN preparation industry and the development of NAPLAN into a 'high stakes' test.

Impact on teaching and learning practices

2.14      The committee received evidence that the NAPLAN testing regime is having an adverse impact on teaching and student learning practices.

2.15      The committee heard that the curriculum has become narrowed as teachers teach to the test.[17] Submitters reported that a focus on NAPLAN preparation has resulted in limitations on creative learning in classrooms. For example, some school children are spending a disproportionate amount of time learning how to master persuasive writing pieces.[18] In response to this particular criticism, ACARA has decided not to give advance notice of the writing style to be tested in future years.[19]

2.16      These submissions are consistent with international academic studies on the impact of national testing regimes, particularly in the United Kingdom and the United States. Professor John Polesel summarised that this literature:

...has found that the testing can have distorting influences on the way in which teachers teach, changing teachers' pedagogy but also changing the way in which schools assign value to different parts of the curriculum. For example, the research has found that things like the arts, drama and music are given less prominence in the curriculum because they are regarded as further away from the main focus of the testing, which is literacy and numeracy. The other thing that came through from the international research is that the focus also had unintended consequences—for example, in the way in which schools recruit kids from different areas of society, and parents using the results of the testing, because it is high stakes and it is made public, in ways which were unintended.[20]

A 'high stakes' test and impact on student well being

2.17      Numerous submitters provided evidence to support their claim that NAPLAN has become a 'high stakes' test, and that this has led to negative impacts on students, teachers and schools.[21] ACARA acknowledged that literacy and numeracy are high stakes, but argued that NAPLAN itself is low stakes for individual students.[22] Depending on the perspective of an individual or organisation, NAPLAN testing may be considered to be high (or low) stakes, either for themselves or for another stakeholder.

2.18      The committee heard that the test has become high stakes, in part, because of the publication of individual school results on the MySchool website and subsequent development of league tables by the media.

2.19      In response to these concerns The Whitlam Institute, along with its partners at the University of Melbourne, is conducting a project titled: The Experience of Education: The impact of high stakes testing on schools students and their families. The project approaches this question from the perspective of the best interests of the child. The two objectives are (i) to determine the positive and negative impacts of NAPLAN on children and (ii) to examine the significance of these impacts for students and their learning environment.[23] Significantly, the Institute observed that 'almost 90 per cent of teachers reported students talking about feeling stressed prior to NAPLAN testing and significant numbers also reported students being sick, crying or having sleepless nights'.[24]

2.20      ACARA questioned whether NAPLAN alone is the cause of student stress,  suggested that the way the Whitlam Institute constructed its survey questions may have led to a particular outcome, and that a more expansive survey that looked at student stress across the entire school year may deliver comparable results to those reported.[25]

2.21      Mr Phillip Heath, currently a Principal at a Canberra private school, advised that even when schools take a low-key approach to NAPLAN, children can still become stressed:

My school does not talk about NAPLAN at all. We do not publish the results. We keep it very much to what it is designed for—that is, to give feedback to us and to an individual student. But, for about half of those present, their parents are giving them tests at home to prepare for the experience. That really surprised me. That is in a context where we say nothing, as a school, and I would suggest that is a pretty common picture around the country. Parents at home who are used to a testing regime—that is how they grew up—consider this a very high-stakes experience, much higher than, in fact, it was intended ever to be.[26]

2.22      Mr Heath was careful not to judge parents for this response, explaining that the way that NAPLAN is reported encourages some parents to view the test as high stakes:

If you set up a test in which the children get the results and their results are shown across bands with an arrow, an aspirational parent, particularly from a culture that values education very highly and achievement very highly, will want to see their child's arrow at the top and will do whatever it takes.[27]

2.23      The Independent Education Union of Australia agreed, noting that the very nature of NAPLAN made it high stakes – for children and teachers. During the Melbourne hearing Mr Chris Watt explained:

The fact that it involves people's children immediately makes it high stakes, because every parent wants the best for their child and if they are using these tests and their results for enrolment purposes other than the child's health it is hard to think of anything more high stakes than your child's education. By definition, testing is, whether you like it or not, high stakes. So, yes, there is unquestionably pressure being put on individual teachers, and some do not want to take those classes for that very reason.[28]

2.24      This evidence suggests that it may not be NAPLAN itself that causes stress, but rather the influence of parents, schools and, in particular, the media. The inclusion of NAPLAN data in the assessment of the School Resource Standard for schools under the new funding arrangements is likely to contribute to the perception of NAPLAN being 'high stakes' for state and territory education jurisdictions.[29]

Growth of the NAPLAN preparation industry

2.25      ACARA advised that detailed preparation for NAPLAN testing was not necessary. However, evidence to the committee suggests that an industry of NAPLAN preparation has developed in Australia over the past 5 years. The Australian Education Union pointed to fish oil supplements, study aids and tutoring support all targeted at NAPLAN preparation.[30]

2.26      ACARA acknowledged that it was aware of instances of 'excessive test preparation' and wanted to work with Principals and teachers to support them in their important role.[31] ACARA also acknowledged that feedback from stakeholders indicated 'the need to restate things like the purpose, to counsel and provide information about what it going on'.[32]

2.27      ACARA also advised a number of reforms that would be implemented in the next several years, including: online delivery of NAPLAN, linking NAPLAN to the National Curriculum, reducing the time gap between testing and results and introducing flexible delivery of the tests.[33]

2.28      Further investigation and inquiry is necessary to fully consider the impacts of the NAPLAN testing regime and to determine the appropriate action (if any) that should be taken to address any adverse consequences.

Publication of results on the My School website

2.29      While most submitters, particularly teachers and principals, offered in-principle support for NAPLAN testing and the careful[34] use of data, in conjunction with teacher assessment, strong criticism was reserved for the publication of individual school data on the My School website.[35] The committee heard that many of the unintended consequences discussed earlier arose not from NAPLAN itself but from the publication of results on the My School website.

2.30      Professor Joy Cummings, an experienced educator and expert consultant, observed that publication on My School 'has made NAPLAN high stakes for schools...Pressure to meet targets is top-down on schools from authorities, from principals to teachers, from teachers to students'.[36] Professor Cummings observed that some students attending schools with a balanced approach to NAPLAN are still reportedly stressed 'due to exposure to media reporting on NAPLAN and My School outcomes'.[37] These criticisms of the My School website were repeated by a large number of submitters to the inquiry.

2.31       The committee heard that the publication of data on the My School website encouraged the misuse of data. For example, Mr Norm Hart, President of the Australian Primary Principals Association, told the committee:

I have no problem with the data being collected and I have no problem with the proper use of the data. My problem, and the problem of primary school principals, is the misuse and inappropriate use—I think there are two different things here—and sometimes it is almost mischievous use, of the data. In the terms of league tables, that is the case. I think they are just so wrong that they should be stopped because a high NAPLAN score does not equal high efficiency. It does not equal high quality. It could well be an element of both, and probably is, but it is not equal. [38]

2.32      ACARA has been asked by the COAG Standing Council on School Education and Early Childhood to provide an assessment of any perverse incentives and unintended impacts as a result of the publication of NAPLAN data on My School.[39] Further inquiry into the purported benefits and negative impact of the My School website is necessary.

International best practice for standardised testing

2.33      Both DEEWR and ACARA submitted that NAPLAN represents international best practice for standardised testing and cited Organisation for Economic Co-Operation and Development (OECD) publications to support this statement.[40] However, a number of submitters disagreed.

2.34      The Australian Primary Principals Association praised the Finnish system because of its strong results and because of the trust it places in teacher professionalism.[41] Sample testing was praised by others because it makes more sophisticated that tests higher order testing skills possible, and enables governments to collect data without interfering with the role of the teacher in providing feedback.[42]

2.35      The committee is also aware of recent reforms to census testing in the United Kingdom to allow quicker feedback and greater use of teachers' professional judgement, including marking of exams by the classroom teacher to allow for prompt feedback.[43]

2.36      The committee believes that closer consideration of international best practice is necessary in order to properly assess the NAPLAN assessment program.

Potential improvements to the NAPLAN program

2.37      The committee heard a range of innovative measures which it was submitted had the potential to improve the NAPLAN testing program. Chief among these were:

2.38      The suggestions above are a representative sample of reforms suggested by submitters. The committee has not had sufficient time to closely consider these and other suggestions. The list provided, however, illustrates the broad range of changes that could potentially be made to the administration of NAPLAN to promote its effectiveness. The committee heard that ACARA and DEEWR were already acting on some of these reforms, but the committee was unable to explore which ones and the extent to which their implementation has been effective.

Conclusion

2.39      Every year over a million Australian students complete three separate tests over a week, and four months later these results are published online at a school level as well as being reported in the media. The results are touted as an accountability measure for teachers and principals, and as a useful tool for parents. Given the large numbers of Australian students involved in NAPLAN, and also the numbers of teachers who are often judged by the results of their students, it is of the utmost importance that the Parliament is able to determine the extent to which NAPLAN is achieving its objectives.

2.40      In conducting such an inquiry on behalf of the Senate, the committee requires time to discover, explore and assess all relevant viewpoints as they impact on the Terms of Reference. Given the time constraints on this inquiry, such an analysis will only be possible if the committee is given the opportunity to continue its inquiry in the 44th Parliament.

2.41      In this context, the committee makes no recommendations of substance in relation to NAPLAN, but notes the potential for the committee to recommend to the Senate the re-adoption of this inquiry early in the next Parliament.

 

Senator Chris Back
Chair, References Committee

Navigation: Previous PageContents | Next Page