NCTE: Avoid Machine-Graded Writing Assessments

Thank you, NCTE, for articulating a strong position on using computers to assess student writing in standardized testing. The National Council of Teachers of English published a position statement this past week that strongly denounces the shift towards having computers and software programs assess student writing, particularly in relation to the coming Common Core assessments that so many of our states are now part of.

The position paper notes:

… we can cost-effectively assess writing without relying on flawed machine-scoring methods. By doing so, we can simultaneously deepen student and educator learning while promoting grass-roots innovation at the classroom level. For a fraction of the cost in time and money of building a new generation of machine assessments, we can invest in rigorous assessment and teaching processes that enrich, rather than interrupt, high-quality instruction. Our students and their families deserve it, the research base supports it, and literacy educators and administrators will welcome it.” – from NCTE

The position paper also cites the many reasons why computers often fail in these machine-scored scenarios, noting:

  • Computers are unable to recognize or judge those elements that we most associate with good writing
  • Computers are programmed to score papers written to very specific prompts, reducing the incentive for teachers to develop innovative and creative occasions for writing, even for assessment
  • Computer scoring favors the most objective, “surface” features of writing (grammar, spelling, punctuation)
  • Computer scoring systems can be “gamed” because they are poor at working with human language, further weakening the validity of their assessments
  • Computer scoring discriminates against students who are less familiar with using technology to write or complete tests

And last, but not least, and perhaps most important of all:

Computer scoring removes the purpose from written communication — to create human interactions through a complex, socially consequential system of meaning making — and sends a message to students that writing is not worth their time because reading it is not worth the time of the people teaching and assessing them.” — NCTE

The paper then goes on to cite alternative ways to assess student writing, including the well-researched method of portfolios. Whether PARCC and Smarter Balance folks are listening, or care to listen, is a whole other matter. If they need any help, the writers of the position paper helpfully provide a long list of annotated articles on the topic.

Peace (without the machine),
Kevin

PS — Thanks to Troy Hicks for sharing the link via Twitter. Troy is one of the authors of the position paper.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *