Formative Pain in the Assessment

 

This month marks a special anniversary for me.  It’s been 4 years since I started using the Developmental Reading Assessment (from Pearson) in my classroom.  Back then I was bright-eyed and full of undirected/misdirected teaching energy.  I accidentally signed up for a PD day that “trained” me on how to administer the DRA.  That accident led to piloting this assessment for my district, followed by the adoption of this assessment tool district-wide.  As is so very often the case in my district, the few of us who had gotten any sort of training were then put in the position of passing that training on to many colleagues in whatever pale immitation we could do of a day-long training (condensed to 90 minutes). 

And here we are.  The district requires all upper-el teachers to use this assessment tool in the Fall and Spring.  Everyone knows how to administer the testing.  Everyone is required to dutifully record scores in each student literacy file and report scores via spreadsheet to district administration.  We’ve all been “on board” about two years now.  Here’s the interesting part–most teachers seem to have no idea what to do with the data collected using this assessment.  I frequently hear comments to the effect that this assessment is useless and that teachers get all the information they need to guide instruction through informal assessment.  My only response to that is:  who the hell do you think you’re fooling, you big fakers?

Don’t get me wrong–I deeply value the informal assessments we are always making with students in every conversation, the perusal of each written piece of work, sometimes just watching surruptitiously as their little faces screw up in frustration/concentration/dazedness during independent work.  But since I’ve yet to hear a single colleague or administrator try to discount the value of wide and varied assessments, this isn’t really at issue.  The issue is whether differentiated reading assessment can help inform and focus instruction for individual students.  Or maybe the issue is giving people a tool, telling them to use it, but not HOW to use it.  Here’s a hammer, build a Frank Lloyd Wright house…

Four years.  It took me four years to go from staring with glossy eyes at the rubrics and “Focus for Instruction” sheets to using the data to inform the scope, sequencing and grouping in my Readers Workshop in a way that very specifically acknowledges on a daily basis the needs of the students I am charged to teach.  To recognize who would need more conferring versus guided reading instruction, what was worth a series of focus lessons versus what could be handled with one or two reminders.  Teaching is slowly turning me into a systems person, and I had to slowly (and painfully at times) figure out ways to analyze and organize this assessment data in a way that spoke to me in clear and resonant tones. 

The upside is I know what I’m doing and why, and I have evidence to base my decisions on, not just instinct and bravado, which is how I now regard my previous “I’ve got it all up in the ol’ noggin” position.  DRA data often confirms what I’m assessing in other ways, formally and informally.  Hooray for me.  More importantly, this data holds me accountable to each student for providing the instruction they actually need, instead of what I most enjoy teaching or find easy to teach.  I hate teaching word attack skills–one of the many reasons I love fifth grade, because most kids come in pretty competent in decoding-type strategies.  But not all, and DRA data shoves that in my face every time I plan my week of Readers Workshop.  Little Susie needs her word attack help, and it’s my gosh darn job to give it.  (It’s a giant snore-fest for me, but that must be why I’m being paid and not volunteering.)

So to the “I’ve got it all ‘up here'” reading teachers, I say–what if that was the attitude and response of your cardiologist, your lawyer, your financial advisor?  Professionals need to embrace accountability in reasonably transparent ways.  These assessments take a LONG time to administer, so we’ve got to make the most of what they tell us, use every drop of data to understand our students, to find those places to nudge them along, and measure their success/failure in some part as our own.   Data like this enhances the art of teaching.  It is one piece of a very complicated endeavor–making a successful reader.

As part of my anniversary celebration, I’ll be sharing my strategies for interpreting and using DRA data with my colleagues at a meeting next week.  I don’t know if I’m doing it “right”, but I know we can’t keep regarding the good assessment as the sort of onerous obligation that we (and fairly so) hang on the high-stakes state-level tests (in Michigan we call this the MEAP).  We are all so tired, so inundated with work that takes us away from our vocation of teaching children, that I think maybe some really worthy tools, DRA included, got caught in the net of suspicion and acrimony that most of us understandably cast over the constant rainfall of “reform” initiatives. 

Wish me luck as I try to salvage something good from the wreckage.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: