On Data, Part Two: How Dumb Data Is Created In Schools

Gathered for our morning meeting, we sat around the language lab watching our assistant principal fumble with the projector. The school's thirteen social studies teachers were on the verge of learning what the data would tell us about our students' learning.

The projector blinked an image on the wall a few times and then, voilá, an Excel spreadsheet popped onto the screen. It didn't take long to figure out the news was not good. The AP had listed student names in the column to the far left and five open-ended items on an interim assessment they'd recently been scored on in the row across the top. Each assessment item for each student had been scored between a zero (indicating the student had not attempted the item) and a three (indicating the student had answered the item correctly in all aspects). Each cell with a zero or a one (a one indicated the student had attempted the item but showed no evidence of understanding) had been filled with red.

It looked like something had just been murdered against the wall, and if it wasn't our egos or our students' futures, it was our collective trust in the administration to use data to make useful decisions.

The interim assessment these data came from consisted of a primary source appropriate (at least ostensibly) for the student's grade-level (World History 1 or 2, US History, or US Government) and five questions. In other words, students would read the source and answer the following questions:

1) From whose perspective are we viewing?

2) What's new and what's old?

3) How do we know what we know?

4) How are things, events, and people connected to each other over time?

5) So what? Why does it matter?

There were more than a few problems with the data this created.

First, although the questions were used with students in social studies classes across the school, teachers did not share a common understanding of what the questions meant. When students asked their teachers for interpretation, different teachers responded in different ways. Lacking consensus on what the questions asked, teachers were in further disagreement over what it meant to answer an item correctly "in all aspects."

Second, the assistant principal assigned the primary sources for each grade-level without consideration of prior learning. Even though I was in the middle of teaching European imperialism, my students' assigned primary source was a Eugene Debs quote about World War I.

Third, the AP required that teachers score each others' students' tests, which becomes problematic when scoring items that require the scorer be familiar with the course's prior content (e.g. questions two and four).

Fourth, the data this produced was said to align with the district's skill standards for social studies (as recommended by New Leaders for New Schools) even though it didn't. The questions and the skill standards were created independently of one another. Our assistant principal, in a last minute effort, placed each of the district's thirteen social studies skill standards with one of the five questions. The data we gathered was then used to represent student proficiency in social studies generally, despite that nobody had even feigned that content standards had been addressed.

Fifth, frustrated by the administration's lack of concern with the legitimacy of the data being created and feeling they may end up being arbitrarily punished as a result, some teachers agreed to assign lower scores on the first of the interim assessments and, as the semester went on, assign higher and higher scores to students' answers on subsequent interim assessments. The thought was that while administration might have difficulty faulting teachers for low scores at the beginning of the semester, they would certainly complain if scores did not improve.

I hope this anecdote has clearly communicated how data-driven instruction can go wrong and provided context for something I wrote yesterday: "Using quantitative data can be useful, but only when we can look under its hood for the reasons it emerged as it did, and not when we're using it merely as a means of satisfying a need for demonstrating accountability." If anyone had been allowed to evaluate the data creation described above ("look under its hood" so to speak), it would have quickly lost any legitimacy our administration was pretending that it had.

Yesterday I explained what inspired me to write this series of posts on data and what I hope to explore with them, described why I'm wary of those who harp on "data-driven instruction," and provided some outside reading material on the topic.

Tomorrow I'll attempt to illustrate similarities among schools like the one I just described and the system that brought down Wall Street in the financial meltdown.

For another teacher anecdote on the misuse of data from the same school I just described, see Mr Teachbad's post on The Answer Sheet.

Comments

  1. James: Your story here is a scary, but not surprising, example of how data-driven 'reform' can go very wrong. I wish more people who touted data-driven reform would look at real examples like yours to understand why so many educators decry its use in our schools (especially for high-stakes purposes).

    ReplyDelete
  2. James, These are such good posts. I've been wanting to write about this myself, but just haven't gotten my thoughts together to do so. I'm so glad you did.

    I really appreciate your tone here as well as the clarity & accessibility.

    Sharing widely!

    ReplyDelete
  3. I have a disturbing example of how data can be used to make bad decisions about student placement. An autism inclusion student I've had for the past 3 years is going to middle school this fall. He has an average range IQ. He is bright in his areas of interests, but has historically scored low on standardized test. I described in detail the discrepancies between test scores and actual ability, as has the school psychologist in his IEP and the other information that should be used for middle school placement. For example he can read close to grade level when he is interested. This student did the worse ever on his last computerized reading and writing assessment (MAP). He rushed and scored 20 points lower than his lowest score. This put him at the beginning reading level.

    The father came in and told me that son had been placed on a low functioning self-contained class. The middle school copied my emails expressing my concerns that this was the absolute wrong placement for this child to downtown. Downtown sent me a nasty email about how the placement was a done deal. The test score data won that round over the teacher who has known the child for three years. The parents are strong advocates, and I imagine the student will eventually have a more appropriate placement, but it will be a fight. Just think of the families who do not have strong advocacy skills.

    --Sorrel

    ReplyDelete
  4. Here's a little something you might consider using for that next data driven staff meeting: a Data Shield. Feel free to use, share, copy, etc. As with anything like this, its power to protect increases dramatically with the user's belief in it. Enjoy, and thanks for the great data series - Mark

    ReplyDelete

Post a Comment

Popular Posts