Techno-Performance Task Assessments

November 30, 2012

This post is the second in a series of three about implementing performance task assessments, an important part of the Common Core/Essential Standard shift.

The Challenge
Creating and administering common assessments is seldom easy work, and sometimes it is incredibly challenging. Take, for example, the challenge of assessing reading, writing, speaking and listening in world languages. Spanish teachers must assess student development in these four areas in intervals throughout the semester. Of course, their daily work, through workbooks, projects, quizzes, tests, and general instruction, provides them a picture of students achievement in particular skills, but that big, 4-strand picture is tough to assess.

Most difficult of all are the speaking and listening strand, as they require either one-on-one assessments (imagine having to test 30 individual students fairly as they explain why it will take two trains, one traveling east, the other west, four hours, thirteen minutes to meet in St. Louis). The Spanish I teachers figured it out.

Assessing Listening
To assess a students’ abilities to comprehend spoken Spanish, the Ms. Haynes, Ms. Dunham, and Ms. Watson created a video of six native Spanish speakers (plus one Japanese student, just for kicks) telling about themselves and their preferences. For the assessment, students watched the video and charted details about any four of the speakers. This assessment told the teachers which students were able to listen to Spanish and extract information from the speaker as they might need to in an actual conversation.

Assessing Speaking
My wife likes to tell the story of how her friend Crystal got entire class out of a speaking test in Spanish II. When it was Kerri’s turn to take the test–there was only one cassette machine for playing and recording–she pushed play, and all she heard was Crystal’s deep drawl, saying, “Hellllloooooo. I don’t hear anythang. Heeellllllooooooo.”

The Spanish teachers came up with a great solution to the challenge, and they were able to eliminate what I will call the Crystal effect. They created Google Voice accounts. Google voice provides you a phone number, and can direct calls to all of your phones, so you never miss a call. The key with this Spanish assessment, however, was to miss the call. Google voice redirects to voice mail and record messages as MP3s.

The teachers had their students call their Google Voice numbers all at once and answer two questions provided by the teacher in their best Spanish. To assess students’ performances on the task, the teachers opened their Google Voice accounts, clicked on the files, and listened to them. SInce the files are MP3, the teachers can easily move them into students’ digital linguafolios, so they can track student development throughout the year, or even as they progress through multiple levels of Spanish.

So What?
So what? Are you kidding me? That’s awesome, and not just because it’s a cool use of technology that averted the Crystal effect. What’s really awesome is this. The assessments tell teachers whether their classes as a whole are on track with reading, writing, speaking and listening, and it helps them identify which students are not progressing in each of the four strands. By delivering a common assessment with a common rubric and collaborating on the evaluation, they cannot help but see their own strengths and weaknesses. It is inevitable, for example, that a teacher whose students’ listening skills fall noticeably below the average will seek to improve that area with the assistance of colleagues. The process ferrets out shortcomings and begs us to respond.

Taking Risks
The Spanish teachers will tell you this process was not without flaw. The sort of risk they took in creating, delivering, and evaluating the assessment was huge and uncomfortable. It is that kind of risk that inspires growth, and growing is a darn good thing.

Leave your comments if you wish, or contact me directly at flinchm@pitt.k12.nc.us to collaborate with your PLC or to discuss assessment, instruction, or technology.

Responding to Performance Task Assessments in Your PLC

November 27, 2012

On October 26, PLCs created culminating performance task assessments. The results were fantastic. My next few blog posts will describe examples. I hope you will take from these examples some ideas for your own PLC, along with the motivation to continue and increase this work on your own. The challenge is great, and time is precious, but the work seems well worth the outcome.

PTA in the C&E PLC (OMG)
It took several hours, but the Civics and Economics PLC produced a clever and, more importantly, a meaningful performance task assessment on our 10/26 PD day. Their purpose was to assess students’ understanding of the US Constitution, Supreme Court precedents, and the ways these documents shape legal decisions. In doing so, they assessed students’ recall of information as well as their ability to apply that knowledge and think critically about legal systems. Check out the C&E PTA.

In short, their assessment asked students to take a legal position either for or against Michael, a male student attempting to enroll in an all-women’s college, and defend that position using amendments and SCOTUS cases. It clearly works on multiple levels of Revised Bloom’s Taxonomy and provides the PLC with an abundance of information about their students attainment of Common Core and NC Essential Standards.

After a collaborative assessment effort, here’s what they found:
1. Motivated responses. Almost every student, even those who rarely participate in class activities, participated in this written assessment, and many wrote more than the teacher expected.

2. Comprehension: Many students performed better than teachers expected. The underlying assumption seemed to be that students who struggle (or don’t bother to struggle) to memorize the information they are taught would have no chance of applying knowledge to a complex problem. Frequently, students were able to apply knowledge to demonstrate some depth of understanding about a topic, like the First Amendment or Plessy v. Ferguson.

3. Areas of need: The purpose of assessment is to shape future instruction, whether it be remediation, review, or forward progress. C&E teachers could tell which topics, like the First Amendment, their students could recall and understand, and which they could not. They also discovered a specific conceptual shortcoming in most responses. Students applied Plessy v. Ferguson appropriately, except that they failed to understand that the Brown v. Board decision upended Plessy. This information showed the teachers the need to review the Brown case and to clarify, perhaps in another context, what happens when a more recent SCOTUS decision eliminates or modifies an older one. When we talk about depth of understanding, as opposed to breadth of knowledge, this example is exactly what we’re talking about.

4. More of it: The C&E teachers agreed that, though time to work together is painfully rare, they would like to continue with this sort of assessment by creating a PTA for their economics unit. Ultimately, this sort of assessment has tremendous potential. It focuses on depth and works at multiple levels of the taxonomy. When we assess in this way, we will also teach in this way, seeking a depth of understanding and pushing students to apply, analyze, evaluate and create, not just remember. It’s a goal worthy of our pursuit.

Share your own experiences with PLCs and performance task assessments by clicking the comments option above.