PLC

NAEP Questions Tool

December 12, 2012

Looking for assessment questions. NAEP’s Questions Tool has a ton, and they look like they should line up beautifully with North Carolina’s Common Exams (a.k.a. MSLs).

Check out the NAEP website.

Or watch this video tutorial.

Techno-Performance Task Assessments

November 30, 2012

This post is the second in a series of three about implementing performance task assessments, an important part of the Common Core/Essential Standard shift.

The Challenge
Creating and administering common assessments is seldom easy work, and sometimes it is incredibly challenging. Take, for example, the challenge of assessing reading, writing, speaking and listening in world languages. Spanish teachers must assess student development in these four areas in intervals throughout the semester. Of course, their daily work, through workbooks, projects, quizzes, tests, and general instruction, provides them a picture of students achievement in particular skills, but that big, 4-strand picture is tough to assess.

Most difficult of all are the speaking and listening strand, as they require either one-on-one assessments (imagine having to test 30 individual students fairly as they explain why it will take two trains, one traveling east, the other west, four hours, thirteen minutes to meet in St. Louis). The Spanish I teachers figured it out.

Assessing Listening
To assess a students’ abilities to comprehend spoken Spanish, the Ms. Haynes, Ms. Dunham, and Ms. Watson created a video of six native Spanish speakers (plus one Japanese student, just for kicks) telling about themselves and their preferences. For the assessment, students watched the video and charted details about any four of the speakers. This assessment told the teachers which students were able to listen to Spanish and extract information from the speaker as they might need to in an actual conversation.

Assessing Speaking
My wife likes to tell the story of how her friend Crystal got entire class out of a speaking test in Spanish II. When it was Kerri’s turn to take the test–there was only one cassette machine for playing and recording–she pushed play, and all she heard was Crystal’s deep drawl, saying, “Hellllloooooo. I don’t hear anythang. Heeellllllooooooo.”

The Spanish teachers came up with a great solution to the challenge, and they were able to eliminate what I will call the Crystal effect. They created Google Voice accounts. Google voice provides you a phone number, and can direct calls to all of your phones, so you never miss a call. The key with this Spanish assessment, however, was to miss the call. Google voice redirects to voice mail and record messages as MP3s.

The teachers had their students call their Google Voice numbers all at once and answer two questions provided by the teacher in their best Spanish. To assess students’ performances on the task, the teachers opened their Google Voice accounts, clicked on the files, and listened to them. SInce the files are MP3, the teachers can easily move them into students’ digital linguafolios, so they can track student development throughout the year, or even as they progress through multiple levels of Spanish.

So What?
So what? Are you kidding me? That’s awesome, and not just because it’s a cool use of technology that averted the Crystal effect. What’s really awesome is this. The assessments tell teachers whether their classes as a whole are on track with reading, writing, speaking and listening, and it helps them identify which students are not progressing in each of the four strands. By delivering a common assessment with a common rubric and collaborating on the evaluation, they cannot help but see their own strengths and weaknesses. It is inevitable, for example, that a teacher whose students’ listening skills fall noticeably below the average will seek to improve that area with the assistance of colleagues. The process ferrets out shortcomings and begs us to respond.

Taking Risks
The Spanish teachers will tell you this process was not without flaw. The sort of risk they took in creating, delivering, and evaluating the assessment was huge and uncomfortable. It is that kind of risk that inspires growth, and growing is a darn good thing.

Leave your comments if you wish, or contact me directly at flinchm@pitt.k12.nc.us to collaborate with your PLC or to discuss assessment, instruction, or technology.

Responding to Performance Task Assessments in Your PLC

November 27, 2012

On October 26, PLCs created culminating performance task assessments. The results were fantastic. My next few blog posts will describe examples. I hope you will take from these examples some ideas for your own PLC, along with the motivation to continue and increase this work on your own. The challenge is great, and time is precious, but the work seems well worth the outcome.

PTA in the C&E PLC (OMG)
It took several hours, but the Civics and Economics PLC produced a clever and, more importantly, a meaningful performance task assessment on our 10/26 PD day. Their purpose was to assess students’ understanding of the US Constitution, Supreme Court precedents, and the ways these documents shape legal decisions. In doing so, they assessed students’ recall of information as well as their ability to apply that knowledge and think critically about legal systems. Check out the C&E PTA.

In short, their assessment asked students to take a legal position either for or against Michael, a male student attempting to enroll in an all-women’s college, and defend that position using amendments and SCOTUS cases. It clearly works on multiple levels of Revised Bloom’s Taxonomy and provides the PLC with an abundance of information about their students attainment of Common Core and NC Essential Standards.

After a collaborative assessment effort, here’s what they found:
1. Motivated responses. Almost every student, even those who rarely participate in class activities, participated in this written assessment, and many wrote more than the teacher expected.

2. Comprehension: Many students performed better than teachers expected. The underlying assumption seemed to be that students who struggle (or don’t bother to struggle) to memorize the information they are taught would have no chance of applying knowledge to a complex problem. Frequently, students were able to apply knowledge to demonstrate some depth of understanding about a topic, like the First Amendment or Plessy v. Ferguson.

3. Areas of need: The purpose of assessment is to shape future instruction, whether it be remediation, review, or forward progress. C&E teachers could tell which topics, like the First Amendment, their students could recall and understand, and which they could not. They also discovered a specific conceptual shortcoming in most responses. Students applied Plessy v. Ferguson appropriately, except that they failed to understand that the Brown v. Board decision upended Plessy. This information showed the teachers the need to review the Brown case and to clarify, perhaps in another context, what happens when a more recent SCOTUS decision eliminates or modifies an older one. When we talk about depth of understanding, as opposed to breadth of knowledge, this example is exactly what we’re talking about.

4. More of it: The C&E teachers agreed that, though time to work together is painfully rare, they would like to continue with this sort of assessment by creating a PTA for their economics unit. Ultimately, this sort of assessment has tremendous potential. It focuses on depth and works at multiple levels of the taxonomy. When we assess in this way, we will also teach in this way, seeking a depth of understanding and pushing students to apply, analyze, evaluate and create, not just remember. It’s a goal worthy of our pursuit.

Share your own experiences with PLCs and performance task assessments by clicking the comments option above.

Dealing with Data in Your PLC

October 15, 2012

Much has been said about PLCs and their use of data to inform instruction. What I discovered early in the process of working with PLCs on common assessments was that too much data is detrimental, as is data that is too detailed. I want to advocate a strategy for reporting proficiency that is easy to analyze and easy to respond to. It consists of two basic steps.

1. Identify general categories (solving equations, lab safety, brake systems)

You don’t want to know how every student did on every question. You really don’t even want to know what the class as a whole did on every question. What you do care about is student proficiency in broad categories. Does Jakeem know the process for changing an oil filter? Can he change break pads? Keep the topics broad by basing them on your essential standards or on topics designated by the PLC as targets for the grading period.

2. Reduce proficiency data to YES/NO

The grade book generally requires definite grades, but in the process of determining how to respond to an assessment, whether formative or summative, you can make your life easier by reducing those numbers to mere YES or NO. Is Jakeem proficient in lab safety or not? Does he understand food chains or not?

So you end up with something that looks like this Proficiency Table.

This report tells the teacher to deliver a whole new lesson, and perhaps a new approach or just a lot of practice, on subject-verb agreement. For pronouns, however, only four students require intervention, while the others might focus on other topics.

I hope this strategy will help your PLC become more efficient and more effective, as you attempt to improve student learning by respond to data. As always, let me know how I can help.

A Model PLC

October 10, 2012

The math department has always supported each of its teachers and has long attempted to align, if not instruction, then at least outcomes from one class to the next. Their history has made the transition to developing PLCs relatively natural, as they have essentially worked in PLCs for so long. Their weekly PLC meetings this year, if not perfect, are a genuine model of what PLCs should do to make instruction work for students.

How Math I Began The Year
All Math I teachers began the school year giving the same pretest. That test provided data about several mathematical skills–requisite skills for success in Math I–about every student enrolled in the course. The data was terrifying. It showed just how ill-prepared this new crop of students would be for the course they had just begun. The important part, however, was not the data itself. It was the response to the data. The PLC found itself immediately revising curriculum, figuring out where they really needed to begin instruction, where they needed to plug holes, and what they could do to help students succeed in Math I.

The process of creating this assessment, implementing it, processing its data, and responding to that data through curriculum design and instructional strategy–that’s what PLCs do. And that’s what is going to help those students, no matter which teacher stands at the front of the class, learn all they can.

Six Weeks In
It’s that time, and it snuck up quickly. The geometry PLC took to heart the command to implement six weeks assessments. They designed several very short, topical quizzes. Without warning, review or preparation beyond the instruction that has taken place over the past six weeks, they implemented those quizzes. One might show that an entire class comprehends parallel and perpendicular lines. Another might show that 48% of students don’t understand right triangles.

The first tells the teacher to keep moving. The second says review, reteach, intervene. And it even tells them who needs that intervention and who does not. Again, the assessments, and I promise you they are simple but well crafted, provide the data that shapes instruction. Perhaps more importantly, the need to respond prompts important conversations. A teacher will not sit with a PLC, reflect on unsatisfactory data, and come to the conclusion to keep moving forward, regardless. At the very least, that teacher will pursue solutions to the problem. And four great teachers tackling one problem is sure to produce a better response, and a response that impacts not just the 34 or 68 or 112 students fortunate enough to have a certain teacher, but every student supported by that PLC.

It Ain’t Easy
Talking to math teachers, I don’t hear tales of glory. I don’t hear the teachers praising the measurable benefits of the PLC. I hear grit. I hear labor. But I also hear a unified effort to identify and overcome the challenges of everyday life in the mathematics classroom. It’s not pretty, but it makes a difference.

PLCs on September 20

September 19, 2012

Today is our first early release day, and I hope that taking the time to work with your PLC will be productive. We want to start by making sure people are clear about a few common questions regarding the Common Core and Essential Standards.

Read this Common Core FAQ if you have questions about Common Core.

PLCs are going to be extremely important during the next few years in terms of facilitating the transition from the old standards and assessments to the new ES, CCSS, EOCs and MSLs. If you want to see today’s presentation again, click here.

 

If you haven’t completed your proposal in My Learning Plan, take care of that now. This video should help you if you need it:

PLC_Proposal_MLP

If that didn’t help, contact Mike Flinchbaugh.

 

Now for the real work of the PLC, which should grow from and be based on student learning. Here’s a little reminder of What PLCs Do and What Your PLC Might Do Today. And one last thing: resources. There’s a ton of stuff out there for us to use but never enough time to sift through it. The PCS core area curriculum specialists are doing their best to sift through materials and post them to the PCS Core Website.

Getting Started in Your PLC

September 11, 2012

Meeting with your PLC
At the beginning of the year, which now seems so far away, we talked about PLCs meeting regularly. Specifically, we identified the second Wednesday of each month (knowing that you might choose to meet more frequently or at different times), early release days, and teacher work days as times for PLCs to meet, collaborate, and produce. Tomorrow is the second Wednesday in September, so PLCs, unless you have identified a different time, should be meeting this week.

Remember that you should participate in a PLC for every course you teach that is taught by another teacher on campus. Each course, and not each course area, should work as a PLC. Those who are involved in PLCs with teachers in other schools, esp. CTE and arts, meet according to Ms. Trueblood’s or Ms. Behan’s instructions.

Goal Setting
PLCs should be actively and collaboratively engaged in pursuit of a limited number of SMART goals, focused on student performance.

S = specific
M = measurable
A = attainable
R = relevant
T = time bound

If you haven’t set goals for student performance, take care of that this week. You will also want to enter your goals into My Learning Plan. I made a Jing video to show you how to do that. Here’s the link: PLC Proposal. Call me if you need any help, or invite me to attend your PLC meeting, so I can work with the group the process. It’s pretty simple, and it should help you keep an eye on the goals you want to achieve and your tools for measuring that achievement.

Where we’re going with this...
Our end goal is to make sure every student learns what they need to learn in our classes, no matter who the teacher is, no matter what the class is. Ideally, a common set of goals and collaboration toward the accomplishment of those goals, will teachers succeed with every student. Our tools for doing this: 1) some form of common assessment, 2) data from those assessments, 3) analysis of and response to that data.

So, for example, every biology teacher gives a common assessment at the end of week five. That assessment shows which students can explain how cells reproduce, can make safe decisions about lab work, and can explain how plants make food. They chart the data, reteach the kids who weren’t proficient on any of the established goals, modify instructional strategies, adapt, improvise, overcome…or something like that. The idea is that the data guides that collaboration because it shows PLCs and individual teachers the gaps that exist in the development of students’ learning.

This semester–in fact very soon–PLCs should be developing assessment tools to measure whether students can do what they need to do. These tools–call them benchmarks or common assessments–should tie in directly to the goals set by the PLC, and will, most likely, focus on the five essential skills you identified at the beginning of the year. For now, take one step at a time. Meet. Set your goals. Discuss the pursuit of those goals.