January 23

January 23, 2013

Your work today will be in PLCs. You will need computers and each other. If you borrow one, please return it by noon. Here’s the plan for the day:

1. The first task for the day is to review the PCS Curriculum Guides. These guides were created last summer with the knowledge that they would change and grow as we become more adept with CCSS and NCES. Your input will help shape revisions of these documents. The best way to locate them is to start at tinyurl.com/pcscore. Click PCS Curriculum Guides and find your course.

The link for feedback ishttp://mbcurl.me/3C6N.

2. One of the concerns we’ve heard expressed over and over this year is a lack of available resources for teachers in implementation of the new standards.  While we don’t have funds to purchase resources, there are literally millions of resources out t here on the web (a recent Google search of “common core” resources turned up 5.5 million results).

Your task for the day is to review some high-quality sites for use in your classroom.  Personnel within the EPS department of Pitt County Schools has identified 50 web resources, covering every subject and grade level, that we feel are high-quality, updated regularly, aligned to the new standards, and reliable.  The resources are stored on Delicious, a social bookmarking site, for easy access.  Today you’ll pick 2 or 3 (and only 2 or 3) to examine in-depth and figure out ways to use them in your classroom.  We really hope you’ll spend a solid 20-30 minutes looking at and examining each individual resource (for a total of 60-90 minutes) rather than skimming over 25 or 30 resources and simply bookmarking them.  The goal of this activity, just like a goal of the new standards, is to examine something in depth rather than simply cover as many sites as you can.

The link to the Delicious site itself is https://delicious.com/pcsrttt  The link to the worksheet used to help you evaluate the websites, recording your thoughts and determining how you might use them as a resource, is http://mbcurl.me/3CAD.  Please print off 2-3 copies of the worksheet for your PLC to use as you review the resources.

Copied from Thomas Feller, successforeverychild.wordpress.com

3. The final activity for your PLC today is collaborating on your own terms. I would like to throw out a few suggestions, but do what your PLC needs to do:

-Discuss and respond to MSLs, especially the constructed response questions. Will your teaching change? Your assessment strategies? How?

-Make an assessment plan for the semester.

-Design a data plan. How will you use data to inform instruction this semester? What kind of data?

-Design common assessments, especially performance task assessments.

-Share resources.

-Plan instruction for the first six weeks.

As with yesterday, our training today is designed to be completed by lunch time so you can spend the afternoon working in your classroom, grading papers, and getting ready for the new semester.

Connect Text Remix Event

December 18, 2012

This week, English students remixed their research papers into new formats. Some chose Xtranormal, some chose Voice Threads, and some chose Pictochart as their tools for reimagining their original text.

Here’s a Voice Thread that tells the story.

NAEP Questions Tool

December 12, 2012

Looking for assessment questions. NAEP’s Questions Tool has a ton, and they look like they should line up beautifully with North Carolina’s Common Exams (a.k.a. MSLs).

Check out the NAEP website.

Or watch this video tutorial.

The Common Core, in a Small Meaningful Way

December 8, 2012

This is Not a Homework Check
“Going over homework” in Jennifer Mabe’s class is far more than a right/wrong self-check. Mabe asks her students, as any teacher might, to announce their answers to questions from the previous night. She follows, not with correction or confirmation, and not with her own demonstration of the correct process for arriving at the best answer, but with an opportunity for other students to challenge the first response with their own. The most important part happens when Mabe asks students to justify why an incorrect answer is incorrect, and why a correct one is correct. Mabe’s students are able to identify where negatives were neglected, order of operations not followed, and concepts misconstrued.

Small as this homework check detail might seem, its value is substantial for two reasons. First, it promotes two of the eight mathematical practices:
3. Construct viable arguments and critique the reasoning of others.
6. Attend to precision.
While teaching them algebra, Mabe is also teaching cross curricular habits of mind (reasoning, precision) that will serve them well as life-long learners. She’s teaching them to think math, and teaching them to think.

Beyond training her students to think math, Mabe requires her students to talk math. To explain why an answer is incorrect, they must, to some extent, use the language of mathematics. Math talk in the mouth of the teacher is fine; it’s like listening to a native speaker. In the mouths of learners, however, talking math develops ownership and mathematical fluency, the same way speaking Spanish helps develop fluency in that language.

Sharks Aren’t Like Dolphins
Clinton Todd wants his students to develop biology fluency as they learn to classify animals. In a recent bell ringer, students decided which animal (turtle, wolf, or shark) the dolphin most resembles, biologically speaking. After a few minutes of independent processing, the first student to respond answered, “shark.” When Todd prodded the student to justify his response, the student offered a detailed comparison of the physical similarities between dolphins and sharks. Todd then opened discussion. What other answers did anyone choose? Why not the shark? Why did you choose the wolf? (The correct answer is the wolf, which, like the dolphin, is a mammal, the biggest hint being the dolphins blow hole and lungs, as opposed to gills. Only one student selected the turtle.)

Todd’s line of questioning, and his response to correct and incorrect responses made this bell ringer effective. The nature of his response remained even and inquisitive, whether the answer was correct or incorrect. What he valued, it appeared, were the reasoning and the thought process the student used to draw a conclusion.

Todd’s activity worked much the way Mabe’s did. His students engaged in science talk, using the language of biology themselves, instead of merely hearing it from the teacher, and they engaged in the thought processes of a biologist, observing, classifying species, and, just as Mabe’s students did, verbally justifying their responses with reasoned explanations.

Common to the Core
These strategies are not new, not by any stretch of the imagination, and they aren’t spectacular. The thing is, they don’t have to be. Nothing about Common Core has to be spectacular, flashy, or funky. What CC does have to do is place the challenge of critical thinking and of developing content-specific literacy on the student. It’s simple enough to see how all this fits into your own discipline, but here are a few questions that might guide you:

  • Are your students solving a problem, instead of mimicking, copying or regurgitating?
  • Have you asked/required/expected/taught your students to justify their solutions/answers?
  • Do you value the students’ reasoning process?
  • How many times during the course of a period does every student use the language of the course?

I would like to make another point about the Common Core standards. They are common. They are the expectations of all students. All students, that is, must be expected to demonstrate these thinking skills in the various courses they take. The two classes described above were not AP or honors courses. Todd’s was a typical standard biology class. Mabe’s was a year-long (euphemistically read, not mathematically inclined) Math I class. Mabe and Todd expected these thinking skills from all students, and by this point in the school year, they can see the results.

Computers or iPads?

December 3, 2012

Our school has gone from 0 to 60, and fast. Ok, it’s actually 0 to 11: from 0 mobile labs to 10 mobile computer labs (8 of them dedicated to a single classroom) + 1 iPad cart. The challenge now is to figure out what to do with them. As this is our school’s first foray into student iPads, I thought I would share some thoughts on when to choose iPads and when to choose computers.

Ye Goode Olde Computre
Let’s start with the familiar: computers. Computers are still our best bet for so many purposes. They are great for conducting research and for producing documents, which should account for the majority of the work we do on computers. They remain an excellent choice for accessing online learning tools like Study Island, Edmodo, Elements, and for using specialty programs, like the Photoshop. Computers also give students access to web-based programs like SAS, Xtranormal, and Voice Threads. The short of it, for now anyway, is that computers remain your best, and most versatile, bet for tech.

Ladies and Gentlemen, The iPad
So why are we even talking about iPads? They aren’t great for producing documents, though I admit that I am typing this blog on my iPad. And they don’t support Flash, which means you can’t run web apps like SAS Curriculum Pathways, on them.

So what makes the iPad awesome for the classroom? Apps: apps for learning (Khan Academy, Nova Elements, Economist World Figures), and apps for assessment (Educreations, ShowMe, Penultimate). Apps are specialized, generally self-contained programs that allow the user to focus, typically, on a single, specific task. An app like World Figures, which provides an abundance of international statistical data, puts students directly in touch with the information they need.

The iPad also allows student mobility, which means students can collaborate easily, teachers can organize jigsaw activities, or groups can use the device’s photo and video capabilities to record their work. Computers, not so much. The other benefit of the iPad, and it is easy to understate this feature, is that it forces cloud thinking. You can’t just drop files on a flash drive, and you can’t just open My Documents. Instead, students must get use to cloud-based (think Google Docs and Dropbox) storage and transmission, which will be the standard for file management before they finish college.

A Guide to Help You Decide

Use the iPad if… Use the computer if… Use either if…
you want students to move around and engage each other off screen students can be stationary and interaction is solely digital you don’t care how they interactact
you have a specific iPad app that you want students to use in class. you have a specific program not available on the iPads that you want students to use. the program/app you want students to use has both web and iPad versions (Voice Threads)
you want students to create videos or annotated recordings (Educreations) or images (Penultimate) you want students to create documents (Word, Google) or presentations (Prezi, PowerPoint) 
you want students to gather information/ideas from specific sources best accessed through an app (Oyez Today, Nova Elements) you want students to both conduct research and produce substantial written text about their findings you want students to research information widely available on the web
you want to engage students with interactive apps like Sketch Explorer or Tap Quiz Maps you want to use Flash-based programs like SAS Curriculum Pathways, which are not available for the iPad you want students to share in a common, digital space (Edmodo, Twitter)

 The fact is, these two options are growing closer and closer to each other in terms of their possibilities and usefulness. The question is not necessarily which device to use, but how to use the device at your disposal to accomplish your desired goal. Chances are your colleagues, your media coordinators or your IC can help you find a solution to whatever tech challenges you might have.

 

Questions? Comments? Suggestions? Post them to Comments (see link above) or e-mail flinchm@pitt.k12.nc.us 

Techno-Performance Task Assessments

November 30, 2012

This post is the second in a series of three about implementing performance task assessments, an important part of the Common Core/Essential Standard shift.

The Challenge
Creating and administering common assessments is seldom easy work, and sometimes it is incredibly challenging. Take, for example, the challenge of assessing reading, writing, speaking and listening in world languages. Spanish teachers must assess student development in these four areas in intervals throughout the semester. Of course, their daily work, through workbooks, projects, quizzes, tests, and general instruction, provides them a picture of students achievement in particular skills, but that big, 4-strand picture is tough to assess.

Most difficult of all are the speaking and listening strand, as they require either one-on-one assessments (imagine having to test 30 individual students fairly as they explain why it will take two trains, one traveling east, the other west, four hours, thirteen minutes to meet in St. Louis). The Spanish I teachers figured it out.

Assessing Listening
To assess a students’ abilities to comprehend spoken Spanish, the Ms. Haynes, Ms. Dunham, and Ms. Watson created a video of six native Spanish speakers (plus one Japanese student, just for kicks) telling about themselves and their preferences. For the assessment, students watched the video and charted details about any four of the speakers. This assessment told the teachers which students were able to listen to Spanish and extract information from the speaker as they might need to in an actual conversation.

Assessing Speaking
My wife likes to tell the story of how her friend Crystal got entire class out of a speaking test in Spanish II. When it was Kerri’s turn to take the test–there was only one cassette machine for playing and recording–she pushed play, and all she heard was Crystal’s deep drawl, saying, “Hellllloooooo. I don’t hear anythang. Heeellllllooooooo.”

The Spanish teachers came up with a great solution to the challenge, and they were able to eliminate what I will call the Crystal effect. They created Google Voice accounts. Google voice provides you a phone number, and can direct calls to all of your phones, so you never miss a call. The key with this Spanish assessment, however, was to miss the call. Google voice redirects to voice mail and record messages as MP3s.

The teachers had their students call their Google Voice numbers all at once and answer two questions provided by the teacher in their best Spanish. To assess students’ performances on the task, the teachers opened their Google Voice accounts, clicked on the files, and listened to them. SInce the files are MP3, the teachers can easily move them into students’ digital linguafolios, so they can track student development throughout the year, or even as they progress through multiple levels of Spanish.

So What?
So what? Are you kidding me? That’s awesome, and not just because it’s a cool use of technology that averted the Crystal effect. What’s really awesome is this. The assessments tell teachers whether their classes as a whole are on track with reading, writing, speaking and listening, and it helps them identify which students are not progressing in each of the four strands. By delivering a common assessment with a common rubric and collaborating on the evaluation, they cannot help but see their own strengths and weaknesses. It is inevitable, for example, that a teacher whose students’ listening skills fall noticeably below the average will seek to improve that area with the assistance of colleagues. The process ferrets out shortcomings and begs us to respond.

Taking Risks
The Spanish teachers will tell you this process was not without flaw. The sort of risk they took in creating, delivering, and evaluating the assessment was huge and uncomfortable. It is that kind of risk that inspires growth, and growing is a darn good thing.

Leave your comments if you wish, or contact me directly at flinchm@pitt.k12.nc.us to collaborate with your PLC or to discuss assessment, instruction, or technology.

Responding to Performance Task Assessments in Your PLC

November 27, 2012

On October 26, PLCs created culminating performance task assessments. The results were fantastic. My next few blog posts will describe examples. I hope you will take from these examples some ideas for your own PLC, along with the motivation to continue and increase this work on your own. The challenge is great, and time is precious, but the work seems well worth the outcome.

PTA in the C&E PLC (OMG)
It took several hours, but the Civics and Economics PLC produced a clever and, more importantly, a meaningful performance task assessment on our 10/26 PD day. Their purpose was to assess students’ understanding of the US Constitution, Supreme Court precedents, and the ways these documents shape legal decisions. In doing so, they assessed students’ recall of information as well as their ability to apply that knowledge and think critically about legal systems. Check out the C&E PTA.

In short, their assessment asked students to take a legal position either for or against Michael, a male student attempting to enroll in an all-women’s college, and defend that position using amendments and SCOTUS cases. It clearly works on multiple levels of Revised Bloom’s Taxonomy and provides the PLC with an abundance of information about their students attainment of Common Core and NC Essential Standards.

After a collaborative assessment effort, here’s what they found:
1. Motivated responses. Almost every student, even those who rarely participate in class activities, participated in this written assessment, and many wrote more than the teacher expected.

2. Comprehension: Many students performed better than teachers expected. The underlying assumption seemed to be that students who struggle (or don’t bother to struggle) to memorize the information they are taught would have no chance of applying knowledge to a complex problem. Frequently, students were able to apply knowledge to demonstrate some depth of understanding about a topic, like the First Amendment or Plessy v. Ferguson.

3. Areas of need: The purpose of assessment is to shape future instruction, whether it be remediation, review, or forward progress. C&E teachers could tell which topics, like the First Amendment, their students could recall and understand, and which they could not. They also discovered a specific conceptual shortcoming in most responses. Students applied Plessy v. Ferguson appropriately, except that they failed to understand that the Brown v. Board decision upended Plessy. This information showed the teachers the need to review the Brown case and to clarify, perhaps in another context, what happens when a more recent SCOTUS decision eliminates or modifies an older one. When we talk about depth of understanding, as opposed to breadth of knowledge, this example is exactly what we’re talking about.

4. More of it: The C&E teachers agreed that, though time to work together is painfully rare, they would like to continue with this sort of assessment by creating a PTA for their economics unit. Ultimately, this sort of assessment has tremendous potential. It focuses on depth and works at multiple levels of the taxonomy. When we assess in this way, we will also teach in this way, seeking a depth of understanding and pushing students to apply, analyze, evaluate and create, not just remember. It’s a goal worthy of our pursuit.

Share your own experiences with PLCs and performance task assessments by clicking the comments option above.

Dealing with Data in Your PLC

October 15, 2012

Much has been said about PLCs and their use of data to inform instruction. What I discovered early in the process of working with PLCs on common assessments was that too much data is detrimental, as is data that is too detailed. I want to advocate a strategy for reporting proficiency that is easy to analyze and easy to respond to. It consists of two basic steps.

1. Identify general categories (solving equations, lab safety, brake systems)

You don’t want to know how every student did on every question. You really don’t even want to know what the class as a whole did on every question. What you do care about is student proficiency in broad categories. Does Jakeem know the process for changing an oil filter? Can he change break pads? Keep the topics broad by basing them on your essential standards or on topics designated by the PLC as targets for the grading period.

2. Reduce proficiency data to YES/NO

The grade book generally requires definite grades, but in the process of determining how to respond to an assessment, whether formative or summative, you can make your life easier by reducing those numbers to mere YES or NO. Is Jakeem proficient in lab safety or not? Does he understand food chains or not?

So you end up with something that looks like this Proficiency Table.

This report tells the teacher to deliver a whole new lesson, and perhaps a new approach or just a lot of practice, on subject-verb agreement. For pronouns, however, only four students require intervention, while the others might focus on other topics.

I hope this strategy will help your PLC become more efficient and more effective, as you attempt to improve student learning by respond to data. As always, let me know how I can help.

A Model PLC

October 10, 2012

The math department has always supported each of its teachers and has long attempted to align, if not instruction, then at least outcomes from one class to the next. Their history has made the transition to developing PLCs relatively natural, as they have essentially worked in PLCs for so long. Their weekly PLC meetings this year, if not perfect, are a genuine model of what PLCs should do to make instruction work for students.

How Math I Began The Year
All Math I teachers began the school year giving the same pretest. That test provided data about several mathematical skills–requisite skills for success in Math I–about every student enrolled in the course. The data was terrifying. It showed just how ill-prepared this new crop of students would be for the course they had just begun. The important part, however, was not the data itself. It was the response to the data. The PLC found itself immediately revising curriculum, figuring out where they really needed to begin instruction, where they needed to plug holes, and what they could do to help students succeed in Math I.

The process of creating this assessment, implementing it, processing its data, and responding to that data through curriculum design and instructional strategy–that’s what PLCs do. And that’s what is going to help those students, no matter which teacher stands at the front of the class, learn all they can.

Six Weeks In
It’s that time, and it snuck up quickly. The geometry PLC took to heart the command to implement six weeks assessments. They designed several very short, topical quizzes. Without warning, review or preparation beyond the instruction that has taken place over the past six weeks, they implemented those quizzes. One might show that an entire class comprehends parallel and perpendicular lines. Another might show that 48% of students don’t understand right triangles.

The first tells the teacher to keep moving. The second says review, reteach, intervene. And it even tells them who needs that intervention and who does not. Again, the assessments, and I promise you they are simple but well crafted, provide the data that shapes instruction. Perhaps more importantly, the need to respond prompts important conversations. A teacher will not sit with a PLC, reflect on unsatisfactory data, and come to the conclusion to keep moving forward, regardless. At the very least, that teacher will pursue solutions to the problem. And four great teachers tackling one problem is sure to produce a better response, and a response that impacts not just the 34 or 68 or 112 students fortunate enough to have a certain teacher, but every student supported by that PLC.

It Ain’t Easy
Talking to math teachers, I don’t hear tales of glory. I don’t hear the teachers praising the measurable benefits of the PLC. I hear grit. I hear labor. But I also hear a unified effort to identify and overcome the challenges of everyday life in the mathematics classroom. It’s not pretty, but it makes a difference.

PLCs on September 20

September 19, 2012

Today is our first early release day, and I hope that taking the time to work with your PLC will be productive. We want to start by making sure people are clear about a few common questions regarding the Common Core and Essential Standards.

Read this Common Core FAQ if you have questions about Common Core.

PLCs are going to be extremely important during the next few years in terms of facilitating the transition from the old standards and assessments to the new ES, CCSS, EOCs and MSLs. If you want to see today’s presentation again, click here.

 

If you haven’t completed your proposal in My Learning Plan, take care of that now. This video should help you if you need it:

PLC_Proposal_MLP

If that didn’t help, contact Mike Flinchbaugh.

 

Now for the real work of the PLC, which should grow from and be based on student learning. Here’s a little reminder of What PLCs Do and What Your PLC Might Do Today. And one last thing: resources. There’s a ton of stuff out there for us to use but never enough time to sift through it. The PCS core area curriculum specialists are doing their best to sift through materials and post them to the PCS Core Website.