Welcome to my ramblings!

Welcome to my Blog. Here you can find the ramblings of a old high school principal. I've created a number of blogs over the years for a variety of reasons. A large number of them I use with my staff which are password protected from the outside world. This blog is for my fellow educators and anyone else who wants to read the ramblings. I guess my target would be building administrators, future administrators, teachers and educators in general.

Wednesday, February 5, 2014

Hess and Rigor

Hess and Rigor

Last spring I was sitting around a table with my Instructional Coach Fred Hollingshead after completing a meeting with my Leadership Team about Rigor.  We had just developed a definition for Rigor and I asked "How can we measure rigor in our classrooms?" Fred was quick to answer, "HESS Matrix".  My reply, "How can the HESS Matrix measure rigor?" He reminded me of the Webb's training he had done the previous semester with our staff. Using our staffs knowledge of Blooms and Webb's we could apply the HESS Matrix to our teachers assessments. 

So in March (2013) we used our inservice to train our staff on the HESS Matrix. We identify each category on the Matrix by a number. For example "Blooms Remember" and "Webb's DOK Level 1" became our 1-1.  Our goal was to get our teachers to focus on 4-1 or higher. (See Table)
                  Webb's 1     Webb'2      Webb's 3    Webb's 4
Remember    1-1              1-2            1-3              1-4
Understand  2-1              2-2            2-3              2-4
Apply           3-1              3-2            3-3              3-4
Analyze       4-1              4-2             4-3               4-4
Evaluate     5-1              5-2             5-3               5-4  
Create        6-1              6-2             6-3                6-4
During the March inservice our teachers noticed that most of their questions were in the lower level (Blue Category above). Number of teachers jumped the gun and started discussing how they could change their assessments to get higher level questions. At that moment I stopped them. I told them they could make their changes but only after we had some baseline data of where their assessments were when they walked in today. So totals were taken and submitted to their chairs to create a total for each department. 

Then we started the conversations and boy were their conversations.  The conversations were rich and in time I may blog on a few of them for now here are just a few of the topics:
  • The type of questions can create rigor.
  • How a simple word change could change the rigor of a question.
  • The role of lower level questions.
  • Science discussed that they felt their subject automatically lent itself to rigor.
  • Fine Arts got involved in what was "Create" vs "Applying" a skill.
After walking through the Matrix and allowing the discussions to take place for awhile we broke out into our departments after lunch. Teachers were asked to actual project questions on the board and have discussions about the rigor of their questions as well as ways to make them more rigorous. 

In the mean time, I took the totals from each department and created our baseline data. It should be noted that while the chairs turned in the total number of questions for each category on the Matrix they also included totals for individual teachers. We chose not to look at individual teachers to begin with. Our focus was on the departments to get our school moving forward. 

So starting in March (2013) we asked our staff to take one of their assessments each month and break it down on the Hess Matrix. To make sure our staff was still on the same page we made sure to revisit the matrix in August with a review during inservice. 

Below shows the growth for our building. We decided to lump our categories to help with the visual ability to track our growth.  

Our results showed that our lower level questioning Blooms 1 &  2 declined. Even our Webb's 1 &  2 showed a little decline. On the other hand we saw Blooms 3 &  4, 5 & 6 as well as Webb's 3 &  4 climb.  Through the fall semester we were excited with the results. It should be noted that when we talked about increasing rigor on our assessments that our staff realized that they would have to increase rigor within their instruction. If we didn't increase rigor within our instruction we couldn't expect our students to do well on our assessments. 

As we enter our third semester of tracking two issues have surfaced. 
  • Tracking student scores - while we were tracking the assessment we were only tracking what our teachers were doing. We did not have any data on the success of our students on these higher level questions. We understood that when we started but did not want to overload our staff with the challenge of tracking individual student data. There was some discussion on trying this during the second semester but the staff wasn't ready. Now that we have entered the third semester our core teachers seem to be ready. We have asked them to identify 1 or 2 higher level questions on their assessment and record how their students perform. 
  • December Final - We require our teachers to give a final at the end of each semester. We do allow the teacher to choose whether it is comprehensive or just a test at the end of a unit. Most teachers choose comprehensive. When the results from the finals were tabulated we saw a major drop in all the categories that had seen growth over the past months as well as an increase in the lower level questions. This lead to even more conversations on why? While the discussion were taking place we decided to take a hard look at individual teachers results. This lead us to a discovery. Department scores did not represent the major of the teachers. For example: Social Studies department has 7 teachers. Two of those teachers had finals with at least 100 questions, many of which were low level. One of those teachers had actual given a two part final with the early portion given a week in advance. That early assessment had higher level questions that was not included in the matrix results.  The other 5 teachers had created finals with very few questions but the ones created were higher level.  These shorter finals were also comprehensive like the ones with over 100 questions.  One of those teachers even said that he had only 5 questions (all higher level) and his students did better on this final then over the past few years. 
Side note: What do teachers think about all of this?
  • Some still see this as something they have to do and a waste of their time. But let me share one  of those groups comments. During their PLC they were asked "How has the Hess Matrix impact your assessments?" Number of them said, "Not at all." Yet, as they continued talking it was clear they were far more focus on asking higher level questions then before. 
  • We have one department who still feels their subject is rigorous by itself and they are struggling to make changes. So many of their test questions continue to be lower level with right and wrong answers. 
  • On the positive side many of our teachers are more focused then ever on finding ways to create higher level questions. This includes finding ways to ask higher level questions that allows students to still give those lower level answers.  
  • Grading time continues to be a challenge. Clearly, moving from multiple choice, matching and true/false to more open ended questions requires more time to grade on the teachers. This is why we are working with our teachers to ask fewer questions or breaking down their assessments into multiple assessments. 
In conclusion, our plan is to continue to collect the matrix monthly. I hope to have some conversation about the May final in hopes of getting better results. We will also have discussion over the sumer on what our next step will be as we enter the fall 2014 school year. 

No comments:

Post a Comment