Saturday, February 2, 2013

Pre-Assessments: Lots of Room to Grow!

This past week, my students completed pre-assessments to measure their current knowledge of reading skills and science content which I will teach over the course of the study.  The results are in and it looks like my student participants have a lot of room to grow, especially in the area of science!


The reading pre-assessment tested several different skills, including using text features to read non-fiction, main idea and details, and cause and effect.  The bulk of the test focused on non-fiction text structures because it is such a broad topic to cover and most of the instruction during the unit will focus on introducing and developing these skills.  The assessment included two page-long reading passages (both non-fiction) and was a combination of multiple choice questions, short answer questions, and questions requiring 1 or 2 word answers.  Students read the passages on their own and answered the questions with no help from me.  Data from the assessment is found below:
  • Range of scores: 16% - 89%
  • Mean score: 64%
  • Median score: 65%
  • Average percentage of text feature questions answered correctly: 66%
  • Average percentage of main idea and details questions answered correctly: 65%
  • Average percentage of cause and effect questions answered correctly: 52%
When I looked at these scores, I noticed several things.  The students who scored at the top are the students who typically perform the best in my class, so these test results seem to correlate with typical student performance.  Likewise, the students who scored the lowest on the assessment are those who typically perform lower in my class.  On average, students performed worst on the cause and effect questions.  Is this because there were so few of them or because the students have a more difficult time understanding cause and effect relationships within non-fiction texts?

My science pre-assessment was similar to the one given in reading.  It covered science concepts that I will teach during the unit.  To develop the assessment, I took the targets from each unit (which I wrote using the Core Content for Assessment) and created questions to show whether or not the student met that standard.  Some standards had multiple targets, so there were multiple questions.  Similarly, some targets were not as broad, so there were fewer questions.  The questions on the assessment were a combination of multiple choice, true/false, listing, and short answer questions.  One major difference between this pre-assessment and the one for reading is that I read the science pre-assessment to them.  I did not offer any help or hints on what words meant or what the answers were, but I did read the questions and answer choices (if applicable) to the whole class.  The reason for doing this is to really measure the students' science content.  If students were expected to read the assessment and answer the questions, then this would assess both their reading capabilities and their knowledge of science content.  I realize that in real life, students will be required to read assessments (I'm thinking of the KPREP assessment), but for the purposes of this study, I wanted to only measure what they knew about science before and after the unit.  When I give students the post-assessment, I will follow the same procedure, reading the questions and answer choices to them, but giving no help or guidance.  Data from this assessment is shown below:
  • Range of scores: 3% - 33%
  • Mean score: 21%
  • Median score: 22%
  • Average percentage of Unit 1 questions answered correctly: 29%
  • Average percentage of Unit 2 questions answered correctly: 23%
  • Average percentage of Unit 3 questions answered correctly: 97%
  • Average percentage of Unit 4 questions answered correctly: 9%
Again, I saw several patterns in the data.  Students did most poorly on the Unit 4 questions.  As I considered why this could be, I looked back at the questions.  Six of the seven questions for this unit required short answer responses, whereas other units contained more multiple choice questions.  The short answer questions are much more difficult for students to answer correctly if they don't know the content because they can't just guess a letter choice or take a stab at writing the correct term down.  Instead, they have to fully understand the concept.  Though I realize these questions are more difficult, I did this intentionally because they take several smaller concepts about the sun, moon, and Earth, and combine them to discuss the effects these bodies have on each other.
Similarly, the students performed best on Unit 3 questions.  These questions are all multiple choice, which tend to be easier because students can simple guess even if they don't know the correct answer.  Again, I tried to match the level of understanding needed for the concept to the level of difficulty for the question, and I hope that students show more understanding in both unit concepts by the end of this study.  Units 1 and 2 were similar in their averages.  Both had some listing questions and multiple choice questions.  Unit 2 also had 3 short answer questions.  Most students had no idea what to even guess on these questions, and it showed in their performance.
One last observation on these scores is that the students who scored highest on the science pre-assessment were not necessarily the students who typically perform highest in the class.  The same is true with those students who scored the lowest.  This is interesting, but since these students have likely had much less exposure to science content than reading, I don't find it too out of the ordinary.

After looking at the data, perhaps one of the most important findings is that students performed significantly better on the reading pre-assessment than the science pre-assessment.  This supports the idea that students receive a large amount of reading instruction, but very little content area instruction.  I hope that by the end of my study, students are able to perform well on both assessments!

No comments:

Post a Comment