Friday, February 1, 2008

What types of subjects and questions are included on the PSSA test?

The PSSA tests proficiency in Math and Reading. Science will soon be added. Types of questions included on the test are multiple choice and open ended questions. (Open ended questions are ones which require a written response.) Multiple choice questions for both Math and Reading are worth one point. Reading Open ended questions are worth 0-3 points. Open ended Mathematics questions are worth 0-4 points. (DCR Tech Report gr 4, 6, and 7 Pg. 10)

In the 2006 PSSA there were 16-20 different tests per grade level. (DRC Tech Report gr 5, 8, 11 pg. 10) (DRC Tec Report 4, 6, 7 pg 10) All the different forms are used in any one classroom. Included in these tests are questions called core items, matrix items, and field test items. Core items are the identical in all tests across the grade level and determine the individual student score. (This is the score appearing on the Parent Report.) Matrix items vary in the different tests and are used to provide the school with a random sample of how the school is fairing in teaching the Academic Standards for each grade level. Since there are a variety of tests, each with different matrix items, more information can be gathered to decide if the school curriculum is successful in teaching the students the standards. A combination of the core items and matrix items score is used for school level reporting. Field tested items are not used in scoring; they are questions that may be used on future PSSA tests depending on student responses. (DRC Tech Report 5, 8, 11) (DRC Tech Report 4, 6, 7)

Several commonly used statistics help to determine if the questions are fair. For example in field tested items, one statistic that is used is the percentage of students who answered correctly. If too many students answered correctly (above 90%) or too few answered correctly (below 30%), then the question would be flagged and reviewed prior to being placed as a scored question on the PSSA test. Other statistics used would answer different questions. Are the students who are “more capable” responding to the “easier” questions with the correct answer and vice versa? Are males and females responding to the items differently? Are Hispanic students responding differently than Caucasians? If the statistics show a problem with any question, the items are either discarded or reviewed and revised. (DRC Tech Report 5, 8, 11)

After the process of statistical, committee, and expert review, acceptable items are entered into a computer system. From computer generated cards and graphics, DRC specialists develop the final tests. The PDE gives the final approval of the test as written and submitted by these specialists. (DRC Tech Report 5, 8, 11)