Saturday, April 26, 2008

Pennsylvania Applying for Differentiated Accountability?

I spoke with a source at the Pennsylvania Department of Education about the possibility of Pennsylvania applying for Director Spellings' pilot program of Differentiated Accountability. The source said that it appeared that Pennsylvania had eligibility, but "they" still were not sure if they were going to apply for the pilot program. The program only will accept ten states which makes it competitive, and there was only a short time until the suggested application due date of May 2, 2008. After the May deadline, there may be a press release regarding the decision. Since the source considers bloggers to have "reporter status", that was the only information that I could obtain. I guess we will have to wait and see!
Read more!

Thursday, April 17, 2008

Subjectivity in PSSA Scoring

The Technical Report for the PSSA for 2007 is on the PA Department of Education website. This report is compiled by DRC to explain the PSSA test development and implementation process for a given year. After reviewing some items from the Technical Report for the PSSA 2007, I am still upset that the open ended questions have a large degree of subjectivity involved in the grading process. As previously stated in an earlier blog, 10% of the tests are subject to a review by more than one "scorer" to determine if there is accuracy between the grade given by one scorer on an open ended question as opposed to the grade another scorer would give the same question. According to the Technical Report 2007 pg 51, two different scorers have only a 70 to 80 percent chance of giving the same score to the same Reading question. In other words, if two educated and trained professionals read the same answer, the two of them are only 70-80 percent likely to give the same score. Many times the readers will give adjacent scores, (adjacent scores are scores that are beside each other, i.e. a score of 2 versus 3). Of course, that would happen when the answer can only receive a score of zero to 3. (Three is the total points allowable for a Reading open ended question. Math open ended questions can receive scores of zero to four points.) To the state's credit, the Math open ended answers have a higher percentage of scoring accuracy. Perhaps the scores are higher because Math is a more cut and dry subject than is Reading. Then again, perhaps the scorers were just tired the day they reviewed Reading. Maybe even as tired as the child who wrote the answer...
Read more!

Friday, April 11, 2008

Researching Differentiated Accountability

I just want everyone to know that I am researching whether or not Pennsylvania will be applying for Director Spellings pilot program of Differentiated Accountability. Basically, Differentiated Accountability will allow schools who do not make their AYP numbers to show why they are not making the numbers and follow up with interventions that are appropriate to the reasons. This makes sense in theory as some schools have a large proportion of their student population identified as needing improvement and others are not making AYP goals because of a small subgroup of students that may be having trouble reaching the targeted goal. In each of these cases, the changes that would need to be made would be quite different As far as I have read, only 10 states will be accepted into the program. Two calls to the Pa Dept of Education have not been returned...
Read more!

Wednesday, April 9, 2008

PSSA Is Finally Over!

Yeah! My son finished his PSSA test and I feel like dancing! I am not sure how he did on the test. The teacher said that he really did not feel like working on it the first day and was rather lethargic about the whole thing. The second and third day went a little better particularly in the Math portion of the test.



He is very glad to be finished with the PSSA prep booklets and vocabulary. In his words, "We get to go back to our regular (textbooks) books now." He is sooo right. Putting off learning the "normal" curriculum to make time for the PSSA prep should not occur. After all, if the students are learning what they are supposed to learn, why do we need the PSSA prep?



I know, I know...too much at stake, too many students needs the drills to "pass". etc. However, if we just drill the PSSA test for six weeks, the results the PSSA gives are actually distorted. It is really just cramming. There are no follow up tests to see if the student retains this information.
What a sham! I am glad it is over for this year anyway!
Read more!

Monday, March 31, 2008

PSSA Day!

Dear Students,

The American Dream will become one step harder for you today. No longer can you work your way to success no matter what your background because today you will take the PSSA test. Your parents and teachers will receive a report giving you a label of advanced, proficient, basic, or below basic. Your classes will be picked for you based on this label. Consequently, your career choice will be narrowed based on this one test. Your educational experience already has been narrowed so that you can prepare for these few hours. Some of you will have recess, art, and music taken away to help you score well on this test. Your written answers will be analyzed by "readers" that have their own opinions resulting in a score received that may or may not be fair. A true American trait of "thinking outside of the box" and individuality will be demolished as you and everyone else must see the same thing in poetry and stories. Instead of going outside and figuring out how high your school is based on mathematics and formulas, you will stay inside and continue to fill in bubbles. You may work hard in school and get decent grades, but still the PSSA test will be more important. You may be a powerful speaker, be able to deal with people well, be good at driving, be dependable, and be compassionate or any number of other important life skills. Yet, the PSSA test does not take these into account. Your label will be stamped on your file, just as a post office stamps each letter with a postmark. Students, please do not carry the label around on your person and do not let it interfere with the American Dream. Your life is your own, your education is your entitlement. Let your government know that you demand more from them than a stamp from one test.
Read more!

Tuesday, March 25, 2008

Proficiency on the PSSA Test

Why does my son do well on the PSSA Math exam and not on the Reading portion of the test? Why does he score well on his Accelerated Reader tests, Star Reader tests, and even the Woodcock Johnson, but not the PSSA? After reviewing the PSSA test last week, I have solved this perplexing mystery.

Math is a subject that is presented in limited ways to a child of his age. The Math on the PSSA test very much parallels the practice sheets that he is assigned in his Academic Math class. On the other hand, by its very nature, reading has more room for variances. Not only are the stories going to be different from what is studied in class, but the interpretation of the story will be different from student to student. My son is a student that does not easily transfer the skills of interpretation and evaluation, but instead relies on his he does not have the chance to study the Reading and reap the benefits of the teacher/classmate discussion of the passage.

What to do with this new found information? Absolutely nothing. More prep work is not going to help. We are already working on the skills of transference, inferences, and evaluation. I am confident that we are thoroughly working with my son to make sure he learns as much as possible. Reviewing the PSSA test has confirmed that pursuing the "proficient score" will only interfere with the time needed to work on other skills that he will need in life. Proficiency on the PSSA test will not help him become successful. To all of you, I recommend reading the PSSA test so that you can come to your own conclusions about your child before the school and the government do that for you.
Read more!

Wednesday, March 19, 2008

I read the PSSA today.

This morning I reviewed the PSSA test that my son will take in a few weeks. The Guidance Counselor sat with me in a small meeting room as I read through the exam. The total time to review the 7th grade test was approximately one hour. I did not have to sign a confidentiality agreement, but I was asked to write down the fact that I reviewed the test on a piece of paper. If I wanted to, I could state why I was reading the test and then state my findings. I wrote the date and that my reason for reading the test was to determine if the test was fair and appropriate.


The exam actually consists of two booklets. One contains all the Math and Reading questions. The other is the answer booklet which has the infamous "bubbles" and the area where the students respond to open ended questions. The bubbles are much better than those of yesteryear. They are not crammed together onto one sheet. Instead each section has its bubbles placed on separate pages. This really eliminates confusion and the possibility of "skipping" a bubble. Unfortunately, the exam is set up so that the students must go back and forth from the exam booklet to the answer booklet. As a first time reader, I found the format a little confusing at first, but became used to it over time. The open ended questions are in the answer booklet, so the exam booklet gives directions at the end of a section such as "go to page 23 in the answer booklet and answer question 15". I found this confusing because the answer booklet page numbers were in a light shade and did not stand out on the page. Attention DRC (Data Recognition Corp.): A bright shade of red would make the pages easier to find.

The math had some of the same problems (with the numbers switched) as what my son has done in class. The open ended questions are similar in format as in class with the exception of some awkward wording. For example, a two part question is often segregated by a page break and the directions are given as "refer to specifics of the previous page to answer". I am sure I am not quoting this exactly, but I felt that for seventh grade the statement is hard to decipher. The math problems range from easy to complex.

The Reading test is difficult. Mechanically, I felt that there is not enough delineation between two different passages that the students are supposed to compare. Moreover, the subject matter is one in which I felt that some of the student's experience will limit them in answering the open ended questions. Figurative language is tested in a way that is similar to the sample question on this website. (PSSA sample questions).

Most reading passages have underlined words that the students must refer back to in order to determine the meaning of the underlined word. It will help a braille reader if it is allowable to have their hand guided back to the underlined word since, in being able to see the text, the visual reader certainly has the advantage in this situation. If my child was visually impaired it would be something I would ask about before he took the test.

Most of the open ended questions require the reader to cite examples from the text. The kind of questions asked are ones that I just want to say "Who Cares? Justify your existence in another way other than making up questions for this test." (Nasty, aren't I?) When reading the passages, I could always tell which parts there would be a question on because these paragraphs are particularly vague. After all, we need to separate the advanced from the proficient, the proficient from the basic, and allow the below basic to be totally blown away. Isn't that what the test does so well?

Some reading passages require the reading of recipes and step by step instructions. While I know these are important parts of our lives, I am sure that it is difficult for seventh graders to uphold interest in these subjects much less answer questions on them.

Generally, I can see why the reading open ended scores produce more variance than the math open ended scores. That is, when the open ended questions are graded, the scores given by two separate scorers are different even while reading the same answer from the same student. Much of the process is subjective based on one's interpretation of the passage and what would be an ideal expected answer. From a child's perspective, differences in beliefs, experiences, interests, and ideas will influence the way any given student understands, interprets, and answers these questions.

After reviewing this test, I realize that the quest to learn even the basic of skills entails so much more than what the PSSA or any other test could possibly reflect. Now more than ever, I truly cannot believe that we use this test to judge our children and label our schools. There are other ways to build student proficiency. They are called classroom exams, hands on activities, individualized education, smaller classes, quality teachers, strong, involved parents, dedicated school boards, and government funding that rewards all of these components.
Read more!

Tuesday, March 18, 2008

Presidential Candidates and the NCLB Act

Since the NCLB is the driving force behind the PSSA test and since the NCLB Act is up for re- authorization, we should know where the presidential candidates stand on education and the NCLB.

While Hillary Clinton voted for the NCLB the first time around, on her website, she bluntly states that she will end the NCLB Act. (Hillary Clinton Website) To improve education, she will enhance the IDEA Act with more funding, recruit and retain good teachers, improve early intervention programs, and create job programs.

On his website, (Barack Obama Website) Barack Obama says that "teachers should not prepare students to fill in bubbles". He believes in reforming and funding NCLB to improve current assessments and create new ones in areas such as the ability to work with technology. He believes more funding from the government would help schools to run lots and lots of programs that would close gaps in subgroups of students, recruit teachers, and run summer programs to help educationally disadvantage students. The funding money would come from varies parts of government such as a reduction in a specific NASA program, improving government procurement systems, and savings from the Iraq war.

According to his website, (John McCain website) John McCain believes that the standardized testing required under the NCLB has helped us see that there is an achievement gap between schools and students from different socioeconomic groups and school choice is important to narrowing these gaps. He states that schools should be "innovative and flexible" in teaching our students and we should focus on addressing the underlying cultural problems in the educational system.

While I do not think any of these candidates is exactly an educational ball of fire, their plans sound good in theory. I am suspicious of Hillary since she was on the committee that gave us the first report "A Nation at Risk" which, along with her husband, really created the environment for the NCLB to become law. I will give Obama some credit, his plan is on a PDF file and has a lot more detail than the other two candidates. I like what John McCain says about schools becoming more innovative and flexible, but that is pretty hard to do when you have the PSSA looming over your shoulder.

Well folks, the vote is yours alone, so I encourage you to look on the websites to read the candidates stances on different issues and the plans they have associated with their ideas. Let us strive to have past voting records and these facts decide our next president instead of voting with whatever the media decides to bombard us with during the race.
Read more!

Thursday, March 13, 2008

With the PSSA's only a few weeks away, the time has come to contact your principal and make an appointment to review the test. My appointment is next Wednesday. I am looking forward to reviewing my son's exact PSSA test. By reading his test, I hope I will gain insight into his eventual test score. I look forward to describing that experience to all my readers.

Part of the No Child Left Behind Act expresses the need for family involvement in education so that the child's educational experience can be maximized. What chances for involvement do parents have? I have attended many PTO meetings. Trust me, "helping education" in the PTO means planning teacher luncheons, deciding on classroom parties, and raising money to purchase items for the school. Of course, there is always the "booster club" for your child's chosen sport, which does help to expand life beyond the 3 R's. While these things do enhance education in their own right, I am not sure if the NCLB ACT was eluding to events of this type. I think the NCLB Act was speaking of improving the parent/child teaching relationship by improving the parent's understanding that the process of teaching starts at home. To learn this, parents need knowledge of what their child is doing in school. (Is anyone familiar with this conversation? Parent: "What did you learn in school?" Child: "I don't know". Or how about the infamous :"Nothing".) Any attempt to fix this common problem must include direct information from the teacher to the parent.

Most schools have web sites to disseminate information efficiently. Some schools have real time student grades that can be accessed through the computer. Teachers may even be required to have a web page with basic information pertaining to the class that they teach. Yet, in our district anyway, there is no requirement for teachers to post homework assignments, give teaching tips for parents, and show daily logs of what is being taught in the classroom. School districts are paying for well designed web sites, yet they are not utilizing the sites to meet one of the requirements of the NCLB.

I recall making some of these suggestions in one meeting. Of course, I was given a look of "are you some kind of idiot", but politely told that the teachers are busy enough without adding more to their plate. The answer to that should be simple. Parents cannot help if they do not know what is happening in class. So I say, start making the teachers post their lesson plans on line. Too much work you say? How about electronic lesson planners that could be posted directly to the web site, have a copy sent to the principal, and another saved for the teacher? When I asked in yet another meeting whether this was possible, the answer was, "I don't know." Translation: "I don't know and I am not going to find out. Keep your suggestions to yourself."

Perhaps this attitude will change since the new Kindergarten parent is at an age where they can only vaguely remember life without the Internet. Hopefully, they will demand and get more information than what we "older parents" are getting today. Their educational interaction with their children will benefit as more classroom details are given. Perhaps this communication will actually aid in a decreased need for the PSSA!
Read more!

Thursday, March 6, 2008

Questionable PSSA Practices

Remember to call your school today to schedule a time to review the PSSA test before your child's scheduled testing time!

Today, I wanted to touch on a different ways that the school districts interrupt important education to prep students for the PSSA. Some schools have labeled a group of children "Bubble Students". These are the students that have a score of "Basic" on their PSSA, but are close to the "Proficient" score. These kids are pulled out of class and given special small group instruction in hopes of pushing them into the "Proficient" category. At first glance, it seems okay. Let's give a little extra help to those that can benefit. Well, there can be no benefit without a cost. Let me tell you how our school is able to make this accomplishment. The "Bubble Students" are given instruction by the already overworked Learning Support teacher. As a result, the Learning Support teacher has less time to spend with the Learning Disabled children who legally and ethically are actually supposed to be the recipients of her services. With the LS teacher busy with the "Bubble Students", the truly disabled students, who so desperately need help, are put on the back burner until the PSSA is complete. The interruption of services is frustrating for the parents and the students. This is one of the many examples of how schools feel is necessary to participate in morally questionable decisions in order to manipulate the numbers into showing progress. (In my opinion, saturating a kid with PSSA problems and calling it "progress" is a questionable practice. )
Read more!

Monday, March 3, 2008

What causes the "false need" for the PSSA?

In a perfect world, the teachers would instruct students to the best of their ability, the students would receive the information and learn easily, the parents would be involved enough to help and support their child's educational process, and the government and school boards would provide enough money for a "free and appropriate education" for all children. If these conditions existed, the PSSA would be nonexistent. As we all know, none of the above always occurs in schools across our nation. Sometimes teachers are incompetent and that incompetency is protected by other teachers and the union. Children have different learning needs and styles. All too often, parents are uneducated or careless when it comes to their children's educational needs. School boards claim they have no money, yet the funding of sports is often a priority in many districts. Governments, for many reasons, must put regulations in place, but neglect to give the money necessary for schools to follow through with those regulations. Some would say that the PSSA is a method of forcing school districts to address all these issues that impede learning. I say that the PSSA is a coercive way of forcing school districts to obtain a false sense of student proficiency instead of correcting some of the reasons that we have an inadequate educational system that will force our nation into an uncompetitive position in the world. (according to some anyway) I look forward to discussing these issues in blogs to come.
Read more!

Friday, February 29, 2008

Private school vs public school?

How does the PSSA effect the education a student receives in a public school vs a private school where the PSSA test is not required? I am in the unique position to have one child in public school and another in private school. I can say absolutely, in the case of my family, my child who attends private school receives a better education. Not only does the private school have an excellent curriculum that revolves around the basic subjects, but they still manage to fit in real life experiences. I just came back from chaperoning my daughter's class trip to the local soup kitchen where students handed out care packages to the soup kitchen "clients". The class had done a fundraiser, bought the personal care items at a local store, bagged them, and then distributed the items at the soup kitchen. During the visit, they were required to pose questions to the director of the establishment about poverty, it's causes, and possible solutions. They were encouraged to make eye contact with the needy people and to speak with them. The children were moved by this amazing experience which could not have been duplicated in the classroom! (and certainly not tested on the PSSA test) My son, in contrast, has participated in money collection and drives for charity, but has never followed through with the delivery of the items. Standardized tests, such as the PSSA, do not allow the time for these "extra" activities. I recently heard a public school teacher say that she was thinking of sending her children to private school because many public school teachers have cut back on "hands on learning" simply because they are "too busy" with the PSSA test preparation. How sad that our children will be schooled in neither the social problems of society nor the application of concepts simply because of tests like the PSSA. Real life applications and experiences mold all of us in different ways. Our understanding of Math, Reading, Science, and Social Studies improves exponentially with every true to life experience. In these matters, public schools and government should follow the private school lead. Let us liberate our public schools from the PSSA test in order to help our children to be better thinkers and allow them the chance to have "life moments" that will shape not only their future, but the future of our world.
Read more!

Tuesday, February 26, 2008

A test on the test...

I can name several vices that the PSSA test brings to education. Today, I will discuss the one that is popping up in my son's Seventh grade English and Math class. I should preface this blog by telling you that, historically, my son does not do well on the PSSA test. However, with certain supports, his grades in school are good. "At least he has good grades," I always think when I look at yet another year of PSSA test scores. Well, I can kiss that comforting thought good-bye because this year the teachers decided to give "sample" PSSA tests and use them for grading purposes. Basically, the students are being tested on the Test! (Note that I have capitalized the word "Test". No, it is not really a proper noun, but it seems to be taking on a life of its own, so I thought it was the appropriate thing to do. Someone call Webster's-we've got a new word here!) Back to testing on the test. Is it not enough that we have set children up to fail on the test? Must we also have them fail in their grades thereby causing their future to be that much more dim? As if devoting six weeks to PSSA test preparation isn't problematic enough, let's throw salt on the wound and make the prep part of the grade. Students should be learning concepts, the interrelationship of those concepts, and the practical usage of them. To use a standardized test such as the PSSA to teach and grade is a travesty of the spirit of education, a waste of tax dollars, and is disrespectful to the diverse needs and learning styles of our students.
Read more!

Monday, February 25, 2008

Importatant dates for the 2008 PSSA test!

March is soon upon us, and so too, is the PSSA test. This year the window in which the Reading and Mathematics test must be given is from March 31, 2008 to April 11, 2008. The Reading and Mathematics PSSA test will be shipped from Data Recognition Corp. (DRC) to the schools by March 3, 2008. If you are a parent and would like to review your child's PSSA test prior to the test date, then you can make your request know to the school any time after March 3, 2008. By March 17, 2008, the school will be sending a flyer to all parents which answers some basic questions about the PSSA test with the last question talking about the parent's right to review the test. If you are reviewing the test, make certain that you request the specific test that your child will be taking. (In case you missed the earlier information, there are many different tests per grade level.) You may meet with a little reluctance or hesitation due to the fact that the District Test Coordinator or the principal may not know how to handle such a request. In my case, I have already spoken with the District Test Coordinator. She was unsure what to do when I asked to review the specific test that my son would be taking, However, she was able to call the appropriate person at the PA Department of Education who said it would be allowable to open a pack of tests and whatever test was given to me for review from that pack would then be labeled as my son's test. The Coordinator said she would pass this information on to the principal of the school that my son attends. Remember, if you are working, you do not have to take time off work review this test. The school must schedule a time for you to review the test that is at your convenience. Be warned that there are stipulations to reviewing the test. Among other things, a confidentiality agreement must be signed and a district representative must be present while you are reviewing the test. Despite this, I would love to see every parent reviewing this test! Let us show the lawmakers, the school personnel, and our children that we want information about this test and more importantly, we want a say in how it is used to judge our students.
Read more!

Friday, February 22, 2008

Interfering again!

After having many informative blogs over the last month or so, this one differs as it tends to more of a complaint. Once again, my 7th grade son will do nothing in Math over the next month except for prepare for the PSSA test. He has more homework relating to the PSSA test than he had when he was following the regular curriculum. Students find this frustrating because of the increase in homework and the inconsistent material thrown their way. I find it detrimental to the learning cycle of the school year. The January, February, March time of the school year is considered by many to be the prime learning months for students. The beginning of the school year is a time for learning the rules and being eased into a routine of a new year. The holidays bring many vacation days. The end of the year is stacked with student excitement in anticipation of the summer holidays. The winter period is more conducive to learning than any other time. Unfortunately, the Pennsylvania Department of Education is interfering with this window of educational opportunity by mandating when the schools must schedule the test. Stop the madness and let the kids learn already!
Read more!

Wednesday, February 20, 2008

Will the PSSA always carry so much weight?

In December 2007, a discussion paper released by the PA State Board of Education recommended that high school graduation requirements be standardized throughout the state. Presently, students can show proficiency in Reading, Writing, and Math by scoring in the proficient range on the PSSA or by passing a local assessment. (A local school district can use their own test if a student cannot score proficient on the PSSA. Presently, this is not subject to a rigorous state review.)

The Board is proposing the use of four different methods to show student proficiency. First, a score of Proficient on the PSSA can be used. Second, a local assessment can be used with the additional requirement that the school district have the assessment independently reviewed to show that the test is aligned with the state standards. Third, ten Graduation Competency Assessments (GCA) would be given in the subjects of Mathematics, Language Arts, Social Studies, and Science. A show of proficiency on one GCA in English, two in Mathematics, one in Science and one in Social Studies would be necessary for graduation. Fourth, proficiency could also be determined by the use of Advanced Placement (AP) and International Baccalaureate (IB) placement tests in subjects tested by a GCA. Any combination of these assessments may be used to show proficiency in the subjects listed. (PDE Discussion Paper) In January 2008, these new regulations were unanimously adopted. If the new regulations survive the year review process then they would be adopted in 2014.

Though this provides more options for meeting the requirements for graduation, generally school officials are not supportive of this venture. Many questions can be raised: Would the money needed to implement this test be better used to reduce class sizes and provide other educational support to the schools? (Raffaele) Will the GCA’s be a new kind of standardized test? Will it provide more opportunity for students to “pass” or simply have the school district decide on which “test” to concentrate. The list can go on!
Read more!

Monday, February 18, 2008

Will there be any PSSA changes for special needs students?

In early 2007, the United States Department of Education authorized modified academic standards to be written by individual states for students who have disabilities and cannot reach the academic standards typically required. A new standardized test would be implemented that relied on these standards. (U.S. Department of Education, May 2007) This test would be a “middle ground” between the PSSA and the PASA. (Recall the PASA is the test that might be taken by a student with severe cognitive issues.).

In Pennsylvania, there is, to date, no modified standards for students who have disabilities. However, sources within the PA Department of Education have confirmed that work has begun on modified standards and the completion of this is expected in three years.
Read more!

Thursday, February 14, 2008

Will the laws governing the PSSA and other assessments change?

The NCLB Act is really the driving force behind the reliance on the PSSA test to measure student performance and school accountability. Presently, this act is being reviewed by our federal government. NCLB could possibly be reauthorized in 2008. There are over 100 advocacy groups that have signed the Joint Statement Organizational Statement. (Joint Organizational Statement, Jan. 2008, Fair Testing, April 2007) This statement provides recommendations to the lawmakers to improve the function of the NCLB. Among other things, the Joint Statement asks for the use of multiple measures of a student’s achievements instead of relying on one all encompassing test. The proposal includes being able to count the student’s show of progress on standardized testing instead of just relying on the achievement of preordained academic standards. There is also a push to test in fewer grades, allow several years for the school improvement plan to show success before punishment would result, generate more family involvement in school, and put more decisions about accountability into the hands of each state rather than the federal government. Other organizations are advocating for similar ideas to be considered in the reauthorization. (Joint Organizational Statement, Jan. 2008, Fair Testing, April 2007)
Read more!

Tuesday, February 12, 2008

Do students need to become proficient on the PSSA in order to graduate from high school?

This depends on the school district. All students must demonstrate proficiency in Reading and Math in order to graduate from high school. If a student’s score on the eleventh grade PSSA is not high enough to be considered proficient, then the student can retake the test in the fall of their senior year. If the student is still not scoring proficient, then the individual school district can elect to use a local assessment to allow the student to demonstrate proficiency. The local assessment should be aligned with the state standards and thus, in theory anyway, the PSSA.

Since the school’s graduation rate influences the AYP, the school districts have a lot of incentive to help the student show proficiency. (Not to mention the fact that it is difficult to “flunk” all these kids.)
Read more!

Sunday, February 10, 2008

Do special needs students have to take the PSSA?

Yes. In many circles, it hardly seems fair to have a child with a disability compared to a typical child, but according to the IDEA Act of 2004 and the NCLB Act, IEP students must have equal access to any standardized testing being given to the regular education student and they must have equal access to the materials needed to be successful in standardized testing. This sub- group of students was intentionally included so that they would have the same high expectations applied to them as are applied to the regular education students. (Cortiella, 2006) If an IEP student meets certain criteria, he/she can take an alternative test called the Pennsylvania Alternative System of Assessment (PASA). Generally, a student must have severe cognitive issues to be able to take this test instead of the PSSA. (Bureau of Assessment and Accountability Booklet 2007)

Students with IEP’s and 504 plans can take the test with accommodations. (504 plans can qualify students for special education services even when they don’t quite meet the criteria for special education.) For example, using more time, reading the Mathematics portion aloud, using large print or Braille booklets, are all allowable accommodations. The school records all IEP students, title one students, IEP students that have exited the IEP process, and students with gifted IEP’s that take the PSSA test along with any accommodations that they have used. This information is reported to the state to allow the PDE to report the progress of these groups of students and to make sure the schools are “evening the playing field” for students with disabilities. (2008 Accommodation Guidelines)

Unfortunately, some accommodations that would help IEP students are not allowable. For example, during the reading phase, a question may refer back to a bold faced printed word. Not being able to see bold faced print, a visually impaired student is at a definite disadvantage when this occurs, yet no one is allowed to even guide the student’s hand to the proper word. For a child whose disability affects their reading, charts that help these students visually map a story are not an allowable accommodation even if the student has this accommodation on their IEP.

During the development of the PSSA test, there is a sensitivity review of the test in which committees with experts in the field of special education are to evaluate questions to ensure fairness to special needs students. Unfortunately, it is difficult to be an expert in all disabilities, so, invariably; some questions will always be unfair to specific groups. (ex. A child on the autism spectrum may or may not be able to answer a question dealing with the feelings of a character in a story.)
Read more!

Friday, February 8, 2008

Can a student be excused from the PSSA?

There are several reasons why a student could be excused from the PSSA. If the parent reads the test and wants the child to be excluded from it based on religious beliefs, a letter can be written to the Superintendent of the school making the request known. “Parents do not have to defend their religion nor do they have to state specific parts of the test that are religiously disagreeable to them. A statement expressing a religious conflict is all that is necessary for exemption. Schools must provide an alternative educational setting for these students while testing is in progress.” Any exclusion based on parental consent will negatively impact the schools progress AYP (Adequate Yearly Progress) numbers. (Bureau of Assessment and Accountability Booklet 2007 pg 8)

Students who have extended absence for the testing window of the PSSA or have had a recent medical emergency can be excluded. Students who are uncooperative and refuse to participate in the test are excluded. Students who have no IEP and are placed in a court or agency appointed school or in an alternative education setting do not have to participate. Students who are first year English Language Learners do not have to participate in the reading sections of the test, but must participate in the Mathematics section of the tests. Students who will be participating in the Pennsylvania Alternative System of Assessment (PASA) are excluded from the PSSA. (Bureau of Assessment and Accountability Booklet)
Read more!

Wednesday, February 6, 2008

Shouldn’t a parent or teacher know the exact result of a student's test so that they can further help the child move toward proficiency?

Of course, ideally, the answer is a resounding yes, however, this is a “secure test” and the exact questions cannot be revealed. Still, all parents do have the right the review the PSSA. Any parent or any interested citizen may read the tests prior to the administration of the tests. In other words, parents can call the school that their child attends and schedule a time to see the PSSA test before their child takes the test. The school district must provide a convenient time for the parent to review the test. There are stipulations for doing this type of review. Confidentiality agreements must be signed, school personnel must be present, and no part of the test may be recorded in any way. (Handbook of Assessment pg 8)

A parent should remember that a few of the PSSA test questions are different from student to student (remember that there are many different versions at any given grade level). At the time the appointment to view the test is scheduled, a request can be made to read the exact booklet that their child will receive. Not only can the test be reviewed for acceptable content, but depending on memory, this may provide a parent with limited information on the incorrect answers. In 2008, this opportunity for viewing is fast approaching. I strongly encourage everyone to read this test. Let us show our schools and, in turn, our politicians, that we care and that we want a say in how this test is used.
Read more!

Tuesday, February 5, 2008

How do parents know how students performed on the PSSA and who decides on what is Proficent?

Parents are sent a report called the The Pennsylvania Parent Report. The student’s results are on the report. The first page shows the average score which is 1300 for both Mathematics and Reading. The report shows your child’s score for each subject and where the score will fall in the performance categories. (Below Basic, Basic, Proficient, and Advanced). There are two things to be aware of on this page.

First, student scores are shown in the thousands. The reason for this is not that there are hundreds of questions on the PSSA, but that the Raw scores, which are the amount the student actually had correct (ex. 33 out of 72) are converted to scaled scores so that the statistics used to check the test have more meaning. The average score for the PSSA is arbitrarily set at 1300. (DRC Tech Report 4, 6, 7)

Second, the 4 levels of performance that can be seen on the Parental Report are: Below Basic, Basic, Proficient, and Advanced. These items are called Performance Level Descriptors (PLD) because they describe the level that students is supposedly performing at as measured by this test.

The PLD are decided upon by a committee of teachers which review the Pennsylvania Academic Standards and decide, based on consensus, the definitions of the different PLD. The State Board of Education has the final approval. (DRC Tech Report 4, 6, 7)

Once approved, the PLD are used by a panel of teachers and educators to aid in the process of deciding which questions can be answered by a Below Basic student, Basic student, Proficient student, and an Advanced Student. The PSSA test questions are presented to the panel from easiest to hardest. The panel is asked to decide where the cut off point is between each PLD. This process takes several rounds of discussion and consensus and is further validated using statistical information. Once decided upon, the cut off points between the descriptors are presented to the State Board of Education for final approval. (DRC Tech Report 4, 6, 7)

Once again this is a subjective process made as objective as possible. However, it stands to reason that if a student is on the borderline of any level, then the score could have actually gone to either performance level depending on subjectivity in scoring, performance on the given day, etc. (Well, at least in my humble opinion.)

On the second and third pages, Mathematics and Reading are broken down into smaller categories. An unscaled score (a score that is based on actual points instead of adjusted into the 1000’s) for each category is listed. For example, out of 20 possible points in the category of Numbers and Operations, 12 points were received. However, there is neither a listing of the types of problems that a student answered incorrectly, (open ended versus multiple choice) nor examples of problem types for each category. The website listed to assist parents in helping there children “grow” educationally is http://www.pagrow.com/. This website shows some more basic information and has links to pertinent areas of the Department of Education website that show some sample problems and how the tests are scored. However, once again, its usefulness is limited in actually helping parents to help their children perform better on the PSSA test.

All school district scores are posted on the Pennsylvania Department of Education website. http://www.pde.state.pa.us/.
Read more!

Monday, February 4, 2008

If you just joined us...

We are working toward an understanding of the PSSA (Pennsylvania System of School Assessment). Each blog contains and element of explaination. Read the blog titles to find your area of interest. A bibliography is included.
Read more!

Do all of these statistics, committees, readers, and writers make the PSSA a fair and valid test?

The hundred million dollar question! Though many statistics are used to ensure that test questions are as statistically fair as our mathematics system will allow, subjectivity by “expert readers” and committees will inevitably occur. (After all, we are all human beings.) When reviewing the results of the 2006 PSSA in the Technical Analysis of grades 5, 8, and 11, there are differences in the scores of male versus female as well as differences in the scores that were collected from the different subgroups. For example, students that were Asian had different scores from those students that were White. (DRC Tech Report 5, 8, 11). In looking at these diverse scores, a troubling question crosses one’s mind. Are the differences in scores of the different groups based on a true educational disadvantage of one group or are some of the lower scores, particularly in the Analysis and Interpretation of Fiction, a manifestation of how students from diverse backgrounds interpret questions? Must the government insist that all of our students view these works in the same way? Arguments can be made for and against this question.

To answer in a less thought provoking manner, the PDE commissioned Human Resources Research Organization (HumRRO) to study the validity of the PSSA. Basically, HumRRO reported that though average scores differ between the different groups of students (white versus black), this difference is also seen in other standardized tests that are comparable to the PSSA. (Thacker, Dickinson, Koger) Additionally, students from a higher socioeconomic class have higher scores on the PSSA as well as other standardized tests. Also, those students who perform well on the PSSA will also perform well on the SAT and university placement exams. (Thacker, 2004) (Sinclair, Thacker 2005).

The question now becomes, if our students do well on the SAT, will they succeed in college? The College Board would say an emphatic “YES”, but evidence shows that the relationship between the academic SAT and the prediction of freshman grades is rather weak. (Sinclair, Thacker 2005) In fact, between the introduction of the essay questions on the 2005 SAT and the SAT tests general failure to make accurate predictions about college success, many colleges have ceased to require the SAT for admissions or at least decreased the importance of the test in the acceptance process. Other factors such as grades and class rank are actually a stronger predictor freshman success. (Perez, 2002)

So, in other words, a student can get good grades, score low on the PSSA and SAT and still do well in college or any other future endeavor. This fact supports what is common knowledge for most of us (but apparently not for our government): a motivated student can learn and succeed whether or not they can sit for a couple of hours and take a test that is supposedly fair and unbiased.
Read more!

Saturday, February 2, 2008

How are the PSSA tests scored?

This is an important question for all of us. After schools have administered the PSSA test and the tests are returned to DRC (The company that is contracted to handle the PSSA test for PA.) for scoring.

After many checks and cross checks to assure that the proper number of boxes received equals the number sent to each school and the proper number of booklets are in each box, the tests are cut, divided and scanned for scoring. (DRC Tech Report 5, 8, 11)

Multiple choice items are scored using the computer (Remember when you were young and took a standardized test and your teacher insisted you use a number 2 pencil? Remember the teacher made you practice filling in the bubble neatly? The computer needs these bubbles filled in nicely in order to read the selected answer).

Open ended scoring is more complicated. Hundreds of expert readers are hired. The pool consists of educators, writers, editors, and other professionals who read and score the students open ended responses. Training and education on the 3 or 4 point scoring system (recall the Math open ended questions are worth 0-4 points and the Reading open ended questions are worth 0-3 points) and examples of writing at each level 0-4 is given to all of the readers. (Examples of each level of writing 0-4 are decided upon by Rangefinding committees consisting of Pennsylvania educators committees, PDE staff, and DRC employees.) (DRC Tech Report 5, 8, 11)

All open ended questions are scored at least once with 10% randomly picked to be scored twice. Reports and other quality controls are used to ensure scoring accuracy. However, even with all the checks and balances, there are still discrepancies in scoring when two different scorers have actually read the material. For example, in the case of the 2006 PSSA Reading open ended questions for grades 5, 8, and 11, same scores on the same question could be duplicated only 71 to 83 percent of the time. In cases where the score given by the second reader, was different from the score given by the first reader "adjacent" scoring often occurred. (Adjacent scores are scores “next” to each other: i.e. receiving 2 points from one scorer and 3 points from another scorer for a given response.) (DRC Tech Report 5,8,11 pg 45) An amazingly low agreement rate considering all the student and school district implications of the PSSA test score! It is also a strong indicator that subjectivity and bias do exist despite the fact that “readers were required to set aside their own biases about student performance and accept the scoring standards.” (DRC Tech Report 5,8,11 pg 42)
Read more!

Friday, February 1, 2008

What types of subjects and questions are included on the PSSA test?

The PSSA tests proficiency in Math and Reading. Science will soon be added. Types of questions included on the test are multiple choice and open ended questions. (Open ended questions are ones which require a written response.) Multiple choice questions for both Math and Reading are worth one point. Reading Open ended questions are worth 0-3 points. Open ended Mathematics questions are worth 0-4 points. (DCR Tech Report gr 4, 6, and 7 Pg. 10)

In the 2006 PSSA there were 16-20 different tests per grade level. (DRC Tech Report gr 5, 8, 11 pg. 10) (DRC Tec Report 4, 6, 7 pg 10) All the different forms are used in any one classroom. Included in these tests are questions called core items, matrix items, and field test items. Core items are the identical in all tests across the grade level and determine the individual student score. (This is the score appearing on the Parent Report.) Matrix items vary in the different tests and are used to provide the school with a random sample of how the school is fairing in teaching the Academic Standards for each grade level. Since there are a variety of tests, each with different matrix items, more information can be gathered to decide if the school curriculum is successful in teaching the students the standards. A combination of the core items and matrix items score is used for school level reporting. Field tested items are not used in scoring; they are questions that may be used on future PSSA tests depending on student responses. (DRC Tech Report 5, 8, 11) (DRC Tech Report 4, 6, 7)

Several commonly used statistics help to determine if the questions are fair. For example in field tested items, one statistic that is used is the percentage of students who answered correctly. If too many students answered correctly (above 90%) or too few answered correctly (below 30%), then the question would be flagged and reviewed prior to being placed as a scored question on the PSSA test. Other statistics used would answer different questions. Are the students who are “more capable” responding to the “easier” questions with the correct answer and vice versa? Are males and females responding to the items differently? Are Hispanic students responding differently than Caucasians? If the statistics show a problem with any question, the items are either discarded or reviewed and revised. (DRC Tech Report 5, 8, 11)

After the process of statistical, committee, and expert review, acceptable items are entered into a computer system. From computer generated cards and graphics, DRC specialists develop the final tests. The PDE gives the final approval of the test as written and submitted by these specialists. (DRC Tech Report 5, 8, 11)
Read more!

Thursday, January 31, 2008

How is the PSSA developed?

Who are these "almighty gods of standardized testing"? Who actually makes up the test?
The process by which the PSSA is developed is a complex one using experts writers, statistics, field tests, and Assessment anchors to develop questions. The Pennsylvania State Board of Education contracts with a company called Data Recognition Corporation (DRC) to compose suitable questions, find and reprint suitable reading passages, test the questions, gather statistical and subjective feedback on the questions, and then finally put the questions in to use on the tests. Data Recognition Corporation distributes the tests, collects the tests, and grades the tests. (DRC Tech. Report 2006) (Yes, it is a lot of reliance on one company to get the job done. Let’s hope they know what they are doing.)

DRC does not have total control. DRC works in conjunction with such organizations as The National Center for Improvement of Educational Assessment (NCIEA) and the National Center for Educational Outcomes to create a test that adheres to basic principles of quality standardized tests. Making sure the questions and graphics are neat, readable, measurable are all part of these principles. During the test composition process and before the final test is given, all aspects of the PSSA tests are reviewed by PDE (Pennsylvania Department of Education) committees. The committees are comprised of teachers, educators, administrators, and some members of the PDE staff. Members of the Board of Education have the final formal approval. (DRC Tech Report gr 5, 8, and 11)

In addition, there are guidelines to ensure test questions are free of bias toward any particular group. “DRC’s guidelines for bias, fairness, and sensitivity include instruction concerning how to eliminate language, symbols, words, phrases, and content that might be considered offensive by members of racial, ethnic, gender, or other groups. Areas of bias that are specifically targeted include, but are not limited to: stereotyping, gender, regional/geographic, ethnic/cultural, socioeconomic/class, religious, experiential, and biases against a particular age group (ageism) and against persons with disabilities.” (4, 6, 7 pg 14) A Bias, Fairness, and Sensitivity Committee including PDE staff members and DRC trained diverse men and women, reviews the test items to, once again, ensure they are free of any biased language or sensitive material. (DRC Tech Report 5, 8, 11) (Are any of us bias-free? With that in mind, and from sources which do not wish to be revealed, there are still some questionable items that sneak through).
Read more!

Wednesday, January 30, 2008

Legal Reasons for PSSA

My son is a special needs students who receives Learning Support. However, during the PSSA prep period, students who have no disabilities and do not qualify for Learning Support are pulled out of the regular classroom and taken to the Learning Support room where they are tutored so that they can possibly score in the proficient range on the PSSA. Who suffers? My kid, of course, since he is not given the attention from the Learning Support teacher that he is entitled by law. Why on earth is the school on the fringe of the law as it tries to get the borderline kids to score proficiently on the test? Click on Read More for the infomation on the leal reasoning for the PSSA test. Use the Bibliography page for the sources.

The PSSA is required in Pennsylvania by the Pennsylvania Department of Education (PDE). Though the PSSA has been given in Pennsylvania since 1999, the federal law that mandates testing in each state is the No Child Left Behind Act.

The goals in Pennsylvania are aligned with the No Child Left Behind Act. Schools are required by law to make Adequate Yearly Progress (AYP) towards indicators of performance. Indicators include proficiency in reading and mathematics, test participation, improvement in student attendance in Kindergarten through 8th grade, and improvement in the four year graduation rate for secondary schools. Schools are required to meet preset proficiency goals for Reading and Mathematics. Progress in percentages must be made in each subject toward the goals every 2 years and then every year until the year 2014 when every student tested must be proficient in both math and reading. In other words, a 100 percent proficiency rate is expected. (Can anything ever be 100 percent certain?) Pennsylvania uses the PSSA to meet the requirements of this NCLB.
Schools are intent on moving toward this goal because there are consequences to the district if the yearly goal is not met. The first year the school does not meet the goals, a warning is given. School choice must be offered, assistance teams are used, and a plan for improvement must be formulated. In the second year showing no improvement, there is more of the same with a few more supplemental services. The third year of no improvement brings even more of the same, but there must be changes in leadership, curriculum, professional development or other strategies. The fourth year of no improvement leads to big changes such as chartering or privatization. (Bureau of Assessment and Accountability March 2007) Because of these “punishments” each school district has incentive to make sure their students make AYP. (Now we know why the schools spend exclusive time learning nothing else but how to take the PSSA. The students and the curriculum suffer as teachers resort to teaching to the test.)

Read more!

Monday, January 28, 2008

PSSA Testing Time AGAIN!

It’s that time of the year again... It happens from the middle of January just about to the middle of March. No, it’s not a holiday, not a vacation time, but the time when schools frantically prepare students for the PSSA test. It is a time when educators feel they must forget about field trips, hands on experiments, history, geography, and sometimes even recess. The majority of the time is spent on studying exclusively for one thing, and one thing only- The PSSA TEST! This infuriating approach is called “teaching to the test”. Does anyone have a story on the "prep process"? How is it done in your school?
Read more!

Bibliography

Click "Read More" to View the Bibliography for the information the older posts and those to come.



National Center For Home Education. (April 1, 2002). Home School Legal Defense Association. National Assessment Education Progress: Precursor to a National Test. Retrieved November 2007. from http://www.hslda.org/docs/nche/000002/00000215.asp .

National Center for Home Education. (September 2002). Home School Legal Defense Association. National Testing: A Federal Mandate?. Retrieved December 2007, from http://www.hslda.org/docs/nche/000010/200210230.asp .

Mathews Jay. (November 14, 2006). The Washington Post. Just Whose Idea Was All this Testing?. Retrieved November 2007, from http://www.washingtonpost.com/wp-dyn/content/article/2006/11/13/AR2006111301007.html Washington Post Page A06.

Data Recognition Corp. (May 2007). Pennsylvania Department of Education. Technical Report for the Pennsylvania System of School Assessment 2006 Reading and Mathematics for Grades 4, 6, and 7. Retrieved October 2007, from http://www.pde.state.pa.us/a_and_t/lib/a_and_t/2006_ReadingMathGr4_6_7_Tech_Report.pdf

Data Recognition Corp. (May 2007). Pennsylvania Department of Education. Technical Report for the Pennsylvania System of School Assessment 2006 Reading and Mathematics for Grades 5, 8, and 11. Retrieved October 2007, from http://www.pde.state.pa.us/a_and_t/lib/a_and_t/2006_ReadingMathGr5_8_11_Tech_Report.pdf.

Bureau of Assessment and Accountability. (March 1, 2007). Pennsylvania Department of Education. Pennsylvania Accountability System. Retrieved December 2007, from
http://www.pde.state.pa.us/pas/cwp/view.asp?a=3&Q=94580&pasNav=6132&pasNav=6325.

http://www.pde.state.pa.us/a_and_t/cwp/view.asp?A=3&Q=129181 2006-07 PSSA and AYP Results/ 2006-07 State Level Math and Reading PSSA Results Pennsylvania Department of Eudcation

Bureau of Assessment and Accountability. (2005). Pennsylvania Department of Education. Assessment Anchors and Eligible Content. Retrieved November 2007, from
http://www.pde.state.pa.us/a_and_t/lib/a_and_t/2005AnchorintoFINAL.pdf.


Bureau of Assessment and Accountability. (Nov. 14, 2007). Pennsylvania Department of Education. Grade Assessment Anchors. Retrieved December 2007, from http://www.pde.state.pa.us/a_and_t/cwp/view.asp?a=108&q=103127&a_and_tNav=6309&a_and_tNav=.


Bureau of Assessment and Accountability. PA Department of Education. 2008 PSSA Accommodations Guidelines. Retrieved January 2008, from http://www.pde.state.pa.us/a_and_t/lib/a_and_t/2008AccommodationsGuidelines.pdf.


Bureau or Assessment and Accountability, Pennsylvania Department of Education The 2007 PSSA Handbook of Asssessment Coordinators and Administrators. Grades 3-8 and 11. Retrieved November 2007, from http://www.pde.state.pa.us/a_and_t/lib/a_and_t/2007_PSSA_Handbook_for_Assessment_Coordinators_and_Administrators.pdf.

Cortiella Candace, (August 2006). National Center on Educational Outcomes. University of Minnesota, Minneapolis, MN. NCLB and IDEA: What Parents of Students with Disabilities Need to Know and Do, Retrieved January 2008, from http://cehd.umn.edu/nceo/OnlinePubs/Parents.pdf.

Committee on Education and Workforce, (February 17, 2005). Indiana Department of Education. Individuals With Disabilities Education Act Frequently Asked Questions. http://www.doe.state.in.us/exceptional/speced/pdf/idea_faq.pdf.

Committee Results. (April 1983). U.S. Department of Education. A Nation at Risk . Retrieved January 2008, from http://www.ed.gov/pubs/NatAtRisk/risk.html

Committee Results. (April 1983). U.S. Department of Education. A Nation at Risk . Retrieved January 2008, from http://www.ed.gov/pubs/NatAtRisk/risk.html. Recommendation B: Standards and Expectations #3.

Elert. Glenn (May 1992). ) (Copyright 1992-2006). Virtual Empire. The SAT: Aptitude or Demographics. Retrived November 2007, from http://hypertextbook.com/eworld/sat.shtml#ramist. Validity paragraph 3

National Assessment of Educational Progress. Institute of Education Services. United States Department of Education. Retrieved December 2008, from http://nces.ed.gov/nationsreportcard/about/.

National Assessment Governing Board. Retrieved December 2007, from http://www.nagb.org/

National Assessment Gov. Board and the Institute for Educational Leadership, Policy Exchange. (Nov. 19, 1998). National Assessment Governing Board. Retrieved January 2008, from http://www.nagb.org/naep/tenth.pdf.

The National Center for Fair and Open Testing. (January 2, 2008). Joint Organizational Statement on the NCLB Act. Retrieved December 2007, from http://www.fairtest.org/joint%20statement%20civil%20rights%20grps%2010-21-04.html.

The National Center for Fair and Open Testing. (April 2007). Organizational Proposals for NCLB Reauthorization. Retrieved December 2007, from http://www.fairtest.org/NCLBReformChartp1.pdf. pages 1-4.

PA Department of Education. (December 2007). Pennsylvania Department of Education. State Board of Education Discussion Paper, Proposed State High School Graduation Requirements. Retrieved December 2007, from http://www.pde.state.pa.us/stateboard_ed/lib/stateboard_ed/Chapter4RoundtablePaper.pdf.

Perez Christina, Fair Test. (May 22, 2002). The National Center for Fair and Open Testing. The Truth Behind the Hype: A Closer Look at the SAT. Retrieved November 2007, from http://www.fairtest.org/truth-behind-hype-closer-look-sat.

Raffaele Martha. (January 7, 2008). PennLive.com. PA Students would have more options for graduation tests. Retrieved January 2008, from http://www.pennlive.com/newsflash/pa/index.ssf?/base/news-58/1199741661252880.xml&storylist=penn.

Sinclair Andrea L., Thacker Arthur A.. Human Resources Research Organization. (Sept. 2005). Pennsylvania Department of Education. Relationships Among Pennsylvania System of School Assessment (PSSA) Scores, University Proficiency Exam Scores, And College Course Grades in English and Math. Retrieved December 2007, from http://www.pde.state.pa.us/a_and_t/lib/a_and_t/HUmRRo_PSSA_report.pdf.

Thacker Arthur A.. Human Resources Research Organization (HumRRO) (May 2004). Pennsylvania Department of Education. PSSA Issues and Recommendations. Retrieved November 2007, from http://www.pde.state.pa.us/stateboard_ed/lib/stateboard_ed/PSSAIssues.pdf.

Thacker Arthur A., Dickinson Emily R., Koger Milton E.. Human Resources Research Organization. (May 2004). Pennsylvania Department of Education. Relationships Among the Pennsylvania System of School Assessment (PSSA) and Other Commonly Administered Assessments. Retrieved November 2007, from http://www.pde.state.pa.us/stateboard_ed/lib/stateboard_ed/Final_PSSA_conv.pdf.

Toppo Greg. (Nov. 11, 2003). USA Today. NAEP May Be Used As A Truth Serum. Retrieved January 2008, from http://www.usatoday.com/news/education/2003-11-10-neap-usat_x.htm.

U.S. Department of Education. (May 20, 2007) Elementary and Secondary Education Final Regulations on Modified Academic Achievement Standards. Retrieved January 2008, from http://www.ed.gov/policy/speced/guid/modachieve-summary.html.

Read more!

Friday, January 25, 2008

PSSA History

Read on for the history of the PSSA:
Read more!

The PSSA (Pennsylvania System of School Assessment) is a term we all know and most of us hate. (at least most of the parents, teachers, administrators, and students) How did we get to this point? How did this one test, in the name of school improvement, come to define the teachers, label the students, and often times punish the school districts? How in the world was learning taken out of the hands of the educators and placed into the hands of the legislatures? The answer lies in a complicated array of world events, political platforms, underpaid teachers, and uniformed parents. How did the PSSA evolve into the all consuming and seemingly all encompassing test that it is today? Let’s start with a bit of history on standardized testing. Click below on "read more" to learn about the history of the PSSA.
In the United States, there are early examples of standardized testing revolving around IQ testing. But the real pre cursor of the modern day standardized testing came in 1900 when the College Entrance Examination Board was founded. The College Board (as it is now know) tested the three R’s (reading, writing, and arithmetic), plus science, foreign language, and history. (Interestingly enough, all of the questions were essay. Multiple choice questions were a later invention!) Simply because these were the areas that the College Board tested, most education began to center around these subjects. By the 1920’s, the College Board had created (May I have a drum roll please...) the SAT. (Mathews 2006) There you have it: the benchmark for standardized tests.

Now let us fast forward to the 1950’s and 1960’s. The “space race” between the United States and the USSR was on and though national standardized tests were not used, demands began on schools to show improvement especially in the area of Science. (Better Scientists equals beating the USSR to the moon. Guess what? History will show that we did make it to the moon first without the cajoling of standardized tests.) In the 1960’s, 1970’s and the 1980’s real change came about in standardized testing. It was during these years that the Nation Assessment of Educational Progress NAEP was born. (NAEP is also referred to as “The Nations Report Card”.) (Mathews, National Center for Home Education April 2002) This standardized test helped policy makers and states to review the effectiveness of their schools. In 1983, a huge national revelation took place when a commission report, A Nation at Risk, was released. Basically, using a lot of the buzz words that we hear today, “global village,” “information age”, etc the report stated that the public schools expectations were mediocre.
Interestingly enough, here is an excerpt from the report stating the commission stance on standardized testing:
“Standardized tests of achievement (not to be confused with aptitude tests) should be administered at major transition points from one level of schooling to another and particularly from high school to college or work. The purposes of these tests would be to: (a) certify the student's credentials; (b) identify the need for remedial intervention; and (c) identify the opportunity for advanced or accelerated work. The tests should be administered as part of a nationwide (but not Federal) system of State and local standardized tests. This system should include other diagnostic procedures that assist teachers and students to evaluate student progress.” (A Nation at Risk, Rec. B)
Note there is a need for other diagnostic testing (not just one that can do it all), and the tests were not recommended every year!!
In the late 1980’s, then Arkansas governor, Bill Clinton, spearheaded the drive for more standardized testing and continued to promote it into his presidency. (Sorry President Bush critics, this has been coming for a while.) (By the way, Hillary Rodham Clinton was on the 1986 NAEP study group which led to major changes in the NAEP. I wonder if her deep seeded presidential fantasy started then.) (Home School Legal Defense, April 2002)

In 1988, the National Assessment Governing Board (NAGB) was created to set policies and make decisions regarding the NAEP. Continuing through till today, yearly tests are given, but on a rotating basis. States participate in the tests on a voluntary basis. (nagb.org)

Now, once again, fast forward to 2001 and along comes the No Child Left Behind Act. This act is actually an updated version and reauthorization of the Elementary and Secondary Education Act of 1965. NCLB mandated standardized testing in grades 3, 5, and 11. The focus of the NCLB is to provide accountability and high standards for all children. So high are the standards in fact, that all children, from all backgrounds must test proficient on a state assessment in Math, Reading, and Science by the year 2014. (That’s correct, 100 percent! A lofty goal to say the least.)

This is where the PSSA comes into play. Its evolution parallels much of these national events. Basically, during the “space race” years (1969-1970), the first state assessment was administered. This test, called the Educational Quality Assessment, was used in schools until the mid 1980’s. (Not coincidentally, right after the infamous “A Nation at Risk” report was generated.) During this time, the Testing for Essential Learning and Literacy Skills (TELLS) was created. The TELLS program was used until 1992 when it was replaced by the PSSA. (DRC 2006 Tech. Report. Pg 1-2)


In 1992, when the PSSA was first developed, school districts were required to participate every third year, but it soon became a yearly requirement. Along with that, a grade 6 and 9 writing assessment became mandatory on a 3 year cycle. By 1999, the Pennsylvania Academic Standards for Reading, Writing, Speaking and Listening, and Mathematics were implemented by the state. With the assumption of these standards, the PSSA became and still is an assessment measuring a student’s mastery of the academics that are deemed important by the state. (DRC Tech. Report 2006 pg. 2)

To further link instruction to the Academic Standards, Assessment Anchors were developed. Assessment anchors are slightly more broad goals which state what students should be able to accomplish by a certain grade level. Underneath these are more specific goals called sub assessment anchors and under these goals are even more specific statements called Eligible content statements. (DRC Tec. Report 2006 pg 5, Assessment Anchors and Eligible Content, 2005, Bureau of Assessment and Accountability 2007) Yes ladies and gentlemen, our great state has laid out everything that a student should know, by when, and without deviation from the norm. (Has the Zombie Nation begun?) Because of NCLB requirements, the PSSA began to be administered in grades 3 through 8 and again in grade 11. Writing assessments are given in grades 6, 9, and 11.

Now, after that long, albeit, watered down history, we know the evolutionary path of the PSSA is one filled with historic events, national politics, and a distortion of the original intent of standardized testing.





Read more!