test dots

This is part 3 of a four-part series on the quantitative approach to IEP evaluation. In this part, we will examine questions that may arise in choosing an exam to use in data-driven evaluation of an IEP described in parts 1 and 2 of this series. Part 4 will look at how standard deviation among scores can be used to evaluate the effectiveness of student achievement systems and how statistics software can be used to do analysis and comparisons.

So far in this series, I???ve endeavored to show how administering a standardized test to all students in an IEP???particularly over time??? can reveal a lot about a program. A key component of this process is giving all of the students in the program the same test.

Undoubtedly, IEP academic administrators and teachers will be concerned about using one test for students across all proficiency levels. Of course, teachers and administrators will need to be sure that students have the minimum proficiency to follow the test. They will also want to know that there are test items that allow lower level students an opportunity to score. IEPs using iTEP for this purpose have discovered that lower level students can follow the straightforward instructions and approach of the test. In addition, each iTEP skill section has lower level test items and tasks that give lower proficiency students an opportunity to score on the test.

Whatever assessment is used, one more basic question must be addressed when employing a norm-based test such as iTEP to evaluate students??? proficiency across levels: ???Should we use a test that is not based solely and directly on our own student learning outcomes and does not use our rubrics or evaluators???? Interestingly enough, there is a good deal of variation in the field in response to this question. While some may readily answer no to this question, others have decided that a close reading and understanding of a test???s proficiency descriptors and scores allows them to align their levels with outcomes on the chosen proficiency test.

In other words, the IEP is reasonably confident that students who gained the out-going skills of a particular level should be able to attain a determined score on the proficiency test. In addition, most IEP administrators and teachers know the pressure to help students perform on other proficiency tests commonly used for university admissions. Indeed, savvy teachers help students draw clear lines between language skills attained in class and success on these tests, using a positive washback effect. Lastly, some regard this third-party, independent testing as an exercise that promotes the good pedagogical health of the institution by ensuring that institutional concepts of proficiency and advancement are not formed in an organizational vacuum.

Certainly, there is much to consider before embarking on standardized proficiency testing across the board in an IEP. But once the decision is made, the data can be viewed from many different perspectives and can be a useful tool for programmatic evaluation and improvement. For instance, a benefit of using an independent proficiency test closely aligned with learning outcomes is the ability to view scores in juxtaposition to pass/fail rates, especially within particular sections of a level. The chart below shows a side-by-side comparison of overall iTEP scores and the percentage of students who passed a particular level and section. In this hypothetical six-level program, sections within a level are distinguished with a letter assigned such as 2A, 3B, or 4C.

Those who have worked long enough in IEPs will recognize this chart and the issue that it highlights. Recalling t he first article in this series, the review of quantitative data can ???quantify??? a phenomenon that one knows to be true only in an anecdotal or subjective sense. In this case, there might be a sense that the teacher or teachers in 3A are applying the rubrics in a way that allows for an inflated pass rate. Conversely, the teacher or teachers in 3D might be applying the rubrics in a way that does not reflect the students??? true proficiency.

While there is a great deal of variation between these two classes in terms of the pass rate, the standardized iTEP score shows much less variation. The students in level/section 3D are scoring in line with their peers in other sections. However, they are not passing the class at the same rate. IEP administrators know that such conditions are not sustainable in an IEP where students will want to be confident that there is fairness in how they are evaluated across sections. However, administrators sometimes do not know how to start the conversation with teachers who think from an anecdotal perspective that there is no problem. In the case of level/section 3D, demonstrating to teachers that their students perform on par with their peers in other sections might encourage the thoughtful participation in norming exercises designed to mitigate such discrepancies. For level/section 3A, one might ask why these students were not able to score significantly above the average given the high rate at which students passed the class. Of course, pass rates do not tell the whole story on skill achievement. If percentage grades are calculated, that could be another factor to consider when examining this possible case of inflated grades.

The analysis of percentage grades is what we will be looking at in greater depth in the next article. In essence, we will be seeking to discover if the percentage grade issued in an IEP class is closely associated with skill achievement. In other words, do higher grade percentages represent higher proficiency? Conversely, do lower grade percentages reflect lower proficiency? Using a proficiency test across the board in IEP can help to answer these types of questions.

Dan Lesho is Executive Vice President of?? iTEP International. Prior to joining iTEP??(International Test of English Proficiency), he was director of Cal Poly Pomona English Language Institute and a professor at Pitzer College.

See this article as it originally appeared on LinkedIn.??

How can we help?

Fill out the form below, and we will be in touch shortly.

We use cookies to ensure you have the best experience. By clicking ‘Allow,’ you agree to our use of cookies.