Monday, March 23, 2015

AMP's Adaptive Testing, Project AWARE--Why Should You Care?

The Alaska Measures of Progress is an adaptive test. There is a "flat" non-adaptive version that will be offered in limited area and groups this year. In subsequent years, it is suppose to be adaptive. Adaptive tests are not quite the same thing as "dynamic tests." In dynamic tests, the questions are similar with side-loop tutorials for students who miss the question. Students who demonstrate mastery by-pass the side-loops. Students in a class will have the same material, but the pace of the material is different.

This test is not a dynamic test. This test is adaptive. There is a difference in the technology. Before discussing the technology, it is important to note that the Alaska Department of Education claims that no psychometric data is on the test. However, there is a concurrent project worth nothing.

Related Projects 


If your school is offering the computer version of the test, it will be adaptive. The Alaska Department of Education and Early Development claims that the test focuses on academics. Perhaps this is so. However, there is a project that seems to overlap the test that should give parents pause.

"Project Aware" is a $9.1 million project that seems to overlap the AMP test.  Somehow, it survived the legislature's surgical knife. It  is a  project in the south central region of the state known  aimed at mental health awareness issues. It is unknown precisely what this covers.  This project appears to involve the American Institute of Research. The project will capture over 60% of Alaska's testing population. It may be in subsequent years this project will be expanded to the rest of the state and will overlap the the AMP. The data could be readily correlated.

Perhaps it is a bit of paranoia to speculate on the implementation of a mental health survey concurrent with adaptive testing. Given the American Institute of Research's longstanding involvement with military and CIA contracts on psychological issues, it is probably rather innocent. Since the AK DEED won't release the AMP scores until October 2015, one wonders why a computerized scoring service is receiving the student data. It may be they plan to correlate testing data with some of these mental health indicators. 


Adaptive Testing Technology

Adaptive testing is nothing "new." What is "new" is the use of technology. The problem with adaptive tests is that it open to manipulation, and there is no way to determine how the test was manipulated or what questions your child was asked.  However, paper tests can be adaptive, and past efforts at adaptive testing were with pencil and paper. It is cumbersome and time consuming to do adaptive tests with paper and pencil, but there was an effort to do so in Pennsylvania years in the 1980s. When they were finally able to get the test released, activists discovered that these tests were not covering material that the average parent would have expected them to cover.





Adaptive testing is a test where subsequent questions change based on the response to the previous question. Questions vary in difficulty as well, and the student's ability to perform on one type of question is considered. Jon Cohen, Vice President of the American Institutes for Research and developer of the adaptive technology used in the SBAC and PARCC tests describes adaptive testing in the video below.





One of the reasons adaptive testing fell "out of fashion" with the education establishment  was the known bias against lower socioeconomic groups. The litigation risk due to bias became clear in Allen et al. v. The Alabama State Board of Education (1999), among other cases.  Has anyone in the state considered the litigation risk of giving a biased test?


 Jon Cohen explains in the video below how adaptive testing is biased against test takers in lower socioeconomic groups. Andrew Halcro, a member of the Education Sustainable Task Force noted on the final day of the task force (December 29, 2014) that in Anchorage, those of lower socioeconomic groups tend to be more racially diverse. The implication then, is that these tests will be biased against racially diverse populations.




One might wonder why the Alaska DEED would give such a test when there are race-based targets set in the accountability measures. There are plenty of reasons for speculation, but nothing is particularly certain. It does seem as if certain areas of the state might be set up to fail the test. The Alaska Department of Education established target test scores for students in their Annual Measurable Outcomes, or AMO, by student race.  It selected a technology that is known to hurt students from lower socioeconomic backgrounds.  One can't help but think this is a set up of sorts.




The other question many have pondered is why the Native community isn't upset about these tests and the AMO targets. One can only speculate on what Elizabeth Peratrovich would say today. 





The reason, of course has to do with the purpose of the test. The test isn't designed to measure student achievement. The test is designed to measure if the material has been covered that the state wants covered and to evaluate the teacher. Whether that teacher is in a brick and mortar school or in a homeschool, that is what the state is trying to measure.

That is why they can get around the bias issue in testing. They claim they are not measuring your child, but measuring the teacher. 




If they are measuring the teacher and not the student, why does the information go into the P-20W data system under the student's name?

So, even if a parent is fine with the data security issues, the scores on the test won't be comparable to other students in the class. They are to check to see if the school is covering the Common Core materials.

Programming Issues


It is bad enough that Alaskan students have had to suffer indignities of the Terra Nova and the programming errors associated with that test. For those unaware, the Terra Nova had serious problems in scoring.  The scoring of the test was invalidated on several occasions. Of course, parents in Alaska never heard about it, but how many teachers, students, and parents were affected by false low reading scores on this test?

"Over time, McGraw Hill determined that a programming error caused the percentile rankings on the TerraNova to be too low at the lower end of the scale and too high at the upper end. As a result, approximately a quarter of a million students in six states were given the wrong national percentile scores."  (Errors inStandardized Tests:A Systemic Problem, National Board on EducationalTesting and Public Policy, Boston College, p. 11-12).

Alaska could have had their scores examined. However, the Alaska Department of Education at that time chose to ignore what was happening in the lower 48.

Then there were the math errors on the exit exams in Minnesota that were the subject of lawsuits. It took parents months to get a copy of the test, and when they did they found several errors from a miscoding of the answer key in the programing (Boston College, p. 15).  Indeed, after reading Errors in Standardized Tests:A Systemic Problem, National Board on Educational Testing and Public Policy, Boston College, one might be hesitant about the use of computers at all.  Those problems were found because there was a paper version of the test, and the test was the same for each student.

In computerized, adaptive testing, there is no paper copy, and there is no way to determine what question a child had to determine if there was a problem with the computer programing or the student.  One can believe that the Alaska Department of Education and Early Development will be all to happy to tell a school that the student or the teacher needs remediation rather than check the validity of the test.


A Final Word on Adaptive Testing 


Adaptive testing was used in an earlier era and its use prompted a slew of legislation afterwards restricted the use of these tests. The most notorious of these tests was the Pennsylvania EQA. One of the more voracious fighters against the EQA was a lady named Anita Hogue. Her video is long is over an hour long and worth every minute. She is the lady referenced by Peg Luksik in the video above.  Watch it when you have time to really focus. These tests are backed by people who have an agenda.

No comments:

Post a Comment