The State of Alaska recently entered into an agreement with SBAC. Those in Utah published questions posed by an economist on this matter. These questions were posed over a month ago, and still have gone unanswered. They deserve an answer.
An Alaska economist, Dr. Barbara Haney, put together the following list of questions:
1)What elected officials were involved in the process to opt into SBAC?
1a) Upon what authority did the state of Alaska put our state’s education system under the authority of the state of Washington and the SBAC consortium? Doesn’t this violate the Alaska Constitution?
1b) Isn’t SBAC an example of an Agenda 21 style regional board? In fact, isn’t this agenda 21?
2) Isn’t it true that the real reason that SOA entered into agreement with SBAC is to get the RTTT money and the NCLB waiver? How much money exactly are we getting from RTTT? To whom will those funds be disbursed?
3)The Race to the Top grant defines College and Career read as follows:
According to the USDOE “College- and career-ready standards: Content standards for kindergarten through 12th grade that build towards college- and career-ready graduation requirements (as defined in this document) by the time of high school graduation. A State’s college- and career-ready standards must be either (1) standards that are common to a significant number of States; or (2) standards that are approved by a State network of institutions of higher education, which must certify that students who meet the standards will not need remedial course work at the postsecondary level.”
In other words, if you adopt the common core standards, you have career ready standards.
How do these new standards meet the needs of Alaska’s employers? (Specific references, specific industries, not platitudes). What career codes in Alaska’s economy are these standards keyed to? How does the SBAC test demonstrate this to Alaskan employers? How do these standards fit in with Alaska’s Manpower forecasts by AKDOL?
4) “Smarter Balanced is grounded in the notion that putting good information about student performance in the hands of teachers can have a profound impact on instruction and—as a result—on student learning.” http://www.edexcellence.net/commentary/education-gadfly-daily/common-core-watch/2013/by-the-company-it-keeps-smarter-balanced.html
Isn’t this teaching to the test?
Further, if that is so, then how will Alaska students perform well on the Common Core curriculum tests if they are not using the common core curriculum?
Isn’t this just the state’s way of bullying local districts into adopting the common core curriculum?
5) Another statement by SBAC to the State of MO in May 14, 2013 “This spring we are pilot testing the first 5,000 items and tasks we have developed with about a million students, engaging more than 5,200 schools drawn from all 21 of our governing states. The pilot test also serves as a beta test for our test delivery software. In addition to testing out our items, performance tasks, and software, the pilot test also gives us an opportunity to evaluate a variety of accessibility features for students with disabilities and English language learners.”http://www.edexcellence.net/commentary/education-gadfly-daily/common-core-watch/2013/by-the-company-it-keeps-smarter-balanced.html
Why is the state of Alaska not looking at established tests like ITBS and the ACT? Why are we using a test that doesn’t exist yet? Why are we using an experimental test?
How can SOA even argue that this is a test superior to other tests when the test hasn’t even been used anywhere?
Why was this test selected rather than ASPIRE, ITBS, or Alaska’s past NCLB test? Since that test is written for Alaska why couldn’t we continue to use it?
6) When SBAC was asked about their own cost structure on May 14, 2013 own cost structure, they stated:
“One element dominates the cost: approximately 70 percent of the vendor cost for summative assessments is tied to hand-scoring. Measuring the deeper learning required by the Common Core requires that students write extensively and much of that writing cannot yet be scored by technology. Paying teachers, faculty, and other content experts to score student responses is costly, but it is currently the only effective way to measure important elements of the Common Core.”
a) will Alaska Teachers be employed to grade Alaskan students?
b) isn’t this essentially what the original Alaska Test went to SBA testing? Didn’t we leave SBA testing due to this cost and alleged capricious nature of the grading system?
c) How then is the writing SBAC actually cheaper than the Digitcorp writing test?
Isn’t it true that SOA adopted this for the NCLB waiver and not because it is a superior test?
How does this test then become a superior instrument of evaluating student success?
7) In the area of English Language Arts (ELA), Smarter Balanced places these capabilities within its claims for both writing and for speaking and listening. In rural village schools there are some English speaking conventions are radically different from those in the roadway system. There is no way to avoid the obvious outcome that this test could discriminate against certain ethnic groups.
Has there been any effort to prepare these schools in speaking? Further, given that Hanley’s office indicates these schools will likely have a paper & pencil version of the test, how will the speaking component be evaluated?
8) SBAC funding ends Sept. 2014. In their comments to the state of MO on May 14, 2013, SBAC stated:
“At the conclusion of the federal grant, Smarter Balanced will transition to being an operational assessment system supported by its member states. The consortium does not plan to seek additional funds from the U.S. Department of Education.”http://www.edexcellence.net/commentary/education-gadfly-daily/common-core-watch/2013/by-the-company-it-keeps-smarter-balanced.html
How much will Alaska be expected to commit in the future of their funds? How does this break out on a per pupil basis (Vermont was told it would be $300 per student for the test alone). Where will this money come from?
Why did the state submit the members of the state to a new taxing authority?
Given Governor Parnell’s commitment to SB21 (now signed) and the short term revenue fall, where will the revenue come from in 2014 to pay for SBAC?
9) Pioneer Institute study on implementation show a staging acceleration in costs of SBAC. On average the costs are 4 times the amount given by the Race to the Top (RTTT) grant monies.
Will Borough Governments be expected to pay a share to SBAC? If so, have borough governments been informed for budgetary purposes?
How much will property taxes have to increase to meet these costs?
10) According to a CRESST study by UCLA & CA Board of Regents of SBAC and PARC dated May 2013 athttp://www.cse.ucla.edu/products/reports/R823.pdf, page 9, second column, states
“Smarter Balanced plans to refine its specifications as it develops items and tasks, a contract for item development has been established, and item and task development are currently underway, as is a contract for specifying the test blueprint (see http://www.smarterbalanced.org/smarter-balancedassessments/ for the preliminary blueprints).
Why did the state of Alaska sign on to a test that is not yet written or tested? When there are clearly other tests available that are cheaper (by SBAC’s own admission) and comparable (according the Washington States’ OWN Washington Policy Center), why are we going with this far more expensive assessment?
11) The CRESST Report by UCLA on page 10 states, “However, collaboration may be incorporated into Smarter Balanced performance tasks, and metacognition may well be required in solving the complex, extended problems that both consortia plan as part of their performance task components.”
The use of group answers is a radical departure in Alaska State testing. How will group answers be used in scoring individual students? Will Alaska students be denied a diploma because they did not pass a group answer? Has the use of group answers been vetted in national testing norms? How will group answers be received by parents? Why does SOA DOE feel the use of group answers to be a superior measure of student performance over traditional methods of assessing individual students?
12) The CRESST Study further states on page 18 http://www.cse.ucla.edu/products/reports/R823.pdf
Both consortia have been optimistic about the promise of automated constructed-response and performance task scoring and have incorporated that optimism into their cost estimates for the summative assessment. Both are estimating summative testing costs at roughly $20 per student for both subject areas. In the absence of promised breakthroughs, those costs will escalate, there will be enormous demands on teachers and/or others for human scoring, and the feasibility of timely assessment results may be compromised.
(My note: Optimistic is academic way of saying full of excrement…) How will these escalating costs be met by the state of Alaska, particularly given that the full results of SB21 may not be realized?
13) Continuing on page 17: http://www.cse.ucla.edu/products/reports/R823.pdf the study states
“In addition to costs, extended performance tasks also offer a challenge in assuring the comparability of scores from one year to the next. Without comparable or equitable assessments from one year to the next, states’ ability to monitor trends and evaluate performance may be compromised.”
What this is saying that that this years scores cannot be compared to last years score (of course, there is no test yet either). So if there is no ability to make time series comparisons, how can you tell if a school is doing better or worse over time? This is a radical departure from past assessments used by SOA where there has been some degree of comparability over time. How can a school then look at last years results and this years results to measure improvement?
14) Continuing on page 19 of the CRESST Study http://www.cse.ucla.edu/products/reports/R823.pdf states specifically that SBAC is going against the grain of deeper learning assessments in their methodology.
“For example, Smarter Balanced content specifications include a relatively large number of assessment targets for each grade—on average 29 targets in mathematics and 35 targets in ELA. The claims, in contrast, reflect a reasonable number of major learning goals and represent the broad competencies that students need for college and career readiness. History suggests that focusing on discrete, individual standards is not the way to develop deeper learning, yet this is the strategy that states, districts, schools, and teachers have typically followed.”
Why is the State of Alaska then using an assessment of “deeper learning” that is designed in a way that history has shown will not reflect that deeper learning? Further, how will the curriculum used in schools reflect the acquisition of this deeper learning?
15) The CRESST Study on page 19 states, “Smarter Balanced has been very transparent in posting all of its plans and the results of its contracts. Yet, because its computer adaptive testing approach essentially individualizes test items to every student, it may be difficult to ascertain how well deeper learning is represented for every student or overall. The test blueprint will provide rules for item selection and presumably, those rules will include those for representing higher levels of depth of knowledge, but this is yet to be seen.”
If test questions are not the same for each student, then how can results be compared across students? Further, since the adaptive technology for the test does not yet exist, why is the state investing in it? Doesn’t this represent a radical departure from the traditional type of test given in SOA? Why does the state want to engage in this experimental test over other proven testing methods?
16) Many of the state’s schools do not have the equipment to offer this test on line. Who will be paying the cost of upgrading the school computer lines? Software? Computers? The purchase of additional computers?
The test hasn’t been field tested, validated, or normed. The test will not offer a result that is comparable from one year to a next for a given institution. The adaptive technology isn’t available yet. Many of the districts in Alaska do not have the technology to offer this test. The Consortium is out of money in Sept. 2014.The test is using a strategy that has been shown to reflect the sort of knowledge it claims to test (deeper learning). The $20.00 per test estimate is considered overly optimistic and costs are expected to escalate. In contrast, there are instruments that have been validated that have a certain cost. Further, as the study states on page 18 “… while built-in accommodations may be easier to accomplish, there will still be the validity challenge of establishing the comparability of accommodated and non-accommodated versions of the test.”
17) Further, if the state is not using the Core Curriculum, then why are we using an assessment that reflects the core curriculum?
Great questions. Thank you, Dr. Haney.
Good luck, Alaska.