RPA Data Center

Skip navigation links
Documents
Pictures
Lists
Surveys
Modify settings and columns
Interim District Assessments Feedback Survey

1. The Interim District Assessments (IDAs) are being administered district-wide in SFUSD for the 2015-2016 school year. As the Achievement Assessments Office prepares for the 2016-17 IDAs, we would like to gather your feedback around both Component 1 (Smarter Balanced IAB and SLA Part A) and Component 2 (ELA Writing and Math Milestone Tasks and SLA Part B.)

This survey takes about 5-7 minutes. Please provide your anonymous responses and suggestions by March 17th, 2016. Thank you for your feedback and support!
 
*Click Next to advance through the survey. Click Save to finish.

1. At what level do you work?

 Elementary School
 49 (56%) 
 
 Middle School
 19 (22%) 
 
 K-8 School
 (2%) 
 
 High School
 17 (20%) 
 

Total: 87

2. 2. What is your role?

 Teacher
 70 (80%) 
 
 Site Administer
 (10%) 
 
 Site Support (IRF, Literacy Coach, ARTIF, etc.)
 (9%) 
 
 Central Office Support
 (0%)  

Total: 87

3. 2a. I received the testing and support materials in an organized an timely manner.

 Yes
 12 (71%) 
 
 No
 (29%) 
 

Total: 17

4. 3. Did you participate in the administration of any of the IDAs?

 Yes
 77 (89%) 
 
 No
 10 (11%) 
 

Total: 87

5. 3a. If no, why?

 They were not required for my students.
 (20%) 
 
 We used other site-based assessments instead.
 (0%)  
 I was not aware that they existed.
 (10%) 
 
 It was not part of my role to administer IDAs.
 (70%) 
 

Total: 10

6. Please rate the training and support you received when preparing to administer and score the IDAs.

4. The following training and support I received allowed me to easily administer the IDA Component 1 Assessment (Smarter Balanced IAB in Math and/or ELA and/or Illuminate Math IAB).

1= Strongly Disagree
2= Disagree
3= Agree
4= Strongly Agree

  
Strongly Disagree
Strongly Agree
(%)43129630
Training at my site by the Achievement Assessments Office
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)2321321310
How-to documents and other support materials
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)1623191626
Phone or email support
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)2717211619
Training at my site by the IRF, Test Coordinator or other staff
 
 
 
 
 
 
 
 
 
 
1234N/A

Total: 77

7. 5. The following training and support I received allowed me to easily administer and score the IDA Component 2 Assessments (ELA Writing and/or Math Milestone Tasks and/or SLA Part B.)

1= Strongly Disagree
2= Disagree
3= Agree
4= Strongly Agree

  
Strongly Disagree
Strongly Agree
(%)44125632
Training at my site by the Achievement Assessments Office
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)222925186
How-to documents and other support materials
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)222114835
Phone or email support
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)2319171822
Training at my site by the IRF, Test Coordinator or other staff
 
 
 
 
 
 
 
 
 
 
1234N/A

Total: 77

8. 6. The following IDA was relevant to the teaching and learning occurring in my classroom.

1= Strongly Disagree
2= Disagree
3= Agree
4= Strongly Agree

  
Strongly Disagree
Strongly Agree
(%)292310434
Component 1 (Smarter Balanced IAB or Illuminate 3rd Grade IAB) - Math
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)272110438
Component 1 (Smarter Balanced IAB) - ELA
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)1843570
Component 1 SLA Part A
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)2118231226
Component 2 - Math Milestone Task
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)34176439
Component 2 - ELA Writing Task
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)1846170
Component 2 - SLA Part B
 
 
 
 
 
 
 
 
 
 
1234N/A

Total: 77

9. 7. The following IDA results provided useful and actionable information.

1= Strongly Disagree
2= Disagree
3= Agree
4= Strongly Agree

  
Strongly Disagree
Strongly Agree
(%)241826821
Component 2 - Math Milestone Task
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)321711533
Component 2 - ELA Writing Task
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)1883364
Component 2 - SLA Part B
 
 
 
 
 
 
 
 
 
 
1234N/A

Total: 87

10. 8. I believe the following IDA Part 1 provided an experience that prepares students for the SBAC Summative Assessment.

1= Strongly Disagree
2= Disagree
3= Agree
4= Strongly Agree

  
Strongly Disagree
Strongly Agree
(%)1423241326
Component 1 (Smarter Balanced IAB or Illuminate 3rd Grade IAB) - Math
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)1021241332
Component 1 (Smarter Balanced IAB) - ELA
 
 
 
 
 
 
 
 
 
 
 1234N/A
(%)10613368
Component 1 - SLA Part A
 
 
 
 
 
 
 
 
 
 
1234N/A

Total: 87

11. 9. The reports that I accessed or viewed from Illuminate were useful for analysis and/or future planning. (Choose N/A if you have not accessed or viewed reports from Illuminate.)

 Strongly Disagree
 28 (32%) 
 
 Disagree
 16 (18%) 
 
 Agree
 18 (21%) 
 
 Strongly Agree
 (6%) 
 
 N/A
 20 (23%) 
 

Total: 87

12. 10. How did you review and analyze results of the IDA Assessments?

 Self-guided reflection using Illuminate reports
 36 (41%) 
 
 One-on-One meeting (eg: with coach/IRF/etc.)
 (0%)  
 Grade/department common planning meetings
 16 (18%) 
 
 Monthly faculty meeting
 (3%) 
 
 I did not review results
 15 (17%) 
 
 Sample
 (1%) 
 
 Self-guided reflection using Illuminate reports; One-on-One meeting (eg: with coach/IRF/etc.); Grade/department common planning meetings
 (1%) 
 
 on my own
 (1%) 
 
 Self-guided reflection using Illuminate reports; One-on-One meeting (eg: with coach/IRF/etc.); Grade/department common planning meetings; Monthly faculty meeting
 (1%) 
 
 Self-guided reflection using Illuminate reports; Grade/department common planning meetings
 (5%) 
 
 looked at work, but we hadn't covered the assessment material yet
 (1%) 
 
 NA
 (1%) 
 
 Grade/department common planning meetings; Monthly faculty meeting
 (1%) 
 
 Self-guided reflection using Illuminate reports; Grade/department common planning meetings; Monthly faculty meeting; Principal Meeting
 (1%) 
 
 We only got to look at milestone task data
 (1%) 
 
 Self-guided reflection using Illuminate reports; phone call with assessments office
 (1%) 
 
 Self-guided reflection using Illuminate reports; Monthly faculty meeting
 (2%) 
 
 Self-guided reflection using Illuminate reports; results were difficult to find and use
 (1%) 
 

Total: 87

13. 11. Please provide general feedback you would like to share regarding any aspects of the IDAs including what additional support, training analysis tools and/or reports you would might find helpful.

 The fall IDA ELA component 1 - Edit/Revise - was 100% stupid. I would much prefer to have the option to choose which test to give my students. I nearly refused to give the spring component 1, but our testing coordinator and I looked at the Research questions and found them to be much more relevant tasks. More flexibility as to what would be useful for assessing my students at what point would be helpful. If there is an issue that there is only one set of high school questions, creating more of those would also be helpful.
 (2%) 
 
 I think the Component 1's are great for prepping the students for SBAC, but they would make more sense if they aligned with the spirals. The writing tasks should not use poetry.
 (2%) 
 
 Four-point score on the online test was generally unhelpful, because they didn't provide actionable information. Milestone rubrics were unhelpful when the scores were almost entirely full-credit or no-credit with very little information about the in-between.
 (2%) 
 
 As a school leader, I highly value meaningful, real-time, purposeful data to drive our instruction and ensure ALL students are able to progress with our curriculum. However, it is becoming clear to me and to many of my admin colleagues that there are just way too many assessments. They take too much time away from instruction. They do not sufficiently reflect the learning from the classroom instruction to warrant the amount of time dedicated. The data only tells the surface story of a child's learning. F&P and IWA are meaningful, useful and informative for classroom instruction re-planning. The IDA have yet to prove to be much more than a superficial dip stick and a distraction from teaching and learning for even my most highly accomplished and veteran teachers. In fact, I have heard teachers discussing how the current trends in district assessment models is effecting the way they view their long term plans for working within SFUSD. I feel it is a moral imperative that the district find a more effective means of evaluating students' growth and proficiency.
 (2%) 
 
 Sample
 (2%) 
 
 The student IDs were very frustrating. Many students did not have one. The spring IDA for Algebra 2 is not aligned to the curriculum, and the spring IDA for Algebra 1 was a repeat of the fall! That did not make sense unless the questions were different. The lack of helpful training for submitting Milestone scores in Illuminate was also frustrating.
 (2%) 
 
 Project based learning is more effective
 (2%) 
 
 Overall, navigation has been challenging. Administering assessments is confusing given the two modes of testing (Illuminate and SBAC). Granted this is the first year officially using both, the learning curve has been difficult.
 (2%) 
 
 I appreciate the interim assessments as an opportunity for students to learn how the Summative SBAC will work and as an opportunity to receive more student achievement data more regularly than once a year. However, I have several problems with the test and the data that is collected from it. Firstly, the types of questions are not as broad as the summative (which includes more interactive components in the math tests and more open response and interactive highlighting in the ELA). Because of that, it doesn't fully prepare students for the Summative Experience, which in my opinion is one of the main benefits of the IDAs. In addition, I think the test is too long. For our GenEd students, most completed it in a single class period but many who took it seriously needed two periods to complete. While these students may have an easier time decoding the questions and creating a response than our SDC and newcomer students, they had a difficult time experiencing breaks in their classes, especially with the two windows so close to each other. They just did them at the end of the Fall semester and then we welcome them back to school with more testing, since these were in addition to the IWA and the SRI. It is too much. And that experience from the GenEd students is much less anxiety producing than it is for our students in SDC classes or are newcomer students who have very emergent English proficiency. This is a very soul crushing and seemingly pointless test for them. To add to this, the data we receive only sorts students into 3 performance levels. This is not very helpful for our teachers as students who are not proficient but are near proficiency are grouped together with students who are. In addition, there is no data available at the standard or strand level; all we receive is a score and an associated designation of Below Proficient, Near or At Proficient, and Above Proficient. This surface level of data has not been helpful for our teachers. It doesn’t give them any feedback about which areas their students need more support in. The data mostly reaffirms what they already see in class, or when it doesn’t coincide, is interpreted that it means the student didn’t take it seriously and they simply clicked through the test. Either way, it has been wholly unhelpful data. I would propose two options. First, that the test is much shorter and asks one of each type of question. One multiple choice, one free response, one computer based manipulation, etc… This way, it actually prepares students for what they are going to see on the Summative test and it doesn’t burn them out. Since the data reporting doesn’t include any detail, a short test would be well aligned with a simple statement of proficiency. Alternatively, if the test were to remain the same length, I would propose the test questions are more varied and the level of data that can be made available to be greatly expanded. Thank you for the chance to submit feedback. I look forward to learning how the AAO office responds to the feedback that they receive.
 (2%) 
 
 The rubric and what the students were asked to do in the instructions did not match. To get a 4 students were required to give information that was not asked for and there was not a section to write the answer. The students should have been required to solve the problem using 1 or 2 methods, not all 4. Teaching the use of the 4 methods provides them a way to solve problems in a way that makes sense to them
 (2%) 
 
 My site made major curricular changes---co-teaching and project based learning, so we had no time to review data nor to plan around the assessment expectations.
 (2%) 
 
 The IAB's are completely useless to teachers and families. There are only 3 levels - below, at or near and above grade level. The 2000 number has no relevance to future planning. The CLA's last year at least gave is data as to which standards were assessed and we could analyze the data and support students with errors - with the ELA IAB we have no data about the types of questions they missed and how to guide their errors. We need to go back to using assessments that give teachers data they can actually use in the classroom. Also, the level of "at or near" grade level is unclear as to whether they met the standard or not.
 (2%) 
 
 Scoring Rubrics were poor in calibrating and focusing on how we should score them. Last year, there were mistakes in the rubrics and some questions didn't even have rubrics, but we thought we just had to score some things holistically. Easier on Illuminate where they can receive immediate feedback.
 (2%) 
 
 The math tasks are only doable one on one for kindergarten (and we hadn't really covered the material yet) I don't know when I would have that kind of time to assess my students one on one.
 (2%) 
 
 The IDAs and the standards on the report cards need to aligned! It takes way too much time to score the IDA ELA Tasks and the Math Milestone Tasks for 33 students! Also, I could not access complete student reports or class summaries in Illuminate. Very disappointing.
 (2%) 
 
 I believe Component 1 ELA task for middle school is useful practice for the platform on which the kids will take the SBAC. However, the Milestone writing task had zero connection to what the kids had to do on the SBAC Performance Task last year. Put the time used for this back into the classroom. Ridiculous amount of time spent scoring these, as well. Collaborative scoring done by non-ELA teachers is often not particularly reliable. Ideally, a writing test should be scored by 2 people, not 1. But, again, this takes time away from the classroom. Trust teachers to teach writing and to assess it in our classes instead of doing more test prep. That's what these assessments are really about, no matter how the district spins them. They don't drive our instruction. Good English teachers won't let that happen. Nor will we teach to these tests. Spend money on additional literacy support for our students who read far below grade level. Spend money to pay us decent salaries. Re-think what the assessments office should be doing.
 (2%) 
 
 The ELA Writing Tasks were of no value at all, as we didn't teach the skills needed to complete the tasks. Would like to have copies of instructions on how to navigate CAASP and Illuminate. Test Coordinator should disseminate testing information in a timely manner so that we are better prepared in advance, instead of struggling at the last minute. Trying to access CAASP on laptops was a big problem at our school. Are the test coordinators supposed to ensure that the most updated CAASP browser is installed and easily accessed? Want to thank the AAO staff for always providing friendly, speedy services!
 (2%) 
 
 I would find these assessments useful if they were based on the pacing and spirals already provided by the district. For example, the 4th grade writing assessment was on Research that is actually the 4th spiral, but assessed in the spring assessments.
 (2%) 
 
 Getting the kids onto the tests is a nightmare. The process has too many glitches, their passwords are too long, and the tech is not up to it. By the time all of the kids are on, they are too stressed to do their best work. School sites should be provided with trained TAs to administer the tests.
 (2%) 
 
 I don't know what SLA stands for so I answered N/A. Please provide explanation of all acronyms. I also had to leave on personal leave because of my father's death and had to administer the tests before we went over the content. I feel as if the arbitrary dates the districts assigns are rigid, inflexible, and ultimately ineffective of assessing student's understanding.
 (2%) 
 
 The new format was considerably this year, but the support and instructions were inconsistent and confusing. It felt like a half-baked plan from the start and the information continued to change throughout the year. My colleagues at AAO, including Jesch, were very helpful and responsive, but we did not feel like we were put in a position to implement this in an authentic way.
 (2%) 
 
 As an Assistant Principal of a large High School this took up an incredible amount of my time.
 (2%) 
 
 Because of the stress level with equipment and access, it's very difficult to equitably administer these tests. Also lack of time on the computer to practice reading and writing, again make it very inequitable, giving advantage to those with a computer at home. Both the math and ELA IABs covered material we had not yet covered in class. The Writing Task for the ELA IAB #1 in 5th was not even a narrative, which is what we were teaching. It makes the results useless and the time wasted for the students.
 (2%) 
 
 The IDA is not aligned to the scope and sequence of Math and especially not to component 2 of ELA. Test window #2 for ELA writing task is exactly the same as the first focusing on poetry. This does not provide needed preparation in writing in other genres. BTW, having component 2 immediately before the 3rd grade IWA is redundant and not the best use of instructional time.
 (2%) 
 
 The deadline for taking the Math component 2 is too early. It is a waste of time for students to be forced to take the test when they have no completed the unit yet. It will not provide accurate results and uses up valuable class time. I refuse to rush through the unit so the timing makes more sense. All three teachers at my site were starting unit 6 when we gave the milestone for unit 6.
 (2%) 
 
 I would find it helpful to get specific information about which problems students missed in the IAB. It also would be good to see a bar graph from Cycle 1 to Cycle 2 to see if there were improvements. Although I know they were different subject matter.
 (2%) 
 
 The IDAs were not relevant in any way to what we were learning in class. Both writing tasks were based on my students' ability to analyze poetry and write about it. This is a skill that is commonly taught at the end of 4th grade. In addition, the second math IDA was for the end of a unit that we had just started. When my grade level team approached the assessments office about this we were told that our pacing was off. I was extremely offended by this response as I have been prompt with my pacing of the math curriculum and am actually ahead of my grade level team. These assessments have been causing unnecessary stress for my students. The first ELA writing task left 2 of my students with special needs in tears and half of my class left with an poor self image based on their ability to finish the assessment. I would push that these assessments be based on what curriculum is actually being taught in our classes.
 (2%) 
 
 More specific reports on standards per class.
 (2%) 
 
 Math Milestone Task is only a very small part of what students need to accomplish in CCSS Math Standards. It does not provide enough information to discuss with parents or for teachers to understand where students are in accomplishing CCSS!
 (2%) 
 
 Too many directions that were not clear. Too many choices online when taking test- it would be helpful to only have the test section that is required active rather than a series of assessments to choose from that are not required. Clearer communication between LEAD, schools and AAO about what test scores and terminology really mean. Alignment of the core curriculum timeline with the assessments- why assess just to assess with no real value of the results since teachers did not get to that standard?
 (2%) 
 
 The IABs as they stand, are not aligned my teaching, the district curriculum, our school's monthly plans, or our new "spiral" system, as required by SFUSD. If this continues, I'd rather opt out from taking them.
 (2%) 
 
 Prior to administering IDA in fall, students had had no computer training. They were very challenged by questions that asked to highlight or drag information. Space in math sections where they were to calculate were also very awkward to use. Could tutorials on these aspects be given before giving IDA?
 (2%) 
 
 All of these assessments, both on the computer and the written ones that I have to give, take A LOT of time. It is stressful for both the students and me. It takes a lot of time out of our regular instruction, and a lot of OUR OWN TIME to grade them. I already know how my students are doing; I do not need another tool to see how students are doing that takes so much time. If the District wanted to help, then they should get professional testers out here and test the students themselves instead of making us do it all. Additionally, the math curriculum that the District is currently having us do IS AWFUL. I have to scramble to teach what the students should know, and get extra practice. I do not feel that they have enough practice with what the District provides. It's truly awful, and my whole school site agrees.
 (2%) 
 
 The physical administration of these tests are still a major problem at our school site due to the lack of accessible technology and ongoing connectivity issues. Having to make complex schedules, moving computers and students in order to access computers is very time consuming and disruptive to scheduled instruction. We simply do not have enough computers and our teachers still lack training. AAO office does not provide hands-on training - only lecture style training, many teachers experience a great deal of frustration with administration of the interim test and are vocal in their unwillingness to lose instructional time. We need to have training time that is required PD for Math and English, and that is centrally provided. WHen teachers are given options for PD they do not choose this training, yet lack fundamental skills to successfully troubleshoot all the problems that invariable occur during these administrations.
 (2%) 
 
 We have an essentially unusable math curriculum that does not evince strong development, logical progression, internal coherency, or editing. So it is not fun, useful, or meaningful to administer assessments from it, since the likelihood that these assessments match the skills taught are around zero. So in my abundant (hee hee) planning time, I have the exciting task of writing my own math program that teaches the skills the assessments expect. Since these skills aren't even aligned with the already remarkable Common Core standards but are higher, this means I am extremely far behind in math instruction. My students did very well on the milestone task at the cost of hours of frustration and not having enough time to teach other skills this year. The testing is driving instruction in terrible ways.
 (2%) 
 
 I teach beginner ELs. Almost everyone received 0 or 1 for all components.
 (2%) 
 
 The IDA is unnecessary. It eats up valuable instructional time to supposedly "give" teachers feedback that we already get from our other assessments and in-class work.
 (2%) 
 
 The support I would like is the flexiblity to assess my students based on the completion of the unit. There are times when I have given the IDA and I just gave the Entry Task in math the day before. I would like to have flexibility to take time to reteach my students content during a unit without having to assess them without any lessons and cause stress they don't need. I am finding that many of my students are not wanting to participate in the assessments now because they feel that they cannot do the task at the time you want them to. When I have given it to them after they are taught the lesson they eagerly do it well, but it is too late for you to see the results because it is not within your timeframe. I need flexible assessment completion.
 (2%) 
 
 Teachers should have training before they need to give the test and then training on how to report the results and training on how to use or look over the results. Since I didn't grade my test the assessment was useless and a waste of students' time
 (2%) 
 
 Student scores need to be broken down to address each standard assessed Overall score didn't really tell any new information Couldn't find class report on each standard
 (2%) 
 
 Knowing what, when, and how at the beginning of the school year; organizing test time at a school and department level and keeping the messaging consistent. A clear, consistent way to communicate this information is necessary. There has been some confusion about what gets administered when, and how the grading/scoring procedure will be done.
 (2%) 
 
 Spring IAB Math Feedback: Milestone Tasks Hoover administers IAB Milestone Tasks and devotes a half day per grade level to collaboratively score the tests. We have the following concerns about the Milestone Tasks administered during the Spring window. Pacing: The staff is of the opinion that the test should be administered at the discretion of the teacher when the appropriate unit has been taught. The windows offered by the district have been too soon for the pace at which the teacher at all grade levels are teaching the material. In addition, administering the test at the discretion of the teacher would allow teachers to adjust the order or pacing of units to allow for projects, as well as other deeper applications of the math. Finally, if the district seeks to compare the results across schools, it would make more sense to assess whether the material taught has been effectively learned by students. Test administered too soon yield little, if any summative information. The Hoover math staff is involved in the complex instruction work as well as the development of common core curricula for math, both of seek to put student learning before “covering the material”; the assessment structure should do the same. 6th Grade The task is decontextualized and provide little to engage a student’s interest. The choice of “We expect at most 5 to attend” is needlessly confusing, and does not reflect common or academic speech. Perhaps 5 examples, all of which needed to be shown graphed on a number line, would be easier to score, without opportunities to be confused by the directions to “circle” expressions that have 5 in the answer. 7th Grade The length of the diagonal side is mathematically impossible; it can’t be 7 given the lengths of the base and height. The directions should specify that the trapezoid is isosceles and the other dimensions need to be changed. 7th grade teachers have expressed frustration with the out of order use of parts of CPM, when the groundwork for the CPM materials is laid out in the first units of the book. 8th Grade The choice of Jet Ski rentals is culturally incompetent. Staff said it reminded them of similar choices about mowing lawns, etc. which are not reflective of the life experiences of most of our urban students.
 (2%) 
 
 The deadlines for the IDAs don't correspond to the pacing of the math program. For example we have always been a half plus unit behind the task the students are being asked to complete. Essentially being assessed for things for skills they have not acquired.
 (2%) 
 
 Reports breaking down student performance by standard would be useful.
 (2%) 
 
 Our Chromebooks took a while to load the test, and we didn't realize that some students just needed to wait it out for the fully white screen to disappear, so that the test can load. It would be helpful to have an extra person in the classroom to help students with the computers and troubleshoot the Chromebooks for taking the IAB. I am unhappy that I had to give tests twice now that covered units that I did not even teach yet. How is the data from these tests supposed to be useful if my students have not even learned the material? We are creating and feeding a mindset for students to hate tests instead of seeing the tests as useful feedback for them and for the teacher. I hate being pressured to a testing window that is unfair because we are trying to respond to the needs of our students.
 (2%) 
 
 This test asked my students about skills, like writing on the theme of a poem, that we had not covered yet. My students felt confused and frustrated. Furthermore, grading the assessments was time consuming and not helpful. The scoring system on Illuminate was slow to use, the rubric had too many drop down options to be scored, and scoring my students on something they have not learned did not feel like a good use of my time when I have so many other more urgent tasks to attend to as a classroom teacher.
 (2%) 
 
 The results of the computer-based assessments are not at all helpful to inform instruction because the reports do not give information on how students did on specific standards. And, often, the content being tested was not what the students have been working on in class. Also, the names Component 1 and Component 2 are not helpful for communication. For me, SBAC Predictor would be more helpful and accurate for Component 1 (or is it 2?) and the Milestone tasks and writing tasks could just be called that.
 (2%) 
 
 The SBAC test includes test questions about areas of the curriculum that were not covered at that time according to the district's own scope and sequence in math. The writing task covered a topic not yet covered according to the spirals in Language Arts and students learn the futility rather than the potential value of these assessments. They also take significant teaching time and redirect use of technology better spent as educational resources.
 (2%) 
 
 I find the IDA too time consuming. It is administered right during report card periods and during the 3rd grade writing test. I feel that there are too many test that the district is requiring us to administer. I have difficulties inputing scores and accessing results on illuminate. The results do not help me plan.
 (2%) 
 
 have not been able to find useful results on Illuminate--would need further training
 (2%) 
 
 1. No alignment between units taught and selected topics for computer-based portion means the data is not informative. 2. The reports should provide levels of proficiency for specific content standards, otherwise they do very little to inform my instruction. This is also true of the SBAC.
 (2%) 
 
 The assessments were very confusing to administer. The content was not aligned to what I was teaching and does not take into account children's developmental levels.
 (2%) 
 
 These were the most confusing and frustrating assessments I have ever given in all my 20 years of teaching! It was torturous to administer and at one point it took over an hour just to log in!!! In addition, they were inappropriate for third graders. Ridiculous demands on an 8 year old--passwords/case sensitive/not to mention that most "hunt and peck". Then, after we slogged through all the tests, our administrator told us that there were complaints that we didn't do it in a timely manner. I'd like people from the downtown office to come and try it!
 (2%) 
 
 Overall these assessments were confusing not only for students but teachers as well. They caused stress and frustration for all students. The assessments are not developmentally appropriate for the children, and do not show an accurate picture of what the students know. It truly saddens me that we were required to waste valuable learning time to administer these.
 (2%) 
 
 The results are useless. The CLA results offered a much clearer picture of student and class performance. Overall, the IDAs were futile for teachers, students, and parents. Please align assessments to what is being taught (use pacing guides).
 (2%) 
 
 I would not like to be pulled from my classes to score the math milestone tasks. I would prefer that they be scored by someone at our district, rather than on-site.
 (2%) 
 

Total: 56