Thursday, November 13, 2014

ESEA Flexibility Waiver Renewal Guidance is Posted

This afternoon (Thursday, November 13) the U.S. Department of Education posted the guidance on applications for renewal of state flexibility under the Elementary and Secondary Education Act and No Child Left Behind. States have until March 31, 2015, to file their applications for renewal. Those Window 1 and 2 states (which includes SC) that want expedited review must submit by January 30, 2015.

States can request a three-year renewal, and those from Windows 1 and 2 that are "fully meeting" commitments to timelines and principals can request a four-year renewal to School Year 2018-29.

Tuesday, November 11, 2014

Legal Implications of Educator Performance Assessment

Dr. Diana Pullin will discuss the issues in her recent paper, Performance, Value, and Accountability: Public Policy Goals and Legal Implications of the Use of Performance Assessments in the Preparation and Licensing of Educators. The paper was published by the Council of Chief State School Officers (CCSSO) and the Stanford Center for Assessment, Learning, & Equity (SCALE). 

Webinar - Thursday, November 20
11:00 a.m. Eastern
To access the webinar you can click here starting at 10:45am on 11/20.
Meanwhile, you can review the paper here. It covers constitutional, statutory, and civil rights issues, as well as questions about the quality of assessments, privacy, intellectual property, and whistleblower situations. 


Meeting number:
719 710 419
Audio connection:
1-866-469-3239 Call-in toll-free number (US/Canada)
1-650-429-3300 Call-in toll number (US/Canada)
Access code: 719 710 419

Wednesday, October 29, 2014

$C Intent to Award to SAS

Following what appears to be the MMO MO (e.g., the ACT - DRC procurement), SC has issued an intent to award a contract to the highest - not lowest - bidder in the quest to have a value-added measure for principals and teachers of "tested" grades and subjects. Executive Information Systems, LLC, reseller for SAS EVAAS, bid $3.8 million for a three-year contract. Rumor has it at least one other contestant bid $2.1 over three years.

Sunday, October 26, 2014

Test Topics

UDPATED
You've heard the rumbles about the amount of testing in schools, teaching to the test, test cheating scandals, testing boycotts by parents, and the use of test results to evaluate teachers. Now there is a move to reduce the volume and increase the quality of assessments, at the same time that districts are being asked to create more pre- and post-tests for every grade and subject for "student growth" measures used in educator evaluation. This movement is layered over state accountability testing requirements, and the federal ESEA requirement of testing ELA and math (grades 3-8 and once in high school) and science (once in elementary, middle, and high school).

On October 15, the Council of Chief State School Officers (CCSSO) and the Council of the Great City Schools (CGCS) issued a joint statement of commitments on high-quality assessments with statements of support from education leaders of states and large-city schools. While the organizations are not moving away from "assessments given at least once a year," they are saying that assessments should be

  • high quality
  • part of a coherent system and 
  • meaningful
What does that mean? In 2013, CCSSO issued its definition of "high quality" for ELA and math college- and career-ready standards, as did the U.S. Department of Education in its requirements for ESEA flexibility waivers ("high-quality assessment" is required in principle 1). As to coherence:
Assessments should be administered in only the numbers and duration that will give us the information that is needed and nothing more.
And meaningful relates to improving instruction and informing parents -  "timely, transparent, disaggregated, and easily accessible."

Both organizations are inventorying the assessments given. CGCS preliminarily reported that students take 113 assessments between K-12, most in 11th grade. The ultimate plan is to
Streamline or eliminate assessments that are found to be of low quality, redundant, or inappropriately used.
In the accompanying webinar there was explicit reference to eliminating multiple tests with overlapping purposes, or assessments that are no longer aligned to mastery of the content being measured.

Meanwhile, districts are scrambling for student growth measures to include in teacher evaluation student learning objectives (SLOs) for the ironically named "non-tested" grades and subjects. And the "next generation" innovators are moving away from one summative test towards learning progressions with mastery assessments. EdWeek reports New Hampshire is proposing a pilot in which the summative SMARTER Balanced Assessments would be used in some grades/subjects but the state-developed PACE performance assessments would be used in others. Linda Darling-Hammond, Gene Wilhoit, and Linda Pittenger recently published a call for a new paradigm for accountability and assessment (here are links to the long version  and the brief).

So here we are with the sometimes conflicting purposes of wanting better student measures, better information on teacher effectiveness, less "seat time," more "personalized" learning, and more meaningful accountability systems.

Meanwhile in SC, we've come up with an innovative way to avoid having teachers "teaching to the test" - don't select the test until ?November? The September intent to award a statewide contract to ACT was protested by DRC. The hearing started October 23. No one is predicting when a decision will be made on which assessments students will take in Spring 2015.

UPDATED 10/27: For an interesting take on what might be behind all of this, check out the EdWeek Politics K-12 team's post.
And apparently the ACT-DRC procurement protest hearing went late on Friday (10/14); SCDE posted a "Supplemental Statement" objecting to a few things.
The Consortium of Large Countywide and Suburban Districts (which includes Greenville, SC) has also come out with a letter to Secretary Duncan in support of fewer summative assessments of higher quality.

Thursday, October 2, 2014

SC Requests Flexibility on Use of Test Score Measures in Evaluation

The actual request for the flexibility to use the test score measures for informational purposes only was sent by the SC State Superintendent today.

Monday, September 22, 2014

ACT - The New SC Assessment Provider?

UPDATED
Although it appeared that ACT Aspire had won the bid to become the new vendor of English language arts and mathematics assessments for grades 3-8 and high school in SC, DRC filed a protest on September 30. The award would be $58.4 million through 2019. Act 200 of 2014 required that the State withdraw from the SMARTER Balanced consortium (which Dr. Zais had already done) and select a new assessment by September 30.
DRC submitted the only other proposal. Rumor has it that the DRC proposal was at substantially lower cost. 

SC would become the second state (first being Alabama) to have statewide adoption of ACT.

In terms of educator evaluation, according to its website the ACT Aspire suite uses student growth percentiles, and can aggregate growth statistics. Meanwhile, the vendors who submitted proposals to do value added measures are making presentations this week. 

Stay tuned for more fun changes. 

Wednesday, September 17, 2014

SC may hold off use of test scores in evaluation?


Updated
On September 16, Alyson Klein of Politics K12 - Edweek published an blog on "Which NCLB Waiver States May Delay Using Test Scores in Teacher Evaluations?" In that day's version South Carolina was not mentioned other than as a state on the map at the bottom (the map is very helpful,
btw).
Then on September 17, the article showed as "UPDATED" and SC was listed here:
Seventeen states told Education Week that they are likely to ask for the flexibility, or were already planning to hold off on using test scores in evaluations, including: Alabama, Arkansas, Connecticut, Delaware, the District of Columbia, Georgia, Idaho, Kansas, Maryland, Michigan, Mississippi, Missouri, Ohio, Oregon, Rhode Island, South Carolina, South Dakota, and Utah.
So the good news was that Dr. Zais is apparently thinking about applying for that additional flexibility.

But then on September 23, SC was moved to the "no-with an asterisk" category. "And South Carolina is a special case, in that the state uses multiple years of growth in student test scores."

So what could have been great news for SC educators is now back in limbo.  Keep sending those requests to Dr. Zais.

UPDATE: Dr. Zais did send in the request on October 2, 2014.

Monday, September 15, 2014

Revised Education Leader Standards

CCSSO and NPBEA have issued for public comment a draft of the refreshed education leader standards, aka "ISLLC." You can review the draft  and make comments between now and October 10. The press release is here 

Friday, September 12, 2014

Weekend SLO Reading



SC educators - if you're not going to the USC-Georgia game tomorrow, you might want to browse through the SCDE's training materials on writing student learning objectives. Over 30 fun files chock full of information, including 12 sample SLOs.

District leaders - it includes checklists-worksheets for your planning and communication purposes.

You can find the entire package at:
https://drive.google.com/folderview?id=0BxMUlIE0XbmdY1paVG9JYXlYbkE&usp=sharing

Tuesday, September 9, 2014

Official Slow Down on SLOs

The SC Department of Education (SCDE) has received permission from the US Department of Education (USED) to delay until SY 15-16 the implementation of student learning objectives (SLOs) as the "student growth" piece of educator evaluation for "non-tested" grades and subjects. In her letter of August 21, Assistant Secretary Delisle indicated that USED was considering other flexibility on a state-by-state basis. The SCDE had requested the delay so that planning and training could be done in SY 14-15.

SCDE has not requested the other flexibility offered in that letter - the one year delay in using student test scores as the "student growth" measure in evaluation for those with "tested" grades and subjects.

The ESEA waiver originally required that both of these pieces be in place by the beginning of SY 14-15, i.e., right now. It's certainly a relief that SLOs are delayed. (The first official SCDE training starts September 11, 2014). Since we don't know what assessment will be given to students this spring (the law requires selection by September 30, 2014), it would make sense to request this other piece of flexibility. Value-added measures would still be calculated - they just would not be included in the "multiple years" of student growth used to evaluate teachers and principals. Only the State Superintendent is authorized to apply for this flexibility. Will he?

Friday, August 29, 2014

Waiver World

More news came out this week in NCLB Waiver World. Of interest are extensions for Indiana and Kansas, and the refusal to extend the waiver in Oklahoma. Lessons include: you don't need to specify a test score percentage for student growth as a "significant factor" in educator evaluations, and if you're not adopting common core, your higher ed institutions need to be ready to certify the standards you do adopt as "college ready."

"Significant Factor"
Waiver Principle 3 requires changes to teacher and principal evaluations. States must include student growth as a "significant factor," and in the ESEA-required tested grades and subjects (ELA and math in grades 3-8 and once in HS; science once in elementary, middle, and high) those tests must be part of student growth. Kansas proposed guidelines without specifying percentages. As reported in the Topeka Capital-Journal, "Test scores will be one of the indicators, but the state won’t mandate what percentage that must comprise of a given teacher or principal’s evaluation." 
This approval is interesting given representations made by some that the USED won't approve waiver amendments if certain percentages are not included in a state's guidelines. Now we have proof that USED will approve plans with more flexibility. 

College- & Career-Ready Standards
Both Oklahoma and Indiana moved away from the common core state standards and related assessments, which had been their method for complying with Principle 1 of the NCLB waiver.
Indiana went the "Virginia" route. Indiana's State Board adopted the 2014 Indiana State Standards and its Commission on Higher Education certified that students meeting those standards would not need remedial course work in post-secondary education.
Oklahoma decided to revert to its 2010 state standards. USED gave the state until August 12 to supply evidence that its higher ed institutions had certified the standards as "college- and career-ready." Oklahoma send a letter stating it could not submit evidence by the deadline, and did not have a timetable for getting it - so the waiver was not extended for School Year 14-15.
Other states facing the anti-common core movement should take note if they want to keep their NCLB waivers: make sure your higher ed institutions are on board.
Oklahoma faces the challenges that Washington state is experiencing in undoing the waiver and reverting to NCLB metrics.

Tuesday, August 26, 2014

Data From the Changing Evaluation Systems

Interesting reads in the educator evaluation sphere included two that came out today.

In Do Evaluations Penalize Teachers of Needy Students?,  Stephen Sawchuk at EdWeek pulls together data analysis from DC and Pittsburgh and concludes:
                
                There are a few takeaways from all of this: 
  • Critics have lambasted "value added" systems based on test scores as favoring teachers of better-performing students. But alternative measures, like observations and surveys, appear to be just as susceptible.
  • It's hard to know based on currently available data whether these patterns reflect flawed systems or a maldistribution of talent; in fact, it could be a combination of both—but as Di Carlo writes about D.C., "none of the possible explanations are particularly comforting."
  • Could the likelihood of lower scores discourage teachers from wanting to work in schools with more minority students or disadvantaged students? 
At a time when states are to be developing new Title II equity plans (due to USED in April 2015), these takeaways are especially troubling. 

And Bellwether Education Partners released today Teacher Evaluations in an Era of Rapid Change: From "Unsatisfactory" to "Needs Improvement." Of their 5 conclusions, one of concern is the variation among districts within the same state. Here's one example - their graphic for some Florida counties: 
Hillsborough or Manatee are looking like better places to work than Pasco, right? I wonder whether graphs like these will become recruitmet tools. 

Thursday, August 21, 2014

Yea for Delay

USED's Secretary Delisle is offering states the opportunity to postpone using new state assessments as part of "student growth" in teacher and principal evaluation systems. (Excerpted below.) Many have asked for this delay for many reasons, including that most states will be implementing new "college- and career-ready" assessments. There's also a hint that other flexibility might be available.

Dear Chief State School Officer:
 ...
The 2014-15 school year is an important one as we continue an essential but complex transition period. Most States will be fully implementing new, rigorous academic standards while also transitioning to new State assessments and implementing educator evaluation and support systems. We continue to hear from educators– just as you do – about the importance of ensuring this transition occurs in a thoughtful and strategic manner. Thus, I am writing to inform you of an important new element of flexibility we will offer States based on your experiences in implementation.
 ...
 However, the U.S. Department of Education (Department) also recognizes that building these systems is complex and that effectively implementing them can pose challenges. As this work has evolved, although they have the necessary authority to implement the systems fully, many SEAs have indicated a need for additional time to incorporate student growth based on State assessments into educator ratings for teachers and principals of tested grades and subjects during the transition to new assessments in 2014–2015.  Still, other SEAs have informed the Department that they need to modify their implementation plans in other ways due to lessons learned or challenges facing their LEAs.   The Department has heard those concerns, and will grant the following additional flexibilities to individual States that need them:

  • SEAs that need flexibility to delay inclusion of student growth on State assessments in evaluation and support systems during the transition to new assessments aligned with college- and career- ready standards.  The Department is offering SEAs transitioning to new assessments the flexibility of additional time to incorporate student growth on State assessments for one year, during the transition to new assessments, which most States plan for 2014-2015.  This flexibility is available if the SEA provides two assurances:

1.   In addition to continuing to implement their educator evaluation systems using multiple measures of student growth, the SEA or its LEAs will calculate student growth data based on State assessments during the transition year for all teachers of tested grades and subjects, in order to ensure and improve SEA and/or LEA capacity to make these calculations in an accurate manner going forward; and
2.   Each teacher of a tested grade and subject and all principals will receive their student growth data based on State assessments for the 2014-2015 school year in order to provide educators with all available information and build a deeper understanding of the information and its uses. 
 ...
Sincerely,


Deborah S. Delisle

Thursday, August 7, 2014

NJ Reducing Test Scores to 10% of Ed Eval

On August 6, 2014, the NJ State Board of Education approved regulation changes that would reduce the weight of test scores in the evaluation of teachers of math and English from 30% to 10% in SY 14-15. Under the changes, "student growth" for those teachers would also include Student Growth Objectives (SGOs, aka Student Learning Objectives) weighted at 20%. Observations will be 70% of their evaluations. Teachers of the "non-tested" grades and subjects will have SGOs at 20% and observations at 80%. The reg changes also include an expedited review for those who contest the use of the SGOs to drop their overall ratings to "partially effective" or "ineffective."


Just another example of how states have the flexibility to define "student growth" in a way that does not overly rely on the test score measures from new tests that will be coming online in SY 14-15.

Wednesday, July 23, 2014

USED Grants MD an Extension on Use of Test Scores in Evaluation

The Baltimore Sun has reported that Maryland has received a one-year extension on the use of test scores in its educator evaluation system while they work through the change to a new assessment.

Wednesday, July 9, 2014

Evaluation Frequency

How frequently is educator evaluation required?

The federal incentives. To receive an ESEA waiver states must require evaluation of teachers and principals "for continual improvement of instruction . . . on a regular basis. . . ."  (FAQ C-51). That was a change from the Race to the Top grant competition, which required "annual evaluation of teachers and principals," so in looking at what "most" states are doing, it is important to distinguish the Race to the Top states from the other waiver states.

States. State law also impacts how frequently educators must be evaluated. Although many states began with an annual, full evaluation, some are not pulling back to require limited, informal, or less frequent review. For example, Rhode Island's General Assembly amended its laws to require review every three years for "highly effective" teachers, every two for "effective" rated teachers, and annual review for all others. Another proposal would have reduced the interval to every 4 or 5 years. Arkansas and Virginia have reviews every three years.

South Carolina. In SC, "continuing contract teachers must be evaluated on a continuous basis," which may be "formal or informal." S.C. Code 59-26-30(B)(5); S.C. Reg. 43-205.1.V.B. The 2006 ADEPT guidelines interpret "continuous basis" as "every year." Those guidelines refer to goals-based evaluation as "informal" evaluation, which may be on a 5-year cycle with annual review. Continuing contract teachers recommended for "formal" evaluation "must be notified in writing no later than April 15," with written notice of the reasons and description of the process. S.C. Reg. 43-205.1.V.B. The June 11, 2014 guidelines require that "[a]ll educators must be evaluated on an annual basis." "However, the type and extent of the evaluation must be based on the intended purpose of the evaluation (see Section 7 below), the educator's level of experience, the educator's prior effectiveness rating(s), and the educator's current performance." (pp. 25-26)

Although the SC statute calls for principals to be evaluated "at least once every three years," the regulations for tier 2 principals require a full evaluation of all principal standards "every other year" with at least Standard 2, Instructional Leadership, in the alternate year. S.C. Reg. 43-165.1.III.B.(1). Again the June 2014 guidelines require "on an annual basis" subject to purpose, experience, prior ratings, and current performance.

Questions. While a full, annual evaluation might be desirable for giving educators feedback to improve instruction, questions remain about how that gets implemented well and whether evaluators have capacity to complete that number of annual evaluations. How the evaluation elements are defined also matters. E.g., how many observations by how many people for how long? The experiment that is underway because of Race to the Top and the ESEA waivers could give us interesting information on these questions.

Monday, July 7, 2014

Equity Plans and Effectiveness

The US Department of Education announced its  Excellent Educators for All Initiative today. This is a new version of the old equity plans that states have had on file for a long time, but those plans related to distribution of highly qualified teachers. The USDE wants states to file new plans by April 2015.  Before writing the plans state educator agencies are to gather data: "To prepare a strong plan, each SEA will analyze what its stakeholders and data have to say about the root causes of inequities and will craft its own solutions."

What does this have to do with evaluation? 

The old plans looked at distribution of highly qualified educators. The new plans will likely look at distribution of both highly qualified and highly effective educators - the ones with the highest evaluation ratings, including student growth as a significant factor. So the issue will be how will states encourage districts to encourage the highest rated teachers to go to the schools with the biggest equity issues? 

The Great Teachers and Leaders Center has been preparing for this announcement - it has an entire "Learning Hub" called Moving Towards Equity that is available for states to use for free. They also offer states free technical assistance. 
Innovation Station
The USDE letter also mentions a support network, but Edweek's Politics K-12 reports that this is only getting $4.2 million to be used nationwide - less than would be given to three schools under the School Improvement Grant program. 

Despite the free technical assistance, this is likely to be one more un- or under-funded mandate layered on states, that trickles to districts and impacts teachers, especially the highest rated ones. 

And don't forget that the USDE was asking for equity-effectiveness plan updates in the original ESEA waiver extension guidance - but backed off of that when states pushed back. 


Thursday, July 3, 2014

What if SC's ESEA flexibility waiver isn't renewed?

The US Department of Education announced July 3 that it has extended the ESEA-NCLB flexibility waiver for six states until the end of school year 14-15. SC was not in this first announcement. See the Politics K-12 Edweek blog  on this. Technically SC's waiver ended with SY 13-14. 

Changes to the evaluation systems occurred so SC could keeps its ESEA waiver. Will those changes matter if SC's waiver is not renewed? What are the issues related to renewal for SC? 

To get a waiver, states had to agree to: 

1. Adopt "college- and career-ready standards" and "high-quality assessments." Both terms in quotes are defined in the waiver materials. 

  • SC's legislature has required new standards starting in SY 15-16. Will those be "college- and career-ready"? Standards must be either "common to a significant number of states" (aka common core state standards), or ""approved by a State network of institutions of higher education, which certify that students who meet the standards will not need remedial course work at the postsecondary level." So whatever standards are developed would need to be "certified" by SC's higher ed institutions. 
  • The legislature also prohibited use of the SMARTER Balanced assessment. A new assessment is required by statute to be selected by September 30 - but will it meet the "high-quality" definition? The assessment must cover the standards and be "valid, reliable, and fair for its intended purposes." but it must also cover the full range of standards no matter how difficult to measure; elicit complex student demonstrations or applications of knowledge and skills;  provide an accurate measure for the "full performance continuum, including high- and low-achieving students"; provide accurate growth measures over a full course/year; provide growth data that can be used to determine whether students are on track to being college- and career-ready; assess ELLs and students with disabilities; provide alternate assessments; and produce data "that can be used to inform: determinations of school effectiveness...; determinations of individual principal and teacher effectiveness for purposes of evaluation; determinations of principal and teacher professional development and support needs; and teaching, learning, and program improvement."
  • SC has pending proposed changes to its application concerning assessment. 
2.  Adopt a State-developed differentiated recognition, accountability, and support system. That's the A-F grades that are assigned to schools and districts based primarily on mean scale score changes in testing. SC recently (June 2) received approval for 22 changes, all related to this element 2. 

3.  Support effective instruction and leadership - primarily through changes to principal and teacher evaluation and support systems. SC has a pending amendment request from March, which is when SC's extension was requested. 

4. Reducing duplication and unnecessary burden - states were to reduce red tape. Most people have forgotten that this requirement even exists, and USED did not ask for state plans on it. 

So how likely is it that SC's extension request will be granted? USED has frequently said it wants to work with states to get to "yes" on waivers. Given the uncertainty on principle 1 - standards and assessments - it may be a while before we know whether SC will receive an extension for SY 14-15. And if the extension is not granted, what will happen to the evaluation changes?  Stay tuned. 

Wednesday, July 2, 2014

Help from Washington, DC?

Assistant Secretary Deb Delisle (from Ohio and recently a resident of Myrtle Beach) sent a notice today to state school chiefs that the US Department of Education will implement a process "that provides SEAs [State Education Agencies] with support that will enable them to meet the requirements" of the educator evaluation sections of their waivers. 

What does that mean? 

Later in the notice she says States will "continue to progress with implementation of their teacher and principal evaluation and support systems," while USED is "offering flexibility where needed for targeted, State-specific adjustments to implementation steps, timelines, and sequencing."

What does that mean? 

Probably that so long as SC is implementing and progressing on its evaluation systems, it can extend a few timelines - like the one that has been requested related to SLOs. 

So maybe we don't have to implement SLOs in August 2014. 

Tuesday, July 1, 2014

SLOs in SC - Part IV: Growth Targets

SLOs are being used to measure student growth in "non-tested" grades and subjects. The theory is that educators will be able to look at baseline student achievement assessment data, set a year's growth target that is rigorous and appropriate for a student based upon that data, implement instructional strategies and progress monitoring, collect post-assessment data, and determine whether a student met the growth target. And measures of student achievement must be "comparable" within the district.

Assumptions about setting growth targets include that -

  1. A valid, reliable baseline assessment exists that covers the course content standards (and a range of standards above and below that grade level?); 
  2. Sufficient information exists about the assessment to determine what one year's growth (or perhaps "typical growth"? Or "rigorous"?) should be for students whose baselines are at varying levels; 
  3. A valid, reliable post-assessment exists that covers the course content standards for that grade level (+/- other grade levels?)
  4. The course content standards for all grades and subjects are of equal difficulty (is high school World Geography equal to high school Calculus?)
The American Institutes for Research (AIR) have materials posted on the site for the Center for Great Teachers & Leaders that discuss various methods (and pitfalls) for setting growth targets.


  • Basic Growth Targets - All students have the same growth target - e.g., increase 20 points between the pre- and post-assessment. 
  • Formula Growth Targets - E.g., all students will growth by half the difference between 100 and their pre-assessment score. A student with a baseline of 50 would have a target of 75 (100-50 = 50/2 = 25 + 50 = 75)
  • Performance Level Targets - A student's performance level will increase by 1 fall and spring. Although PASS courses will use VAM, and example would be a Below Basic1 student moving to Below Basic 2; or a Below Basic 2 student moving to Proficient. 
  • Individualized Targets - For example, NWEA provides various targets on MAP (average expected, comparison groups) based upon a student's pre-test score. 
  • Tiered Growth Targets - Students are grouped based on the pre-assessment and given tiered targets. Either similar scores, or similar increases can be used. Advanced Tiered Growth Targets set expectations at a baseline or +X points, whichever is greater. 
  • Preassessment - Growth Score
  • 0 - 45 points       65
  • 46 - 70 points     75
  • 70+ points          85
None of these methods for setting growth targets is perfect. For example, an "average" expected growth targets means that half of the students (and therefore teachers with that target) make that target; half  the students (and therefore teachers) do not reach that score. Basic growth targets do not account for closing achievement gaps. Formula targets may not provide enough rigor and "stretch" for top-performing students.

Because SC teachers have (hopefully) a year to get ready for implementing SLOs in SY 15-16, it is highly recommended that during SY 14-15 assessments be reviewed, and data on "typical one year's" growth be collected.  This year when it doesn't "count," why not administer a pre-test, try setting goals, collect post-data, and see how students performed?  That way you'll have at least one year's data on that assessment to inform setting goals for SY 15-16.





Monday, June 30, 2014

SLOs in SC - Part III: When

The guidelines themselves do not specify a number of years, but Appendix A, referring to changes, refers to a 3 year rolling average (p. 3). (It refers to a three-year rolling average value-added measure (VAM) and to sanctioning licences, but the guidelines themselves do not limit the multiple year review only to VAM.) Assuming assessment results for SLOs will be available before the end of the school year, 3 years of data would first exist in SY 17-18 (year 1: SY 15-16; year 2: SY 16-17; year 3: SY 17-18). 


When will SC educators be required to implement SLOs?
When will the SLO student growth measure impact ratings?
When will SLOs need to be submitted, approved, and reviewed?

When will SC educators be required to implement SLOs?  Probably School Year (SY) 15-16. Assuming that the US Department of Education grants the SCDE an extension, SLOs will be implemented as part of the revised SC educator evaluation system in SY 15-16. (If an extension is not granted, then the ESEA/NCLB flexibility waiver requires implementation in SY 14-15 - this August.) 

When will the SLO student growth measure impact ratings? Probably SY 17-18. The amended evaluation guidelines adopted on June 11 state in the introduction to "Individual Student Growth": 
A teacher's impact on student growth will be determined by looking at student growth data over multiple academic years. 
(Page 20.


When will SLOs need to be submitted, approved, and reviewed? Districts will need to set timelines for SLO data collection, pre-assessment, writing, approval, and review. The Guidelines (p. 22) refer to three required phases: goal-setting conference, mid-year check-in, and end-of-year conference. 

Next - SLOs in SC - Part IV: Growth Targets

Friday, June 27, 2014

SLOs in SC - Part II: What

What are SLOs?
What needs to be in SLOs in SC?
What assessments will be used?

What are SLOs? A student learning objectives (SLOs) is a measurable, long-term, academic goal informed by available data that a teacher or a teacher team sets at the beginning of the year for all students. (Additional SLOs may be for subgroups of students.) 

Most educators are familiar with S.M.A.R.T. goals, and SLOs are similar. Although different formulations exist for the letters, generally a SMART goal is: 

  • Specific, strategic
  • Measurable
  • Action-oriented, and Attainable
  • Rigorous, Reasonable, Results-Focused
  • Timed and Tracked. 
What needs to be in SLOs in SC? The South Carolina Department of Education (SCDE) has drafted an "Anatomy of an SLO" that defines the different sections, which are
  • Objective
  • Student Population
  • Standards/Content
  • Interval of Instruction
  • Assessment (Pre and Post)
  • Progress Monitoring
  • Baseline and Trend Data
  • Instructional Strategies
  • Growth Targets
  • Rationale
SCDE has also developed a draft template for use in creating an SLO.
 SLO Template

What assessments will be used? There are currently no guidelines on what assessments are appropriate for SC SLOs. Teachers, schools and districts need to inventory available assessments that might measure student growth, identify gaps, and create or purchase assessments to fill those gaps. The Reform Support Network of Race to the Top grantee states has prepared a summary of how states are approaching assessments (pp. 14-15). Other states have developed assessment lists, guidelines, and processes for SLO assessment approval. 


    Next up: Part III: When

    Thursday, June 26, 2014

    Student Learning Objectives (SLOs) in SC - Part I: Who

    Which SC educators must do student learning objectives starting in SY 15-16? 

    In South Carolina, principals and some teachers will be required to have value-added measures (VAM) as the "student growth" component of their evaluations. (See the box in the June 24 post.) 

    All other "educators" will be required to use Student Learning Objectives (SLOs) to demonstrate "student growth." "Educator" is defined as "any individual who works in one or more South Carolina public schools in a position that requires licensure [aka a certificate] by the South Carolina State Board of Education." The guidelines specify that "all other educators" includes: 


    • Other classroom-based teachers (in the non-statewide tested grades and subjects)
    • Speech-language therapists
    • School guidance counselors
    • Library media specialists
    • "etc." (Does that include assistant principals? District office staff?)
    The ESEA waiver requires that the revised educator evaluation system be in place by the start of SY 14-15, but the S.C. Department of Education has requested an extension on implementing SLOs until SY 15-16. 

    Even if the extension is granted, teachers, schools and districts should begin planning now for implementation of SLOs next school year. Tips on planning are available courtesy of the Ad Hoc Educator Evaluation Group (educators, PSTA, SCASA, SCEA, SCSBA) funded by SCASA and Palmetto State Teachers Association. The SCEA has obtained a grant to develop and provide SLO training in SC.

    Up next: SLOs in SC - Part II: What

    Wednesday, June 25, 2014

    Growth, Not Just "Proficiency" (updated)

    Teachers who focus most on helping students reach "proficiency" need to shift to a "growth" mindset to score well under the new educator evaluation system. Thirty percent of the annual rating is based on "student growth."

    Since No Child Left Behind (NCLB) was adopted, educators have been focused on helping all students become "proficient" in reading, math, and science. ("Proficient" was not defined in NCLB, so each state had its own standards, assessments, and cut scores for the "proficiency" level.)

    The goal under NCLB was that 100% of students would reach proficiency by 2014. Uh, yeah - that's this year, and, duh, no - we didn't make it. That's one of the reasons why 45 states asked for waivers from the US Department of Education.

    One of the shifts in policy focus with the waivers is towards ensuring that all students achieve at least one year's growth each academic year. To receive a waiver, states agreed to add "student growth" to their educator evaluation systems (and at some point will need to add it to their accountability systems).

    So now instead of one test that determines whether students reached the "proficiency" bar, educators must have at least two tests, a pre- and a post-assessment, that measure whether a student grew during the year.

    There are a lot of assumptions that are not adequately addressed at this point, including:
    • The assumption that we have assessments that measure growth (under NCLB assessments were required to only measure whether a student mastered the grade-level content, so few if any were adaptive and measured growth).
    • The assumption that we have assessments for which we define "one year's growth" no matter what the student's starting point is, and no matter how far behind or ahead the student is. 
    • The assumption that "one year's growth" is the same across grades and subjects (is one year of 4th grade math equal to one year of 4th grade English or 5th grade math?) 
    • The assumption that SC even has selected an assessment (see The Unknown in an SCDE presentation)
    • The assumption that we know what standards SC will assess.
    UPDATE: EdWeek's Politics K-12 team announced today that USED is expected to release criteria for new assessments this summer. 
    • "What the department ultimately requires states to prove when their tests are examined, then, is of intense interest to the assessment field right now. Will they feel the criteria are fair, and take into account the complex landscapes on which they operate, with teacher evaluation and state accountability casting long shadows? Which tests will meet the mark and which won't? And will states have to scramble to revise their assessments?"
    The waiver application and guidelines define "student growth":
    Student growth” is the change in student achievement for an individual student between two or more points in time.  For the purpose of this definition, student achievement means—

      • For grades and subjects in which assessments are required under ESEA section 1111(b)(3):  (1) a student’s score on such assessments and may include (2) other measures of student learning, such as those described in the second bullet, provided they are rigorous and comparable across schools within an LEA.
      • For grades and subjects in which assessments are not required under ESEA section 1111(b)(3):  alternative measures of student learning and performance such as student results on pre-tests, end-of-course tests, and objective performance-based assessments; student learning objectives; student performance on English language proficiency assessments; and other measures of student achievement that are rigorous and comparable across schools within an LEA. 

    Tuesday, June 24, 2014

    Nominations for the SC Educator Evaluation Advisory Team

    SCDE is looking for nominations of educators to serve on the SC Educator Evaluation Advisory Team. Those who serve must be available July 22-23.

    Fill out the  nomination  form by July 3.

    New SC Educator Evaluation Guidelines


    • 30% student growth for teachers, 20% District Choice, 50% professional performance
    • 50% student growth for principals, 50% professional performance (PADEPP)
    • Value-added measures for all principals and teachers of statewide-tested subjects


    • Student learning objectives for all other educators. 
    • 5 rating levels: Exemplary, Highly Effective, Proficient, Needs Improvement, Ineffective
    • Annual evaluation
    • Student growth data will be reviewed "over multiple academic years." 
    • In districts that elect not to use the "District Choice" option, teacher student growth will be 50%.

    At its meeting on June 11, 2014, the SC State Board of Education adopted guidelines presented by the S.C. Department of Education for evaluation of "any individual who works in one or more South Carolina public schools that requires licensure by the South Carolina State Board of Education." That includes not only principals and classroom-based teachers, but also "speech-language therapists, school guidance counselors, library media specialists, etc."

    The guidelines were adopted as one of the requirements of the ESEA flexibility waiver from the US Department of Education.