Student Learning Objectives
From the AEC's blogFrom June 11-13, we were in Harrisburg working with groups of music, theatre, and visual arts teachers (along with teachers in other non-tested content areas) on Student Learning Objectives. The PA Department of Education is interested in crafting these Student Learning Objectives as models for districts to use when evaluating teachers in non-state-tested content areas.
This session was planned because there is legislation moving through the Pennsylvania legislature right now that would require school districts to evaluate teachers based on the results of student assessment. In content areas currently tested on the PSSA, those results are easily accessible. For the rest of us, there must be valid, reliable assessment results in order to make decisions about teacher effectiveness. The legislation (HB 1980) proposes the following structures for determining teacher effectiveness:
Subject areas tested on PSSAs
(approx. 20% of teachers)
Subject areas not tested on PSSAs (approx. 80% of teachers)
50% administrator evaluation
15% school-level data
15% PSSA scores20% teacher-level data
50% administrator evaluation
15% school-level data
35% teacher-level data
The process was facilitated by staff from the Center for Assessment. In the room were K-12 teachers, higher ed folks, and PDE staff in the areas of family/consumer sciences, early childhood (K-2), science, social studies, environment and ecology, technology, health/physical education, and the arts (music, theatre, and visual arts – no members of the dance education field were available to attend this session).
The session started with some definitions:
- Non-tested subject areas and grades – courses, subject areas, and grade levels without at least two consectuive years of state-level standardized tests
- Student Learning Objectives (SLOs) – establish goals for students and then evaluate the extent to which the goals have been achieved. The SLOs were described throughout the session as a “three-room house”, with the three rooms being:
- Objective – learning goal
- Target – the level of performance that students achieve in relation to the learning goal
- Assessment – how that level of performance is measured
The teams then started by crafting draft objectives (room 1 in the SLO house). When the teams shared out, it was clear that some were very specific and others very general. What the teams struggled with throughout the three days was this idea of specificity: how specific should these objectives be, keeping in mind that these are models for all 500 school districts in Pennsylvania? They must meet the needs of educators in Cornell SD (one building K-12) as well as Philadelphia SD (largest district in the state). One lens that we used to focus the work was the idea that these are written both for student learning and for teacher evaluation. Using that lens, the SLOs have to be clear and specific enough so that an administrator who may not have a background in that content area can understand.
When we got to the “2nd room” of the SLO – assessment, we started the work with this list of steps:
- Identify what part(s) of the SLO should be assessed.
- Select an authentic task. The task should be directly related to the SLO.
- Identify criteria. Focus on the essential elements of the task that are worthy of being assessed.
- Create a rubric or other scoring tool for the criteria.
One thing that we struggled with in relation to assessment was how specific the assessment task should be. Again, knowing that this is to be a model for districts across Pennsylvania, it was difficult to project how often teachers see students, for how long, and which materials they have access to. In the end, we took our best guess at these items based on the circumstances of the teachers in the room, but knowing that we could not possibly plan for every situation.
Our final step was to set the targets. Now that we had the objectives and all of the pieces of the assessment, we could more accurately predict what percentage of students should be proficient in order for the teacher to be successful. A particularly interesting conversation occurred at the music table: if teachers see students, for example, for 45 minutes each week 36 times during the year, do they really end up seeing students 36 times? More likely, they see all students from 30-32 times due to snow days, assemblies, etc., and some students even less. What does that mean for setting targets?
Finally, so you can see what all this means, here are links to two of the documents that the music team developed during this three-day session. They are presented here not because they are perfect and final…in fact, they are drafts and will more than likely undergo significant revisions before they’re finalized. But we feel like it’s easier to see how all these parts fit together with an example. We welcome feedback on these drafts.
As a final note, the music team spent considerable time debating the purpose of these SLOs and the structure that should guide them. This example is for a course at the high school level that is NOT a performing ensemble (although the argument could be made that it could be modified for an ensemble setting). The team also worked on SLOs for elementary and middle level general music, as well as for instrumental and choral ensembles at all levels. The visual arts team worked on one SLO for middle level art, and the theatre team worked on one SLO for HS theatre, focusing on acting.
Draft Student Learning Objective - This is the document that lays out the three rooms in the SLO “house”: objective, target, and assessment.
Draft Assessment Plan - This document fleshes out the target and assessment pieces more readily and shows how they align to the objective.