1. Gisellemk
  2. General
  3. Thursday, May 11 2017, 11:17 AM
  4.  Subscribe via email
Learning made to measure: Embedded assessment and educator growth
By Giselle O. Martin-Kniep with Rebecca Schubert

Thirty years ago, I served as a program evaluator for the California International Studies Project, a consortium led by Stanford University that included world affairs organizations, colleges, universities, and county education offices. One of my responsibilities was to evaluate the impact of the Project’s programs — mostly on teachers, but sometimes on students. Many of the programs aimed at improving cross-cultural awareness, perspective taking, and conflict resolution.

I found I was limited by tools that relied on perceptual and attitudinal data rather than assessing whether adults or students could understand that others can think differently, recognize the value of an alternative perspective, or assume a perspective other than their own.

After experimenting with alternative measures, I learned that the best way to assess such outcomes was to provide learners with experiences that elicited them, including simulations, role-plays, and other performance tasks. I discovered the value of authenticity and the constructs of assessment for learning and assessment as learning. I recognized then, as I do now, that assessment is the most powerful lever for learning and that it can be a means for assessing dispositional and other hard-to-measure outcomes. This is true for adult learners as well as students. This article describes how one set of educators in an LCI program used the same approach to enhance their own learning and practice.

***

A significant portion of LCI’s work is directed towards helping educators attend to and assess students’ ability to communicate, collaborate, think deeply, and apply and reflect on what they know and can do. But, inspired by our understanding of assessment for and as learning, we design our professional learning experiences with embedded assessment opportunities that enable educator-participants to assess their own growth and attainment of the outcomes they acquire as they learn.

This approach can be illustrated through one of LCI’s professional learning experiences, designed for about 50 teachers and administrators from 10 school districts in New York, New Jersey, and Connecticut. Over four full-day sessions between December 2015 and February 2016, these educators sought to learn about and assess critical thinking, metacognition, and problem solving. They worked collaboratively in small teams, first to uncover their understandings of these outcomes, then to determine what specifically to assess using what metrics and, finally, to engage in peer reviews as they completed different drafts of their work. This professional learning experience illustrates how program or curriculum-embedded assessment can help facilitators and learners document their learning.

Pre- and post-assessments provide value for adult learners, too
Being able to assess our impact or the impact of what we experience as learners begins with a clear sense of what we know and don’t know. All participants attending the program worked in districts that had made an explicit commitment to critical thinking, problem solving and metacognition, as evidenced in their mission or vision statement, district goals, and their participation in a Consortium dedicated to promoting these outcomes. Thus, it was easy to assume that there was a high level of readiness and understanding of these outcomes.

We launched the design work by reviewing and discussing different definitions and conceptualizations of each of the outcomes, sharing individuals’ assumptions about these conceptualizations, and exploring how these outcomes manifest themselves in teachers’ and students’ discourse, behavior, and work, using videos and assessment examples.

To track changes in participants’ understandings of the outcomes as they engaged in these learning experiences, we asked them to complete a concept map of each outcome before and after the first set of activities. The concept map below illustrates some of these changes that one of the teams experienced. Words in blue were added before the activities, and words in black were added after the activities.
http://www.lciltd.org/images/GMKConceptMapPhoto.jpg

As demonstrated by the map, these individuals came to the program recognizing that thinking entailed multiple components, including skills (such as comparing), knowledge, and processes (such as questioning and revising), and required instruction. The revised map shows nuanced changes illustrated by the awareness of perspectives and of additional skills and processes.

As they examined the revised maps, participants realized that there was more to thinking than what they understood. In fact, the more they learned about the outcomes, the more they understood their knowledge limitations and what the outcomes entail.
I have a better understanding of the different dimensions of problem solving. I clearly see how it can be broken down into subcategories. In the past, I did not view it this way. This clarifies our next steps. … We have a lot of work ahead.

As they explored these outcomes more in videos and other examples, participants were humbled by the limitations in their instructional repertoire and discovered that helping students acquire and use these outcomes required more and perhaps even different strategies than those they knew.
It turns out we don’t do a great job asking students to think about their thinking and that we don’t help them know what thinking entails.

Having access to pre- and post-assessment experiences, such as the concept maps, helped participants assess their own growth and motivated them to learn more about the outcomes. Their motivation increased even more once participants began to design specific metrics that assigned levels of development or quality.

A design process provides opportunities for authentic, program-embedded assessments
Having the opportunity to design school and classroom assessment tools for their own use gave them an authentic purpose for their learning and deepened their understanding of these outcomes even more. As they drafted tools, participants discovered the importance of clear and precise language for communicating what to expect from students and how this differs from relying on evaluative and relative terms.
We realized the importance of describing behaviors rather than relying on evaluative words. We also realized how common it is to use quantitative words in a rubric. Now, we try to focus on using what is visible.

As the design process progressed, participants were eager to bring their work to their students and teachers. Some of the tools, like the rubric below, unpacked the behaviors associated with problem-solving indicators and the prompts that could elicit such indicators.
http://www.lciltd.org/images/RubricAssessmentProfessionalLearning.JPG
Using these tools deepened participants’ attention to and understanding of these outcomes, whether they used the tools with students, teachers, or across the system:
Using this tool with my 3rd graders helped me stay super-focused on what it is I’m looking for as evidence of them engaging in problem solving.

The tool can be used at all grade levels because it can be flexibly adapted to differences in the complexity of subject-area content as well as the time frame for evolution from stage 1 to stage 5 in the progression. Younger students may move after the course of a unit or school year whereas older students may move through stages more quickly.

Different tools provided different types of insights for their creators. The chart below contains excerpts of a critical thinking rubric which includes sample student answers to anchor the rubric. Developing these examples helped the participants understand how well they conceptualized and could describe the desired outcomes for each level.
http://www.lciltd.org/images/RubricCriticalThinkingAssessmentProfessionalLearning.JPG

As teachers and administrators used the tools, they uncovered more nuanced behaviors associated with the outcomes and realized how they needed to be refined. As one participant said, “When I piloted the tool, I realized students who are amazingly metacognitive might not score very high on the tool. That means the rubric needs work, not them… I need to revise my dimensions to account for different kinds of thinkers.”

They also realized, especially after analyzing student work against the tools and metrics, that the tools sometimes communicated higher expectations than the learning opportunities they provided or that student work did not meet the expected standards.

Administrators and teachers came to understand that producing these outcomes is not only about teaching students how to problem solve or think, but requires a culture that promotes thinking or problem solving in students. This, in turn, demands contexts in which teachers see themselves as thinkers and problem solvers.
Learning about the stages of the problem-solving process and the nuances of each stage underscored for me that problem solving can be taught. We often think of some students as being ‘natural’ problem solvers and the leaders in group tasks. I learned that, as leaders, we have to provide opportunities for teachers to be involved in authentic problem-solving tasks so they can identify what it looks and sounds like so they can support students in the process.


Our program-embedded assessment experiences enabled me to learn about and from the participants in the program, while they learned about their own learning and impact on others. I realized that assessing difficult-to-measure outcomes requires a rich and elaborate language that attends to the nuances and developmental range of these outcomes, an instructional repertoire that honors their development, and the experiences and opportunities for educators to cultivate and practice these outcomes in themselves and in their practice.

***
This article was adapted with permission from a version that appeared in the April 2017 issue of The Learning Professional, published by Learning Forward (www.learningforward.org). All rights reserved.
Comment
There are no comments made yet.


There are no replies made for this post yet.
Be one of the first to reply to this post!
Guest
Submit Your Response
You may insert polls into your post. The poll would then appear in the post.
Vote Options
Captcha
To protect the site from bots and unauthorized scripts, we require that you enter the captcha codes below before posting your question.