My Logo

Strategies for Assessing Social Emotional Competence in Out of School Time Settings

Strategies for Assessing Social Emotional Competence in Out of School Time Settings
By: Sophie Shang
Center for Prevention Research in Social Welfare
University of California, Berkeley
[email protected]

 

Playworks is a national, play-based Out-of School-Time (OST) program focused on enhancing recess in low-income schools. Full-time, trained staff are placed in schools to structure recess, integrate physical activity throughout the day, coordinate sports leagues, and provide lessons on health, fitness, and safety. Additionally, staff run an after-school Junior Coach Leadership Program to train a group of about 15 students per school to become peer recess leaders. Evaluations of Playworks programming have indicated many positive outcomes, including improvements in school climate (London, Westrich, Stokes-Guinan, & McLaughlin, 2015), reductions in bullying (Fortson, James-Burdumy, Bleeker, Beyler, London, Westrich, Stokes-Guinan, & Castrechini, 2013), and improvements in learning/academic performance (Fortson et al., 2013). Yet, despite reports from teachers, parents, and principals that Playworks promotes student Social Emotional Competence (SEC), there have not yet been studies to demonstrate this. This is unfortunately a common occurrence because it can be difficult to efficiently and effectively measuring SEC in Out-of-School-Time (OST) settings.

This article briefly describes one effort by Dr. Valerie Shapiro and Dr. Sarah Accomazzo (Center for Prevention Research in Social Welfare at UC Berkeley), Jennette Claassen (Playworks), and Jennifer Fleming Robitaille (Devereux Center for Resilient Children) to pilot measures of SEC within Playworks OST programming. During the 2014-2015 academic year, they worked together, and in partnership with Apperson SEL and Rush Neurobehavioral Center, to assess SEC among 4th and 5th grade students participating in the Junior Coach Leadership Program (JCLP) at 31 Playworks schools in Northern California.

Two primary lessons were taken from the Handbook of Social Emotional Learning (Durlak, Domitrovich, Weissberg, & Gullotta, 2015) for selecting assessments to pilot. First, because there are distinct advantages of different assessment modalities, “multi-method, multi-rater assessment is preferred over mono-method, mono-rater assessment.” (McKown, 2015, p.332). Second, criteria for selecting tools should include the adequacy of user documentation, strength of psychometric properties, relevance to the populations of students to be assessed, practicality of administration, inclusivity of multiple perspectives and dimensions, and potential for interpretable and meaningful information to guide decisions (Denham, 2015).  Since these recommendations were made with school-based programs in mind, the following attributes were also considered for utility within OST environments:

Does the tool…

  • assess strengths rather than pathology in the spirit of a youth development framework?
  • have evidence of appropriateness for OST staff to administer, inform, or interpret?
  • have practical requirements for familiarity with the child, given the constraints of OST programming?<
  • use consistent forms and norms for heterogeneous age and ability groups?
  • lend to efficient assessment in play environments with time and budget constraints?
  • yield individual scores which can be used to tailor programming for children’s unique profiles?

This feasibility study of social emotional assessment in OST environments piloted two instruments. The Devereux Student Strengths Assessment (DESSA, LeBuffe, Shapiro, & Naglieri, 2009/2014) is a strength-based behavior rating scale (an indirect assessment) that only requires four weeks of adult familiarly with the child for a minimum of six hours a week, uses a consistent form from K-8th grade, has a brief version which takes only 1 minute per child to complete, and includes norms for teachers, parents, and program staff. The Virtual Environment for Social Information Processing Tool (VESIP, Russo-Ponsaran, McKown, Johnson, Allen, & Knudsen, 2012) is a direct assessment conducted through a computerized simulation in which children adopt the role of an avatar, interact with other avatars, respond to challenging social situations, and engage in social decision-making. The VESIP can be group-administered as an activity within OST programming.

The pilot revealed several “lessons learned” that may be useful to others conducting a Social Emotional assessment in OST settings. For indirect assessments, these include:

  • When program administrators asked their staff to complete the assessments, they received better compliance from their staff when they called it a “program evaluation” rather than a “pilot”
  • Paper copies of behavior rating scales were preferred by some OST program administrators, given that they could be completed by staff, teachers, and caregivers anywhere, without the need for specific technology or connectivity.
  • Electronic administration of behavior rating scales was preferred by some OST program administrators when the primary objective was formative assessment, as results could be generated and used in real-time.
  • Since teacher participation rate ranged from 38% to 72%, using an assessment that could be fully completed by program staff was essential.

For direct assessments, these include:

  • If possible, allow at least an hour for set-up to transform an OST space into a group administration test site (e.g. locate needed equipment and outlets, rearrange furniture, set up mobile wireless devices, laptops, mice, and headsets).
  • For group administration,  staff may assign seating to mitigate student attempts to compare their responses or time-to-completion with other students. Merely having students “spread out” as much as possible may be difficult to manage in some OST environments.
  • OST staff were tremendous resources in implementing direct assessments, helping with room set-up, troubleshooting challenges, engaging students during unexpected delays, and modeling enthusiasm and attentiveness towards the assessment.
  • Since attendance is not required in OST programs, it was difficult to predict how many students would be present, or leave early.
  • To avoid pre-existing student associations with both “tests” and “computer games”, students reacted best to the direct assessment when it was called  an “activity,” (following administration guidelines).
  • It may be useful to provide a quiet, in-seat activity option for students to complete after a group-administration to account for different completion times (e.g. once several students were finished, the rest seemed to “hurry up” in order to join their peers in the next activity).

A full report describing the experience of conducting this feasibility study entitled “The choices, challenges, and lessons learned from a multi-method social-emotional / character assessment in an out of school time setting”  is forthcoming in the Journal of Youth Development: Bridging Research and Practice 

References

Fortson, J., James-Burdumy, S., Bleeker, M., Beyler, N., London, R. A., Westrich, L., … & Castrechini, S. (2013). Impact and implementation findings from an experimental evaluation of playworks: effects on school climate, academic learning, student social skills and behavior. Mathematica Policy Research.

Beyler, N., Bleeker, M., James-Burdumy, S., Fortson, J., London, R. A., Westrich, L., … & Castrechini, S. (2013). Findings from an Experimental Evaluation of Playworks: Effects on Play, Physical Activity and Recess (No. 7781). Mathematica Policy Research.

Denham, S.A. (2015). Assessment of SEL in Educational Contexts. In J.A. Durlak, C.E. Domitrovich, R.P. Weissberg, & T.P. Gullotta (Eds.), Handbook of Social and Emotional Learning: Research and Practice. New York: Guilford.

Durlak, J. A., Domitrovich, C. E., Weissberg, R. P., & Gullotta, T. P. (2015). Handbook of Social and Emotional Learning.

LeBuffe, P.A., Shapiro, V.B., & Naglieri, J.A. (2009/2014). The Devereux Student Strengths Assessment (DESSA) Assessment, Technical Manual, and User’s Guide. Charlotte, NC: Apperson, Inc.

London, R. A., Westrich, L., Stokes‐Guinan, K., & McLaughlin, M. (2015). Playing Fair: The Contribution of High‐Functioning Recess to Overall School Climate in Low‐Income Elementary Schools. Journal of school health, 85(1), 53-60.

McKown, C. (2015). Challenges and opportunities in the direct assessment of Children’s Social and Emotional Comprehension. In J.A. Durlak, C.E. Domitrovich, R.P. Weissberg, & T.P. Gullotta (Eds.), Handbook of Social and Emotional Learning: Research and Practice. New York: Guilford.

Russo-Ponsaran, N.M., McKown, C., Johnson, J.K., Allen, A., & Knudsen, K. (2012). Usability & Likability of the Virtual Environment for Social Information Processing (VESIP) for Children with and without Autism Spectrum Disorders. Presented at the International Society for Autism Research: Toronto: Ontario.

 

 

 

JOIN OUR EMAIL LIST & NEVER MISS AN UPDATE

1 (866) 872-4687

444 Devereux Drive

Villanova, PA 19085