The audience for today's episode is social work faculty, specifically practice instructors who are interested in learning more about how to objectively evaluate their student's skills. Today's episode reviews the origins of OSCE adapted for social work, how it is implemented in different types of social work programs, some findings from the research that has been conducted on OSCE, and some recommendations for faculty who are interested in learning more about this approach.
Disclosure: I served as a CSWE Council on Publications liason with Marion Bogo for the text Using Simulation in Assessment and Teaching: OSCE Adapted for Social Work (Objective Structured Clinical Examination).
Download MP3 [52:22]
Bios
Marion Bogo, MSW, Adv. Dip. SW is Professor at the Factor-Inwentash Faculty of Social Work, University of Toronto. She is the former Field Director, Dean, and first appointee to the first Endowed Chair in Social Work. Her research focuses on social work field education and the conceptualization and assessment of professional competence. She has published over 100 journal articles and book chapters, and 5 books. Professor Bogo is a member of numerous journal editorial boards and was Associate Editor North America for Social Work Education: The International Journal. She has consulted to schools of social work in North America, Asia, and Europe. In 2013 she was awarded the Significant Lifetime Achievement in Social Work Education Award from the Council of Social Work Education, USA in recognition of her contributions to social work education and to improving assessment of professional competence. In 2014 she was appointed as an Officer of the Order of Canada for her achievements in the field of social work as a scholar and teacher, and for advancing the practice in Canada and abroad.Mary Rawlings, PhD, LCSW is a professor at Azusa Pacific University, and Chair of the Department of Social Work and currently serves as director of the MSW program. Her teaching and research interests are in competency based education, and in the development and testing of the use of simulation in training and assessment of student practice skill. She is interested in experiential learning models, such as service-learning, that can enhance student educational outcomes. She currently conducts research on the development of OSCE exams for social work. She is a licensed clinical social worker with more than 10 years of practice experience. Her practice interests are in women's issues, and chronic and persistent mental illness.
Transcript
IntroductionHey there podcast listeners. You’re pretty good at your job, right? I mean, if your job is being a student, I suspect you’re getting mostly As and Bs. If you are a practitioner, you know, out in the field, then the fact that you’re listening to this episode purely for your own edification suggests that you are a dedicated social worker. That, or you’ll listen to just about anything now that the first season of the Serial Podcast is over. I know, I’m right there with you. But I digress. When it comes to your performance, unless I’m your field instructor or your professor, I don’t actually know how well you’re doing. If I were in the room with you, like actually live in the room with you, not just in your ear, I could ask you, how good are you? If you’ve had to fill out your own field evaluation or annual employee self-evaluation you’ve thought about how you would rate yourself. The problem with rating ourselves is that, well, we don’t do a very good job of it. We think we’re doing better at everyday tasks than we actually are, and we think we’re doing worse on difficult or unusual tasks than we actually are. A classic 1981 study by Ola Svenson (Svenson, 1981) found that 93% of American drivers rated themselves as above average. I know, you’re thinking that Svenson must have studied drivers in Garrison Keillor’s fictional Wisconsin town of Lake Wobegon where “all the women are strong, all the men are good looking and all the children are above average.”
So, if your self-evaluation isn’t particularly valid (not to be confused with reliable - it might be reliably over- or underestimated, but not a valid assessment), who could provide an objective evaluation of your skills? In social work education, the assumption is that your professors and field instructor,or “practice teachers,” as they are called in England, are the best people to ask.
But what if they are not?
Professors have been criticized for contributing to grade inflation - the so-called “Lake Wobegon effect” where all student grades are above average. Consider this. In 1940, the most common grade received by students at a university was a “C.” 35% of students received “Cs” compared with 15% who received “As.” In 2008, the most common grade was an “A” with 42% of students receiving an A and only 15% earning a “C.” (http://www.gradeinflation.com/). Now, this isn’t an indictment of social work faculty. This study was based on undergraduate grades across all programs. And I’ve heard social work faculty argue that it is unethical to pass students who can’t do work at an “A” level. So, it might be that in social work faculty are working really hard to make sure those C students become A students. But maybe not.
What about field instructors? The news isn’t so good here either. According to a study by Marion Bogo and colleagues, “field instructors had extremely high agreement when evaluating performance of students they did not know” (Bogo, 2014, p. xiv). But, these same field instructors had great difficulty evaluating their own students, in part because of the relationships they had with students. For example, if I’m your field instructor and I really like you or I really dislike you it is going to be harder for me to objectively evaluate your skills. The tendency to “go easier” on likeable students is what researchers call a “leniency bias.” And for a profession that considers field education a “signature pedagogy,” valid and reliable evaluations of student performance in field is paramount.
And that brings me to today’s topic: objective standardized structured exams, also known as OSCE. For over a hundred years, educators have struggled with the best way to provide social workers with the knowledge AND skills needed to provide the best services possible to clients. The division in most schools is that knowledge is provided in the classroom and specific practice skills are gained in the field. But starting in 2008, the Council on Social Work Education - the national body that accredits schools of social work in the USA, required programs to evaluate a student's competence in 10 areas. Competency-based outcomes emphasize what a student can do, not just what they know. If evaluations by students, professors, and field instructors fall short in the ways I’ve just mentioned, what can we do?
Today’s episode looks at one approach that is intended to provide a more objective and structured evaluation of social work skills - the OSCE, which stands for the Objective Structured Clinical Examination. My guests, Marion Bogo and Mary Rawlings, and their co-authors Ellen Katz and Carmen Logie, are pioneers in the development, implementation, and evaluation of OSCE adapted for social work. Marion Bogo is professor at the Factor-Inwentash Faculty of Social Work, University of Toronto. She has published over 100 journal articles and book chapters, and 5 books. In 2013 she was awarded the Significant Lifetime Achievement in Social Work Education Award from the Council of Social Work Education, USA in recognition of her contributions to social work education and to improving assessment of professional competence. In 2014 she was appointed as an Officer of the Order of Canada for her achievements in the field of social work as a scholar and teacher, and for advancing the practice in Canada and abroad. Mary Rawlings is a professor at Azusa Pacific University, Chair of the Department of Social Work and currently serves as director of the MSW program. Her current research focuses on the development of OSCE exams for social work. She is a licensed clinical social worker with more than 10 years of practice experience. Their 2014 CSWE Press text, Using Simulation in Assessment and Teaching: OSCE Adapted for Social Work (Objective Structured Clinical Examination), reviews how to conceptualize competence and what are the implications for assessment, how to design, prepare, and implement an OSCE, how to develop practice competence through teaching with simulation, and finally how to engage university partners in delivering OSCE adapted for social work. For instructors who are interested in learning how to implement OSCE adapted for social work the text has over 50 pages of appendices with practical resources. Additionally, Marion and her colleagues have been providing Faculty Development Institutes at CSWE’s annual program meeting for the past several years, and will conduct annual roundtable discussions at APM.
I spoke with Marion and Mary in October 2014 at CSWE’s 60th annual program meeting in Tampa, Florida. Unlike most guests whom I schedule months in advance, Marion and Mary responded to a last minute invite. We came up with the questions on the spot. And then something unusual happened. They just sort of answered the questions without me. Which, in retrospect, wasn’t surprising when I realized they had spent much of previous four days talking at length about OSCE. So, you’ll hear a lot of them and not a lot of me.
And now, without further ado, on to episode 94 of the Social Work Podcast: Objective Structured Clinical Examination (OSCE) Adapted for Social Work: Interview with Marion Bogo and Mary Rawlings.
Interview
(forthcoming)
References
(provided by Marion Bogo and Mary Rawlings)
Assessing Competence Using OSCE Adapted for Social Work
- Bogo, M., Rawlings, M., Katz, E., & Logie, C. (2014). Using Simulation in Assessment and Teaching: OSCE Adapted for Social Work (Objective Structured Clinical Examination). CSWE: Alexandria, VA.
- Logie, C., Bogo, M., & Katz, E. (in press). “I didn’t feel equipped: Social work students’ reflections on a simulated client ‘coming out’. Journal of Social Work Education.
- Katz, E., Tufford, L., Bogo, M., & Regehr, C. (2014online). Illuminating students’ pre-practicum conceptual and emotional states: Implications for field education. Journal of Teaching in Social Work 34, 96-108.
- Logie, C., Bogo, M., Regehr, C., & Regehr, G. (2013). A critical appraisal of the use of standardized client simulations in social work education. Journal of Social Work Education, 49(1): 66-80. DOI: 10.1080/10437797.2013.755377
- Bogo, M., Katz, E., Regehr, C., Logie, C., Mylopoulos, M., & Tufford, L. (2013). Toward understanding meta-competence: An analysis of students’ reflection on their simulated interviews. Social Work Education 32(2): 259-273. DOI: 10.1080/02615479.2012.738662
- Bogo, M., Regehr, C., Katz, E., Logie, C., Tufford, L., & Litvack, A. (2012). Evaluating the use of an objective structured clinical examination (OSCE) adapted for social work. Research on Social Work Practice. 22(4), 428 - 436. DOI: 10.1177/1049731512437557
- Bogo, M., Regehr, C., Katz, E., Logie, C., Mylopoulos, M. & Regehr, G. (2011). Developing a tool to assess student reflections. Social Work Education 30(2), 186-195.
- Bogo, M., Regehr, C., Logie, C., Katz, E., Mylopoulos, M., & Regehr, G. (2011). Adapting objective structured clinical examinations to assess social work students’ performance and reflections. Journal of Social Work Education, 47, 5-18.
Simulation in Teaching
- Bogo, M., Shlonsky, A., Lee. B., & Serbinski, S. (2014). Acting like it matters: A scoping review of simulation in child welfare training. Journal of Public Child Welfare 8(1)70-93. DOI: 10.1080/15548732.2013.818610
Conceptualizing and Assessing Competence
- Bogo, M., Rawlings, M., Katz, E., & Logie, C. (2014). Using Simulation in Assessment and Teaching: OSCE Adapted for Social Work (Objective Structured Clinical Examination). CSWE: Alexandria, VA.
- Bogo, M. (2010). Achieving Competence in Social Work through Field Education. Toronto, ON. University of Toronto Press.
- Bogo, M., Mishna, F., & Regehr, C. (2011). Competency frameworks: Bridging education and practice. Canadian Social Work Review 28(2): 275-279.
- Bogo, M., Regehr, C., Power, R., & Regehr, G. (2007). When values collide: Providing feedback and evaluating competence in social work. The Clinical Supervisor 26(1/2), 99-117.
- Bogo, M., Regehr, C., Woodford, M., Hughes, J., Power, R., & Regehr, G. (2006). Beyond competencies: Field instructors' descriptions of student performance. Journal of Social Work Education, 42(3), 191-205.
- Bogo, M., Regehr, C., Power, R., Hughes, J., Woodford, M., & Regehr, G. (2004). Toward new approaches for evaluating student field performance: Tapping the implicit criteria used by experienced field instructors. Journal of Social Work Education, 40(3), 417-426.
- Bogo, M., Regehr, C., Hughes, J., Power, R., & Globerman, J. (2002). Evaluating a measure of student field performance in direct service: Testing reliability and validity of explicit criteria. Journal of Social Work Education, 38(3), 385-401.
- Regehr, G., Bogo, M., Regehr, C., & Power, R. (2007). Can we build a better mousetrap? Improving measures of social work practice performance in the field. Journal of Social Work Education, 43(2), 327-343.
- Regehr, C., Bogo, M., Donovan, K., Anstice, S. & Kim, A. (2012). Identifying student competencies in macro practice: Articulating the practice wisdom of field instructors. Journal of Social Work Education, 48, 307-319.
- Regehr, C., Bogo, M., Donovan, K., Lim, A. & Regehr, G. (2011). Evaluating a Scale to Measure Student Competencies in Macro Social Work Practice. Journal of Social Service Research. 38(1) 100-109. DOI 10.1080/01488376.2011.616756
- Taylor, I. & Bogo , M. (2013iFirst).Perfect opportunity~perfect storm? Raising the standards of social work education in England. British Journal of Social Work. doi: 10.1093/bjsw/bct077
Resources
- The International Meeting for Simulation in Healthcare: http://www.ssih.org/Events/IMSH-2015
- Building Professional Competence: http://research.socialwork.utoronto.ca/hubhomepage?hub=building_prof_competence
- Svenson, O. (1981). Are we all less risky and more skillful than our fellow drivers? Acta Psychologica, 47(2), 143–148. doi:10.1016/0001-6918(81)90005-6
APA (6th ed) citation for this podcast:
Singer, J. B. (Producer). (2015, January 5). #94 - Objective structured clinical examination (OSCE) adapted for social work: Interview with Marion Bogo and Mary Rawlings [Audio Podcast]. Social Work Podcast. Retrieved from http://www.socialworkpodcast.com/2015/01/OSCE.html
Great episode, thank you. I'm planning on following up with some of the faculty members with me at my University and see about options regarding implementing this for our program.
ReplyDeleteI do have role play activities that my students have been graded on, but I've never really felt sure of their reliability.