MATH VALUES

View Original

Effective Professional Development: What we can learn from PRODUCT

By Audrey Malagon

A few months back, Stan Yoshinobu told us about the PRODUCT (PROfessional Development and Update through Collaborative Teams) program’s efforts to get IBL into classrooms across the country through a robust professional development program for faculty. A key part of their project was evaluating the effectiveness of their approaches to professional development. In this month's blog we talk with Sandra Laursen about the research and evaluation side of the PRODUCT project.

1) Many faculty recall professional development that they found personally helpful, or in some cases, not helpful. How did your team measure the effectiveness of professional development for faculty in the PRODUCT project?

For our intensive workshops, we were most interested in the workshop attendees’ implementation of IBL practices. We compared their teaching practices pre-workshop and one academic year after the workshop, emphasizing initial changes rather than expertise. When instructors are trying more IBL-like approaches, they are shifting how they use class time, and we can measure that. It takes time to develop expertise, but we can tell if they are moving in the intended direction. We also gathered survey data about these practices and conducted some classroom observations.

The PRODUCT traveling workshops had different goals, so for these our measures of effectiveness focused on learning, awareness, and interest to learn more. After each workshop, we gathered data on participants’ self-reported growth in IBL-related knowledge and skills, motivation to use IBL, and beliefs in the effectiveness of IBL. These measures helped us improve the workshops over time and served as short-term outcome measures so that we could connect immediate learning with longer-term implementation attributed to workshop participation.

2) What conclusions were you able to draw about effective professional development? What can others hoping to implement successful professional development programs learn from your experience?

One important conclusion is that this model of professional development worked! For the intensive workshops, the average change in teaching practices yielded a pre/post effect size of nearly a full standard deviation, which is a big change for any educational intervention!

We believe this sizable effect is due to the carefully designed workshops and the support we provided workshop instructors. You can read more about the design of the workshops here, but some key features were:

  • Thorough advance planning and thoughtful logistics.

  • The use of video as a backbone of the workshop, which helped participants build mental images of IBL and discuss variations suited to different environments.

  • Intentional community-building and activities designed to foster interaction over important ideas.

  • Useful content and support resources provided by the workshop leaders.

  • Modeling of IBL practices in the workshop itself.

We organized the workshops into four strands that included videos with examples of IBL teaching, help with course design and logistics, use of educational literature on student learning, and support for course content creation. We trained facilitators to work in different roles around these strands and with different teammates, which helped them deploy a versatile toolkit to address workshop needs in the moment. Read more in our practical handbook.

3) One of the conclusions in your impact report is the importance of support for faculty when first implementing a new pedagogical technique like IBL. What different forms could support take?

The literature on professional development makes clear that support after the workshop is essential to implementation. The intensive workshops used a form of “e-mail mentoring,” a deliberately proactive,cohort-based list. Facilitators did not just share announcements, but invited people, cheerfully and relentlessly, to chime in—to share what they were trying and how it was going. The chimes served as conversation starters, offering social support as the group celebrated successes and normalized the difficulties of trying something new. The list has become a supportive community for inquiry about teaching – it doesn’t rest on the facilitators sharing their expertise. Read more about this support mode here.

4) Your outcome report mentions that traveling workshops allowed the PRODUCT team to reach non-tenure track instructors and faculty at two-year institutions. With the MAA's increased focus on supporting VITAL faculty (Visitors, Instructors, TAs, Adjuncts, and Lecturers), what can we learn from PRODUCT about how best to do this?

The traveling IBL workshops did reach a greater variety of people - in part due to deliberate outreach and in part due to their accessibility. In fact, their success prompted us to create the MAA OPEN Math project, funded by NSF and supported by MAA, to offer intensive professional development workshops online. The workshops take advantage of what we’ve all learned about effective and interactive online teaching; they reduce both personal costs and our collective carbon footprint. The feedback on these workshops has been very positive!

5) For those who are interested in seeking NSF funding for a professional development program, what is your advice on creating a successful research and evaluation plan? How important was this part of your project both to your successful application as well as to implementing the PRODUCT program?

Research and evaluation are two sides of one coin. We design studies so one data set can be used for multiple purposes. Evaluation focuses on improving a specific program and documenting its impact. Research questions must have general relevance and contribute to general knowledge. Sometimes when I review proposals, I see writers getting a little grand about the aims of their research. A small, thoughtful study, executed well, can be really valuable. It’s also important to do your homework in the literature and learn from people who are trained in evaluation or education research. In education, it takes many studies in many different contexts to build transferable knowledge. Figure out what is interesting about your context that might add a bit of insight to what is known. People can also get hung up on comparison groups. I often encourage people to pose questions about their work in terms of “what is good?” “why?” and “what could be better?” —rather than whether it is “better than” something else.

For PRODUCT, I do think a good research and evaluation plan was critical to getting funded. Formative evaluation was crucial to improving the workshops as we went. It’s even more exciting when we can also demonstrate a clear and compelling impact on instructors’ practice and can contribute to the literature on effective professional development.