Cutting EDge: A Collaborative Approach to Faculty Development

Author: 
Heidi Marsh
May
2019
Volume: 
22
Number: 
5
Learning Abstracts

Technology is increasingly seen as a cost-effective and innovative solution to issues in higher education, including the challenge of meaningfully engaging students during class time (Hooper & Rieber, 1998; Kirkwood & Price, 2014). However, there remain barriers to the integration of technology, in terms of faculty time, resources, and experience. Moreover, while the addition of technology may enhance teaching practices, there is also the risk of integrating technology for technology’s sake; to be effective, faculty need to consider how digital tools can engage students from both a technological and a pedagogical standpoint (Johnson, Wisniewski, Kuhlemeyer, Issacs, & Krzykowski, 2012). Finally, although the integration of new technology is widely encouraged, its impact and effectiveness have received relatively little empirical attention (Kirkwood & Price, 2014). This issue of Learning Abstracts describes a new collaborative approach to educational development, pioneered at Humber College Institute of Technology and Advanced Learning, that aims to address these issues. In the model, the college’s Professional Learning, Digital Learning, and Scholarship of Teaching and Learning (SoTL) teams united to provide a support-rich opportunity for faculty to experiment with and integrate a new technology-enhanced approach (in-class polling) into courses, while gathering empirical evidence about the impact.

Barriers to Technology Integration

Faculty are increasingly being encouraged to integrate educational technology into their teaching practices, but this may pose challenges. For one, the adoption of new technology can be time consuming (Ely, 1990), particularly if institutions have insufficient resources and are unable to provide proper faculty development opportunities (Buchanan, Sainter, & Saunders, 2013; Johnson et al., 2012). Moreover, innovation with new technology can feel risky and even induce anxiety related to a lack of confidence and/or expertise (Kotrlik & Redmann, 2005; Nkonge & Geuldenzolph, 2006). In addition, because educational technologists, digital specialists, and educational developers often work independently of one another, there may be a disconnect between how to technologically and pedagogically incorporate a new educational tool.

Beyond the question of how, there is also the question of why. Although innovation is to be encouraged, it is critical to investigate the effectiveness of digital tools, particularly as they may be costly. Unfortunately, the acceleration of technological innovation makes this challenging, and research on the efficacy of these tools has tended to lag behind (Kirkwood & Price, 2014) for reasons such as faculty time constraints, a prioritization of disciplinary research over SoTL research, and a lack of experience in conducting classroom research (Kenny & Evers, 2010). There are also ethical considerations for faculty investigating the impact of their own teaching practices, which can pose a barrier to engagement with SoTL (Martin, 2013). Finally, much of the SoTL research that has been conducted on educational technology has been limited to a single classroom and/or a specific context, with a single tool. Thus, our knowledge of the impact of many digital tools is piecemeal, at best.

In-Class Polling

One notable exception is student response systems (SRS), originally introduced in the form of clickers. These devices provide the opportunity for faculty to gather real-time feedback from their students through in-class polling or quizzes. Typically, an instructor projects a question and students use clickers to respond. Compiled responses are displayed and, sometimes, discussed. Unlike other technologies, the impact of clickers has been well documented, with evidence that they support improved student interactivity, attendance, retention, engagement, and, occasionally, grades (for a review see Aljaloud, Gromik, Billingsley, & Kwan, 2015).

Although the original SRS were clickers, in the past five years there has been an explosion of phone- and web-based apps for in-class polling. These tools offer an augmented opportunity to exercise authentic pedagogy (Kivunja, 2015). However, the ever-evolving and emerging apps have been much less researched than their original counterparts.  Moreover, while some studies have compared these newer polling apps to traditional clickers (e.g., Stowell, 2015), most of the work has involved only one or two polling tools in a couple of classrooms (cf. Iwamoto, Hargis, Taitano, & Vuong, 2017; Wash, 2014). Given that each app varies in its features, affordances, and look and feel, it is unclear whether they are equally effective at stimulating learning and engagement.

The Solution: Cutting EDge

In response to the need to support the integration of new digital tools in a way that is both technologically and pedagogically appropriate, and in a framework that allows for the collection of empirical data, Humber College Institute of Technology and Advanced Learning  developed Cutting EDge. In this model, interested faculty were invited to two lunch-and-learn sessions, in which they were introduced to five in-class polling tools: Plickers, Kahoot!, Mentimeter, Socrative, and original clicker devices. Faculty were trained not only how to use each tool, but also about best practices for their integration into coursework (cf. Caldwell, 2007). Faculty selected their three favorite polling tools and were assigned to use each of the three tools for a three-week period of the semester, respectively (see Figure 1). During the three-week periods, they were asked to use only that tool in at least two lessons. The order that the tools were used was counterbalanced across faculty members as much as possible to account for potential novelty effects (Lantz, 2010). It should be noted that no faculty members chose physical clickers; all of them selected phone- and web-based apps.

Figure 1. Cutting EDge Training, Support, and Research Model

When faculty tried a tool in class for the first time, a member of the Digital Learning team was present (if requested) to mitigate the risk involved with experimenting with new technology. Meanwhile, the SoTL team recruited each faculty member’s students to complete brief surveys about their experiences with each of the polling tools. Students were asked to report on the quality of classroom interactions, their perceptions of their own learning and understanding, and the features of the polling tools individually. These surveys were administered at the end of each three-week window. A set of final, longer surveys was administered to students and faculty at the end of the semester to gather overall impressions on in-class polling. All surveys were voluntary, online, and anonymous. Finally, focus groups were conducted with faculty members after each three-week block to explore instructors’ perceptions of classroom interactions, their students' engagement, and the impact of the experience on their teaching practices.

The Cutting EDge framework provided an opportunity for faculty to learn several new digital tools in an evidence-informed way, within a supportive group setting, and in an environment that mitigated risk and allowed for experimentation. It also allowed for the collection of data on four digital tools from over 400 students across twenty faculty members’ classrooms. Finally, it allowed faculty to take part in a SoTL research project and get feedback from their students without having to conduct the research independently and navigate ethics approval (which was managed centrally by the SoTL team).

Results/Impact

Although the focus of this initiative was on technology-enhanced learning, an in-depth description of the data on in-class polling is beyond the scope of this discussion and will be reported elsewhere. Suffice it to say that the polling tools were overwhelmingly popular, both for faculty and students. Levels of engagement were high and the tools improved awareness of students’ conceptual understanding. Interestingly, students showed a strong and nearly unanimous preference for one app (Kahoot!), while faculty preferences were much more varied.

In this article, we focus on the lessons learned about educational development, which were equally illuminating. The benefits to faculty were immediate, explicit, and ongoing; teaching practices were transformed. This was evident from their survey responses, summarized in Table 1. Seventeen of the twenty faculty members completed the survey, and every single one agreed or strongly agreed that the use of the tools was a positive teaching experience for them, that they would recommend their use to other faculty, and that they would use them again in the future. Moreover, more than three in four faculty (76.5 percent) reported that they changed their teaching in relation to students’ responses on the polls. Many (41.2 percent) used the tools not only to assess understanding, but also to get feedback on their teaching. More than four in five (82.4 percent) reported that getting feedback during class helped them to teach better.

Changes in teaching practice were also apparent from the focus groups. One faculty member reported that he had reconfigured an entire lecture from a PowerPoint presentation into a series of questions in Kahoot! to guide his students through a new skill. Others noted that the experience challenged them to rethink their lesson plans. As reported by one instructor,

Sometimes it made me do something in a slightly different way . . . because I was thinking more about how to stimulate discussion or incorporate opinions into this very scientific class. So I really liked that; it stretched me a little bit.

Another said,

The new way that I’ve now changed . . . my slides and the flow of the classroom, I think I'm going to keep that. . . . By the second or third week, I started being able to use the tool at the beginning of classroom more liberally. I let my lecture flow a little more liberally, because I wanted to see what the reaction of the students would be first.

In some cases, this flexibility was in direct relation to an increased awareness of students’ levels of understanding (or lack thereof). As one faculty member reported, “I actually found myself, even at break, going through the slides and either taking out or moving things around because I now kind of got a feel of the classroom, which was new.”

Many faculty expressed a sense of renewal, excitement, and experimentation with their teaching, as is evident from this statement:

Now that I am aware of [the tools] I will have more time [to] explore different options and really integrate them into my lesson planning so I’m excited about it. If I’m excited about it then my students are going to be excited.

But perhaps this observation best encapsulates the faculty experience:

I love it. I’m lucky that I’m part of [this initiative]. . . . As someone who’s not technology savvy it was important to me to bring this into my classroom. . . . It's just opened my eyes to more tools that are out there. I think it's almost just as fun for me to incorporate them in my lesson as a PowerPoint, right? . . . You also get to see the benefits of what the students have to offer with the tools, so for me I absolutely loved being a part of this study and I know my students did too and I think they're happy for me that I'm learning more technology in the classroom. So yeah, it's definitely going to bring my future classes different ways of teaching.

Lessons Learned and Recommendations

Despite positive results, we also learned some valuable lessons about the process of implementing Cutting EDge. First, the coordination that was required between teams was extensive. Because we took a personalized approach to development (with 82.4 percent of faculty reporting that the training and support was sufficient), faculty members occasionally got out of sync with one another, meaning that research recruitment, in-class support, and the deployment of surveys were not always perfectly aligned. We have since employed a project manager to oversee the process and coordinate between teams. This has improved the flow of the model substantially and has allowed for further scalability.

Subsequent iterations have also taught us that the choice of technology is important; in-class polling tools worked well because they could easily be integrated into a class session without being tied to assessment. In other words, the tool could easily fit into the course, rather than having to make the course fit the technology.

We have also learned to allow more flexibility with the timeline. Faculty often had a specific lesson plan in mind for the tools that didn’t always align with the timeline we had originally proposed. Exams, internships, and study breaks were also important to factor in. We now give faculty a wider window of time in which to integrate the tool and build in some contingency room in case of unanticipated changes.

Next Steps

This model represents an innovative approach to educational development, with rich collaboration between the Professional Learning, Digital Learning, and SoTL teams. The model allowed for evidence-enhanced integration of technology in a way that is scalable and that generated a wealth of new data from hundreds of students, across several disciplines, all within a single semester. At the same time, faculty were supported at a personalized level, which allowed for meaningful experimentation. We plan to continue and expand collaboration among these teams, for this and other professional development initiatives.

Bolstered by the success of the first iteration, we have continued to use the model, but with new technologies, including collaborative learning tools and interactive video apps. We are also exploring the possibility of piloting the initiative across institutions to gather even more widespread, generalizable data about the impact of technology in the classroom.

References

Aljaloud, A., Gromik, N., Billingsley, W., & Kwan, P. (2015). Research trends in student response systems: a literature review. International Journal of Learning Technology10(4), 313-325.

Buchanan, T., Sainter, P., & Saunders, G. (2013). Factors affecting faculty use of learning technologies: Implications for models of technology adoption. Journal of Computing in Higher Education, 25(1), 1-11.

Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life sciences education, 6(1), 9-20.

Ely, D. P. (1990). Conditions that facilitate the implementation of educational technology innovations. Journal of Research on Computing in Education, 23(2), 298-305.

Hooper, S., & Rieber, L. (1998). Teaching with technology. In A. C. Ornstein (Ed.). Teaching: Theory into practice. Needham Heights, MA: Allyn and Bacon.

Iwamoto, D. H., Hargis, J., Taitano, E. J., & Vuong, K. (2017). Analyzing the efficacy of the testing effect using Kahoot™ on student performance. Turkish Online Journal of Distance Education, 18(2), 80-93.

Johnson, T., Wisniewski, M. A., Kuhlemeyer, G., Isaacs, G., & Krzykowski, J. (2012). Technology adoption in higher education: Overcoming anxiety through faculty bootcamp. Journal of Asynchronous Learning Networks, 16(2), 63-72.

Kenny, N., & Evers, F. (2010). Responding to the challenging dilemma of faculty engagement in research on teaching and learning and disciplinary research. Collected Essays on Learning and Teaching, 3, 21-26.

Kirkwood, A., & Price, L. (2014). Technology-enhanced learning and teaching in higher education: what is ‘enhanced’ and how do we know? A critical literature review. Learning, Media and Technology, 39(1), 6-36

Kivunja, C. (2015). Innovative methodologies for 21st century learning, teaching and assessment: A convenience sampling investigation into the use of social media technologies in higher education. International Journal of Higher Education4(2), 1-26.

Kotrlik, J. W., & Redmann, D. H. (2005). Extent of technology integration in instruction by adult basic education teachers. Adult Education Quarterly, 55(3), 200-219.

Lantz, M. (2010). The use of ‘clickers’ in the classroom: Teaching innovation or merely an amusing novelty? Computers in Human Behavior, 26(4), 556-561.

Martin, R. C. (2013). Navigating the IRB: The ethics of SoTL. New Directions for Teaching and Learning, 2013(136), 59-71.

Nkonge, B., & Geuldenzolph, L. (2006). Best practices in online education: Implications for policy and practice. Business Education Digest, 15, 42-53.

Stowell, J. R. (2015). Use of clickers vs. mobile devices for classroom polling. Computers & Education82, 329-334.

Wash, P. D. (2014). Taking advantage of mobile devices: Using Socrative in the classroom. Journal of Teaching and Learning with Technology, 3(1), 99-101.

Dr. Heidi Marsh is Director, Scholarship of Teaching and Learning; Dr. Carol Appleby is Director, Professional Learning; Senay Habtu is Research Administrator; and Dr. Siobhan Williams is Program Administrator for the Centre for Teaching & Learning. Mark Ihnat is Director, Digital Learning, and faculty, Liberal Arts and Sciences and Innovative Learning, at Humber College Institute of Technology and Advanced Learning in Toronto, Ontario, Canada.

Opinions expressed in Learning Abstracts are those of the author(s) and do not necessarily reflect those of the League for Innovation in the Community College.