It was a rough long day, yet we were able to learn in lab and troubleshoot together. Tonight, we watched a session entitled “When support and quality assurance collide: A learning technologists’ journey to maintain staff wellbeing in a world full of data” by Laura Hollinshead from the University of Derby. Hollinshead worked with their team on quality assurance and developing a process for increasing and maintaining quality. They wanted to establish baselines and share data with the institution. The baselines included:
- develop a high-level module plan, prepare a module induction,
- present learning content in structure, manageable segments,
- include a blend of synchronous and asynchronous approaches,
- ensure the online socialization of students,
- facilitate the learning through group work and collaborative activities,
- active use of a range of multimedia resources, and
- apply a consistent approach to reviewing and improving accessibility of course materials
The challenge, Hollinshead stated, was creating “manageable, fair, and supportive” environments. They wanted the workload of assessing programs to be manageable along with the academic workload for instructors. They wanted it to be fair and to take into account the different approaches taken by instructors. Hollinshead emphasized that they wanted a supportive environment. For the design, they created different modules and “personas” for different users. They began by notifying program leaders that their program had been selected. Then, the program leaders worked with learning technologists who did a review to plan the next steps. Finally, there was a follow up to learn how programs were improved. They used a rubric to review programs and rated on a three point scale the baseline items from above. The scale was color coded: red (low), yellow (mid), and green (high) scores. Accessibility, Hollinshead mentioned, came up very often, and they created more resources to share with instructors and staff. The team requested feedback from participants and responses indicated that most were satisfied with the materials and support provided. With the reviews, the team identified and shared “best practices” as well as improvements to approaches.
What didn’t work so well, according to Hollinshead, was the workload for their team and academic staff. The team also streamlined the review process, and Hollinshead mentioned they have refined the baseline and guidelines for reviews. One question asked by the audience was if workloads were discussed with staff and, for example, module leaders. The answer was not initially, and now they have a better understanding of the time constraints and workload. Hollinshead mentioned several times that the workload and time constraints were challenges for staff. Another question an audience member asked was how much follow up was performed. It seems that follow up was done using online forms and there are still opportunities to learn more from reviewers and participants. As I continue to take Quality Matters workshops and think about course improvement, I appreciate learning what other programs and institutions have done and the challenges they encountered. While I am honestly scared to share some of the courses I teach with others… or should I say embarrassed? … the feedback and examples from others are very useful. I have learned so much from the opportunity to view other course sites and setups. One struggle we have right now for the online molecular biology course is how to organize and display online labs in the learning management system. Maybe external reviewers (external to the program and course) will have suggestions!
