Evaluating New Learning Technologies

  • Home
  • ALT
  • Evaluating New Learning Technologies

It is Friday! I had a chance today to talk with Melanie about Hypothes.is. It was a lot of fun! Also, student researchers worked in the lab. Tonight, we watched an ALT 2021 session entitled “Making your mind up: Formalising the evaluation of learning technologies” by Marieke Guy and Tharindu Liyanagunawardena. Guy is a learning technology production manager at UCEM. Reviewing the transitions of the past two years from the “pivot to online emergency remote teaching to a year of ‘online teaching,’ Guy spoke about the shifts and student satisfaction. Using the NSS scores, Guy emphasized that it was a rocky year and there was 75% overall satisfaction and a need for investment in digital education. Guy also discussed the many factors involved in making technology decisions in higher education campuses: evaluation, stakeholders, compromises, administration… and a lot of problem solving. Guy provided examples of existing frameworks. The Educause Rubric for E-learning Tool Evaluation, the Technology acceptance model, Jacob Nielson Usability testing, and Tony Bates SECTIONS model (looks at several “sections” including students, ease of use, costs, affordances of different media, interaction, networking, and security and privacy) were listed. There are also models that look at change management approaches, explained Guy, and some examples are ADKAR (Awareness, Desire, Knowledge, Ability, and Reinforcement), McKinsey 7S Model, and Lewin. CMALT looks at the constraints and benefits. uCEM looked at several different systems and did systematic testing and evaluation using the Educause rubric. The Educause rubric addresses functionality, accessibility, technical, mobile design, privacy/data protection/rights, and social/teaching/cognitive presence. I thought it was interesting to see mobile design and the Community of Inquiry presence aspects included. Guy explained that at UCEM after testing on the VLE, IT staff checked integration and safety before the group provided recommendations. Guy asked “how can you ensure a thorough and timely evaluation” and this made me think about the timeframe for evaluating new technologies. No wonder it often takes years! The group used Mural to crowdsource ideas, and the audience quickly filled the board with ideas. I wonder how expensive Mural is? The free version allows up to five murals. Guy answered several questions about the process and improvements. One consideration I had not thought about that was mentioned was how pilots may not offer the entire range of functions a tool has to offer. As we get ready to use new software next spring an have been testing it with a group of undergraduate scholars, I now have a couple more questions to ask the company!

Computer screen with circle and numbers on the sides.
How are new learning technologies evaluated before implementation? Photo by Tima Miroshnichenko on Pexels.com