Data Analytics and Open Practices at the Institutional Level

I continue watching OERxDomains21 sessions this week. Tonight, I started by watching a session by Nicole Allen “Open Education, Data Analytics, and the Future of Knowledge Infrastructure” that addressed open education and surveillance analytics. Allen gave a talk without slides and started by explaining the new market for data analytics. Several examples of mergers and analytics of research collaborations and student movement on campus were mentioned that I didn’t know about. Also, Allen talked about new legislature and initiatives that encourage electronic textbook use by institutions effectively requiring students to accept the terms and sign up for subscriptions. Allen explained that the use of proctoring software relies on algorithms that are not free of bias and often affect those marginalized disproportionately. The use of data analytics by companies affects how universities function and make decisions. Allen encouraged investing in course redesign and professional development instead of surveillance software. Allen asked: “Does higher education start from the posture of trust?” and compared this to openness and closed-ness. Allen encouraged us to talk more about curriculum and openness since they are the infrastructure of our institutions. During the question session, several points were raised about increasing awareness and communication.

Another session entitled “Exploring the Possibilities for an Institutional OEP Self-assessment Tool to Shift OEP from Grassroots to Institution-Wide” presented by Elizabeth Childs and Tannis Morgan. Morgan started with an explanation of their land acknowledgement and why it is customary to start with an acknowledgement in higher education in Canada. It was refreshing to learn the rationale. Morgan then talked about the tool they designed for evaluating OEP impact. The tool consists of 23 questions with a Likert scale. The tool is available online and they used a critical case self study approach to “examine the similarities and differences between our institutional approaches and evolution.” The convenience sample included five British Columbia post-secondary institutions with open practices such as zero textbook cost degrees, OER use, open courses, open journals, encouraging open… They transcribed and examined the sessions. The main themes they identified in surveys and transcript analyses were advocacy, policy, leadership, and institutional culture. It was interesting to hear about grassroots initiatives, leading from the middle, and institutional culture affects OEP adoption. Childs talked about the impact of using the tool and evaluating it on the research team. It was interesting to hear that the focus was on institutional changes on blended open practices.

Both of these sessions highlighted how institutional changes and adoption of open practices requires raising awareness of faculty practices, institutional culture, and interactions with vendors. The data analytics considerations brought up by Allen and reframing to focus on course redesign and professional development opportunities are issues we are grappling with here. I think that with current reassessment and adoption of OEP, we may be in a position to to a site evaluation using the tool described by Childs and Morgan. In the process of evaluating OEP and institutional culture, we could learn how to improve openness and increase student access to affordable education. I’m interested in OEP and how to change we consider its impacts. What can I do to get more traction and truly help students learn, contribute, and share?

Mountains and clouds in the background. Water reflecting the mountains with snow. Trees on the right and a yellow kayak.
What happens on our campus when Open Educational Practices begin to take root? What institutional changes are required to adopt openness and critically analyze potentially harmful data analytic practices? Photo by James Wheeler on Pexels.com