Strategies for implementing AI for better student engagement

92% of students are already using AI, yet only 36% receive institutional support. Here, we take a look at practical ways institutions can support students with the use of AI, promote AI literacy and optimize it for student engagement.

Clock 3 min read Calendar Published: 5 Dec 2025
Author Phoebe Hoar
Strategies for implementing AI for better student engagement

The rapid integration of AI is reshaping higher education, moving beyond the initial fear of plagiarism to a focus on harnessing its potential to increase student engagement and success. For higher education professionals, this shift presents a powerful opportunity to deploy AI not just for compliance, but as a core tool for equity and deep learning.

In our recent insightful webinar, "Effective Strategies for Implementing AI to Increase Student Engagement," Genio's Chief Technology Officer, Josh Nesbitt, chaired a discussion with pioneering experts:

  • James Moore (Director of Online Learning at DePaul University)
  • Amanda Hagman (Chief Scientific Officer at Atana and Adjunct Professor at Utah State University)
  • Jon Louviere (Chief Officer for Online Enterprise and Extended Learning at Idaho State University).

The panel explored how institutions can build a foundation of trust, compliance, and effective strategy, particularly important given that 92% of students are already using AI, while only 36% receive institutional support.

 

Using AI to augment the learning process

The panelists consistently emphasized that effective AI implementation must be guided by the principle of augmentation rather than replacement.

The goal is to remove "unproductive friction", those administrative or repetitive tasks that don't add learning value, while maintaining the "productive friction" necessary for developing real knowledge and skills.

  • Scaffold the learning process: AI tools should always help to scaffold the learning for students, rather than doing it for them. For instance, Genio's approach uses AI to automate and offload tasks, like generating multiple-choice quizzes or outlines from existing notes, but never automatically generates the notes themselves, as the process of honing notes is critical.
  • Encouraging reflection and presentation: Two of the most powerful parts of learning are reflection and presentation, which often happen outside the classroom. Students can use generative AI systems (often via voice assistants) to practice presenting and reflect on feedback, bringing that learning back into the classroom.

James Moore noted an interesting disconnect: older generations are embracing generative AI, while some younger students are pushing back on the technology due to ethical concerns. Providing a "safe environment" for students to experiment, critique, and form a meaningful argument for rejecting AI is crucial for preparing them for the modern workplace.

Encourage students to use AI for personalized support and burnout management

AI is highly effective as an equalizer, providing customizable and scalable support that directly benefits marginalized or underserved student populations, including those with disabilities, first-generation students, and veterans.

Personalized AI support for students

  • Personal learning assistant: 65% of students don't disclose their disabilities. AI allows students to self-disclose their learning needs (such as dyslexia or ADHD) to an AI assistant, which is often a "safer" and less psychologically taxing task than telling a professor or advisor. The AI can then reframe course notes and information in a way that the student can consume it better, a level of personalization most professors don't have time or capacity to provide.
  • Universal design for learning (UDL): AI should be used to promote universal design in all course products and learning experiences. By addressing the needs of disabled learners, institutions transform the experience for everyone.
  • Increased accessibility: AI can help make lecture content more accessible through efficient and appropriate transcription, language translation, and content adaptation for students with a disability and/or a different language background.

Use AI to reduce burden on advisors

Student advisors and support teams face increasingly high pressures and demands, leading to burnout. AI can be used to reduce this and the pressure on student support teams.

  • 24/7 support: AI-powered agents can be configured to provide support when human advisors can’t. This is particularly critical for online students, or new majority learners, who work weekends or antisocial hours, providing immediate answers to common questions.
  • Early intervention and data prioritization: Institutions hold a range of data across different systems. By using machine learning and AI, institutions can analyze this data to predict which students are most vulnerable to dropout and need outreach, allowing advising teams to prioritize their most meaningful and impactful projects.

Tips for ensuring students use AI fairly

The panelists agreed that the major challenge is integration and mastery, which requires proactive institutional guidance.

1. Transparency and trust

  • Be transparent in your own AI usage: The panel discussed how professors should be transparent about their own AI use, avoiding "double standards" where they demand attribution from students but not for their own syllabus or rubrics.
  • Show vulnerability: Professors can build trust by demonstrating areas where they are still learning or using AI, which positions them not as an all-powerful figure, but as a learner alongside the student.
  • Ethical grading: If using AI for grading, faculty must be upfront with students and leave the door open for students to seek a human review. James Moore also suggests using AI after grading to check for human bias (like grading differently when tired) rather than as the primary grading tool, which preserves the value of the human-student relationship.

2. Mitigating bias 

  • Challenge algorithm bias: Generative AI models are trained on the web and are not free from bias. Institutions must actively test and mitigate this.
  • Diverse testing groups: The group responsible for testing AI systems must be representative of the student demographics to ensure they notice issues that those in leadership or faculty positions might not see due to self-selection bias.

3. Institutional AI literacy

The biggest ethical issue discussed was the lack of AI literacy. Institutions must take a strategic, system wide approach:

  • AI literacy training: Institutions should prioritize creating an AI literacy course for all staff and faculty that results in a certificate. This ensures everyone is equipped with the foundational knowledge to question, control, and define the tools they use, a skill we know is valued by employers too.
  • Strategic mission alignment: Every institution should sit down strategically and discuss how AI can advance their mission plan. Amanda Hagman called this moment "as unprecedented as when we all got sent home for COVID," meaning institutions must adapt or risk falling behind.

Ready to turn these strategies into action? Watch the full webinar recording now to hear the panel share their successful integration stories, best practices, and innovative use cases for AI in student engagement.

Rewatch the webinar
Time for a simpler, smarter note taking accommodation?

Time for a simpler, smarter note taking accommodation?

Genio Notes is the online note taking tool that makes compliance simple, reduces cost and admin burden, and improves student outcomes.
Learn More