The Impact of AI on Education: Enhancing or Undermining Learning?
Key points:
AI is rapidly changing education, but not always in ways that support learning. A growing number of AI tools promise to “help” students by completing assignments, writing assignments, solving problems, or even taking exams automatically.
Although these tools may seem convenient, they raise an important question: Do they remove barriers to learning or do they eliminate learning itself?
When AI Replaces Effort, Learning Pays the Price
AI is becoming increasingly integrated into academic life. According to Tyton Partners’ Time for Class 2025 report, 30 percent of teachers and 42 percent of students report using generative AI on a weekly or daily basis. As AI becomes a routine part of the learning experience, the question is no longer whether it will be used, but rather how these tools impact the learning process.
The rapid rise of AI doing coursework for students poses real risks for learners, teachers and institutions. Widely accessible automation tools like Google’s Homework Helper, Companion’s Einstein, Quick Solver AI, or Eduhack.ai can bypass the effort required to learn. Learning shouldn’t be effortless. It is inherently iterative and time consuming. Progress comes through practice, application and repetition. This effort is not a flaw in the system; Productive struggle is the foundation of how people learn.
As we evaluate AI in education, we should ask a simple question: Does this technology remove barriers to learning or replace the desirable effort that makes learning possible? If the answer is the latter, the educational process itself risks being undermined. When students rely on AI to generate answers rather than solve complex problems, the learning loop breaks down. Without application and repetition, learning simply cannot be achieved.
At the same time, we should be clear about why students use these tools in the first place. Too often, learning experiences feel passive or disjointed. When courses are engaging, personalized, and interactive, students are far more likely to make the effort to learn critically.
AI should support these efforts, not replace them. Tools that help learners better understand concepts, provide guidance, or reduce unnecessary friction in the learning process can be incredibly powerful. In fact, the Time for Class report shows that 84 percent of students still prefer human-centered instruction, underscoring the importance of AI supporting—not replacing—classroom instruction.
The Biggest Challenge for AI is Not Technology, But Privacy and Trust
The risks of automated AI tools go beyond learning outcomes. Privacy and security are equally important concerns. We are seeing tremendous innovation in the AI ecosystem, and some of it is truly transformative. However, there are still many unknowns with these new solutions. Many newer or less mature vendors lack strong security measures, and unfortunately some tools are designed to take advantage of students looking for quick workarounds. Students may download applications without fully understanding how their data will be used, stored or shared without their consent.
Educational institutions have a solemn responsibility to protect learners’ data, and AI developers must share this responsibility. AI systems should be transparent about how they work, what data they collect, and how that data is used. Institutions should always be clear about what technology they are using and how it interacts with their learning environments.
I have always taken the protection of learner data seriously and believe that it should not be used to train large language models. Protecting student data isn’t optional – it’s essential to building trust in education technology.
Look for AI to Transform Learning
The goal should not be to avoid AI in education. The goal should be to use AI that strengthens learning rather than replacing it. One of the most promising approaches I’ve seen from educators is a shift toward more active learning and assessment. Instead of tasks and tests that reward memorization, active experiences challenge learners to analyze, interpret, and apply knowledge in meaningful and engaging ways. This deeper engagement naturally discourages shortcut tools and promotes true understanding.
AI designed specifically for learning can support this change.
When used carefully, it can help educators transform their lectures or static materials into interactive content that encourages participation and application. For example, AI can transform recorded lectures or slides into dynamic modules in which students actively engage with concepts, test their understanding, and apply what they have learned – all directly related to learning outcomes.
AI can also provide learners with support tailored to their needs: answering course questions, guiding them back to relevant material after a quiz, or helping them explore concepts outside of traditional office hours. In this way, AI becomes a learning companion and not a replacement.
The AI Decisions Institutions Make Now Will Shape Learning for Generations
Institutions are still working to establish guidelines for AI use. According to the latest Time for Class report, 45 percent of teachers say preventing cheating is one of their biggest teaching challenges. As AI tools become more integrated into academic life, thoughtful governance and transparency will be critical to ensure that these technologies strengthen learning rather than undermining it.
When evaluating AI solutions, institutions should use a few principles to guide their decisions.
AI should be learning-oriented and based on robust learning science foundations. Technology should reinforce best teaching practices, not circumvent them. AI is designed to help educators work more efficiently while maintaining the quality and integrity of course content and aligning with learning outcomes. In short, educators must remain in control.
Human supervision should always be integrated into AI systems. Educators – not algorithms – should make the final decisions about course content, feedback, and assessments. Keeping people at the center remains a top priority for leaders. As Justin Rose, associate vice president for information management and digital learning at Southeastern University, puts it: “People working in harmony with new technologies will always be the key recipe for success.”
Transparency and governance are also essential. Institutions should control how AI interacts with their data. You should be able to set permissions, define guardrails, and understand exactly how systems work. The educational impact must be measurable.
AI should improve the results we see: increased engagement, deeper understanding, and significant time savings for educators. AI will undoubtedly play an important role in the future of education. The question is not whether it will be used, but how it will be used.
If we develop AI systems that shortcut the learning process, we risk undermining the very purpose and value of education. But AI that deepens engagement, makes learning more personal, and supports meaningful effort has the potential to become one of the most powerful tools ever available to educators, learners, and institutions.
The future of AI in education should never be about replacing learning; it should always be about strengthening it.
Christian Pantel, D2L
Christian Pantel is Chief Product Officer at D2L.
Latest posts from eSchool media contributors (view all)
For more information, visit Here.
“`

