Students’ relationship with AI: between support for learning and threat to self-esteem

Share:

How can generative AI be integrated into education responsibly and effectively, while taking into account the needs of students? This is a challenge facing higher education institutions, given that AI not only impacts work habits but also self-esteem.


Generative artificial intelligence tools are disrupting education at breakneck speed. By 2025, 74% of 18- to 24-year-olds in France will be using generative AI , 19 points more than the 25- to 34-year-old age group.

With just one click, these technologies allow you to write a text, summarize a complex concept, or create a unique image. A real asset for students seeking efficiency, and a promising tool for teachers looking to energize their classes and diversify their teaching approaches. But doesn’t this efficiency hide certain risks?

While AI may provide immediate benefits, cognitive outsourcing could come at a high cost in the long run. A preliminary study from the Massachusetts Institute of Technology (MIT) introduces the concept of “cognitive debt,” which suggests that the more tasks we delegate to AI, the more likely our brains will become idle, ultimately leading to the atrophy of essential learning skills such as memory and critical thinking.

However, as some researchers point out , these still preliminary results, while opening up important avenues for reflection, could also partly reflect the specificities of the methodological approach used.

This observation highlights a central issue: the need to reflect on students’ relationship with AI and how to integrate and supervise its use.

Use of AI and self-construction

Beyond its impact on cognitive activity, AI also alters students’ emotional relationship with learning. A study conducted at a leading French business school showed that students with high academic anxiety perceived AI as an essential support for their success, while fearing that its use would be deemed illegitimate or that they themselves would become replaceable.

Conversely, students with higher self-esteem were less likely to question the legitimacy of AI and the work it produced, and less likely to fear being replaced. However, they too reported a high level of dependence on these tools. For the latter, AI is primarily perceived as an effective assistant, while for more anxious students, it appears both as an essential aid and a potential threat.

This ambivalence reveals a real paradox: while AI enhances perceived effectiveness to the point of becoming indispensable for many students, it also raises deep concerns about the academic legitimacy of its use and the fear of being replaced in the long term.

These feelings reflect a more global transformation: AI is not just a neutral tool, it is part of a relationship of trust and identity that profoundly impacts the construction of oneself as a learner and future professional.

A tool with multiple potentialities

These findings raise a key question: how can we integrate generative AI into education responsibly and effectively, taking into account the cognitive and emotional needs of students?

Beyond the risks it may pose, AI allows for personalized learning paths by adapting to the needs, pace, and level of each student. It can provide additional explanations, reformulate a difficult concept, or offer tailored examples, thus offering ongoing support, available 24/7. This availability allows students to progress outside of class hours and strengthen their independence.

Studies also highlight that, if properly managed, AI has considerable potential as a catalyst for creativity and academic performance , by generating unprecedented synergies between humans and machines.

What new skills for students and teachers?

The challenge is to design AI as a real educational lever while taking into account psychological differences and varied uses depending on the students.

In this context, the role of teachers is undergoing a transformation. Integrating these tools into courses requires in-depth pedagogical reflection to address questions of academic legitimacy, alleviate fears related to human replacement, and, above all, ensure that AI is perceived as a support for learning, and not as a threat to future skills. Furthermore, it is essential to prevent the risks of dependency and overconfidence in AI. Some researchers emphasize the importance of developing metacognition  : the psychological ability to observe, analyze, and regulate one’s own thoughts and behaviors.

Metacognition encompasses the explicit understanding of oneself as a learner—one’s strengths, weaknesses, and effective strategies—as well as the ability to plan, monitor, and adjust one’s work methods. In concrete terms, it allows students to take a step back from their learning, assess the quality of their reasoning, and reflect on their worldview.

Because in a world where collaboration between humans and AI is becoming the norm, it will be more necessary than ever to cultivate our deeply human skills and values. In this regard, the World Economic Forum’s “Future of Jobs Report 2025” highlights that, well beyond technical skills, abilities such as creative thinking, resilience, flexibility, empathy, curiosity, and the ability to learn throughout life will occupy a more central place than ever.

Author Bio: Giulia Pavone lectures in Artificial Intelligence (AI) and consumer behaviors, technology adoption, conversational agents, autonomous vehicles, ethics of AI at Kedge Business School

Tags: