College students are noticing their AI‑smoothed writing sounds strong — and not like them

Share:

Generative AI has become a part of everyday student life in Canada. While institutions focus on misconduct and detection, a deeper shift is happening, one that concerns identity.

A recent KPMG Canada report finds that 73 per cent of students use generative AI for schoolwork, and nearly half say it is their “first instinct.” Also significant is the finding that many students also report feeling uneasy, worried that their use may be seen as cheating.

The study is based on a survey of 684 university, college, vocational and high school students within a larger sample of 3,804 Canadians (aged 18+), on how people are adopting generative AI.

In my doctoral research on STEM education in Ontario colleges, I’m exploring how AI is transforming not only how students write but also how they perceive voice, legitimacy and what it means to be themselves.

Academic policies can define what constitutes cheating, but they do not address a more subtle concern: if AI helped write my assignment, will I still be seen as capable, and will my work represent me?

Identity takes shape through writing

Writing is more than a technical skill. It is one of the primary ways students structure and elaborate ideas, demonstrate competence and position themselves as emerging professionals.

This is particularly significant in STEM, where programs are often closely linked to specific career paths. Students are expected to begin positioning themselves as future professionals through how they communicate and present knowledge.

At the same time, STEM fields are often seen as primarily technical or data-driven, with writing treated as secondary. Yet research shows that communication is central to scientific practice, shaping how knowledge is constructed, interpreted and shared.

AI is part of envisioning career paths

Even beyond this, when science students write assignments, they also undertake what social and cultural theorists describe as “identity work.”

Through writing, students build narratives that let them explore how they might belong in particular worlds or professional fields. In my research, I examine how STEM programs operate as cultural worlds with implicit rules about what counts as smart, credible and legitimate participation.

Students interpret rules and adjust how they portray themselves in their work. This identity work is shaped by prior experiences, confidence with disciplinary language and alignment between personal interests and the STEM career paths they see as being available to them. AI is now part of that process.

‘Kinda generic’

In my research, I have observed college STEM classes, taken field notes and spoken with a cohort of students multiple times over a two year period about their work.

I often hear a version of the same concern: the AI-generated draft is technically strong, but “it does not sound like me.” This concern reflects the insight that “voice” or “sound” in writing is a signal of legitimacy.

In my collaborative work on cultivating student agency, I use the idea of “becoming alive within science education” to describe moments when students can bring more of themselves — their perspectives, ways of thinking and experiences — into how they learn and express ideas.

Yet institutions often favour more standardized forms of writing. AI can intensify this by making a fluent, generic style instantly available. For some students, this lowers barriers and supports access. For others, it feels like self-erasure.

One student put it this way:

“It’s better writing, yeah, it sounds good and helps get a better grade. But it’s kinda generic. Like anyone could’ve written it, not just me.”

This recurring pattern in the data points to a broader tension: phrasing, structure and tone in writing carry traces of identity, traces AI can smooth or erase.

How we think about ourselves

Many of us have likely noticed that AI tools can improve the quality and efficiency of writing and may also lead to more uniform outputs, reducing variation in how ideas are expressed. These concerns are echoed in education guidance.

UNESCO warns that AI systems can shape how knowledge is produced and expressed, raising questions about human agency and originality. Canadian policy discussions similarly highlight both the opportunities and risks of AI for student learning and authorship.

Taken together, these insights suggest how beyond only assisting human writing, AI shapes how voice is expressed and how we think about ourselves.

Policy catching up

Canadian post-secondary institutions are still determining their approach to AI.

Many policies aim to balance flexibility with oversight, allowing limited AI use while emphasizing disclosure and addressing risks such as fabricated citations, bias and privacy issues.

Yet institutions also acknowledge challenges in enforcement.

As policies evolve, uncertainty remains. Students must navigate what is permitted, what constitutes their work and whether it truly reflects who they are.

STEM and belonging

In Canada, participation in STEM fields remains uneven across gender and other social dimensions such as race, Indigenous identity, socioeconomic status and immigrant background.

Many students already question whether they belong, making recognition deeply consequential.

If AI-generated writing becomes the implicit standard for “good work,” students may begin to locate competence in the tool rather than in themselves.

Students who rely on AI may question the authenticity of their success, while those who avoid it may feel at a disadvantage.

What can educators do?

Rethinking learning design is important. Students should not have to guess what is acceptable. Assessments should focus on process that makes students’ thinking visible, not just product.

Significantly, writing in one’s own voice must be treated as a skill worth developing.

In practice, this can be as simple as asking students to explain how they used AI in an assignment, or compare an AI-generated paragraph with their own and discuss what changed in tone, clarity and reasoning.

Instructors might also ask students to revise AI-polished text so it reflects their own thinking, or to identify where their interpretation and uncertainty matter. These and other small shifts help foreground not only what students produce but also how they think and position themselves in their work.

AI is here to stay. The question is whether STEM classrooms will help students use these tools without losing their voice, their agency and their sense of belonging.

Author Bio: Nurul Hassan Mohammad is a PhD Candidate, Ontario Institute for Studies in Education at the University of Toronto

Tags: