Scared Of Tech? UAE Students Fear False Accusations Of AI Help In University Assignments
- PUBLISHED: Mon 23 Feb 2026, 6:41 PM
- By: Nandini Sircar
source on Google
- Share:
For many university students across the UAE, submitting a major assessment now comes with an unexpected layer of anxiety - the fear of being misunderstood by a machine.
A new YouGov survey, which gathered 10,330 responses from students across countries including Canada, the United States, the United Kingdom, Australia, New Zealand, Malaysia, Singapore, Saudi Arabia and the UAE, found that 81 per cent of students here report stress about being“wrongly flagged by AI detection tools” when submitting major assessments.
Recommended For You Serbia, Sweden urge citizens to quit Iran as Trump mulls strikeOf the total responses, 527 were from students in the UAE and 515 from Saudi Arabia. At campuses in Dubai and Abu Dhabi, that anxiety is no longer abstract.
Stay up to date with the latest news. Follow KT on WhatsApp Channels.
Prof A. Somasundaram, Associate Dean-Academic Undergraduate Studies Division at BITS Pilani Dubai Campus, said the concern is visible among students.
“We are seeing this concern among students. Many genuinely worry about being wrongly flagged, even when they have completed their work independently. The uncertainty around how AI detection tools operate can create anxiety, particularly for high-stakes assessments. However, our faculty deals with students with empathy rather than completely relying upon AI tools. We prioritise academic dialogue and review processes over automated judgments, which helps reduce unnecessary stress.”
Blurred lines in an AI-augmented classroomThe emotional toll should not be underestimated, said an expert.
“The stress is real. Even the perception that a system may misclassify their work creates uncertainty, particularly for students who are trying to use AI responsibly. At the same time, many institutions do not rely on detection tools, and students in those environments often report greater confidence in their learning process,” highlighted Dr Zeenath Reza Khan, Computer Science Associate Professor at University of Wollongong in Dubai and Founding President of the Centre for Academic Integrity in the UAE.
She added that the issue runs deeper than plagiarism detection alone, pointing to the rapid evolution of technology embedded into everyday tools.
“What makes this moment more complex is the rise of agentic AI embedded directly into browsers and productivity platforms. Tools integrated into search engines, writing environments, and learning systems are no longer separate applications students consciously open.”
Khan emphasised that they are ambient, embedded, and increasingly autonomous.“We are no longer simply discussing students copying from a chatbot. We are dealing with systems that suggest, draft, restructure, and anticipate responses in real time. The boundary between student cognition and machine augmentation is becoming blurred.”
If universities respond only with stricter surveillance, she cautioned, they may miss the bigger picture.
“If institutions respond to this shift with detection alone, they risk addressing only a narrow part of a much broader transformation. The question is no longer 'Did the student use AI?' but 'How do we preserve human agency in an AI-augmented environment?'”
Impact on performance, wider economyAt the Canadian University Dubai, Assistant Professor of Artificial Intelligence Najla Al Futaisi said the psychological impact can directly affect learning outcomes.
“Such causative stress can significantly alter student performance and their general learning experience. When students anticipate false accusations of academic misconduct, their focus may shift from intellectual development and assimilation, rather to anxiety management and defensive writing strategies. In some cases, this fear can paradoxically push students toward the very misconduct they wish to avoid.”
She explained that some students may even turn to AI in an attempt to outsmart detection systems.
“For example, students may use AI tools to generate assignments and then attempt to 'humanise' the output to avoid detection. This approach undermines the learning process, weakens critical thinking skills, and encourages overreliance on technology rather than actual engagement of intellect."
Over time, such patterns can erode students' confidence in their own abilities, diminish originality, and excoriate the existence of analytical and problem-solving skills - competencies essential to academic success and employability in the UAE's knowledge-driven economy.”
For policymakers, the findings also carry implications beyond the classroom. Dr Ashraf Mahate, Chief Economist for Trade and Export Market Development at the Dubai Department of Economy and Tourism, described the survey as a call to action.
“There is a collective opportunity here for our entire ecosystem - from governing bodies and university executives to our technology partners - to move beyond a surface-level engagement with these findings. The data provides us with a clear roadmap to refine high-level policy and address the 'integrity anxiety' and mentorship gaps that are currently part of the student experience.”
He added that clarity and human connection will be key.
“By providing clear, easy-to-follow guidance and maintaining a human-centred focus, we can ensure our teaching and assessment methods remain strong and prepare our national workforce for the challenges ahead.”
Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.

Comments
No comment