Oxford University Press finds half of UK students fooled by AI

Oxford University Press has warned that schools must do more to help students navigate the world of AI after a survey showed alarming gaps in students’ ability to judge AI content.

A new OUP survey of 2,000 UK pupils found that only 47 per cent of students feel confident in identifying trustworthy AI-generated information.

A further 32 per cent admitted they couldn’t tell whether AI content was accurate, while 21 per cent were unsure.

Despite the findings, eight in ten students report regularly using, if not relying, on AI tools in their schoolwork.

The report found that nearly half (48 per cent) said they want more support from teachers to spot reliable content, and 51 per cent called for clearer rules around when AI tools should be used.

Student fears about AI

The report findings feed into broader concerns about how AI might reshape study behaviours.

A total of 60 per cent of students said AI encourages copying rather than original work.

Meanwhile, 62 per cent admitted that AI has had an adverse effect on their studies in some way, with some saying tasks had become ‘too easy’ or that their creativity gets dulled.

Amie Lawless, OUP’s Secondary Product Director, dubbed this as a positive response.

She said: “It’s encouraging to see how aware young people are of the challenges surrounding AI and how eagerly they want to collaborate with their teachers to address them”.

Policy and practice in UK schools

This comes as the UK government has rolled out new AI guidance for schools and colleges, emphasising that AI tools must be used safely and effectively. 

The guidance, published in June 2025, includes materials for staff and school leaders created in partnership with the Chiltern Learning Trust and the Chartered College of Teaching. 

It stresses that AI should be ‘teacher-led’, with outputs systematically checked for accuracy and student data safeguarded. 

Elsewhere, the Department for Education has doubled down on AI tools as a route to free up teachers’ time, encouraging use of AI for administrative tasks such as lesson planning and writing feedback, while making clear school leaders must remain in control of decision-making. 

New school ‘AI toolkits’ published by the DfE also recommend using generative AI for lower-stakes tasks first, while building up governance and standards. 

Meanwhile, many UK schools are still waiting for training and policy support.

A Bett survey cited by the DfE found that 69 per cent of teachers said their school had not yet implemented AI, and 32 per cent of school leaders were not even planning changes to accommodate AI. 

At the same time, Jisc’s Student Perceptions 2025 report underscores that pupils across the UK are anxious about misinformation, bias, and the ethical risks of AI content. 

In the teacher ranks, a recent Literacy Trust study found 45.2 per cent of teachers now report concerns about pupil misuse of generative AI, and 51.4 per cent are worried about the impact on student engagement. 

Jisc’s 2025 survey of staff also flags that only 44 per cent of further education institutions, and 37 per cent of higher education ones, have delivered AI training for staff – a sign that policy often precedes readiness. 

“Expanding AI in schools without care could lead to a drop in standards as pupils stop thinking hard about the tasks they complete”, he added.

Related posts

First Trust Global Portfolios Management Limited Announces Distributions for certain sub-funds of First Trust Global Funds plc

First Trust Global Portfolios Management Limited Announces Distributions for certain sub-funds of First Trust Global Funds plc

First Trust Global Portfolios Management Limited Announces Distributions for certain sub-funds of First Trust Global Funds plc