Following a series of Freedom of Information (FOI) requests, Roar can exclusively reveal that King’s College London has spent £35,013 on Microsoft Copilot licenses since September 2024.
King’s purchased 80 licenses to Copilot, Microsoft’s flagship generative AI chatbot, as part of their Enrolment for Education Solution’s renewal in September 2024, at a cost of £28,396. Additionally, 13 licenses have been purchased on an ad-hoc basis since then for £6,617.
Ten Students Expelled for AI Misuse since September 2022
Roar can also reveal that, since September 2022, 10 students have been expelled from King’s in disciplinary action where AI misuse was cited as a major reason.
Since 2022, AI has been mentioned in 55 students’ cases that have reached the misconduct committee. 42 of those students saw some form of disciplinary action.
In the 2024/25 academic year, 32 students were investigated, 17 disciplined and five students expelled for AI-related reasons.
In comparison, UCL disclosed to Roar in an FOI that fewer than five students were excluded last year, while the London School of Economics (LSE) saw 40 students investigated in 2024/25, withholding further information due to privacy concerns.
King’s Students Hold Mixed Views on AI
Students surveyed by Roar offered different experiences of how AI has affected their academic life.
The most common usage of AI among students was to summarise readings, with 60% of students surveyed by Roar acknowledging using AI for this purpose. Some told Roar they used AI as they felt they did not have enough time to complete the readings independently.
Similarly, 57% of those surveyed admitted to using AI to generate ideas and help with planning for assignments. One student told Roar, “The fact that AI exists makes it so difficult not to use it – when it seems like an essay is gonna take ages, or you feel stuck, it’s too tempting.”
However, a lesser 32% said they used AI to help write essays. One student told Roar they feel there is a “concerning” over-reliance on AI among current students.
“I think King’s probably doesn’t realise the extent to which students are using it. Some people on my master’s course literally can’t put a thought together.”
Some students also noted AI usage goes deeper than assisting university work.
“I’ve met people who say they use ChatGPT for advice if they feel unwell, to answer questions for their seminar work, to summarise current events and happenings.”
King’s AI Policy
Currently, King’s policy on AI states that generative AI may only be used by students at the module convenor’s discretion, with limited exceptions. However, copying and pasting into assignments is forbidden and acknowledgement of the use of generative AI is required.
In spite of this, the College’s Academic Board has accepted that it is “clear that students would use it regardless” with some noting that for academics its use continues to require a “mindset shift”.
20 surveyed students told Roar they knew someone who had, or had themselves, been the subject of an investigation for academic misconduct around AI use. Multiple students argued King’s should adopt a harsher anti-AI stance, citing environmental issues and to ensure students are academically capable.
A King’s College London spokesperson said:
“AI tools are increasingly integrated into everyday life and we support students in developing their AI literacy. Our guidelines clearly set out where AI can be used in learning, however when it comes to assessments, it is essential submitted work is a genuine representation of each student’s own work, skills and subject knowledge to maintain academic rigour and integrity.”
KCLSU Concerns
The findings come amid a growing concern about AI usage within King’s. This month KCLSU began drafting an ‘AI Manifesto’ to clarify the position of the Union and the University on the use of AI in assignments.
The Union claims that many students are being wrongly accused of using AI in their assignments and that in many cases the existing policy is unclear.
Universities continue to struggle to identify AI-generated content in coursework submissions. The higher education ombudsman reminded institutions in July to be aware of the “limitations” of AI detection software like Turnitin, after a series of complaints were upheld involving overzealous adherence to the software’s analyses.
Some universities have rowed back on institutional subscriptions to AI software. In the Netherlands SurfNET, which provides IT services to academic institutions, recommended that institutions do not use Microsoft Copilot due to “privacy risks” and a “lack of transparency from Microsoft”.
For guidance on permissible use of AI, visit here.
Grace Holloway is Roar's editor-in-chief managing the editorial side of our operation. She has gained valuable experience from Bloomberg as well as writing for Breaking Media, the Non-League Paper and Politics UK.
Kaveh Kordestani is a staff writer for Roar

