Artificial Intelligence (A.I) in Higher Education: opportunities, risks and Assessment Design
Oct 24, 2025
A short lecture on Artificial Intelligence in Higher Education. It begins with a definition, the applications of AI, the risks and limitations, how assessment design has to change and the way forward. WEBPAGE: https://academic-englishuk.com/ai-in-education/ LECTURE QUESTIONS DOWNLOAD: https://academic-englishuk.com/ai-in-education/
View Video Transcript
0:04
Hello and welcome to this short lecture
0:07
on AI in higher education. Today we'll
0:10
specifically look at AI opportunities,
0:13
risks and assessment. So we'll start
0:15
with the definition followed by AI
0:18
applications, then limitations, then on
0:21
to how assessments will have to change
0:23
to become more resilient, and finally a
0:26
conclusion of final thoughts and a
0:28
question to ponder.
0:31
Just before we begin this lecture, I'd
0:33
like to highlight that this research was
0:36
created by analyzing current guidance
0:38
and evidence from the UK government
0:40
alongside policies and practice papers
0:43
from leading UK universities.
0:46
All sources will come at the end of the
0:48
lecture.
0:50
So let's begin with the definition.
0:53
Generative artificial intelligence or
0:55
Gen AI refers to tools like chat GPT,
1:00
Copilot, Deepseek, Dowo, and Grammarly
1:03
that can create new content such as
1:05
texts, images, code, and simulations. In
1:08
education, these tools can help students
1:11
by providing instant feedback,
1:13
personalized study support, and
1:15
realistic practice scenarios.
1:18
However, they also present challenges.
1:22
For example, they can spread false
1:24
information, raise ethical concerns, and
1:27
reduce genuine critical thinking if used
1:29
without care or proper evaluation.
1:33
Just so we're clear from the start, AI
1:36
is an umbrella term for all forms of
1:38
artificial intelligence. While
1:40
generative AI is a specific type of A, I
1:44
designed to produce new data, not just
1:46
process or recognize it. In this
1:48
lecture, I will primarily refer to
1:50
generative AI, but at times I will
1:53
simply use the term AI to mean the same
1:56
thing.
1:58
So then, how prevalent is AI in higher
2:01
education? A recent statistic from the
2:04
Oxford Higher Education Policy Institute
2:06
who carried out an online questionnaire
2:09
with a representative sample of 1,041
2:12
undergraduates from across UK higher
2:14
education found that 88% of UK
2:17
university students admitted to using
2:20
generative AI tools for assessments.
2:25
So what do students use AI for at
2:27
university?
2:30
AI tools can enhance students learning
2:33
by offering personalized support,
2:35
creative ideas, and instant feedback.
2:38
The following examples show how AI can
2:41
be used to develop understanding,
2:43
improve productivity, and make studying
2:46
more interactive and inclusive.
2:49
One, it helps with personalized learning
2:52
where AI tools adapt study pace, review
2:55
key topics, and focus on areas students
2:58
find difficult, or content exploration
3:01
to generate study notes, quizzes,
3:04
summaries, and multimedia explanations
3:06
to support understanding. There's
3:09
virtual tutoring where students can ask
3:11
AI for one-to-one explanations,
3:13
examples, or feedback when revising or
3:16
preparing assignments.
3:19
Two, it can create simulations and
3:22
scenarios to help practice real world
3:24
situations or case studies to apply
3:26
theory to practice.
3:29
Three, it can help with assessment and
3:32
feedback to use for self-checking,
3:34
receiving formative feedback or tracking
3:36
your learning progress.
3:39
Four, it will improve accessibility to
3:42
get complex texts simplified,
3:44
translated, or converted to accessible
3:47
formats such as alt text or audio.
3:51
Five, it can be used for text
3:54
manipulation to summarize readings,
3:56
paraphrase key points, or reformat notes
3:59
into tables or outlines.
4:02
Six, of course, there is idea generation
4:05
to brainstorm essay topics, argument
4:08
structures, or report outlines to plan
4:10
your writing. And that would include
4:12
data analysis and visualization where it
4:16
can turn data into clear charts, graphs,
4:18
or infographics for presentations or
4:21
reports.
4:23
And finally, seven, productivity support
4:26
to organize tasks, plan study schedules,
4:29
and manage time more efficiently.
4:33
So although there are many benefits, it
4:35
is not entirely without flaws.
4:39
Let's now look at the limitations and
4:41
risks of generative AI in education.
4:45
The most serious issue is reliability
4:47
and accuracy.
4:49
AI often produces errors or misleading
4:52
outputs because it cannot reliably
4:54
distinguish truth from falsehood. In
4:57
some cases, it fabricates information
4:59
known as hallucinations, including false
5:02
references that appear credible but do
5:05
not exist. Performance can also decline
5:08
over time through a process called
5:10
algorithmic drift. At present, this
5:13
occurs when accuracy and reliability
5:15
decrease because the system continually
5:18
repeats and recycles the same
5:20
information, reinforcing patterns and
5:22
errors rather than adapting to new or
5:25
changing data.
5:28
Now, let's turn to the second serious
5:29
issue, which is bias and fairness.
5:33
AI systems are trained on vast data sets
5:36
that may include stereotypes, cultural
5:39
biases, or discriminatory patterns,
5:41
which are often reproduced and even
5:43
amplified in their outputs. There is a
5:46
lack of transparency in how these
5:48
systems generate responses, often
5:51
described as the blackbox problem, which
5:53
limits accountability and a clear
5:56
insight into decision-m processes.
5:59
Generally, AI presents information in a
6:02
confident and authoritative manner which
6:05
can encourage users to accept flawed or
6:07
misleading content without applying
6:10
sufficient critical evaluation.
6:14
Now, the third point is academic
6:16
integrity.
6:18
By producing work with AI that does not
6:21
reflect their own understanding,
6:23
students risk committing plagiarism or
6:26
academic dishonesty and graduating
6:28
without genuine expertise or
6:30
professional competence. In fact, such
6:33
excessive reliance contradicts
6:35
university pedagogical principles as it
6:38
weakens critical thinking, problem
6:40
solving, and independent learning
6:42
skills.
6:44
Then of course there's data privacy and
6:46
compliance.
6:48
AI systems often process sensitive
6:51
personal or research data, raising risks
6:54
of breaches, surveillance, or misuse.
6:57
Involving third party vendors, which are
7:00
external organizations that store,
7:02
manage, or analyze data on behalf of an
7:05
institution introduces additional risks
7:08
of interception, unauthorized access, or
7:11
exposure outside the institution's
7:13
direct control. Many AI systems are
7:16
trained on copyrighted material without
7:19
permission, which raises questions about
7:21
intellectual property, unclear ownership
7:23
of outputs, and potential legal
7:26
disputes. Furthermore, accessibility and
7:29
equality requirements must be addressed
7:31
to ensure full compliance with General
7:34
Data Protection Regulation, GDPR, and
7:37
Data Protection Act, DPA obligations.
7:42
And finally, there is ethical and social
7:44
concerns.
7:46
AI outputs may conflict with
7:49
institutional values or misalign with
7:51
ethical standards. In particular,
7:54
overuse of AI where automated systems
7:57
are relied upon more than human judgment
7:59
can restrict academic freedom and
8:02
discourage independent perspectives.
8:04
This dependence may also reduce
8:06
creativity and undermine professional
8:09
expertise by replacing original thought
8:12
and specialist knowledge with formulaic
8:14
outputs.
8:18
So bearing all of that in mind,
8:20
educators will need to start to think
8:22
about designing AI resilient
8:24
assessments.
8:28
AI resilient assessment design will now
8:30
need to combine selective invigilation,
8:33
live components, contextualized and
8:36
process-based tasks, collaboration,
8:39
ethical judgment, experiential learning,
8:41
and guided AI used to promote integrity,
8:45
critical thinking, and authentic
8:47
engagement. Let's look at these in more
8:49
detail.
8:51
Number one, invigilated and onampus
8:54
components.
8:56
This will be to incorporate supervised
8:59
or invigilated elements such as
9:01
in-person exams, vivvers, or practical
9:04
demonstrations to confirm authorship and
9:07
understanding.
9:10
Number two, contextual and localized
9:12
assessment tasks.
9:16
Assignments will have to draw on module
9:18
specific readings, seminars or case
9:21
studies.
9:23
Also, it may require reference to local
9:25
data examples or institutional contexts
9:29
that AI systems are less able to
9:31
replicate accurately.
9:35
Number three, interlin and developmental
9:38
assessment.
9:41
Students will be required to create
9:43
portfolios or linked assessments across
9:45
a module or program to be assessed on
9:48
coherence and sustained learning.
9:52
This could include asking students to
9:54
provide annotated drafts, research
9:56
notes, or reflective commentaries that
9:58
demonstrate process and progression.
10:03
Number four is higher order thinking and
10:05
critical engagement.
10:09
An emphasis on these skills such as
10:11
analysis, synthesis, and evaluation in
10:14
both task design and marking rubrics
10:16
will be required,
10:19
especially to use criteria that rewards
10:22
original argumentation, integration of
10:24
theory, and engagement with course
10:27
materials.
10:30
Number five, authentic and
10:32
scenario-based tasks.
10:36
This will be framing assignments around
10:38
realistic case studies, simulations, or
10:41
applied problems that include specific
10:43
constraints, defined time limits,
10:46
limited data sets, local examples, or
10:49
scenario-based conditions.
10:52
Again, opportunities for personal
10:54
insight or experience connected to
10:56
academic evidence and theoretical
10:58
frameworks will be necessary.
11:03
Number six, incorporation of AI within
11:06
assessment.
11:09
This will be become more prevalent as AI
11:12
grows and will require students to
11:14
critically evaluate AI outputs or AI
11:17
produced work.
11:19
So this now leads me into how
11:21
universities are dealing with AI at the
11:24
moment.
11:26
One new approach being introduced in
11:28
assessment and task guides is the use of
11:30
a coding system that tells students when
11:33
and how they are permitted to use AI.
11:37
Some universities are now using a
11:39
traffic light assessment system like
11:41
this one on the slide. It highlights a
11:44
clear policy on academic integrity by
11:46
defining when AI use is prohibited red
11:50
allowed in an assistive role amber or
11:53
required as part of the assessment
11:56
green. It ensures students use AI
11:59
responsibly supporting learning without
12:01
undermining originality, fairness or
12:04
academic standards.
12:07
Interestingly, traditional pen and paper
12:09
tests are making a comeback and becoming
12:12
more common as universities respond to
12:15
AIUS.
12:17
The University of Liverpool emphasizes
12:19
that assessments must never replace
12:22
original thought, independent research,
12:24
and the production of original work,
12:27
recommending traditional invigilated
12:29
handwritten formats as part of a wider
12:32
strategy to safeguard authenticity and
12:35
uphold academic integrity.
12:39
And Newcastle University recommends
12:41
using assessment formats that are less
12:43
vulnerable to AI misuse, such as
12:46
invigilated in-person pen and paper
12:48
exams, to preserve academic integrity in
12:52
a world where generative AI is widely
12:54
accessible.
12:56
This slide shows a range of paper and
12:59
pen tests with the test type, what it
13:01
assesses, and how it resists AI. Have a
13:04
good look at how these tests are
13:06
assessing understanding and knowledge.
13:15
Okay, now to the final conclusion.
13:19
So to conclude, generative AI is rapidly
13:22
transforming higher education, offering
13:25
powerful opportunities to enhance
13:27
learning, personalize feedback, and
13:29
improve accessibility. Yet, as we have
13:32
seen, it also introduces significant
13:34
risks related to accuracy, ethics, and
13:38
academic integrity. Universities are
13:40
therefore rethinking how they design
13:43
assessments to protect the core values
13:45
of higher education, genuine
13:47
understanding, originality, and
13:49
fairness.
13:52
As we move forward, the challenge is not
13:55
to resist AI, but to integrate it
13:57
responsibly, ensuring that technology
14:00
supports rather than replaces human
14:02
judgment, creativity, and independent
14:05
thought.
14:07
I'll leave you with a final question to
14:09
reflect on.
14:11
Is generative AI helping us learn more
14:14
deeply or simply making it easier to
14:16
appear as though we have?
14:21
Here is the reference list for this
14:23
lecture.
14:27
Thank you and good evening.
#Computer Education
#Education
#Machine Learning & Artificial Intelligence

