What does the chatbot "ChatGPT" have to say on its use in the educational system?
With the continuing hot debate on the
benefits of AI in the educational system and confronted with the possibility of
student plagiarism (without access to a plagiarism detection software), I asked
ChatGPT how it would go about detecting if students are using ChatGPT during
written exams.
ChatGPT recognises that all of the following can be challenging, but there are
some language patterns and characteristics that might indicate its usage:
Unusual or advanced vocabulary: ChatGPT has a wide-ranging vocabulary and can
suggest advanced or uncommon words and phrases. NB: looking out for certain
language that is not typical for a certain generation. There are always
exceptions such as students who are studying literature and who are exposed to
different writing styles and perspectives.
Written assignment example: if you provided a simple prompt and received a
written assignment with complex language, it could be a signal that some
students may have sought external assistance, such as from AI language models
like ChatGPT, or they might have consulted other sources.
Complex sentence structures: ChatGPT can generate complex and well-structured
sentences. If a student's writing suddenly becomes more sophisticated than
usual, it might raise suspicion.
Inconsistencies in writing style: Since ChatGPT generates text based on
context, there might be inconsistencies in writing style if students switch
between their own writing and text generated by ChatGPT.
Lack of personal voice or originality: ChatGPT tends to lack personal voice and
originality. If a student's writing suddenly lacks their usual voice or seems
too generic, it could be a red flag.
Rapid improvement or sudden changes in writing quality: If a student's writing
quality suddenly improves dramatically or changes significantly in style, it
could indicate the use of external assistance.
Unusual sources or references: If students reference sources or information
that are not typically covered in the course material or are too advanced for
their level of understanding, it could suggest they used ChatGPT for
assistance.
Consistency in errors or inaccuracies: If students consistently make similar
errors or include inaccurate information that seems out of place, it could be a
sign of reliance on AI-generated content.
ChatGPT agreed with me on two major points:
- It is not a substitute for critical thinking,
- It is not a substitute for personal voice or orginality. As an AI language
model, it does not possess personal experiences, emotions, or individuality
like us humans.
Final thought: While AI models can be valuable tools for learning, they are
most effective when integrated into a broader educational framework that
includes diverse learning experiences, human interaction, critical thinking,
and active engagement. 😉
Copyright © 2024 Third Culture Professionals
Photo Pexels: Olga Lioncat