HomeTop StoriesChatbots as a learning tool: obstacles and opportunities

Chatbots as a learning tool: obstacles and opportunities

Whether it’s healthcare or industry – or something as trivial as creating personalized birthday wishes – artificial intelligence (AI) is now part of many people’s daily lives.

What about higher education? Can AI in the form of large language models (LLMs) such as the chatbot ChatGPT be used to prepare for exams? And is it allowed?

Take Germany. Most universities there are only just starting to draw up guidelines or instructions regarding AI, says Jens Tobor, project manager of the University Forum for Digitalization (HFD) at the Center for Higher Education (CHE) in Gütersloh.

They are mainly recommendations and not yet binding, he notes, especially given the many gray areas left by the recent passage of the European Union’s Artificial Intelligence Act, the first of its kind in the world.

As Jannica Budde, senior project manager of the HFD, notes: “Unlike the exams themselves, for the time being, what you like and works for you is allowed in preparing for them.” But it’s important to realize, she adds, that ChatGPT is a language model, not a knowledge model, so “you need to be aware that the information could be false.”

See also  The Greek court dismisses the shipwreck lawsuit against nine men

“Apart from data protection and copyright, there is not yet a binding legal framework that universities can adhere to,” says Tobor. However, these laws pose the biggest hurdle to using AI as a personal learning assistant, because students must first feed the model with the knowledge they need for a particular subject.

Taking copyrighted learning materials or old exams with you is problematic. “It would amount to reproduction and could be illegal,” notes Tobor, who says it is not yet clear whether, and to what extent, the AI ​​companies behind the software applications further process the recorded data.

Instead, he recommends using the chatbot as a kind of Socratic dialogue partner. For example, ChatGPT asks the student reflective, individually predetermined questions about a set of facts, and checks whether the student has understood them.

“The beauty of this is that AI does not provide suspect information, but rather promotes a more in-depth – and therefore more educational – assessment of the subject,” says Tobor.

Katharina Opper, educational scientist and e-learning developer, has experience with this method. It was practiced by the ancient Greek philosopher Socrates and is about “asking questions without giving answers,” Opper writes.

See also  Short regional news for June 21

She has developed a prompt that allows AI to ask targeted questions and thus stimulate independent thinking. The person entering the prompt is asked what the topic of conversation is, and the dialogue can begin.

According to Opper, this approach is less susceptible to false information, because no information is given that can be accepted uncritically.

A bit of background: Chatbots powered by LLMs, such as ChatGPT, are subject to ‘hallucinations’, i.e. generating plausible-sounding falsehoods. They arise because, based on patterns in their extensive training data, they simply predict the probability of the next word in a given sentence.

It is also possible to have ChatGPT play dumb, so to speak, and have the student explain the material to it. “This process also consolidates the learning process,” says Tobor. The chatbot is assigned the role of, for example, a fellow student who has no understanding of the subject and is told what he needs to know.

Another option is to have the chatbot ask exam questions that the student must answer. “AI can do this quite well, but the amount of false information it provides about fact-based exam knowledge is still significant,” says Malte Persike, scientific director of the Center for Teaching and Learning Services at RWTH Aachen University in Germany.

See also  All 54 African countries, including disputed territories

“In terms of specialized content and knowledge, especially numbers and dates, I would caution everyone not to rely on AI,” he says.

Although the results are much better when AI is linked to a database from which the specialized content is retrieved, false information is still possible. “This is because the question is misunderstood and the correct information is not retrieved from the database, or because the information from the database is not understood,” Persike explains.

However, if you want to download documents from a digital learning space as a PDF file and load them back into the AI, you will run into the aforementioned copyright restrictions – but only if you are using a commercial system. According to Persike, there are now AI tools you can install on your own laptop that run locally and don’t transfer data to the internet.

“In all likelihood, this would be legally permissible,” he says. These models so far lack the quality and capacity of ChatGPT-4, “but if you want a dialogue partner, the quality of open-source alternatives is good.”

Should we use artificial intelligence as a learning tool or not?  Many universities do not yet have precise regulations about this.  Markus Hibbeler/dpa

Should we use artificial intelligence as a learning tool or not? Many universities do not yet have precise regulations about this. Markus Hibbeler/dpa

- Advertisement -
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments