ChatGPT doesn’t do magic

Share:

ChatGPT has become popular throughout the year 2023. Its ability to generate content, with a series of prior textual instructions, not necessarily very extensive, has caused both astonishment and despair in the educational environment.

In the academic world (schools, institutes, vocational training centers and universities) there has been concern when thinking that a generative chatbot could replace all or parts of the educational work, discourage student effort or give false results of excellence in the face of society and the labor market.

Many students may think that thanks to ChatGPT (or one of its alternatives, such as Google Bard or Opera Aria ) they can do their homework in a short time (or solve exam tests without any considerable mental effort). But it’s not that simple.

Know what is being asked

The cornerstone of generative AI is prompting , which is the preparation of instructions for the subsequent generation of content (audiovisual or textual) based on the data model that has been trained and integrated into the corresponding application.

The instructions can be presented in different formats: chained reasoning, verification of the statement, question-answer sequence until obtaining a final result or mere complement of indications or examples (observations).

That said, the person has to know what they want to obtain through the program (or rather, what they want to be generated). In fact, even if you present the context, he has to set a kind of route or sequence for you.

Channel, frame and contextualize

For example, if you want to propose the resolution of a mathematical problem, you must clearly indicate the context as well as mark with the necessary degree of precision the restrictions that allow you to direct the direction of the interpretation. Specify if this is a probabilistic study or if you should seek certain help when programming with certain code.

Every response must be reviewed. Not only in the face of possible “hallucinations”. These errors that lead to an out-of-order response can occur through “our fault,” but not always: they also arise when the model is not sufficiently trained.

But in addition, the answer requires a review to the extent that it is necessary to understand the basis to continue with subsequent developments.

In short, if you do not know what you are asking or what is being generated, it is difficult to make truly profitable and productive use of these generative solutions.

Save time, but not effort

For these reasons, students may save time writing or doing a mathematical proof, but they do not stop straining their minds. Let’s look at some concrete examples of use:

  1. If you write a scientific work, you can ask ChatGPT for a review, especially if it is in English, so that it adapts your text to a more formal register, close to native and adapted to the usual expression in certain prestigious publications.
  2. If someone wants to have a summary of content to help them prepare for an exam by focusing on what is most important, they can ask the program about the main aspects about a topic (for example, the concept and typological functioning of metabolism).
  3. If someone wants to solve a complex marketing practical case , they can ask questions that will help them obtain indications that guide the calculation path. For example, identifying a discrete probability distribution on which we have to make a posteriori calculations.

For these reasons, ChatGPT, far from being a “wizard” who solves problems, is an important opportunity for students to strengthen their critical spirit, their ability to contrast, and their analysis and research skills.

Author Bio: Angel Manuel Garcia Carmona is Associate Professor of Mathematics and Data Science at CEU San Pablo University

Tags: