Vogon Today

Selected News from the Galaxy

StartMag

The ChatGPT cyclone and the school: 5 key points

The ChatGPT cyclone and the school: 5 key points

How will ChatGPT affect students and teachers? Challenges and opportunities of artificial intelligence in schools. The speech by Professor Enrico Nardelli of the University of Rome Tor Vergata, director of the National Laboratory "Informatics and School" of CINI and former president of Informatics Europe

In recent months there has been enormous turmoil in the world of education – especially among teachers, due to generative artificial intelligence tools (which I will call, for brevity's sake, IAG in the following), of which the best known is ChatGPT. Why is this happening?

To begin with, IAG tools are objectively capable of exhibiting performances (in the production of texts, but also images, and many other cognitive products typically created by humans) that in many cases are indistinguishable from those of people. Therefore, what in Stefano Quintarelli's beautiful definition are only SALAMIs (Systematic Approaches to Learning Algorithms and Machine Inferences = systematic approaches for [the development of] learning algorithms and [systems of] automatic deduction), are, due to a projection process which is typical of human beings who see meanings everywhere , mistakenly perceived in common interpretation as truly intelligent entities.

Then, there are enormous commercial interests at stake, billions of euros (or dollars) invested annually to try to grab market shares that are projected to be worth at least a thousand times greater. Every day, these incessantly push us to describe scenarios in the media that range from science fiction (we will solve all of humanity's problems) to apocalyptic (we will all lose our jobs), but always aimed at building a belief of inevitability: not there is an alternative! In particular, since the best way to convince someone to use a certain product is to get them used to using it from an early age, the school sector is where the greatest pressure is concentrated.

Furthermore, IAG tools are now integrated into the technological appendix we are all equipped with, the smartphone, and therefore, like it or not, everyone finds themselves using them. It is inevitable! it is continually reiterated. The term "innovative" is then used as a passe-partout to convince those who are not experts, neglecting to consider what the real long-term consequences may be. Let us remember how in the last twenty years minors, even very young ones, have been made to use digital devices without any real control, only to then realize that, as documented in the final report of June 2021 of the fact-finding investigation " On the impact of digital on students, with particular regard to learning processes " carried out by the "Public Education and Cultural Heritage" Commission of the Senate, some problems have been created, and not insignificant ones.

Now, to limit ourselves to textual IAG tools, i.e. those that prepare texts that respond to what has been requested (e.g.: what are the salient facts of the Punic Wars? or: write a summary in 20 lines of "The Betrothed ”, or again: describe the process of pollination of plants), the immediate consequence on school is that one of the most traditional ways of verifying skills used on a daily basis, i.e. carrying out homework, completely loses its effectiveness. It is obvious, unless a prohibitionist climate is established which in any case would only be of some use for a short period, that all schoolchildren would do them using the IAG.

However, the answer is not to invent unlikely ways to make students use them anyway. I will discuss the five critical points of such an approach below. The answer is to give greater value to orality and face-to-face relationships . So, less homework and more class work, which means having smaller classes and more teachers. Of course, money is needed, but the future of a country depends on education. An approach of this kind, among other things, returns to giving value to the relational component of the relationship between teacher and student. We know well, at least starting from the analysis made by Plato in his Dialogues, that the emotional component of the educational relationship between didàskalos and mathetés , teacher and pupil, is a fundamental aspect of paideia, the ethical and spiritual growth of the disciple. This component can be enriched by technology, if appropriately used, but never, never replaced, under penalty of impoverishment and destruction of our humanity.

However, some say: “let's transform the availability of IAG into an opportunity for better learning by students; let's let them experiment with these technologies that they will end up using when they grow up." There are some very important critical points in this way of doing things that are worth examining.

1) These are tools still under development, which often produce answers that seem correct, but are inaccurate . Only if we know well the topic on which the IAG has produced a text can we realize what is wrong. Since in school, especially the younger you are, the acquisition of knowledge is one of the fundamental educational aspects, it is clear that it is not sensible to risk that pupils learn incorrect knowledge (or even tainted by prejudices or stereotypes) .

2) They are tools controlled by the usual Big Techs, over which there is an absolute lack of control over how they have been developed and work, what data has been used for their training and what safety tests have been conducted. For every potentially harmful technology, companies have introduced regulations. In this case the European Union is trying with the so-called "AI Act", for which the political agreement was recently announced but the details are not known. But in the meantime, its use is being encouraged as much as possible, regardless of the possible risks. A first problem exploded in the last days of 2023, with the news that the New York Times , one of the best-known newspapers with the best reputation in the world, sued OpenAI and Microsoft for copyright infringement because they would have trained their IAG tools on their items without having permission. Others are expected soon, relating to visual IAG tools, i.e. those capable of producing images and videos, since several users on X (formerly Twitter) have highlighted that they can produce images protected by copyright , even if the user formulate the request in a generic way.

3) Using tools that are still in development means working for free for those who are developing them and maybe one day will resell them to us. That is, we are again making the same mistake we made by abandoning ourselves without thinking too much to intrusive and abusive social platforms, which have collected enormous amounts of data about us and use them for commercial purposes. What do we get in return? When it is said that we have to teach teachers how to ask questions to the IAG to get the help they may need (the so-called prompt engineering ) I feel shudder: but do we really want to transform them into free workforce available to Big Tech?

4) Even if the IAG tools always gave correct answers (but this is not the case), with their use in the lower levels of school not only do we risk our children acquiring incorrect knowledge, but we can undermine their chances of growth in terms of cognitive, since in this way they do not exercise them . Synthesizing texts, arguing a position, presenting a point of view are fundamental skills for any citizen. If we don't make them practice as schoolchildren, they will never acquire them. A bit like what happens if we only travel by car and never on foot or by bicycle, our physical abilities weaken. In addition, while studying books one trains oneself in the plurality of points of view and ways of presenting a certain topic, in the presence of very few sources of knowledge the risk of social indoctrination, especially for humanistic subjects, is very strong.

5) The contrast between the continuous invitations to use them and the lack of a serious ethical evaluation relating to the involvement of minors in the use of still experimental tools seems incredible to me. But how is it that any experimentation involving children rightly requires ethical approval, and in this case nothing? For a recent research project in primary schools comparing two methods of teaching one of the fundamental constructs of computer science, my colleagues and I had to get the approval of an ethics committee, whereas in this case I see exhortations to use the IAG to minors without any reflection on these aspects, nor on those relating to privacy. But can you imagine if in the early 1900s the Wright brothers had started letting people fly in their airplanes while they were developing them?

They say: but there is no alternative, the IAG is here and is part of our lives. It is true that it is already in our lives, but there is no absolute need to use it in classrooms. Which doesn't mean don't talk about it or pretend it doesn't exist. Collective use in the classroom, under the control and guidance of the teacher, accompanied by a critical discussion, obviously at the level permitted by the age of the children, constitutes useful information on a technology that they encounter anyway. But we really don't need, in the current state of technology, for students to use such tools regularly. For teachers, some uses are possible, as I described in a recent article , but extreme caution is required, given the still experimental nature of these tools and the dangers associated with the dissemination of minors' personal data. In this regard, I would like to know whether the Guarantor Authority for the Protection of Personal Data has assessed whether the exhortation to teachers to use these tools to improve their teaching, personalizing it for the needs of individual students, could jeopardize the privacy of this last.

I add, to be even clearer: in a context in which we teach the scientific basis of computer science in schools, teaching on the principles of artificial intelligence, which constitutes a very important sector of computer science itself, can certainly also find space. However, teaching what automatic learning is (= machine learning ), i.e. the fundamental technique used for IAG, to those who do not know what an automaton or an algorithm is is like trying to make people understand what trigonometry is. who knows only the four arithmetic operations. It can certainly be said that trigonometry allows you to measure the distance between two trees beyond a river without crossing it, but in the absence of an adequate mathematical basis it is simply a matter of disclosure. Certainly useful for the citizen who has little time to study and who still needs to be updated on scientific-technological advances, while in school the scientific basis must be provided to understand the world around us, both that of nature and the artificially constructed one, which in to an ever greater extent it is digital.

Rethinking the entire school curriculum in light of the transformation that has now taken place from an industrial society to a digital society is an indispensable step, much, much more than chasing technological trends by training students and teachers in the use of IAG tools.

(Interested readers will be able to dialogue with the author, starting from the third day following publication, on this interdisciplinary blog )


This is a machine translation from Italian language of a post published on Start Magazine at the URL https://www.startmag.it/innovazione/il-ciclone-chatgpt-e-la-scuola-5-punti-chiave/ on Thu, 04 Jan 2024 06:21:17 +0000.