Privacy Policy Banner

We use cookies to improve your experience. By continuing, you agree to our Privacy Policy.

Why don’t you have to say “thank you” and “please” to chatgpt

-
Sam Altman estimates the high computational cost of courtesy words as “please” and “thanks” in interactions with AI. (Reuters/given Ruvic/Illustration/File Photo)

Courtesy is an inherent virtue of interactions. Saying “please” and “thanks” is a habit, but with artificial intelligence, as the case of Chatgpt, It would not be the best solution.

According to several investigations and statements of experts, such as the OpenAi CEO, Sam Altman, using these two expressions has a considerably high cost in terms of energy consumption, and, and, and, Although in some cases you can improve AI responses, there are reasons to reconsider if it is worth doing.

Chatgpt, like many other artificial intelligence models, works through a complex that involves thousands of computer operations to generate responses to user consultations. Every an application is made, the AI ​​must process the entry into its entirety, dividing it into tokens (smaller units of text), which increases the time and resources required, especially the interactions are longer and longer due to courtesy.

Sam Altman, CEO of Openai, addressed this problem in a conversation with a user in X. The user asked about the costs derived from the interactions educated with the language models, to which Altman responded with humor, estimating that Openai has spent “DECENDS OF MILLIONS Well spent dollars ”due to the extra words that users include courtesy.

Each query to chatgpt uses
Each query to ChatgPT uses an average of 0.3 watts-room, highlighting the impact of the . (Reuters/given Ruvic/Illustration)

Although CEO’s response was jokingly, the reality is that interactions with words such as “please” or “thanks” require greater computational processing, which translates into more energy consumption and, therefore, greater costs for the company.

This phenomenon is the of how the chatgpt language model works. If additional words are included in a consultation, such as those mentioned above, The amount of tokens that the AI ​​must process is increased, which requires more calculation timegreater use of graphic processing units (GPU) and, ultimately, a greater burden on the servers that operate the system.

This additional consumption of resources may seem insignificant in a single case, but when these interactions are multiplied by millions of daily users, the costs are triggered.

The energy consumption of AI is not limited only to monetary costs. It also has a considerable ecological footprint, because Data centers that models such as ChatGPT require huge amounts of electricity.

-
IA data centers
IA data centers will consume up to 945 Electricity TWH in 2030, according to the Energy Agency. (Infobae Illustrative Image)

According to estimates of the specialized environment Tom’s Hardwareeach average chatgpt consultation consumes approximately 0.3 watts-moan (WHH) of electricity. Although this is a tenth of what was previously estimated, It remains a considerable number if the platform users scale is taken into .

The concern for the environmental impact of AI is not new. The International Energy Agency (IE) projects that by 2030, the data centers will consume about 945 Teravatios-Hora (TWH) of electricity, which represents a significant with respect to the current 415 TWH.

Besides, The AI ​​models not only demand electricity to function, but also to cool the servers and avoid overheatingOr, which implies additional water consumption. A report by the University of California revealed that up to a response of two to three words from AI can consume between 40 and 50 milliliters of water in the computational process.

Despite the high energy cost, some research suggests that being educated when interacting with AI could, in certain cases, improve the quality of the responses generated. A study conducted by the University of Cornell in the United States evaluated how interactions with ia varied depending on whether a courteous or impolite language was used.

The formality in interactions with
The formality in interactions with AI reflects cultural differences, being more effective in languages ​​such as Japanese. (Infobae Illustrative Image)

The showed that, as in human interactions, polite language usually generates more effective and satisfactory responses, while rude indications can trigger a lower performance of the model.

However, the study also pointed out that not all polite language is equally cash. In certain contexts, excess courtesy may not be beneficial. In particular, language models seem to better to a moderate courtesy level in languages ​​such as English, While in other languages, such as Japanese, a high degree of formality seems to be more favorable.

This finding highlights that the AI ​​models are not only influenced by human norms, but also reflect the linguistic and cultural peculiarities that they acquire through their data.

The study concluded that, although educated language can improve interaction with AI, this effect is not always consistent and depends largely on the cultural context and the courtesy level. What is clear is that educated interactions, such as those that include “please” and “thanks”, They can influence how the model responds, but does not guarantee a significant improvement in the quality of the answers.

-

-

-
NEXT ELON MUSK launches the beta version of Grok 3.5: Is it the best of the moment?