A Simple Act of Being Polite with an Artificial Intelligence May Be Generating Millions in Costs and Impacting Global Electric Energy Consumption, Revealing a Dilemma Between Digital Etiquette and Technological Sustainability in Times of Advanced AI.
In the digital world we live in, interaction with virtual assistants, such as ChatGPT, has become an essential part of the routine.
Whether to answer questions, search for information, or simply to chat, these tools are increasingly present in our daily lives.
However, a curious detail stands out: politeness when communicating with ChatGPT, such as saying “please” or “thank you,” may be generating a significant energy cost for OpenAI.
-
Goodbye pet hair on clothes: a washer with an internal filter promises to remove up to 5 times more hair than regular machines and uses an XL trap dryer system to capture what remains.
-
The world consumes 50 billion tons of sand per year and no one is monitoring it: the UN warns that the second most used resource on the planet after water is running out and has even created an international mafia.
-
The most overlooked waste in the electricity bill doesn’t come from the refrigerator or the air conditioner; it escapes through almost invisible cracks in doors and windows, operating all day long and can steal up to 20% of the energy used for climate control, while a low-cost seal takes just a few hours to eliminate this loss.
-
German engineers had an idea that no one had tried before: to hide the hydrogen tank inside the tractor’s wheels so that the machine can operate all day in the field without needing diesel.
In a surprising revelation made by Sam Altman, CEO of OpenAI, it became clear that the use of such expressions, considered polite in many cultures, is indeed increasing energy consumption on the company’s servers.
Altman commented on this curiosity after a post on X (formerly Twitter), where a user questioned the impact these formalities might have on ChatGPT’s operational costs.
Energy Cost of Polite Words: The Impact of Digital Etiquette
Altman’s comment came in response to a post by user @tomieinlove, who asked, “How much money is OpenAI potentially losing due to energy costs generated by people saying ‘please’ and ‘thank you’ to its models?”
The CEO did not waste time and replied: “It was tens of millions of dollars well spent — you never know.”
This exchange between users and the CEO reveals an unusual aspect of using artificial intelligence technologies: the education of users, who, while behaving courteously, end up generating a real impact on operational costs.
Although exact numbers have not been disclosed by Altman, it is a fact that any word or expression included in a ChatGPT prompt requires data processing.
Each character occupies space in the model’s context window and needs to be analyzed in detail when the message is sent to OpenAI’s servers.
This means that when a user inputs a “please” or “thank you,” the model needs to process these words, treating them as data that affect the assistant’s response behavior.
This seemingly simple behavior generates a larger computational cost than one might imagine.
Language models, like ChatGPT, operate on probabilistic calculations, which require a significant amount of resources to compile and generate a response that makes sense within the vast database that feeds the system.
Energy Consumption in Processing Simple Interactions
The analysis of the energy impact generated by polite interactions on ChatGPT has become a topic of interest among researchers and users.
According to a 2024 survey conducted by the consulting firm Future, 67% of users in the United States habitually interact politely with chatbots.
About 55% of respondents said they adopt this behavior because they consider it “the right thing to do,” while 12% believe that politeness can help avoid reprisals in a potential machine uprising, a concept that emerged with the rise of AI technologies.
But how much energy does this actually consume?
To get an idea of the energy impact of a simple message, the consulting firm Future highlighted that a single AI-generated email consumes about 0.14 kWh of energy, enough to keep 14 LED light bulbs on for one hour.
Although this amount seems modest compared to other types of energy-consuming operations, such as generating complex images with AI or running models that require substantial amounts of data, it still indicates how even the simplest interactions can contribute to high energy consumption.
An Astronomical Amount of Energy in Simple Interactions
Now, imagine the energy consumption to process the politeness of over 400 million ChatGPT users.
These numbers are impressive and reveal how an apparently trivial interaction can result in a high cost.
The question is: to what extent does this digital politeness, which may seem simple and unimportant, become an additional burden for the companies that develop these technologies?
To better understand this point, it is necessary to think about the operational costs of ChatGPT.
OpenAI’s artificial intelligence requires vast computational resources, and the amount of data processed with each interaction is not just a small number.
Processing words and phrases requires executing complex calculations, allocating data, and manipulating information in real-time.
Therefore, every word spoken by the user is, indeed, a component of a large-scale operation that consumes a significant amount of energy.
The Hidden Costs of Polite Interactions with AI
Contrary to what many might imagine, it is not just a matter of “word economy” or “process optimization.”
Digital politeness, manifested through expressions like “please” and “thank you,” adds an extra layer to processing, making it more complex and, consequently, more costly.
Although AI has been designed to interact smoothly and humanely, it must also adjust its responses to this courteous behavior, which requires more computational resources.
This impact may seem small in each individual interaction, but when we add up the billions of daily interactions, the cumulative effect results in significant energy costs, something that is being carefully monitored by OpenAI.
It is worth noting that, while the company benefits from a robust business model and is constantly seeking innovation, this type of cost inevitably generates debates about the sustainability and environmental costs of AI-based technologies.
The Sustainability of AI Technologies
With the increasing use of AI, environmental issues have started to be raised more intensely.
The energy consumption to process responses, images, or other types of interactions generated by AI models is immense.
This raises a crucial point: how can companies balance the need to provide fast and accurate responses to users while managing the environmental costs of that operation?
Researchers and engineers at OpenAI are aware of the impact these technologies can have on the environment.
The company has been investing in ways to optimize its systems and reduce energy consumption without compromising service quality.
But it is important to remember that as the demand for AI grows, energy consumption also increases, becoming a challenge for companies that wish to minimize their carbon footprint.
How to Interact Efficiently with ChatGPT?
In light of this scenario, an interesting reflection arises: should we continue to interact with virtual assistants as we do with people, using courteous expressions that generate additional costs?
Or would it be more efficient, for the environment and for OpenAI, to adopt a more direct and straightforward approach when interacting with these technologies?
These are questions that may seem small in daily life, but when analyzed more deeply, reveal the impact of our digital interactions on the real world.
When we reflect on these issues, we realize that even the simplest attitudes, like being polite to a chatbot, can have unforeseen and significant consequences.

-
Uma pessoa reagiu a isso.