Skip to content
FinanceIndustryenergyOfTechnology

ChatGPT Electrical Usage Equivalence: Length of Time a Kitchen Oven Must Run for One Query

In a recent post, Sam Altman, the head of OpenAI, delved into the prospective development of artificial intelligence and the power consumption associated with ChatGPT.

OpenAI CEO Sam Altman discusses upcoming AI advancements and ChatGPT's energy consumption in a...
OpenAI CEO Sam Altman discusses upcoming AI advancements and ChatGPT's energy consumption in a Tuesday blog post.

Servin' Up ChatGPT: A Delightful Energy Breakdown

ChatGPT Electrical Usage Equivalence: Length of Time a Kitchen Oven Must Run for One Query

In a blog post, the dynamic CEO of OpenAI, Sam Altman, shed some light on the energy costs associated with the popular AI chatbot, ChatGPT. It turns out that an average query requires a shockingly small amount of power—approximately 0.34 watt-hours. That's just slightly more than what a low-energy lightbulb would burn through in a few minutes!

To put things in perspective, imagine you're baking a soufflé in your oven. If your oven happens to chug along using about 2,000 watts, it would take under a second (around 0.6 seconds, to be precise) to burn up the same amount of energy as that single ChatGPT query. On the other hand, if you're keen on saving energy and prefer using an LED lightbulb (which typically uses around 10 watts), you'd need to keep it on for a leisurely two minutes and some change (exactly 122 seconds) to hit the same energy mark.

And guess how much water gets used up during a query? Just a smidgen—around one-fifteenth of a teaspoon. Water consumption per query is certainly nothing to raise a brow about, but the cumulative impact on our planet will amp up given ChatGPT's rapidly growing user base and frequent queries. For the average user, though, the energy footprint per query amounts to a few minutes of lightbulb usage and way less than what an oven cooks up in the blink of an eye.

Curious about the future of AI and its energy demands, Altman envisions the 2030s as an era of energy surplus. He speculates that as data center automation becomes more prevalent, the costs of artificial intelligence will approximate the cost of electricity itself. However, many questions and challenges remain unanswered in pursuit of a sustainable and energy-efficient AI future.

Oh, and by the way, OpenAI kept mum on Business Insider's request for a comment—but hey, nobody's perfect!

By The Numbers: A Closer Look

To give you a better grasp of the figures:

  • ChatGPT Query: 0.34 watt-hours
  • Oven (example): If an oven consumes about 2,000 watts, it would use 0.34 Wh in just under one second (specifically, in about 0.6 seconds).
  • Lightbulb (LED, example): If a 10-watt LED bulb is used, it would take over two minutes (about 122 seconds) to consume 0.34 Wh.
  • Water Usage: roughly one-fifteenth of a teaspoon per query

While a single query’s energy use is relatively minor, the cumulative impact is significant given ChatGPT’s large user base and frequent queries. However, for an individual query, the energy used compares to running a lightbulb for a few minutes and less than the power an oven uses for any notable period[1][2][3].

  • The energy cost of an average ChatGPT query, approximately 0.34 watt-hours, is comparable to a few minutes of an LED lightbulb's usage, providing a clear picture of the minimal energy consumption per query.
  • Conversely, if we consider the energy consumption of an oven as an example (roughly 2,000 watts), just under a second (about 0.6 seconds) would suffice to burn up the same amount of energy as a single ChatGPT query.
  • As Water consumption per query is small (roughly one-fifteenth of a teaspoon), it might be negligible for an individual query; however, the cumulative impact on our planet is a matter of concern given ChatGPT's ever-growing user base and frequent queries.

Read also:

    Latest