Toot

Written by Bas Schouten on 2024-12-12 at 19:55

@NatureMC @kate I'm not sure what you mean by that. Training GPT4 by most estimates took ~50GWh, but that's 'done', like you doing a query does not mean more training on the model you're using. I.e. for every query you do -now- on an already developed model you only count the current marginal power use. Which for GPT4 is ~3Wh (https://balkangreenenergynews.com/chatgpt-consumes-enough-power-in-one-year-to-charge-over-three-million-electric-cars/ i.e. 278GWh/78Gq ~= 3Wh/q).

=> More informations about this toot | View the thread | More toots from Schouten_B@mastodon.social

Mentions

=> View NatureMC@mastodon.online profile | View kate@fosstodon.org profile

Tags

Proxy Information
Original URL
gemini://mastogem.picasoft.net/toot/113641608780352902
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
224.596427 milliseconds
Gemini-to-HTML Time
0.290522 milliseconds

This content has been proxied by September (3851b).