@NatureMC @kate I'm not sure what you mean by that. Training GPT4 by most estimates took ~50GWh, but that's 'done', like you doing a query does not mean more training on the model you're using. I.e. for every query you do -now- on an already developed model you only count the current marginal power use. Which for GPT4 is ~3Wh (https://balkangreenenergynews.com/chatgpt-consumes-enough-power-in-one-year-to-charge-over-three-million-electric-cars/ i.e. 278GWh/78Gq ~= 3Wh/q).
=> More informations about this toot | View the thread | More toots from Schouten_B@mastodon.social
=> View NatureMC@mastodon.online profile | View kate@fosstodon.org profile
text/gemini
This content has been proxied by September (3851b).