Ancestors

Toot

Written by Wim🧮 on 2025-01-23 at 16:46

I wrote a new article:

"The real problem with the AI hype"

https://wimvanderbauwhede.codeberg.page/articles/the-real-problem-with-AI/

tl;dr: even if the AI hype falls flat, it will have caused emissions to go up considerably.

The article has the best estimate I have been able to arrive at for the growth in global emissions from AI data centres.

[#]FrugalComputing

=> More informations about this toot | More toots from wim_v12e@scholar.social

Descendants

Written by Niall Winters on 2025-01-23 at 17:16

@wim_v12e Thanks @janet for sharing. Super insightful - appreciate the depth of the research. I also found this article interesting: https://about.bnef.com/blog/liebreich-generative-ai-the-power-and-the-glory/

=> More informations about this toot | More toots from nwin@mastodon.ie

Written by Wim🧮 on 2025-01-23 at 18:05

@nwin Thanks for the link. I think the main difference with the previous hypes is that there is a lot of government buy-in. I think that as a result it might take a bit longer to die out. But I would be really surprised it we saw 20% growth year on year for more than 5 years (that would already be bad enough)

@janet

=> More informations about this toot | More toots from wim_v12e@scholar.social

Written by Niall Winters on 2025-01-23 at 19:17

@wim_v12e Agreed. Even though Trump $500bn AI announcement had shades of Boris hospitals about it, it nevertheless illustrates your point nicely.

@janet

=> More informations about this toot | More toots from nwin@mastodon.ie

Written by Alex@rtnVFRmedia Suffolk UK on 2025-01-23 at 20:28

@nwin @wim_v12e Whilst I think a lot of the generative AI side is indeed hype, there is unfortunately a lot of potential for automating relatively mundane surveillance functions (processing ANPR and CCTV footage, monitoring social media etc) and the datacentres will get used more and more for this (especially scraping large datasets to deal with offline), and the LLM/generative stuff can be used as a smokescreen..

=> More informations about this toot | More toots from vfrmedia@social.tchncs.de

Written by Wim🧮 on 2025-01-23 at 20:29

@vfrmedia @nwin That's true. The LLMs are used to push the notion of AI, but it will undoubtedly lead to machine learning in general becoming even more widely used.

=> More informations about this toot | More toots from wim_v12e@scholar.social

Written by Alex@rtnVFRmedia Suffolk UK on 2025-01-23 at 20:34

@wim_v12e @nwin LLM seems to be "bread and circuses" at the moment (I.e allow gammons on Facebook to make fake pictures of idealised life in 1970s), maybe govts will use it in public service to triage customer service queues rather than hire real humans, but setups like Starmers "pothole robot" are dual use (Police in England are already evaluating combined AI/ANPR cameras to check for lane deviations that suggest a driver is less than sober, but could also shows drivers dodging potholes)

=> More informations about this toot | More toots from vfrmedia@social.tchncs.de

Written by Christian on 2025-01-23 at 20:17

@wim_v12e

Thanks. That were very insightful calculations. Coarse, but with the advantage that it's possible to follow through.

And it's very nice to have the numbers in context! Very nice read indeed --

=> More informations about this toot | More toots from christian_zerfass@fediscience.org

Written by Wim🧮 on 2025-01-23 at 20:27

@christian_zerfass

Thanks, that's why I did the rough estimate as well. The better model is too complicated to explain in a blog post. I'm in the process of writing a paper about it.

=> More informations about this toot | More toots from wim_v12e@scholar.social

Written by Christian on 2025-01-23 at 20:47

@wim_v12e

Excellent. Good luck for the paper!

=> More informations about this toot | More toots from christian_zerfass@fediscience.org

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113878682171339307
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
260.204447 milliseconds
Gemini-to-HTML Time
2.824513 milliseconds

This content has been proxied by September (ba2dc).