I wrote a new article:
"The real problem with the AI hype"
https://wimvanderbauwhede.codeberg.page/articles/the-real-problem-with-AI/
tl;dr: even if the AI hype falls flat, it will have caused emissions to go up considerably.
The article has the best estimate I have been able to arrive at for the growth in global emissions from AI data centres.
[#]FrugalComputing
=> More informations about this toot | View the thread
Coda: the figures in the above thread are quite handwavy. I did a more nuanced analysis and will write up an article about it. The results are a little less dramatic but the overall conclusion remains the same: the AI hype is resulting in increased emissions even if it does not deliver on its promises. And this at a critical time when emissions must come down urgently and drastically.
[#]FrugalComputing
=> More informations about this toot | View the thread
(*) For example, SegNet, a network shown to have 99% accuracy for colon cancer detection, is 7.6M parameters (https://www.sciencedirect.com/science/article/pii/S0010482521005242). An LLM with 7B parameters (1000x more than that) can run easily on a laptop (https://www.hardware-corner.net/llm-database/LLaMA/).
=> More informations about this toot | View the thread
Compared to GPT-4 or Gemini, these are tiny models that do not require huge amounts of compute, memory or storage. Where GPT-4 is more than a trillion parameters (10^12), a typical CNN for medical applications (*) is of the order of tens of millions of parameters (10^7). So it needs a hundred thousand times fewer resources. And many of the ML techniques that do not use neural networks, like SVM or random forest, are even smaller than that. (2/2)
[#]FrugalComputing
=> More informations about this toot | View the thread
The UK's "AI Opportunities Action Plan" (https://www.gov.uk/government/publications/ai-opportunities-action-plan/ai-opportunities-action-plan) perpetuates the fallacy that we should embrace the "AI revolution" and its huge increase in demand for compute to make society better through e.g. better diagnostic tools in healthcare, smarter electricity grids, etc.
The growth in "AI" is entirely driven by generative AI based on Large Language Models. It has little or nothing to do with the Machine Learning approaches that already help with so many aspects of society. (1/2)
=> More informations about this toot | View the thread
"2. Change lives by embracing AI"
Some key examples are given:
"Using AI assistants" !
"Drafting structured reports and forms with AI" !
=> More informations about this toot | View the thread
"28. Require all regulators to publish annually how they have enabled innovation and growth driven by AI in their sector."
=> More informations about this toot | View the thread
"25. Commit to funding regulators to scale up their AI capabilities, some of which need urgent addressing. Government should also ensure all sponsor departments demonstrate how they are funding this capability within their budgets through the Spending Review process."
So what this says is, "we won't give you more money for research but you have to fund more AI research."
=> More informations about this toot | View the thread
To me, as an academic, "15. Support Higher Education Institutions to increase the numbers of AI graduates and teach industry-relevant skills." sounds like bad news too. "support" could so easily turn into "pressurise" by making funding contingent.
=> More informations about this toot | View the thread
Reading further through this 50-point plan. The the whole thing about unlocking data assets and a copyright-cleared British media asset training data set sounds rather ominous.
=> More informations about this toot | View the thread
Ah, they do mention sustainability, once:
"5. Mitigate the sustainability and security risks of AI infrastructure, while positioning the UK to take advantage of opportunities to provide solutions"
It's not clear whether this refers to environmental sustainability or not, but we can hope. It's again silver bullet thinking though: "government should look to support and partner with UK companies who can demonstrate performance, sustainability or security advancements."
=> More informations about this toot | View the thread
Dan McQuillan has written a great commentary on this Plan:
https://www.computerweekly.com/opinion/Labours-AI-Action-Plan-a-gift-to-the-far-right
=> More informations about this toot | View the thread
Needless to say, 100x growth would be a disaster: it would mean the data centres alone would consume 50,000 TWh/year. That would be 24 GtCO2e, so purely the emissions from making the servers would double of the global emissions budget to meet the climate targets. And that is without even taking into account the embodied carbon of all that infrastructure. (5/5)
[#]FrugalComputing
=> More informations about this toot | View the thread
If you think that all sounds crazy, the UK government has just said we need to increase the UK's AI research compute capacity by 20x in five years. Dell’s CEO has said that global data centre capacity must increase by 100x over the next 10 years. And OpenAI's Altman has said the world need 100x more GPUs production capacity, which amounts to the same.
And even if none of this happens, the damage is already being done.
(4/5)
[#]FrugalComputing
=> More informations about this toot | View the thread
To put some figures on this: the current global data centre capacity is 55GW, which means 482 TWh/year. Global electricity generation is 30,000 TWh/y. Production of chips for data centres is currently around 21 TWh/y.
Suppose this grows by 10x in ten years (that is about 25% per year and is within current projections). Then we get 4820 TWh for the data centres and 210 for the chips so 5000 TWh or an extra 2.4 GtCO2e (on a total global CO2 budget of 12 GtCO2e). (3/5)
[#]FrugalComputing
=> More informations about this toot | View the thread
[#]FrugalComputing
=> More informations about this toot | View the thread
The real problem with the current AI hype is not "AI" in itself but what it is causing:
[#]FrugalComputing
=> More informations about this toot | View the thread
The UK's "AI Opportunities Action Plan".
Not a single word on climate, emissions or even net zero. Instead,
"As AI infrastructure providers seek access to land and power, governments who move quickly and mirror the pace of growth and innovation in the AI data centre market will be best placed to secure investment."
https://www.gov.uk/government/publications/ai-opportunities-action-plan/ai-opportunities-action-plan
=> More informations about this toot | View the thread
(6/5) Addendum: what about the end user device? I had assumed silently that this was a phone and so negligible; but suppose it is an average TV (40"), then the power consumption is about 50W, and that would result in 16 gCO2e, which is still much less than the 450 gCO2e of a 10,000- word GPT-4 query.
[#]FrugalComputing
=> More informations about this toot | View the thread
How does a query to GPT-4 compare to driving a car?
A single query to GPT-4 with a 10,000-word answer size produces 450 gCO2e (*). An average UK car produces 132 gCO2e/km. So this query is equivalent to driving a car 3.4 km.
[#]FrugalComputing
(*) https://wimvanderbauwhede.codeberg.page/articles/google-search-vs-chatgpt-emissions/ and assuming 100 words for the queries in that article; in reality it is probably closer to 30 words, but I prefer to be conservative.
Car stats from https://www.nimblefins.co.uk/average-co2-emissions-car-uk
=> More informations about this toot | View the thread
=> This profile with reblog | Go to wim_v12e@scholar.social account This content has been proxied by September (ba2dc).Proxy Information
text/gemini