Ancestors

Written by Wim🧮 on 2025-01-15 at 09:20

The UK's "AI Opportunities Action Plan" (https://www.gov.uk/government/publications/ai-opportunities-action-plan/ai-opportunities-action-plan) perpetuates the fallacy that we should embrace the "AI revolution" and its huge increase in demand for compute to make society better through e.g. better diagnostic tools in healthcare, smarter electricity grids, etc.

The growth in "AI" is entirely driven by generative AI based on Large Language Models. It has little or nothing to do with the Machine Learning approaches that already help with so many aspects of society. (1/2)

=> More informations about this toot | More toots from wim_v12e@scholar.social

Written by Wim🧮 on 2025-01-15 at 09:21

Compared to GPT-4 or Gemini, these are tiny models that do not require huge amounts of compute, memory or storage. Where GPT-4 is more than a trillion parameters (10^12), a typical CNN for medical applications (*) is of the order of tens of millions of parameters (10^7). So it needs a hundred thousand times fewer resources. And many of the ML techniques that do not use neural networks, like SVM or random forest, are even smaller than that. (2/2)

[#]FrugalComputing

=> More informations about this toot | More toots from wim_v12e@scholar.social

Toot

Written by Wim🧮 on 2025-01-15 at 09:21

(*) For example, SegNet, a network shown to have 99% accuracy for colon cancer detection, is 7.6M parameters (https://www.sciencedirect.com/science/article/pii/S0010482521005242). An LLM with 7B parameters (1000x more than that) can run easily on a laptop (https://www.hardware-corner.net/llm-database/LLaMA/).

=> More informations about this toot | More toots from wim_v12e@scholar.social

Descendants

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113831632547409344
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
268.333431 milliseconds
Gemini-to-HTML Time
0.657916 milliseconds

This content has been proxied by September (3851b).