The UK's "AI Opportunities Action Plan" (https://www.gov.uk/government/publications/ai-opportunities-action-plan/ai-opportunities-action-plan) perpetuates the fallacy that we should embrace the "AI revolution" and its huge increase in demand for compute to make society better through e.g. better diagnostic tools in healthcare, smarter electricity grids, etc.
The growth in "AI" is entirely driven by generative AI based on Large Language Models. It has little or nothing to do with the Machine Learning approaches that already help with so many aspects of society. (1/2)
=> More informations about this toot | More toots from wim_v12e@scholar.social
Compared to GPT-4 or Gemini, these are tiny models that do not require huge amounts of compute, memory or storage. Where GPT-4 is more than a trillion parameters (10^12), a typical CNN for medical applications (*) is of the order of tens of millions of parameters (10^7). So it needs a hundred thousand times fewer resources. And many of the ML techniques that do not use neural networks, like SVM or random forest, are even smaller than that. (2/2)
[#]FrugalComputing
=> More informations about this toot | More toots from wim_v12e@scholar.social
(*) For example, SegNet, a network shown to have 99% accuracy for colon cancer detection, is 7.6M parameters (https://www.sciencedirect.com/science/article/pii/S0010482521005242). An LLM with 7B parameters (1000x more than that) can run easily on a laptop (https://www.hardware-corner.net/llm-database/LLaMA/).
=> More informations about this toot | More toots from wim_v12e@scholar.social
text/gemini
This content has been proxied by September (3851b).