Compared to GPT-4 or Gemini, these are tiny models that do not require huge amounts of compute, memory or storage. Where GPT-4 is more than a trillion parameters (10^12), a typical CNN for medical applications (*) is of the order of tens of millions of parameters (10^7). So it needs a hundred thousand times fewer resources. And many of the ML techniques that do not use neural networks, like SVM or random forest, are even smaller than that. (2/2)
[#]FrugalComputing
=> More informations about this toot | View the thread | More toots from wim_v12e@scholar.social
=> View frugalcomputing tag This content has been proxied by September (ba2dc).Proxy Information
text/gemini