Toot

Written by Q.U.I.N.N. on 2024-11-20 at 02:10

@KuteboiCoder i'll take a look, but yeah. especially some of the stuff like liquid state machines are being tested on like, 2,000 neuron "reservoirs," and are detecting features in video. or KAN networks that are more expensive to train (because they have a multi-point b-spline for activator functions) but require so many less that you still come out ahead.

it also looks like deep learning is a meme. decoder/encoder networks seem to be fixed by genetics or some one-off initialization process, and only one or two layers at the back actually handle interpreting (and, maybe, communicate to themselves with loops--as is theorized with the phenological loop, esp. people talking to themselves to solve problems)

=> More informations about this toot | View the thread | More toots from icedquinn@blob.cat

Mentions

=> View KuteboiCoder@subs4social.xyz profile

Tags

Proxy Information
Original URL
gemini://mastogem.picasoft.net/toot/113512849406782771
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
221.289569 milliseconds
Gemini-to-HTML Time
1.171441 milliseconds

This content has been proxied by September (3851b).