Toot

Written by b4ux1t3 :trek_ds9_sisko:#1️⃣ on 2025-01-20 at 13:42

@xgebi me, explaining to AIBros that that's not how it works: an LLM can never have sufficient context to replicate human understanding. There is no context you can pass an LLM that will make it capable of thought. Context, which is required for an LLM to generate meaningful output, can never be comprehensive enough for the LLM to think.

=> More informations about this toot | View the thread | More toots from b4ux1t3@hachyderm.io

Mentions

=> View xgebi@hachyderm.io profile

Tags

Proxy Information
Original URL
gemini://mastogem.picasoft.net/toot/113860970636513673
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
223.952615 milliseconds
Gemini-to-HTML Time
0.364375 milliseconds

This content has been proxied by September (3851b).