@freddy @sunfish No, I think that is missing the point completely.
LLMs (no, it's not AI) aren't making the puzzle simple. They don't help you understand the problem, they don't further understanding of similar problems, and that way, their use prevents simplification.
=> More informations about this toot | View the thread | More toots from Ardubal@mastodon.xyz
=> View freddy@security.plumbing profile | View sunfish@hachyderm.io profile
text/gemini
This content has been proxied by September (ba2dc).