@judell Wondering about this:
"But I do worry about perverse incentives. Why engineer understandable systems when we can outsource the understanding of them?"
LLMs capture the knowledge from humans discussing configuration. If there are no more human discussions, how would LLM assistants be trained?
=> More informations about this toot | View the thread | More toots from khinsen@scholar.social
=> View judell@social.coop profile
text/gemini
This content has been proxied by September (3851b).