To give some insights on my broken psychology: please don't mention that some AI tool hallucinated a GDAL option in one of your GDAL tickets. What's the point? Yes, those tools don't understand anything they tell you. There's no better way to deter me from looking at the ticket at all. I'm wondering how many years I'll be able to survive in this field if everybody is so fond of that.
=> More informations about this toot | More toots from EvenRouault@mastodon.social
@EvenRouault I saw exactly this with PG last months.
Me : " How can I move pgwal dir without messing with symbolic links in a docker container"
ChatBullshitMachine: "Just use the pg_wal_dir" option !
no pg_wal_dir option ever existed of course.
AI is a nice tool to identify missing argument that statistically could be there.
=> More informations about this toot | More toots from RegisHaubourg@mastodon.social
@EvenRouault AI is the Truth, end of debate. Now recognize your new overlords, stop ranting and implement the option accordingly, miserable human. By chance, you can use AI to implement the GDAL option automatically. Isn't that nice ?
[#]2èdegré
=> More informations about this toot | More toots from vpicavet@mastodon.social
@EvenRouault AI pollution is set to become a major challenge to #SoftwareEngineering in general and #OpenSource in particular.
Most folk are yet to understand #LLMs do not understand #SourceCode.
=> More informations about this toot | More toots from luis_de_sousa@mastodon.social
@EvenRouault Any LLM related ticket needs to be an auto-block. You got broken code from an LLM, let the LLM resolve it.
=> More informations about this toot | More toots from hareldan@mapstodon.space
@EvenRouault some say it's wrong to even use the word "hallucinate" wrt AI as it implies intelligence, rather than large-scale pattern matching.
=> More informations about this toot | More toots from jamesgbradbury@mastodon.social This content has been proxied by September (ba2dc).Proxy Information
text/gemini