Ancestors

Toot

Written by Even Rouault on 2025-01-24 at 14:04

To give some insights on my broken psychology: please don't mention that some AI tool hallucinated a GDAL option in one of your GDAL tickets. What's the point? Yes, those tools don't understand anything they tell you. There's no better way to deter me from looking at the ticket at all. I'm wondering how many years I'll be able to survive in this field if everybody is so fond of that.

=> More informations about this toot | More toots from EvenRouault@mastodon.social

Descendants

Written by Régis Haubourg on 2025-01-24 at 14:36

@EvenRouault I saw exactly this with PG last months.

Me : " How can I move pgwal dir without messing with symbolic links in a docker container"

ChatBullshitMachine: "Just use the pg_wal_dir" option !

no pg_wal_dir option ever existed of course.

AI is a nice tool to identify missing argument that statistically could be there.

=> More informations about this toot | More toots from RegisHaubourg@mastodon.social

Written by Vincent Picavet on 2025-01-24 at 14:56

@EvenRouault AI is the Truth, end of debate. Now recognize your new overlords, stop ranting and implement the option accordingly, miserable human. By chance, you can use AI to implement the GDAL option automatically. Isn't that nice ?

[#]2èdegré

=> More informations about this toot | More toots from vpicavet@mastodon.social

Written by Luís de Sousa on 2025-01-24 at 18:07

@EvenRouault AI pollution is set to become a major challenge to #SoftwareEngineering in general and #OpenSource in particular.

Most folk are yet to understand #LLMs do not understand #SourceCode.

=> More informations about this toot | More toots from luis_de_sousa@mastodon.social

Written by HD on 2025-01-25 at 07:14

@EvenRouault Any LLM related ticket needs to be an auto-block. You got broken code from an LLM, let the LLM resolve it.

=> More informations about this toot | More toots from hareldan@mapstodon.space

Written by James Bradbury on 2025-01-25 at 07:53

@EvenRouault some say it's wrong to even use the word "hallucinate" wrt AI as it implies intelligence, rather than large-scale pattern matching.

=> More informations about this toot | More toots from jamesgbradbury@mastodon.social

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113883707419488797
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
272.693287 milliseconds
Gemini-to-HTML Time
1.361456 milliseconds

This content has been proxied by September (ba2dc).