Ancestors

Toot

Written by Karin Dalziel on 2024-10-29 at 21:52

In my library we had a discussion of AI earlier, and a coworker that has done cool things with it talked about how he thinks of it like using the dryer: bad for the environment, and he thinks of that every time he dries his clothes, but he still uses the dryer. He feels he can't be excited and share the cool things because of environmental concerns.

I personally have avoided using any of the very large (Open AI, Claude, etc) LLM stuff because I feel guilty based on all the dire warnings I have heard about how bad it is for the environment.

I am curious how others are justifying their use of these tools to themselves given the publicized environmental costs, in part because I feel like I have to start experimenting with them if only to clearly state what they are NOT useful for. Feel free to respond privately!

(I had a hard time phrasing this, please know I am not trying to judge anyone's use of AI any more than I would judge people driving to work)

=> More informations about this toot | More toots from nirak@hcommons.social

Descendants

Written by Wendy Robertson on 2024-10-29 at 22:27

@nirak I successfully avoided them until recently when it occurred to me I had been unsuccessful in trying to get values from an array in JSON via GREL in Open Refine and there was no one I could ask/felt I could ask and I used copilot successfully. It was a very specific query, and I learned something. Environmentally I don't know the cost of that interaction but given how much of our stuff is in AWS and cloud (i.e. always on computer banks) it may not be worse than what I already do

=> More informations about this toot | More toots from Wendycr@glammr.us

Written by Karin Dalziel on 2024-10-29 at 22:54

@Wendycr that's what I am kinda leaning towards. Use it sparingly, only when I really have a purpose, and otherwise just talk to other people about their use.

The warnings I have seen about resource use are pretty dire, and it's really hard to judge since the companies are not sharing info that would actually let us estimate use.

=> More informations about this toot | More toots from nirak@hcommons.social

Written by Wendy Robertson on 2024-10-30 at 02:30

@nirak then energy use is just terrifying. Someone is talking about recommissioning our local nuclear power plant and I assume it is for this kind of thing. On the plus side it isn't fossil fuel. On the downside are accidents, intentional destruction, and an end product we can dispose of safely.

=> More informations about this toot | More toots from Wendycr@glammr.us

Written by Karin Dalziel on 2024-10-30 at 14:51

@Wendycr Right, if we are to use nuclear (which I'm certainly not against) I don't want it only to be for AI stuff. (and/or crypto mining)

Really distressing how much we've ramped up energy use for these new things right as we need to be cutting it :(

=> More informations about this toot | More toots from nirak@hcommons.social

Written by Wendy Robertson on 2024-10-30 at 15:56

@nirak Yes. So much this.

=> More informations about this toot | More toots from Wendycr@glammr.us

Written by David Colarusso on 2024-10-29 at 23:26

@nirak FWIW, my back of the envelope calculations put 12 hours of heavy GPT use at around 3,000g of CO2 production, which is roughly the same as one cheeseburger. And I’ve seen estimates that a single query is about 10x a traditional Google search. For me the question is one of tradeoffs. You can see some of my sources here: https://suffolklitlab.org/protective-randomness-artificial-intelligence/ Just ctr-f “CO2” to find the relevant section.

=> More informations about this toot | More toots from Colarusso@mastodon.social

Written by Karin Dalziel on 2024-10-30 at 00:16

@Colarusso thanks!

=> More informations about this toot | More toots from nirak@hcommons.social

Written by David Colarusso on 2024-10-30 at 02:47

@nirak to help me better get a sense of scale, I decided to compare output with transit, and one GPT query (~4g) seems to generate about as much CO2 as driving an average passenger vehicle 53 feet (~400g per mile). See https://www.epa.gov/greenvehicles/greenhouse-gas-emissions-typical-passenger-vehicle#:~:text=including%20the%20calculations.-,How%20much%20tailpipe%20carbon%20dioxide%20(CO2)%20is%20emitted%20from,of%20CO2%20per%20mile. This of course doesn’t take into account other impacts (e.g., water usage et al), but it make things a little more human scale while really making the case for biking more.

=> More informations about this toot | More toots from Colarusso@mastodon.social

Written by Karin Dalziel on 2024-10-30 at 14:54

@Colarusso I already mostly bike for commuting and errands, and keep my heat/ac set low, and try not to buy much. Part of my conundrum is that normally I would bargain with myself and go "ok, I can use this thing if I don't drive..." but I've already cut most things as much as I can at this time.

=> More informations about this toot | More toots from nirak@hcommons.social

Written by David Colarusso on 2024-10-30 at 15:05

@nirak I hear you. I stopped eating mammals to offset some flights et al. As the saying goes, it's all tradeoffs. Another, option you might consider is to make use of small or local open models. For example, you can use something like https://lmstudio.ai/ to run LLMs on your personal machine. Power usage will vary, but you would actually know what you're using, and if you're able to make choices about where you get your power from, you could pull that lever too.

=> More informations about this toot | More toots from Colarusso@mastodon.social

Written by Karin Dalziel on 2024-10-30 at 15:23

@Colarusso That is something we're definitely going to look into (thanks @adr) :D

I am excited to learn more about it

=> More informations about this toot | More toots from nirak@hcommons.social

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113392925995148846
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
404.001448 milliseconds
Gemini-to-HTML Time
3.850778 milliseconds

This content has been proxied by September (ba2dc).