Ancestors

Written by Karin Dalziel on 2024-10-29 at 21:52

In my library we had a discussion of AI earlier, and a coworker that has done cool things with it talked about how he thinks of it like using the dryer: bad for the environment, and he thinks of that every time he dries his clothes, but he still uses the dryer. He feels he can't be excited and share the cool things because of environmental concerns.

I personally have avoided using any of the very large (Open AI, Claude, etc) LLM stuff because I feel guilty based on all the dire warnings I have heard about how bad it is for the environment.

I am curious how others are justifying their use of these tools to themselves given the publicized environmental costs, in part because I feel like I have to start experimenting with them if only to clearly state what they are NOT useful for. Feel free to respond privately!

(I had a hard time phrasing this, please know I am not trying to judge anyone's use of AI any more than I would judge people driving to work)

=> More informations about this toot | More toots from nirak@hcommons.social

Written by David Colarusso on 2024-10-29 at 23:26

@nirak FWIW, my back of the envelope calculations put 12 hours of heavy GPT use at around 3,000g of CO2 production, which is roughly the same as one cheeseburger. And I’ve seen estimates that a single query is about 10x a traditional Google search. For me the question is one of tradeoffs. You can see some of my sources here: https://suffolklitlab.org/protective-randomness-artificial-intelligence/ Just ctr-f “CO2” to find the relevant section.

=> More informations about this toot | More toots from Colarusso@mastodon.social

Toot

Written by Karin Dalziel on 2024-10-30 at 00:16

@Colarusso thanks!

=> More informations about this toot | More toots from nirak@hcommons.social

Descendants

Written by David Colarusso on 2024-10-30 at 02:47

@nirak to help me better get a sense of scale, I decided to compare output with transit, and one GPT query (~4g) seems to generate about as much CO2 as driving an average passenger vehicle 53 feet (~400g per mile). See https://www.epa.gov/greenvehicles/greenhouse-gas-emissions-typical-passenger-vehicle#:~:text=including%20the%20calculations.-,How%20much%20tailpipe%20carbon%20dioxide%20(CO2)%20is%20emitted%20from,of%20CO2%20per%20mile. This of course doesn’t take into account other impacts (e.g., water usage et al), but it make things a little more human scale while really making the case for biking more.

=> More informations about this toot | More toots from Colarusso@mastodon.social

Written by Karin Dalziel on 2024-10-30 at 14:54

@Colarusso I already mostly bike for commuting and errands, and keep my heat/ac set low, and try not to buy much. Part of my conundrum is that normally I would bargain with myself and go "ok, I can use this thing if I don't drive..." but I've already cut most things as much as I can at this time.

=> More informations about this toot | More toots from nirak@hcommons.social

Written by David Colarusso on 2024-10-30 at 15:05

@nirak I hear you. I stopped eating mammals to offset some flights et al. As the saying goes, it's all tradeoffs. Another, option you might consider is to make use of small or local open models. For example, you can use something like https://lmstudio.ai/ to run LLMs on your personal machine. Power usage will vary, but you would actually know what you're using, and if you're able to make choices about where you get your power from, you could pull that lever too.

=> More informations about this toot | More toots from Colarusso@mastodon.social

Written by Karin Dalziel on 2024-10-30 at 15:23

@Colarusso That is something we're definitely going to look into (thanks @adr) :D

I am excited to learn more about it

=> More informations about this toot | More toots from nirak@hcommons.social

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113393491539431074
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
329.613498 milliseconds
Gemini-to-HTML Time
1.467677 milliseconds

This content has been proxied by September (ba2dc).