An excellent overview of the development in the world of #LLMs in the last year, put together by @simon in his "Things we learned about LLMs in 2024": https://simonwillison.net/2024/Dec/31/llms-in-2024/. Remember the YouTube paradox where the engineers made the site faster, but globally overall load times went up because suddenly more people could use it? I wonder if something like this could happen with LLMs and the environmental impact of prompts: individual prompts get cheaper, but overall energy consumption goes up.
=> More informations about this toot | View the thread | More toots from tomayac@toot.cafe
=> View simon@simonwillison.net profile
=> View llms tag This content has been proxied by September (3851b).Proxy Information
text/gemini