Toots for theawely account

Written by David on 2025-02-04 at 10:24

L’administration de Wikipédia en français est tellement désastreuse et empêtrée dans le népotisme, la complaisance dans le mensonge et l’intimidation que c’est AMA au-delà du récupérable, et qu’elle n’a plus grand chose à voir avec la version anglophone. Certains admins me font presque regretter d’avoir crédibilisé par mes contributions (substantielles) un projet qui ne respecte plus ses principes.

=> More informations about this toot | View the thread

Written by David on 2025-01-28 at 12:33

"Am I learning about Hopfield networks or Wicca?"

=> View attached media

=> More informations about this toot | View the thread

Written by David on 2025-01-11 at 12:31

How do you avoid dependency hell in Python? Does conda or mamba help?

=> More informations about this toot | View the thread

Written by David on 2025-01-02 at 15:41

https://www.lexpress.fr/monde/europe/petrole-russe-cette-dependance-qui-fait-honte-a-leurope-C7XT37BOY5GSJF57BDMSGNBGPU/

=> More informations about this toot | View the thread

Written by David on 2024-12-28 at 11:37

Chinese OW models laundering US LMs, themselves laundering GPL code. There is some irony to it.

=> More informations about this toot | View the thread

Written by David on 2024-12-15 at 12:22

I’ve added caching and revamped the results interface of https://onthecitesof.info.

=> More informations about this toot | View the thread

Written by David on 2024-12-13 at 19:03

Can you recommend a Mastodon instance which supports >500 characters and markdown, have a gentle community and a transparent and apolitical moderation?

=> More informations about this toot | View the thread

Written by David on 2024-12-13 at 18:48

Excited about the new xLSTM model release. There are many well-though designs compared to transformers: recurrence (which should allows composability), gating (like Mamba & LSTM which is based on, which allows time complexity independent of the input size), state tracking (unlike Mamba & transformers). For now, these advantage aren’t apparent on benchmarks, but most training techniques are secrets, and the recent advances of LLMs evidenced that they matter a lot.

=> More informations about this toot | View the thread

Written by David on 2024-11-28 at 13:42

I have used excellent niche services this year:

• https://elicit.com: scientific publications summarisation using a LLM (don’t know if trained on or RAG). It cites them and I didn’t experienced hallucinations

• https://rtings.com: in-depth reviews & comparator of multimedia devices

• https://vectorbuilder.com: manufacture & approachable design of bespoke genetic vectors, their excellent CS can even design them for you gratis.

What are the ones you discovered?

=> More informations about this toot | View the thread

Written by David on 2023-08-05 at 09:07

‘Fighting two enemies’: Ukraine’s female soldiers decry harassment https://www.theguardian.com/world/2023/aug/04/fighting-two-enemies-ukraine-female-soldiers-decry-harassment?CMP=Share_iOSApp_Other

=> More informations about this toot | View the thread

Written by David on 2022-11-11 at 13:30

La recherche ne fonctionne que sur son instance 😕

=> More informations about this toot | View the thread

=> This profile with reblog | Go to theawely account

Proxy Information
Original URL
gemini://mastogem.picasoft.net/profile/88096
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
281.320049 milliseconds
Gemini-to-HTML Time
2.126585 milliseconds

This content has been proxied by September (3851b).