A poster highlighting our main research this year. Taking a few steps towards low-cost AI.
The most exciting research is to scale up variational (Bayesian) methods to GPT-2.
We also did some work on model-merging, which I believe will change federated and continual learning.
I truly believe that we can train AI models cheaply and safely (without needing nuclear power plants, or stealing everybody's data). Working towards making demos that will change minds.
Looking forward to 2025.
=> More informations about this toot | View the thread | More toots from emtiyaz@mastodon.social
text/gemini
This content has been proxied by September (3851b).