The flawed explanations typically center around the initial tokenization stage masking the letters so that all the LLM sees is a vector of numbers
https://youtube.com/shorts/7pQrMAekdn4
=> More informations about this toot | View the thread | More toots from bornach@masto.ai
text/gemini
This content has been proxied by September (3851b).