Lost in the Middle: How Language Models Use Long Contexts
https://arxiv.org/pdf/2307.03172
You can provide long contexts (prompts) to language models, but the probability that the model uses the provided information varies with the positions of that information. Just like human memory has a primacy and recency bias
=> More informations about this toot | View the thread | More toots from tomstafford@mastodon.online
text/gemini
This content has been proxied by September (3851b).