Zeek

joined 1 year ago
[–] Zeek@lemmy.world 1 points 1 month ago (5 children)

Not really. The purpose of the transformer architecture was to get around this limitation through the use of attention heads. Copilot or any other modern LLM has this capability.