??? The top lvl commenter wants an LLM with big context window and the other commenter responded with an LLM which has 200k token context window which is waaaaaay more than "100 lines of code".
Yeah sorry, I thought that was clear. It’s how context is measured.
Claude is 200k
those are tokens not lines of code....
??? The top lvl commenter wants an LLM with big context window and the other commenter responded with an LLM which has 200k token context window which is waaaaaay more than "100 lines of code".
Yeah sorry, I thought that was clear. It’s how context is measured.