1m token context length
by•
https://www.anthropic.com/news/1m-context
Nice, looks like the context length has increased quite a bit. Up from 250k I believe!
I wonder if context > parameters for “basic” coding tasks at this point.
263 views
https://www.anthropic.com/news/1m-context
Nice, looks like the context length has increased quite a bit. Up from 250k I believe!
I wonder if context > parameters for “basic” coding tasks at this point.
Replies
Lanceboard
For basic coding, more context might beat raw model size for sure.
Huge jump. For certain workflows—like codebase Q&A or large doc analysis—1M context feels like having the whole project open in your head at once. For “basic” coding tasks, I agree—context might now be the bigger bottleneck than raw parameter count. The real magic will be how tools like Growstack use that memory to keep business logic and docs in sync without constant copy-paste.