FilingsHighLevel

Testing Anthropic Claude’s 100k-token window on SEC 10-K Filings High-Level Findings Overview of Methodology Tutorial Setup Analyzing a Single Document Analyzing Multiple Documents Conclusion

Anthropic’s 100K Context Window expansion, just released yesterday, has taken the AI community by storm. A 100k token limit is roughly 75k words (~3x GPT4–32k’s context window, ~25x that of GPT-3.5/ChatGPT); this implies that...

Recent posts

Popular categories

ASK ANA