Anthropic has announced that its LLM AI model, Claude, now has a larger capacity for analyzing content from books. The context window has been expanded to 100,000 tokens, which is equivalent to approximately 75,000 words. Tokens are fragments of words that should make processing AI data easier. This larger context window makes it possible to have an intelligent chat tool write a book based on this model, or to hold lengthy conversations that could last hours or even days. It also allows companies to extract important information from various documents through conversational interaction, which is faster than vector search based methods for complex queries. Compared to the 100,000 tokens Claude has in its context window, Open AI’s GPT-4 has a context window of 4,096 tokens (approximately 3,000 words) as part of ChatGPT and 8,192 or 32,768 tokens via the GPT in beta 4API.