Tag: context window size

  • Simon Willison’s Weblog: Kimi-K2-Instruct-0905

    Source URL: https://simonwillison.net/2025/Sep/6/kimi-k2-instruct-0905/#atom-everything Source: Simon Willison’s Weblog Title: Kimi-K2-Instruct-0905 Feedly Summary: Kimi-K2-Instruct-0905 New not-quite-MIT licensed model from Chinese Moonshot AI, a follow-up to the highly regarded Kimi-K2 model they released in July. This one is an incremental improvement – I’ve seen it referred to online as “Kimi K-2.1". It scores a little higher on a…

  • Hacker News: 3x Improvement with Infinite Retrieval: Attention Enhanced LLMs in Long-Context

    Source URL: https://arxiv.org/abs/2502.12962 Source: Hacker News Title: 3x Improvement with Infinite Retrieval: Attention Enhanced LLMs in Long-Context Feedly Summary: Comments AI Summary and Description: Yes Summary: The text discusses a novel approach called InfiniRetri, which enhances long-context processing capabilities of Large Language Models (LLMs) by utilizing their own attention mechanisms for improved retrieval accuracy. This…

  • Hacker News: Comparison of Claude Sonnet 3.5, GPT-4o, o1, and Gemini 1.5 Pro for coding

    Source URL: https://www.qodo.ai/blog/comparison-of-claude-sonnet-3-5-gpt-4o-o1-and-gemini-1-5-pro-for-coding/ Source: Hacker News Title: Comparison of Claude Sonnet 3.5, GPT-4o, o1, and Gemini 1.5 Pro for coding Feedly Summary: Comments AI Summary and Description: Yes **Summary:** This text provides a comprehensive analysis of various AI models, particularly focusing on recent advancements in LLMs (Large Language Models) for coding tasks. It assesses the…