Tag: riding

  • Simon Willison’s Weblog: gpt-image-1-mini

    Source URL: https://simonwillison.net/2025/Oct/6/gpt-image-1-mini/#atom-everything Source: Simon Willison’s Weblog Title: gpt-image-1-mini Feedly Summary: gpt-image-1-mini OpenAI released a new image model today: gpt-image-1-mini, which they describe as “A smaller image generation model that’s 80% less expensive than the large model." They released it very quietly – I didn’t hear about this in the DevDay keynote but I later…

  • Simon Willison’s Weblog: GPT-5 pro

    Source URL: https://simonwillison.net/2025/Oct/6/gpt-5-pro/ Source: Simon Willison’s Weblog Title: GPT-5 pro Feedly Summary: GPT-5 pro Here’s OpenAI’s model documentation for their GPT-5 pro model, released to their API today at their DevDay event. It has similar base characteristics to GPT-5: both share a September 30, 2024 knowledge cutoff and 400,000 context limit. GPT-5 pro has maximum…

  • Simon Willison’s Weblog: Two more Chinese pelicans

    Source URL: https://simonwillison.net/2025/Oct/1/two-pelicans/#atom-everything Source: Simon Willison’s Weblog Title: Two more Chinese pelicans Feedly Summary: Two new models from Chinese AI labs in the past few days. I tried them both out using llm-openrouter: DeepSeek-V3.2-Exp from DeepSeek. Announcement, Tech Report, Hugging Face (690GB, MIT license). As an intermediate step toward our next-generation architecture, V3.2-Exp builds upon…

  • Simon Willison’s Weblog: Claude Sonnet 4.5 is probably the "best coding model in the world" (at least for now)

    Source URL: https://simonwillison.net/2025/Sep/29/claude-sonnet-4-5/ Source: Simon Willison’s Weblog Title: Claude Sonnet 4.5 is probably the "best coding model in the world" (at least for now) Feedly Summary: Anthropic released Claude Sonnet 4.5 today, with a very bold set of claims: Claude Sonnet 4.5 is the best coding model in the world. It’s the strongest model for…

  • Simon Willison’s Weblog: Improved Gemini 2.5 Flash and Flash-Lite

    Source URL: https://simonwillison.net/2025/Sep/25/improved-gemini-25-flash-and-flash-lite/#atom-everything Source: Simon Willison’s Weblog Title: Improved Gemini 2.5 Flash and Flash-Lite Feedly Summary: Improved Gemini 2.5 Flash and Flash-Lite Two new preview models from Google – updates to their fast and inexpensive Flash and Flash Lite families: The latest version of Gemini 2.5 Flash-Lite was trained and built based on three key…

  • Simon Willison’s Weblog: GPT-5-Codex

    Source URL: https://simonwillison.net/2025/Sep/23/gpt-5-codex/#atom-everything Source: Simon Willison’s Weblog Title: GPT-5-Codex Feedly Summary: GPT-5-Codex OpenAI half-relased this model earlier this month, adding it to their Codex CLI tool but not their API. Today they’ve fixed that – the new model can now be accessed as gpt-5-codex. It’s priced the same as regular GPT-5: $1.25/million input tokens, $10/million…

  • Simon Willison’s Weblog: Grok 4 Fast

    Source URL: https://simonwillison.net/2025/Sep/20/grok-4-fast/ Source: Simon Willison’s Weblog Title: Grok 4 Fast Feedly Summary: Grok 4 Fast New hosted reasoning model from xAI that’s designed to be fast and extremely competitive on price. It has a 2 million token context window and “was trained end-to-end with tool-use reinforcement learning". It’s priced at $0.20/million input tokens and…

  • Simon Willison’s Weblog: GPT‑5-Codex and upgrades to Codex

    Source URL: https://simonwillison.net/2025/Sep/15/gpt-5-codex/#atom-everything Source: Simon Willison’s Weblog Title: GPT‑5-Codex and upgrades to Codex Feedly Summary: GPT‑5-Codex and upgrades to Codex OpenAI half-released a new model today: GPT‑5-Codex, a fine-tuned GPT-5 variant explicitly designed for their various AI-assisted programming tools. I say half-released because it’s not yet available via their API, but they “plan to make…

  • Simon Willison’s Weblog: Kimi-K2-Instruct-0905

    Source URL: https://simonwillison.net/2025/Sep/6/kimi-k2-instruct-0905/#atom-everything Source: Simon Willison’s Weblog Title: Kimi-K2-Instruct-0905 Feedly Summary: Kimi-K2-Instruct-0905 New not-quite-MIT licensed model from Chinese Moonshot AI, a follow-up to the highly regarded Kimi-K2 model they released in July. This one is an incremental improvement – I’ve seen it referred to online as “Kimi K-2.1". It scores a little higher on a…