Tag: plugin
-
Simon Willison’s Weblog: GPT-4.1: Three new million token input models from OpenAI, including their cheapest model yet
Source URL: https://simonwillison.net/2025/Apr/14/gpt-4-1/ Source: Simon Willison’s Weblog Title: GPT-4.1: Three new million token input models from OpenAI, including their cheapest model yet Feedly Summary: OpenAI introduced three new models this morning: GPT-4.1, GPT-4.1 mini and GPT-4.1 nano. These are API-only models right now, not available through the ChatGPT interface (though you can try them out…
-
Simon Willison’s Weblog: Using LLMs as the first line of support in Open Source
Source URL: https://simonwillison.net/2025/Apr/14/llms-as-the-first-line-of-support/ Source: Simon Willison’s Weblog Title: Using LLMs as the first line of support in Open Source Feedly Summary: Using LLMs as the first line of support in Open Source From reading the title I was nervous that this might involve automating the initial response to a user support query in an issue…
-
Cloud Blog: 229 things we announced at Google Cloud Next 25 – a recap
Source URL: https://cloud.google.com/blog/topics/google-cloud-next/google-cloud-next-2025-wrap-up/ Source: Cloud Blog Title: 229 things we announced at Google Cloud Next 25 – a recap Feedly Summary: Google Cloud Next 25 took place this week and we’re all still buzzing! It was a jam-packed week in Las Vegas complete with interactive experiences, including more than 10 keynotes and spotlights, 700 sessions,…
-
Simon Willison’s Weblog: llm-fragments-rust
Source URL: https://simonwillison.net/2025/Apr/11/llm-fragments-rust/#atom-everything Source: Simon Willison’s Weblog Title: llm-fragments-rust Feedly Summary: llm-fragments-rust Inspired by Filippo Valsorda’s llm-fragments-go, Francois Garillot created llm-fragments-rust, an LLM fragments plugin that lets you pull documentation for any Rust crate directly into a prompt to LLM. I really like this example, which uses two fragments to load documentation for two crates…
-
Simon Willison’s Weblog: llm-fragments-go
Source URL: https://simonwillison.net/2025/Apr/10/llm-fragments-go/#atom-everything Source: Simon Willison’s Weblog Title: llm-fragments-go Feedly Summary: llm-fragments-go Filippo Valsorda released the first plugin by someone other than me that uses LLM’s new register_fragment_loaders() plugin hook I announced the other day. Install with llm install llm-fragments-go and then: You can feed the docs of a Go package into LLM using the…
-
Cloud Blog: Introducing Firebase Studio and agentic developer tools to build with Gemini
Source URL: https://cloud.google.com/blog/products/application-development/firebase-studio-lets-you-build-full-stack-ai-apps-with-gemini/ Source: Cloud Blog Title: Introducing Firebase Studio and agentic developer tools to build with Gemini Feedly Summary: Millions of developers use Firebase to engage their users, powering over 70 billion instances of apps every day, everywhere — from mobile devices and web browsers, to embedded platforms and agentic experiences. But full-stack development…
-
Cloud Blog: What’s new with Google Cloud networking
Source URL: https://cloud.google.com/blog/products/networking/networking-innovations-at-google-cloud-next25/ Source: Cloud Blog Title: What’s new with Google Cloud networking Feedly Summary: The AI era is here, fundamentally reshaping industries and demanding unprecedented network capabilities for training, inference and serving AI models. To power this transformation, organizations need global networking solutions that can handle massive capacity, seamless connectivity, and provide robust security. …
-
Simon Willison’s Weblog: Long context support in LLM 0.24 using fragments and template plugins
Source URL: https://simonwillison.net/2025/Apr/7/long-context-llm/#atom-everything Source: Simon Willison’s Weblog Title: Long context support in LLM 0.24 using fragments and template plugins Feedly Summary: LLM 0.24 is now available with new features to help take advantage of the increasingly long input context supported by modern LLMs. (LLM is my command-line tool and Python library for interacting with LLMs,…
-
Simon Willison’s Weblog: Note on 5th April 2025
Source URL: https://simonwillison.net/2025/Apr/5/llama-4-notes/#atom-everything Source: Simon Willison’s Weblog Title: Note on 5th April 2025 Feedly Summary: Dropping a model release as significant as Llama 4 on a weekend is plain unfair! So far the best place to learn about the new model family is this post on the Meta AI blog. You can try them out…
-
Simon Willison’s Weblog: smartfunc
Source URL: https://simonwillison.net/2025/Apr/3/smartfunc/ Source: Simon Willison’s Weblog Title: smartfunc Feedly Summary: smartfunc Vincent D. Warmerdam built this ingenious wrapper around my LLM Python library which lets you build LLM wrapper functions using a decorator and a docstring: from smartfunc import backend @backend(“gpt-4o") def generate_summary(text: str): """Generate a summary of the following text: """ pass summary…