Simon Willison’s Weblog: LLM 0.26a0 adds support for tools!

Source URL: https://simonwillison.net/2025/May/14/llm-adds-support-for-tools/#atom-everything
Source: Simon Willison’s Weblog
Title: LLM 0.26a0 adds support for tools!

Feedly Summary: LLM 0.26a0 adds support for tools!
It’s only an alpha so I’m not going to promote this extensively yet, but my LLM project just grew a feature I’ve been working towards for nearly two years now: tool support!
I’m presenting a workshop about Building software on top of Large Language Models at PyCon US tomorrow and this was the one feature I really needed to pull everything else together.
Tools can be used from the command-line like this (inspired by sqlite-utils –functions):
llm –functions ‘
def multiply(x: int, y: int) -> int:
“""Multiply two numbers."""
return x * y
‘ ‘what is 34234 * 213345’ -m o4-mini
Or from the Python library like this:
import llm

def multiply(x: int, y: int) -> int:
"""Multiply two numbers."""
return x * y

model = llm.get_model("gpt-4.1-mini")
response = model.chain(
"What is 34234 * 213345?",
tools=[multiply]
)
print(response.text())
There’s also a new plugin hook so plugins can register tools that can then be referenced by name using llm –tool name_of_tool "prompt".
There’s still a bunch I want to do before including this in a stable release, most notably adding support for Python asyncio. It’s a pretty exciting start though!
Tags: llm, generative-ai, projects, llm-tool-use, ai, llms

AI Summary and Description: Yes

Summary: The text discusses new features added to a Large Language Model (LLM) project, particularly the introduction of tool support in an alpha release. This supports the integration of plugins and function calls within LLM applications, which is significant for developers working on AI-based software.

Detailed Description: The text outlines recent advancements in an LLM project, specifically the alpha version 0.26a0, which introduces tool support—a crucial addition that enhances the functionality of LLMs. This development is particularly relevant to professionals in AI, software development, and generative AI security who are focused on leveraging LLMs for varied applications.

* Key Points:
– **Tool Support Feature**: A new feature enabling command-line and Python library usage of functions within the LLM was introduced, allowing users to integrate custom functions seamlessly.
– **Workshop Presentation**: The author is presenting a workshop on building software using LLMs, which indicates a growing interest and application of LLMs in real-world scenarios.
– **Example Usage**: The provided examples demonstrate both command-line and Python integration for mathematical operations, showcasing the ease of tool incorporation in LLMs.
– **Plugin Hook**: The introduction of a new plugin hook allows for the registration and referencing of tools by name, indicating a move towards expandability and flexibility in LLM applications.
– **Future Development**: The author hints at future enhancements, including support for Python asyncio, which is essential for improving the efficiency of LLM applications, particularly in concurrent execution environments.

In summary, the introduction of tool support within LLMs has implications for developers and security professionals who are actively working with AI and looking to build upon LLM frameworks for various applications. It opens doors to more interactive and functional AI-driven solutions.