Source URL: https://simonwillison.net/2025/May/29/llm-tools-exa/
Source: Simon Willison’s Weblog
Title: llm-tools-exa
Feedly Summary: llm-tools-exa
When I shipped LLM 0.26 yesterday one of the things I was most excited about was seeing what new tool plugins people would build for it.
Dan Turkel’s llm-tools-exa is one of the first. It adds web search to LLM using Exa (previously), a relatively new search engine offering that rare thing, an API for search. They have a free preview, you can grab an API key here.
I’m getting pretty great results! I tried it out like this:
llm install llm-tools-exa
llm keys set exa
# Pasted API key here
llm -T web_search “What’s in LLM 0.26?"
Here’s the full answer – it started like this:
LLM 0.26 was released on May 27, 2025, and the biggest new feature in this version is official support for tools. Here’s a summary of what’s new and notable in LLM 0.26:
LLM can now run tools. You can grant LLMs from OpenAI, Anthropic, Gemini, and local models access to any tool you represent as a Python function.
Tool plugins are introduced, allowing installation of plugins that add new capabilities to any model you use.
Tools can be installed from plugins and loaded by name with the –tool/-T option.
[…]
Exa provided 21,000 tokens of search results, including what looks to be a full copy of my blog entry and the release notes for LLM.
Tags: llm, generative-ai, llm-tool-use, apis, search, ai, llms
AI Summary and Description: Yes
Summary: The text discusses the launch of LLM 0.26, highlighting its new feature that allows integration with tool plugins, particularly focusing on the addition of web search capabilities through the llm-tools-exa plugin. This development is significant for professionals in AI and cloud computing as it enhances the functionality and usability of language models.
Detailed Description: The release of LLM 0.26 introduces noteworthy advancements that expand the capabilities of Large Language Models (LLMs). The integration of tool plugins allows for the enhancement of AI models by incorporating additional functionalities, such as web search, significantly optimizing their performance in various applications.
Highlights of LLM 0.26:
– **Tool Integration**:
– The most anticipated feature is the ability of LLMs to run external tools introduced through plugins.
– LLMs from providers such as OpenAI, Anthropic, Gemini, and local models can now access any tool represented as a Python function.
– **Plugin System**:
– Dynamic installation of tool plugins is now possible, which adds new capabilities to the language models.
– This new structure allows users to easily load and utilize tools by specifying them as part of the command line interface using the –tool/-T option.
– **Web Search Capability**:
– The introduction of llm-tools-exa exemplifies the potential uses of the plugin system, bringing a web search functionality directly to LLMs.
– This tool utilizes Exa, a new search engine that provides an API for conducting searches.
– **User Experience**:
– The ease of installation and utilization is demonstrated by a simple command-line example for users to follow, emphasizing accessibility for developers seeking to leverage the new features.
– **Performance**:
– Users are reporting excellent results, indicative of the tool’s efficiency in retrieving relevant information, which could significantly enhance tasks requiring quick access to a large corpus of data.
These advancements in LLM 0.26 open up new possibilities in the AI landscape, particularly in how models can be customized and equipped with practical functionalities for enhanced efficiency in real-world applications. This is particularly relevant for professionals engaged in AI development, cloud services, and endowed with responsibilities in security and compliance, as these tools could help mitigate risks and improve operational workflows.