Source URL: https://simonwillison.net/2025/Apr/3/smartfunc/
Source: Simon Willison’s Weblog
Title: smartfunc
Feedly Summary: smartfunc
Vincent D. Warmerdam built this ingenious wrapper around my LLM Python library which lets you build LLM wrapper functions using a decorator and a docstring:
from smartfunc import backend
@backend(“gpt-4o")
def generate_summary(text: str):
"""Generate a summary of the following text: """
pass
summary = generate_summary(long_text)
It works with LLM plugins so the same pattern should work against Gemini, Claude and hundreds of others, including local models.
It integrates with more recent LLM features too, including async support and schemas, by introspecting the function signature:
class Summary(BaseModel):
summary: str
pros: list[str]
cons: list[str]
@async_backend("gpt-4o-mini")
async def generate_poke_desc(text: str) -> Summary:
"Describe the following pokemon: "
pass
pokemon = await generate_poke_desc("pikachu")
Vincent also recorded a 12 minute video walking through the implementation and showing how it uses Pydantic, Python’s inspect module and typing.get_type_hints() function.
Tags: llm, python, generative-ai, ai, llms
AI Summary and Description: Yes
Summary: The text introduces a Python library called “smartfunc,” developed by Vincent D. Warmerdam, which facilitates the creation of wrapper functions for large language models (LLMs) through decorators. This approach enhances LLM integration and functionality, particularly with features like asynchronous support.
Detailed Description: The provided content highlights a novel approach to interfacing with large language models through a Python library named “smartfunc.” This library allows developers to create LLM wrapper functions using a decorator pattern, which simplifies the process of invoking LLM capabilities while maintaining clean and organized code.
Key points from the text include:
– **LLM Wrapper Functionality**: The smartfunc library provides a way to wrap functions around LLMs, allowing for easier and more seamless interaction with LLM capabilities in Python applications.
– **Integration with Various LLMs**: The wrapper supports multiple LLMs, including well-known models like GPT-4, along with newer entrants like Gemini and Claude, showing versatility in application.
– **Asynchronous Support**: The library includes support for asynchronous programming, crucial for handling tasks that may take variable amounts of time, thereby improving overall application responsiveness.
– **Function Signature Introspection**: Utilizing Python’s inspect module, the library can handle function signatures dynamically, allowing for structured outputs (like summaries with pros and cons) that are defined using Pydantic models.
– **Practical Demonstration**: Vincent recorded a 12-minute video walkthrough detailing the implementation and features of the smartfunc library, contributing to community learning and showcasing usage scenarios.
– **Use Cases**: Examples of use cases such as generating summaries from text and providing descriptive outputs for objects (like Pokémon descriptions) indicate its practical applications in generative AI and LLM contexts.
In conclusion, the smartfunc library represents an innovative approach that enhances productivity for developers working with LLMs, which may help accelerate deployment and functionality of generative AI applications. Its design aligns with current best practices in software development and cloud computing security, particularly where secure and efficient code execution is crucial.