Simon Willison’s Weblog: Locally AI

Source URL: https://simonwillison.net/2025/Sep/21/locally-ai/
Source: Simon Willison’s Weblog
Title: Locally AI

Feedly Summary: Locally AI
Handy new iOS app by Adrien Grondin for running local LLMs on your phone. It just added support for the new iOS 26 Apple Foundation model, so you can install this app and instantly start a conversation with that model without any additional download.
The app can also run a variety of other models using MLX, including embers of the Gemma, Llama 3.2, and and Qwen families.
Tags: apple, ios, ai, generative-ai, local-llms, llms, mlx

AI Summary and Description: Yes

Summary: The text discusses a new iOS app called “Locally AI” created by Adrien Grondin, which allows users to run local large language models (LLMs) on their smartphones. It notably supports Apple’s new iOS 26 Foundation model and various other models, highlighting its relevance in the landscape of AI and mobile applications.

Detailed Description: The announcement of the “Locally AI” app presents considerable implications for AI accessibility and local computation capabilities. Here are the key points regarding this development:

– **Functionality**: The app enables users to engage with large language models directly on their iOS devices, eliminating the need for cloud-based processing, which is essential for privacy and speed.

– **Model Support**: It supports a range of models, including:
– Apple’s iOS 26 Foundation model
– Models from Gemma, Llama 3.2, and Qwen families through the MLX platform

– **Privacy and Security**: Running models locally on a device can enhance user privacy as data does not need to be sent to external servers for processing. This emphasizes the growing focus on local AI solutions that prioritize user data security.

– **User Experience**: The integration of models directly into a mobile app offers seamless interaction and could appeal to developers, researchers, and AI enthusiasts looking for easier access to advanced machine learning capabilities on personal devices.

– **Industry Implications**: This development reflects a broader trend in AI toward decentralization and local processing, which has significant implications for security—reducing the risks associated with data transmission and cloud storage.

Overall, the “Locally AI” app by Adrien Grondin not only represents an innovative step in mobile AI applications but also raises important considerations regarding privacy, security, and the future of AI deployment in consumer technology.