Source URL: https://simonwillison.net/2025/Jul/18/how-to-run-an-llm-on-your-laptop/
Source: Simon Willison’s Weblog
Title: How to run an LLM on your laptop
Feedly Summary: How to run an LLM on your laptop
I talked to Grace Huckins for this piece from MIT Technology Review on running local models. Apparently she enjoyed my dystopian backup plan!
Simon Willison has a plan for the end of the world. It’s a USB stick, onto which he has loaded a couple of his favorite open-weight LLMs—models that have been shared publicly by their creators and that can, in principle, be downloaded and run with local hardware. If human civilization should ever collapse, Willison plans to use all the knowledge encoded in their billions of parameters for help. “It’s like having a weird, condensed, faulty version of Wikipedia, so I can help reboot society with the help of my little USB stick,” he says.
The article suggests Ollama or LM Studio for laptops, and new-to-me LLM Farm for the iPhone:
My beat-up iPhone 12 was able to run Meta’s Llama 3.2 1B using an app called LLM Farm. It’s not a particularly good model—it very quickly goes off into bizarre tangents and hallucinates constantly—but trying to coax something so chaotic toward usability can be entertaining.
Tags: ai, generative-ai, local-llms, llms, ollama, lm-studio, press-quotes
AI Summary and Description: Yes
Summary: The text discusses running local language models (LLMs) on personal hardware, highlighting a creative approach involving storing open-weight models on USB sticks for potential future use. It provides insights into practical applications of LLMs on devices like laptops and smartphones, thereby merging concepts of AI and infrastructure flexibility.
Detailed Description: The text serves as both a practical guide and a thought-provoking commentary on the potential utility of language models in hypothetical scenarios where society may need to be rebooted. Key points include:
– **Local LLMs:** The article introduces the concept of running language models directly on personal devices, such as laptops and smartphones. This offers accessibility and autonomy over AI resources.
– **Emergency Planning:** Simon Willison’s idea of using a USB stick with encoded knowledge represents a novel backup plan to utilize AI models for potential societal rebuilding, symbolizing the preservation of human knowledge.
– **Available Tools:** The mention of specific applications like Ollama, LM Studio, and LLM Farm illustrates practical tools for users interested in deploying local LLMs, expanding the infrastructure capabilities of personal computing.
– **Performance Notes:** While trying to run LLMs on devices such as an iPhone 12, the text mentions challenges such as the models generating nonsensical outputs or “hallucinations,” highlighting limitations in current generative AI technology.
Overall, this content is relevant for professionals interested in AI, particularly in exploring how generative models can be executed local to enhance infrastructure resilience and maintain control over AI applications amidst broader societal challenges. Potential implications for security include the need to consider data privacy and ethical use when utilizing AI in these innovative formats.