Tag: coding security

  • Simon Willison’s Weblog: Hallucinations in code are the least dangerous form of LLM mistakes

    Source URL: https://simonwillison.net/2025/Mar/2/hallucinations-in-code/#atom-everything Source: Simon Willison’s Weblog Title: Hallucinations in code are the least dangerous form of LLM mistakes Feedly Summary: A surprisingly common complaint I see from developers who have tried using LLMs for code is that they encountered a hallucination – usually the LLM inventing a method or even a full software library…