Tag: clark
-
Slashdot: Spotify Publishes AI-Generated Songs From Dead Artists Without Permission
Source URL: https://entertainment.slashdot.org/story/25/07/21/2037254/spotify-publishes-ai-generated-songs-from-dead-artists-without-permission?utm_source=rss1.0mainlinkanon&utm_medium=feed Source: Slashdot Title: Spotify Publishes AI-Generated Songs From Dead Artists Without Permission Feedly Summary: AI Summary and Description: Yes Summary: The incident involving Spotify’s unauthorized publication of AI-generated songs attributed to deceased artists raises significant concerns about content verification and security within music streaming platforms. This situation emphasizes the need for robust…
-
Simon Willison’s Weblog: Quoting Jack Clark
Source URL: https://simonwillison.net/2025/Jan/28/jack-clark-r1/#atom-everything Source: Simon Willison’s Weblog Title: Quoting Jack Clark Feedly Summary: The most surprising part of DeepSeek-R1 is that it only takes ~800k samples of ‘good’ RL reasoning to convert other models into RL-reasoners. Now that DeepSeek-R1 is available people will be able to refine samples out of it to convert any other…
-
The Register: DeepSeek’s R1 curiously tells El Reg reader: ‘My guidelines are set by OpenAI’
Source URL: https://www.theregister.com/2025/01/27/deepseek_r1_identity/ Source: The Register Title: DeepSeek’s R1 curiously tells El Reg reader: ‘My guidelines are set by OpenAI’ Feedly Summary: Despite impressive benchmarks, the Chinese-made LLM is not without some interesting issues DeepSeek’s open source reasoning-capable R1 LLM family boasts impressive benchmark scores – but its erratic responses raise more questions about how…
-
Simon Willison’s Weblog: Quoting Jack Clark
Source URL: https://simonwillison.net/2025/Jan/20/jack-clark/ Source: Simon Willison’s Weblog Title: Quoting Jack Clark Feedly Summary: [Microsoft] said it plans in 2025 “to invest approximately $80 billion to build out AI-enabled datacenters to train AI models and deploy AI and cloud-based applications around the world.” For comparison, the James Webb telescope cost $10bn, so Microsoft is spending eight…
-
Simon Willison’s Weblog: Quoting Jack Clark
Source URL: https://simonwillison.net/2024/Dec/23/jack-clark/#atom-everything Source: Simon Willison’s Weblog Title: Quoting Jack Clark Feedly Summary: There’s been a lot of strange reporting recently about how ‘scaling is hitting a wall’ – in a very narrow sense this is true in that larger models were getting less score improvement on challenging benchmarks than their predecessors, but in a…
-
Simon Willison’s Weblog: Quoting Jack Clark
Source URL: https://simonwillison.net/2024/Nov/18/jack-clark/ Source: Simon Willison’s Weblog Title: Quoting Jack Clark Feedly Summary: The main innovation here is just using more data. Specifically, Qwen2.5 Coder is a continuation of an earlier Qwen 2.5 model. The original Qwen 2.5 model was trained on 18 trillion tokens spread across a variety of languages and tasks (e.g, writing,…