Tag: multilingual
-
Simon Willison’s Weblog: New Pleias 1.0 LLMs trained exclusively on openly licensed data
Source URL: https://simonwillison.net/2024/Dec/5/pleias-llms/#atom-everything Source: Simon Willison’s Weblog Title: New Pleias 1.0 LLMs trained exclusively on openly licensed data Feedly Summary: New Pleias 1.0 LLMs trained exclusively on openly licensed data I wrote about the Common Corpus public domain dataset back in March. Now Pleias, the team behind Common Corpus, have released the first family of…
-
Simon Willison’s Weblog: QwQ: Reflect Deeply on the Boundaries of the Unknown
Source URL: https://simonwillison.net/2024/Nov/27/qwq/#atom-everything Source: Simon Willison’s Weblog Title: QwQ: Reflect Deeply on the Boundaries of the Unknown Feedly Summary: QwQ: Reflect Deeply on the Boundaries of the Unknown Brand openly licensed model from Alibaba Cloud’s Qwen team, this time clearly inspired by OpenAI’s work on reasoning in o1. I love how the introduce the new…
-
Slashdot: AI Lab PleIAs Releases Fully Open Dataset, as AMD, Ai2 Release Open AI Models
Source URL: https://news.slashdot.org/story/24/11/16/0326222/ai-lab-pleias-releases-fully-open-dataset-as-amd-ai2-release-open-ai-models?utm_source=rss1.0mainlinkanon&utm_medium=feed Source: Slashdot Title: AI Lab PleIAs Releases Fully Open Dataset, as AMD, Ai2 Release Open AI Models Feedly Summary: AI Summary and Description: Yes Summary: The text outlines PleIAs’ commitment to open training for large language models (LLMs) through the release of Common Corpus, highlighting the significance of open data for LLM…
-
Simon Willison’s Weblog: Releasing the largest multilingual open pretraining dataset
Source URL: https://simonwillison.net/2024/Nov/14/releasing-the-largest-multilingual-open-pretraining-dataset/#atom-everything Source: Simon Willison’s Weblog Title: Releasing the largest multilingual open pretraining dataset Feedly Summary: Releasing the largest multilingual open pretraining dataset Common Corpus is a new “open and permissible licensed text dataset, comprising over 2 trillion tokens (2,003,039,184,047 tokens)" released by French AI Lab PleIAs. This appears to be the largest available…
-
Hacker News: DeepSeek v2.5 – open-source LLM comparable to GPT-4o, but 95% less expensive
Source URL: https://www.deepseek.com/ Source: Hacker News Title: DeepSeek v2.5 – open-source LLM comparable to GPT-4o, but 95% less expensive Feedly Summary: Comments AI Summary and Description: Yes Summary: The text discusses DeepSeek-V2.5, an open-source model that has achieved notable rankings against leading large models such as GPT-4 and LLaMA3-70B. Its specialization in areas like math,…