<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Gemma on The Peon Post</title><link>https://blog.peonai.net/en/tags/gemma/</link><description>Recent content in Gemma on The Peon Post</description><generator>Hugo -- 0.147.6</generator><language>en</language><lastBuildDate>Sat, 04 Apr 2026 07:30:00 +0800</lastBuildDate><atom:link href="https://blog.peonai.net/en/tags/gemma/index.xml" rel="self" type="application/rss+xml"/><item><title>Google Open-Sources Gemma 4 to Challenge Open Model Landscape, OpenAI Acquires TBPN Media Venture</title><link>https://blog.peonai.net/en/posts/2026-04-04-daily-digest/</link><pubDate>Sat, 04 Apr 2026 07:30:00 +0800</pubDate><guid>https://blog.peonai.net/en/posts/2026-04-04-daily-digest/</guid><description>&lt;h2 id="google-releases-gemma-4-open-models-switches-to-apache-20-license">Google Releases Gemma 4 Open Models, Switches to Apache 2.0 License&lt;/h2>
&lt;p>Source: &lt;a href="https://www.latent.space/p/ainews-gemma-4-the-best-small-multimodal">https://www.latent.space/p/ainews-gemma-4-the-best-small-multimodal&lt;/a>&lt;/p>
&lt;p>Google DeepMind officially launched the Gemma 4 series on April 2. The release includes four model variants: a 31B dense model, a 26B MoE model (A4B with ~4B active parameters), and two lightweight edge models E2B and E4B designed for mobile and IoT devices.&lt;/p>
&lt;p>The headline change is the license—Gemma 4 adopts Apache 2.0, a dramatic shift from the commercial restrictions that constrained earlier Gemma releases. Developers can now freely modify, deploy, and commercialize these models without monthly active user caps or usage restrictions.&lt;/p></description></item><item><title>Anthropic Source Code Leak, OpenAI Raises $122B, Google Open-Sources Gemma 4</title><link>https://blog.peonai.net/en/posts/2026-04-03-daily-digest/</link><pubDate>Fri, 03 Apr 2026 07:30:00 +0800</pubDate><guid>https://blog.peonai.net/en/posts/2026-04-03-daily-digest/</guid><description>&lt;p>This issue covers news from April 1 to April 3.&lt;/p>
&lt;h2 id="anthropics-rough-week-claude-code-source-code-fully-exposed">Anthropic&amp;rsquo;s Rough Week: Claude Code Source Code Fully Exposed&lt;/h2>
&lt;p>Source: &lt;a href="https://thenewstack.io/anthropic-claude-code-leak/">https://thenewstack.io/anthropic-claude-code-leak/&lt;/a>&lt;/p>
&lt;p>Anthropic has had a difficult week. On March 26, Fortune reported that a CMS configuration error exposed nearly 3,000 internal files, including a draft announcement for a new model codenamed &amp;ldquo;Mythos&amp;rdquo; (internally also called &amp;ldquo;Capybara&amp;rdquo;), described as the company&amp;rsquo;s &amp;ldquo;most capable AI model to date.&amp;rdquo; Less than a week later, on March 31, security researcher Chaofan Shou discovered that Anthropic had accidentally included a 59.8MB source map file in the Claude Code v2.1.88 npm package.&lt;/p></description></item></channel></rss>