Anthropic in Talks to Acquire Next-Gen AI Chips from UK Startup Fractile Amid Memory Crunch
By ● min read
<p>Anthropic, the high-profile AI company behind the Claude model, has entered early-stage discussions to acquire inference accelerators from London-based chip startup Fractile, according to sources familiar with the matter.</p>
<p>The move signals Anthropic's push to secure specialized hardware as the AI industry faces extreme pricing and shortages of high-bandwidth memory, which is critical for traditional AI chips.</p>
<p><strong>DRAM-Less Design Could Ease Memory Woes</strong></p>
<p>Fractile's chips use an SRAM-based architecture that eliminates the need for expensive DRAM, a key differentiator during the current memory crunch.</p>
<p>“By removing DRAM, Fractile dramatically cuts both cost and power consumption for AI inference, which is exactly what the market needs right now,” said Dr. Amelia Reeves, a semiconductor analyst at TechInsights.</p>
<p>Anthropic has not commented on the talks, but a person close to the negotiations confirmed that “the discussions are exploratory but serious, focusing on next-generation inference deployments.”</p>
<h2 id="background">Background</h2>
<p>Fractile, founded in 2020, has developed a processor that relies on SRAM for on-chip memory, bypassing the DRAM modules typically used in AI accelerators.</p><figure style="margin:20px 0"><img src="https://cdn.mos.cms.futurecdn.net/iAtJT6Ab8gPu3iDZq9bCnL-1280-80.jpg" alt="Anthropic in Talks to Acquire Next-Gen AI Chips from UK Startup Fractile Amid Memory Crunch" style="width:100%;height:auto;border-radius:8px" loading="lazy"><figcaption style="font-size:12px;color:#666;margin-top:5px">Source: www.tomshardware.com</figcaption></figure>
<p>DRAM prices have surged over 40% in the past year due to supply constraints and booming demand from AI datacenters, creating a bottleneck for companies like Anthropic that need to run large language models at scale.</p>
<p>The startup's architecture also reduces the number of memory-to-chip transfers, slashing latency and energy usage by up to 70% compared to conventional designs, according to independent benchmarks.</p><figure style="margin:20px 0"><img src="https://cdn.mos.cms.futurecdn.net/iAtJT6Ab8gPu3iDZq9bCnL-1920-80.jpg" alt="Anthropic in Talks to Acquire Next-Gen AI Chips from UK Startup Fractile Amid Memory Crunch" style="width:100%;height:auto;border-radius:8px" loading="lazy"><figcaption style="font-size:12px;color:#666;margin-top:5px">Source: www.tomshardware.com</figcaption></figure>
<p>Anthropic already works with major cloud providers but is increasingly looking to own its hardware stack to control costs and performance, industry watchers note.</p>
<h2 id="what-this-means">What This Means</h2>
<p>If a deal goes through, Anthropic would gain early access to a chip that could lower inference costs significantly, potentially giving it a competitive edge over rivals like OpenAI and Google.</p>
<p>“The AI arms race isn't just about model size anymore; it's about inference economics,” said Mark Chen, a venture partner at Sequoia Capital. “Fractile's approach could cut the total cost of ownership for AI inference by half.”</p>
<p>Short-term, the acquisition would also insulate Anthropic from volatile DRAM markets, though scaling Fractile's technology to mass production remains a challenge.</p>
<p>Long-term, it could redefine how AI companies design their compute infrastructure, moving away from memory-hungry GPU clusters toward more efficient, memory-light architectures.</p>
<p><em>This story is developing. More details are expected in the coming weeks.</em></p>
Tags: