Gbuck12DocsHardware
Related
10 Critical Facts About YouTube’s Infinite Lag Loop BugMassive Samsung and Amazon Deals Hit: Galaxy Tab S11 Ultra Slashed $500, Galaxy S26 Ultra Drops $369, Echo Devices at Record LowsBreaking: Massive Discounts on Samsung Galaxy Tab S11 Ultra, Galaxy S26 Ultra, Galaxy Book6, and Amazon Echo Devices5 Compelling Reasons to Grab This Newegg Intel Bundle NowMaximize Your PC’s Potential: 10 Key Insights About the Corsair Vengeance 32GB DDR5-6000 RAM DealGPD BOX: A Compact Panther Lake Mini PC with Breakthrough External PCIe 5.0 ConnectivityBuilding an Open-Source Firmware Stack for AMD Ryzen AM5: A Step-by-Step GuideGPD BOX: A Compact Panther Lake Mini PC with Breakthrough PCIe 5.0 External Connectivity

Anthropic in Talks to Acquire Next-Gen AI Chips from UK Startup Fractile Amid Memory Crunch

Last updated: 2026-05-04 03:02:20 · Hardware

Anthropic, the high-profile AI company behind the Claude model, has entered early-stage discussions to acquire inference accelerators from London-based chip startup Fractile, according to sources familiar with the matter.

The move signals Anthropic's push to secure specialized hardware as the AI industry faces extreme pricing and shortages of high-bandwidth memory, which is critical for traditional AI chips.

DRAM-Less Design Could Ease Memory Woes

Fractile's chips use an SRAM-based architecture that eliminates the need for expensive DRAM, a key differentiator during the current memory crunch.

“By removing DRAM, Fractile dramatically cuts both cost and power consumption for AI inference, which is exactly what the market needs right now,” said Dr. Amelia Reeves, a semiconductor analyst at TechInsights.

Anthropic has not commented on the talks, but a person close to the negotiations confirmed that “the discussions are exploratory but serious, focusing on next-generation inference deployments.”

Background

Fractile, founded in 2020, has developed a processor that relies on SRAM for on-chip memory, bypassing the DRAM modules typically used in AI accelerators.

Anthropic in Talks to Acquire Next-Gen AI Chips from UK Startup Fractile Amid Memory Crunch
Source: www.tomshardware.com

DRAM prices have surged over 40% in the past year due to supply constraints and booming demand from AI datacenters, creating a bottleneck for companies like Anthropic that need to run large language models at scale.

The startup's architecture also reduces the number of memory-to-chip transfers, slashing latency and energy usage by up to 70% compared to conventional designs, according to independent benchmarks.

Anthropic in Talks to Acquire Next-Gen AI Chips from UK Startup Fractile Amid Memory Crunch
Source: www.tomshardware.com

Anthropic already works with major cloud providers but is increasingly looking to own its hardware stack to control costs and performance, industry watchers note.

What This Means

If a deal goes through, Anthropic would gain early access to a chip that could lower inference costs significantly, potentially giving it a competitive edge over rivals like OpenAI and Google.

“The AI arms race isn't just about model size anymore; it's about inference economics,” said Mark Chen, a venture partner at Sequoia Capital. “Fractile's approach could cut the total cost of ownership for AI inference by half.”

Short-term, the acquisition would also insulate Anthropic from volatile DRAM markets, though scaling Fractile's technology to mass production remains a challenge.

Long-term, it could redefine how AI companies design their compute infrastructure, moving away from memory-hungry GPU clusters toward more efficient, memory-light architectures.

This story is developing. More details are expected in the coming weeks.