Literal Labs Raises $6.2M to Build Ultra-Efficient AI Chips Based on the Human Brain
June 20, 2025
byFenoms Startup Research
Literal Labs, a UK-based semiconductor startup developing brain-inspired AI chips, has raised $6,223,938 in pre-seed funding to build ultra-efficient neuromorphic processors for the next generation of edge and autonomous systems. The round was backed by Northern Gritstone, Mercuri, Sure Valley Ventures, Cambridge Future Tech SPV, and several angel investors.
Founded by a team with deep experience in ARM, Graphcore, and top academic labs, Literal Labs is targeting the next frontier in AI hardware: drastically reducing power consumption by mimicking the way neurons fire in the human brain.
What Literal Labs Is Building
Literal Labs is designing neuromorphic processors - AI chips that don’t just run deep learning models faster, but run them more like the brain does. This means:
- Event-based computing rather than clock-based processing
- Spiking neural networks (SNNs) that activate only when needed
- Ultra-low power operation that can function on the edge with minimal battery draw
- Native support for temporal inference tasks (like vision, robotics, and autonomous decision-making)
These chips are not just about speed - they’re about efficiency. Literal Labs believes the future of AI won’t be won by brute-force compute, but by smarter, leaner silicon that works in real-world environments, not just data centers.
Why Neuromorphic Computing Matters
AI workloads are exploding - but so is energy use. Training and running modern models can burn massive amounts of power, and traditional GPUs are increasingly hitting their limits in cost, size, and sustainability.
That’s why neuromorphic computing is gaining ground:
- The global neuromorphic chip market is projected to grow from $74 million in 2023 to $550 million by 2030, a CAGR of over 33.5% (Allied Market Research)
- Traditional AI chips (like GPUs and TPUs) require 10x–100x more energy than neuromorphic processors for certain sparse or edge tasks
- Companies like Intel (Loihi), IBM (TrueNorth), and SynSense are already pursuing neuromorphic silicon as a complement to cloud-based AI
- The edge AI hardware market is projected to reach $22.2 billion by 2027, driven by autonomous vehicles, robotics, smart sensors, and on-device inferencing (Grand View Research)
- Regulatory trends in Europe and Asia are pushing for energy-efficient AI in industrial IoT and mobile devices
Literal Labs is positioning itself at the intersection of these trends - offering AI compute that doesn’t compromise on power or portability.
Why This Round Stands Out
$6.2 million in pre-seed funding is a statement - especially in semiconductor deep tech, where investors often wait for tape-out before they commit.
But Literal Labs isn’t just pitching a better chip. They’re offering a better worldview on what intelligence at the edge should actually look like.
And that’s where the real founder insight kicks in: most startups chase the dominant curve. Literal Labs questioned the shape of the curve itself.
They didn’t ask how to shrink GPUs. They asked whether neural compute needs to be continuous at all, or whether spiking, event-based architectures could unlock orders of magnitude better efficiency by simply aligning with how biology already solved this problem.
Here’s the value drop: If you’re building foundational tech, your edge won’t come from being slightly better. It’ll come from being structurally different - because that’s what breaks cost curves, energy ceilings, and architectural inertia.
Literal Labs didn’t just design new silicon. They changed the mental model of how compute should behave. And when your product forces a reframe, adoption becomes inevitable - not because you convince people, but because the old assumptions stop making sense.
Market Outlook: A New Era for AI Hardware Is Taking Shape
The exponential growth of AI has revealed a fundamental bottleneck: current compute architectures weren’t designed for intelligence - they were designed for throughput. As a result, power-hungry GPUs and server farms are quickly becoming unsustainable for both edge and planetary-scale workloads.
This is where neuromorphic and event-driven chips step in - not as successors to GPUs, but as a new class of compute entirely.
Key signals in the market:
- The neuromorphic computing market is projected to grow from $74 million in 2023 to over $550 million by 2030, with enterprise AI, autonomous systems, and defense driving demand (Allied Market Research).
- Traditional GPUs consume up to 10x more energy than neuromorphic architectures for sparse, event-based tasks like gesture recognition, audio processing, or autonomous sensor fusion (MIT Technology Review).
- Edge AI hardware - which includes devices like drones, wearables, smart cameras, and medical diagnostics - is expected to hit $22.2 billion by 2027, with a CAGR of 19.8% (Grand View Research).
- By 2030, over 60% of AI inference is expected to occur on the edge, not in centralized data centers - creating massive demand for ultra-low-power, high-efficiency chips (McKinsey).
- Governments are now prioritizing chip energy profiles. The EU Chips Act, CHIPS and Science Act (US), and UK Semiconductor Strategy all allocate R&D resources for emerging architectures that reduce carbon impact and supply chain fragility.
With these forces converging, Literal Labs is positioned to define a new tier in the AI stack: not brute-force compute, not cloud-bound AI - but biologically aligned silicon that can live anywhere intelligence is needed.