Datawizz Raises $12.5M Seed to Redefine How AI Models Are Built, Owned, and Deployed
October 16, 2025
byFenoms Startup Research

Datawizz, a startup pioneering specialized AI model infrastructure, has raised $12.5 million in seed funding to make artificial intelligence more efficient, economical, and private.
The round was led by Human Capital, with participation from BGV, 91VC, and other investors. Founded by Iddo Gino - the entrepreneur behind API marketplace RapidAPI - Datawizz aims to solve one of AI’s biggest scaling problems: the unsustainable cost and inefficiency of running massive, one-size-fits-all language models.
The Problem: The Hidden Tax of AI Adoption
Large language models (LLMs) may have democratized access to AI, but they’ve also introduced a new form of operational debt. The average enterprise now spends between 20–40% of its AI budget on model inference costs alone, with total global spending on LLM API calls surpassing $8.4 billion in the first half of 2025 (PitchBook, 2025).
Even well-funded startups are discovering that the economics of AI scale poorly. As workloads increase, unit costs often rise faster than revenue. Teams can’t easily optimize, because most models are too large and too generalized for specific business needs.
That’s where Datawizz steps in. Its platform enables developers to break away from monolithic LLMs and instead use Specialized Language Models (SLMs) - smaller, targeted models tuned for individual tasks.
The result: 85% lower compute costs, up to 15× faster inference speeds, and complete ownership of both models and data.
The Strategic Shift: From Bigger Models to Smarter Systems
The company’s insight is simple but transformative: the future of AI isn’t about size - it’s about precision.
Most AI startups compete on scale - bigger datasets, bigger models, bigger promises. But as Gino points out, “Bigger models don’t always mean better outcomes.” The true opportunity lies in orchestrating dozens of smaller, sharper models that specialize, collaborate, and learn from usage patterns.
This architecture not only cuts costs but makes AI deployment more predictable, explainable, and domain-aware.
Here’s where founders should pay attention - because this is the inflection point that separates sustainable AI businesses from short-lived hype cycles.
Many teams today still treat AI models as static assets: something to build, fine-tune, and ship. Datawizz flips that mindset. It treats AI as a dynamic supply chain - where every model, no matter how small, becomes a node that can be optimized, replaced, or redirected based on real-time performance and cost.
That’s the quiet revolution hiding in Datawizz’s product: it’s not building another “model.” It’s building the routing intelligence that determines which model runs, when, and for what.
In AI, control of orchestration beats control of generation.
You don’t win by having the largest model - you win by deciding which model to use at the right time. That’s how you transform a product into infrastructure.
Think about it this way: most AI companies build static capability (a chatbot, a generator, a summarizer). But as the market matures, those will commoditize fast. What doesn’t commoditize is decision logic - the ability to route data, tasks, and compute intelligently.
This is the layer where margins hide. It’s where defensibility lives. Because every query that passes through your system improves your orchestration layer, not just your output.
Founders building in AI should ask themselves: are we trying to be the model, or the memory that decides which model runs?
The second path is slower to market but exponentially harder to replace - and that’s exactly why Datawizz’s strategy is so powerful.
The Team and Early Momentum
Led by Iddo Gino, who scaled RapidAPI to over 4 million developers before exiting, Datawizz brings deep experience in developer tools and distributed computing. His team, drawn from major AI labs and infrastructure startups, is now building a platform that gives enterprises the freedom to own and train models locally - with full compliance and privacy control.
The startup has already piloted its system with select enterprise clients, showing measurable improvements in cost savings and latency reductions.
According to Gino, the mission is clear: “We want to make AI modular, so companies don’t have to choose between capability and control. You should be able to train, deploy, and manage your own specialized models - just like running microservices.”
The Market Momentum: Efficiency Is the Next Frontier
The timing couldn’t be better. The global AI infrastructure market, currently valued at $49 billion (2025), is projected to reach $160 billion by 2030, driven by the need for optimization, not just expansion (Allied Market Research, 2025).
Analysts predict that by 2028, over 70% of AI inference will come from models under 5 billion parameters, signaling a clear pivot away from today’s oversized architectures (CB Insights, 2025).
At the same time, regulators are cracking down on data governance and model transparency, pushing companies toward solutions that emphasize traceability, control, and energy efficiency - exactly the space Datawizz occupies.
With major investors like Human Capital and BGV backing this shift, it’s clear that “lean AI” isn’t a niche trend - it’s the next infrastructure wave.
What the Funding Will Power
The $12.5M seed round will be used to:
- Expand engineering teams in San Francisco and Tel Aviv.
- Refine routing intelligence for multi-model orchestration.
- Build integrations for enterprise AI monitoring and observability.
- Strengthen on-prem and private cloud deployments for data-sensitive clients.
By 2026, Datawizz aims to become the de facto standard for multi-model AI orchestration, empowering developers and enterprises to build AI systems that are cheaper, faster, and fully under their control.
Why This Raise Matters
The funding signals more than investor confidence - it reflects a shift in how the AI ecosystem defines progress. The next generation of AI infrastructure won’t be measured by the size of a model, but by the intelligence of its architecture.
For founders, the lesson is timeless: building the biggest model makes you impressive - but building the smartest system makes you indispensable.









