Back

Intelligent Solutions

Chapter 3: What are you going to build?

Simon Woodhead

Simon Woodhead

20th January 2026

Every once in a while I get to write a business plan. Maybe I’m sad, but I love it – imagining the future and our part in it gives me the same kind of satisfaction I expect a painter or writer gets. Yes, considerably sadder I know. I still have the plan written in 1995 which talks about the Internet and Mobile Phones – how both would catch on and converge, and the creation of eSMS: our Chapter 1. Chapter 2 was about voice, how the Internet would displace the PSTN and the convergence of voice and data; it has been a longer slower chapter than we thought but, again, has panned out otherwise as expected. And that brings us to Chapter 3, which I wrote this weekend.

Chapter 3 is about AI and you might find that odd given we launched our first Conversational AI agent in April 2025, having been experimenting with related things for 18 months before that – recall the demos we took to ITW 2023 for those who were there. ‘The Potato’ is not new, and we’ve talked about that for at least as long, but the origin story is different. Those were exploratory and evolutionary, like the many other things we innovate around, even if the Potato was something of a Eureka moment in itself.

Writing a business plan is another level though. For that, something has to be revolutionary, not a mere evolution. Something so big I’d otherwise be starting a new business or putting it at the core of Simwood going forward. Not just the tail wagging the dog, the tail and dog swapping places altogether. The sample size is small, but I think my track record on these calls is pretty good – some might say I should buy an award. This isn’t self-praise BTW, I don’t go in for that, but I want to really make the point that this is something I do rarely and it is a big deal for me when I do.

When we first announced the Potato, others laughed. Yet, years later one of our more glacial competitors recently announced that they are exploring something that sounds remarkably similar, of course without them admitting so. Meanwhile, dino-carriers dismissed our AI agents and WhatsApp integration, yet we’re told by mutual customers they’ve “got one coming”. I doubt we’ll ever see anything functional from either but I take heart in the conversation having moved on to the recognition that they have to at least say they’re following. 

The real big change though is internal – the conversations I’ve had with our amazing team and what Charles and co have done with the v2 beta. If you haven’t looked or played yet, be sure to do so, because it is game changing. The goal was feature parity with the heavily funded startups, but we kind of overshot. Moreover, Charles and I have been workshopping ways we can go so much further, not just from a front-end feature perspective, but what the underlying infrastructure looks like. That’s where the lightbulbs started going off for me as it is very clear to me that existing players are heavily dependent on public cloud and their economics reflect a combination of healthy margins for AWS and VC funding on ridiculous valuations, their set-up is not truly scalable which is reflected in heavy per account concurrency limits or hideous concurrency charges, and the compromises of their technology choices are very obvious. There are so many ways we can better what they’ve done, both technically and commercially – yes, a better product with better economics. We’ve been here before, twice as it happens.

The market opportunity for AI agents and conversational AI is enormous. It is expected to exceed USD 200bn by 2034, growing roughly 2.5x faster than CPaaS/UCaaS, with 93% of business leaders expecting early adopters of AI agents to gain a competitive edge. Voice AI specifically attracted USD 2.1bn in venture capital in 2024 alone (7x growth from 2022). Gartner predicts that by 2035, agentic AI will drive USD 450bn+ in enterprise software revenue, representing 30-40% of the total market. If this market were as saturated as UCaaS/CPaaS/CCaaS is today it’d still represent a compelling opportunity for Simwood and our customers, but it isn’t saturated at all, despite appearances.

There’s a book called “Blue Ocean Strategy” which is one of those books you barely need to read the back cover of to get the message: oceans are red with blood where competitors are fighting over the same customers, so swim to where it is blue. The Conversational AI space on the face of it looks like a red ocean, with massive operators having hundreds of millions in funding and huge infrastructures (cough), how could a real business possibly compete? 

The first thing you need to realise is that these guys do not have the customers, you do. They want them, but every policy and commercial term is geared to take them directly, not to work with or through you. I tried having sensible conversations with ElevenLabs, who were too arrogant to even respond, and the weirdos at Vapi ghosted me at the prospect of remotely real-world commercials. I’m so glad for both now. I genuinely believe every single SME everywhere will have a) WhatsApp enablement and b) voice agent(s). From a plumber taking diary appointments while busy under a sink, a builder sending quotes while on a tea break, to a bank taking credit card applications, every business has a place for them. Despite Rachel Reeves’ best efforts, I’m told there are still 5.7m private sector businesses in the UK alone, and apparently 360m worldwide, and guess what: they’re already your customers. Every one of your customers will consume these services over the coming years and, furthermore, every single one of your competitors’ customers will too. Disagree with me if you like, but the costs of being wrong are pretty heavy yet the opportunity is huge – the risk reward ratio is very favourable. Imagine a simple message handling agent, nothing clever, and there’s a £20 a month product there for every UK sole trader (3.1m of them in the UK apparently) – you do the maths. We haven’t even got into call centres and enterprise which is an enormous sector too, albeit sold something of a pup by providers so far. We’ve had some huge successes amongst our customer base in previous Chapters (read: most notable exits in the sector in recent years were Simwood customers since startup) and I’m convinced the next one will be of this ilk.

The second thing you need to appreciate is that this technology is new and there are some very clever people doing amazing things at huge companies like xAI and Google. How could the likes of Simwood compete with them, and shouldn’t you just deal directly? There are so many moving parts to anything AI and every single one of them is an optimisation opportunity. Sure, some bundle the entire stack but go and try them – they’re crap and expensive. In our world that mistake is like thinking BT are the PSTN, so you can only deal directly and the likes of Simwood adds no value; the market proves otherwise. There are some utterly awesome open-source projects out there doing a better job than the majors for those who know how to work with them.

Then there’s infrastructure, or lack thereof. To truly lead in this space as it evolves I think you need your own GPUs, and you need them close to customers to minimise latency. That’s not only expensive, but requires a network and know-how of the kind Simwood has accrued over 30 years. Sure they can use AWS but then they have economics that are as unworkable as ElevenLabs’, or they can spunk some of that VC money all over infrastructure they’ve no experience managing and end up with a broken model that way instead. It turns out, and this was another lightbulb for me, that the economics and pressure points of running GPUs are not all that different to those of running a TDM network – something we know well and learned the hard way. 

Oh, and I haven’t even mentioned telephony which is kind of important in the real-world – your customers want phone calls, not to ask their customers to chat in a web browser – which brings CPaaS tax of up to 2c/minute and funds underlying dino-carriers, unless you use someone who has the whole stack. Someone like Simwood perhaps.

So here’s our plan: AI and all things Potato are going to become more and more central here at Simwood. We’re going to continue to bring more and more of our AI solution (which currently consumes certain third party APIs for some of the many moving parts) on-net and we’re going end up in a place where it is all on-net, running on our own GPUs, for maximum control, efficiency and security/privacy for your customers. Along the way, while you’ll see new features evolve, behind the scenes we’ll be optimising continuously to give us collectively the best stack possible, at the best economics. This puts the best technology solution in the hands of the end-users, which is what it has always been about for me.

Still want to resell lines and minutes from someone who doesn’t get it, or are we going to go and put a dent in the universe together?

Related posts