The hidden cost of ignoring brokerage infrastructure

brokerage

For years, the brokerage industry poured resources into where returns were most visible: the front end. Trading platforms became the competitive arena of choice — multi-asset interfaces, sleek dashboards, and mobile-first experiences attracted the capital and the marketing budgets. The back end, by contrast, was largely ignored. That neglect is now catching up with the industry.

A recent panel discussion brought together Muinmos CEO Remonda Kirketerp-Møller and Integral director of product Julian Elliott, moderated by Andrew Saks, to examine what this imbalance means for brokerages today — and what must change.

Muinmos recently discussed what is the brokerage technology reckoning, and why the back end can no longer be an afterthought.

Spanning topics from technology stack strategy through to real-time regulatory oversight, the conversation produced a consistent message: brokerages that treat infrastructure as a strategic asset, rather than an operational cost, will be the ones that endure.

The legacy of front-end investment

Kirketerp-Møller was candid about how the industry arrived at this point. Large institutional players such as Saxo Bank, Interactive Brokers, and CMC Markets invested early and deeply in proprietary trading infrastructure, securing a durable competitive advantage in the process. The wave of retail brokerages that followed took a markedly different path.

Muinmos CEO Remonda Kirketerp-Møller said, “Enormous amounts of money went into the front end. Multi-asset trading platforms became a core requirement, and that is where massive IT resources were invested. It unfortunately neglected a great deal of back-office operations — and that neglect is one of the reasons we see so many issues today with regulatory reporting and the volume of manual work that continues.”

Many of those retail brokerages entered the market from adjacent sectors — affiliate marketing among them — without the capital or the FinTech foundations to build from scratch. The white-label, off-the-shelf route was the natural choice. Vendors hosted everything; the broker retained little meaningful control over the technology or client data and had limited capacity to adapt when conditions shifted.

As regulatory demands have intensified and new asset classes have emerged, the fragmentation that approach creates has become a serious liability. Siloed operations, disparate back-end systems, and manual data consolidation for regulatory submissions represent inefficiency that is both operational and reputational.

Stop building what you are not built to build

One of the sharpest observations in the discussion concerned the persistent tendency of institutions to invest in bespoke infrastructure that sits outside their core competency.

Muinmos CEO Remonda Kirketerp-Møller said, “There is no question that institutions should stop building this type of infrastructure themselves. Expert technology companies exist precisely to do this and they do it at scale. Many institutions do not realise that the system they have built cannot simply be extracted and deployed elsewhere. It is not portable.”

The ramifications extend well beyond operational inconvenience. When an institution built on a proprietary bespoke system considers a sale or public listing, auditors will probe where the client base sits and where the IP resides. If the answer points to a third-party vendor’s server, or a system entangled with legacy architecture, the impact on valuation is material.

The alternative — a well-structured platform from a specialist software provider, deployed on the broker’s own infrastructure — delivers genuine IP ownership, full flexibility, and the ability to integrate any front end for any asset class, without the ongoing development burden or the time-to-market constraints of building from scratch. Development cycles from scratch frequently outlast the window between a trend emerging and regulators responding to it.

The expertise gap

A second structural challenge Kirketerp-Møller identified is the shortage of professionals who understand both domain and technology. Domain expertise alone was once sufficient. It is no longer.

AI agents, open APIs, and MCP servers are not peripheral concerns for specialist technology teams — they are now central to how brokerages interact with clients, manage data, and engage with regulators. Elliott noted that at a recent industry event in Dubai, the majority of brokerages in attendance had no familiarity with MCP servers, even as technology providers are already integrating that capability into their back-end architecture.

Muinmos CEO Remonda Kirketerp-Møller said, “Hiring domain experts who cannot engage with the technology will still leave significant gaps. Institutions need to hire people who understand and can leverage technological capabilities — not just people who understand the domain.”

The implication for brokerage leadership is direct: the technology strategy conversation cannot remain the exclusive territory of a CTO or development team. Business leaders must understand what is being built, what is being procured, and whether it will still serve the business 18 months from now.

New asset classes and the first-mover dilemma

The panel also examined the challenge posed by new asset classes — crypto perpetuals being the most current example — and the tension between first-mover risk and the danger of falling behind.

Elliott observed that broker hesitancy around crypto perpetuals is largely driven by uncertainty: insufficient revenue visibility, unclear regulatory treatment, and the difficulty of launching a new product on a technology stack not built for rapid adaptation. But the flip side is equally concerning.

Should regulators move to restrict or substantially alter the treatment of CFDs, crypto perpetuals are a credible alternative product. Critical mass is not yet present, but it could arrive within 12 to 18 months. Brokers who have not had the technology conversation by then will find themselves at a pronounced disadvantage.

The broader principle, as Saks observed, is what might be termed the Tesla effect: disruption originates outside an industry, arrives faster than incumbents anticipate, and raises the user experience bar for everyone. The crypto exchange experience — intuitive, API-first, rapid — is already reshaping what a new generation of traders expects. A brokerage anchored to a legacy-dependent platform will struggle to keep pace.

What real-time regulatory oversight will look like

Perhaps the most forward-looking element of the discussion concerned the trajectory of regulatory oversight itself.

Kirketerp-Møller described a near-term future in which regulators do not merely audit retrospectively but monitor in real time — scrutinising not just trading behaviour but code architecture. The question evolves from what a platform did to how it was built and whether it conforms to expected standards.

Muinmos CEO Remonda Kirketerp-Møller said, “If I am a regulator auditing institution A, I will expect certain behaviour and certain standards. I will expect the same in institution B. That creates a very strong incentive for institutions to use compliant, standardised platforms from expert technology providers.”

The implications for the build-versus-buy debate are significant. A bespoke system built to proprietary standards may not conform to emerging regulatory expectations of standardised, auditable behaviour. A platform built by a specialist RegTech provider, working directly with regulators to translate requirements into code-ready logic, occupies a fundamentally different position. Muinmos works on exactly this: embedding compliance into the technology stack from the outset, rather than layering it on retrospectively.

The connectivity imperative

A recurring theme throughout the panel was connectivity — and the fragmentation that its absence produces across every dimension of brokerage operations. Regulatory reporting, onboarding, KYC, AML, investor protection: all require data to flow coherently across systems.

AI agents, increasingly the vocabulary through which the industry discusses next-generation automation, cannot function effectively without coherent underlying communication infrastructure. Without it, fragmentation persists regardless of how sophisticated the individual tools are.

The efficiency gains from resolving this are substantial. Tasks that might occupy a compliance professional for an entire day — data extraction, analysis, reporting — can be delivered in seconds by a well-integrated system. These are not incremental improvements. They are competitive differentiators.

The brokerages that succeed will not necessarily be those with the most polished front ends. They will be those with the most coherent, connected, and compliant back ends — infrastructure capable of adapting as both market conditions and regulatory expectations continue to evolve.

The conversation that needs to happen now

The panel’s conclusion was practical. For brokerages evaluating their current stack, whether in-house-built or vendor-reliant, the strategic questions are clear: Does the current technology stack serve today’s needs and those of 18 months from now? Is the infrastructure genuinely portable, scalable, and capable of being valued as an asset? Are there people in the business who understand both the domain and the technology? Are back-end systems sufficiently connected to support AI integration, multi-asset reporting, and real-time regulatory compliance?

These are not technical questions. They belong in the boardroom and the chief executive’s office — not only in the CTO’s.

Read the daily RegTech news

Copyright © 2026 RegTech Analyst

Enjoyed the story? 

Subscribe to our weekly RegTech newsletter and get the latest industry news & research

Copyright © 2018 RegTech Analyst

Investors

The following investor(s) were tagged in this article.