Citisoft Blog

Designing Flexibility in the Asset Management Operating Model

Written by Andy Earl | Mar 12, 2026

As the saying goes, “the only constant in life is change.” In asset management, that change can feel relentless: new products, new asset classes, new regulations, acquisitions, new data sources, and new technologies. Flexibility to accommodate this change is not accidental. It is instead the outcome of conscious design decisions.

When an operating model cannot adapt, growth slows. This is not because investment opportunity is lacking, but because operational capability cannot keep pace with strategic ambition.

In this context, the operating model – how people, processes and technology are organised to deliver value – must remain efficient and reliable while also being deliberately designed to absorb change

What rigidity looks like and why it matters

Most firms recognise the symptoms of a rigid operating model.

    • Projects take longer and cost more than expected, delaying product launches or entry into new markets.
    • Initiatives fail to deliver expected benefits as compromises are made to work around constraints.
    • Business volumes cannot grow without a commensurate increase in cost, eroding the scalability of the business model.

These symptoms are often rooted in familiar challenges: legacy platforms that are difficult to integrate, diminishing expertise in ageing technology, infrastructure that does not scale, and data locked in silos with weak governance. Taken together, these issues create an environment where change is possible, but slow, expensive, and high-risk.

The four pillars of a flexible operating model

Flexibility emerges when firms make deliberate design decisions across four areas, balancing stability with the ability to transform. Together, these represent what should remain stable and what must be allowed to change.

  • A relatively stable, modular core backbone built from interoperable platforms
  • A data architecture that reduces system interdependencies and facilitates integrating new applications and data sources
  • Clear separation between standardised functions and differentiated capabilities
  • The ability to innovate quickly without introducing long-term fragility

1. Core backbone: Stable does not mean rigid

Every asset manager relies on a set of foundational capabilities – or table stakes – such as trade processing, corporate actions, and fund accounting. These functions underpin value-generating activities and provide the platform on which differentiated investment processes operate.

The priority of this core backbone is stability, but stability should never come at the expense of growth.

Standardisation is essential to achieve robustness, efficiency, and interoperability. However, this is not to say that the backbone should be rigid. A backbone that cannot adapt will eventually become a structural constraint, forcing firms into costly workarounds, delayed initiatives, or wholesale system replacement.

The objective is therefore a stable but modular core that can absorb change incrementally.

Applications within the backbone should be loosely coupled, with clearly defined interfaces (such as APIs), allowing individual components to be enhanced or replaced without disrupting the wider estate. This modularity is what allows stability and agility to coexist.

Outsourcing can reinforce this flexibility where providers can bring scale and technical capability that individual firms may struggle to sustain internally. It will come as no surprise that this pattern is evident in how operating models are structured across the industry. Data from our recent survey profiling the industry’s transformation agenda shows that 86% of solutions across Middle Office, Asset Servicing, Fund Accounting, and Settlements are outsourced or delivered via third-party platforms, reflecting their role as standardised, foundational capabilities within the core backbone.

Data management stands in contrast. With only 60% of solutions outsourced or managed via a third party, many firms have yet to establish the same level of stability and efficiency in this area. This gap is particularly pronounced among asset managers with over $1T in AUM, where fragmented data landscapes remain a key constraint on operating model flexibility.

Ultimately, the leadership challenge is not whether to standardise, but what should remain stable over the long term and how that stability is designed to accommodate change. Decisions around which capabilities belong to the core, and which should remain closer to the firm’s differentiated edge, are central to designing an operating model that can evolve without becoming brittle.

2. Data architecture: Where flexibility is won or lost 

A modern data architecture is one of the decisive enablers of operating model flexibility.

In practice, many constraints that appear to be application problems are actually data architecture limitations. Common symptoms include excessive point-to-point integrations, inconsistent definitions, and data that cannot be reused with confidence. A well-designed unified data architecture reduces these constraints by decoupling data sources from data consumers. Core sources such as security masters or IBORs are integrated into a common structure, from which downstream functions including performance, risk, and analytics can consume data without hard-wired dependencies.

This materially reduces fragility and makes it easier to implement new systems or onboard new data sources, as opposed to building point-to-point interfaces.

From our transformation survey, we see that nearly 90% of firms plan to overhaul their data-related functions within three years, signalling a want for more integrated, future-ready data operating models.   

 

Equally important is governed access. When data is standardised, well defined, and readily available, teams spend less time reconciling information and more time using it to make decisions. This foundation is also critical for advanced analytics and AI, since automation and GenAI are only as effective as the data they rely on.

The above does not imply a single data repository. Many effective architectures are federated and combine multiple repositories and data marts, both structured and unstructured, into a coherent whole. What matters is strong governance, consistent definitions, standardised pipelines, and the ability to integrate new systems cleanly both up and down stream.

3. Differentiation and the secret sauce

Change is most frequent in areas that contribute to a firm’s differentiated edges such as research, idea generation, portfolio modelling and portfolio construction.

These are the areas where investment teams continuously refine their processes, launch new products, and expand into new strategies and asset classes. As a result, it is common for these areas to evolve faster than the rest of the operating model and are less amenable to heavy standardisation.

Larger asset managers particularly tend to rely on proprietary or specialised platforms in these areas. In some cases, maintaining separate platforms by team or strategy is a deliberate design choice to reduce contagion risk and allow change in one area without destabilising others. Our Transformation Survey reflects this reality: Proprietary systems continue to play a pivotal role in portfolio management, particularly among firms with over $1T AUM, where 38% rely on internally developed platforms. 

The challenge is not whether to allow this differentiation, but how to support it without introducing fragility. This is where the underlying data architecture is key.

It should allow specialised systems to be integrated with the core backbone, and enable data produced by these platforms to be shared where needed. Where multiple systems exist across investment teams, their outputs must still be brought together to support a coherent total portfolio view.

The following illustration shows how a deliberately designed operating model separates what must remain stable from what must be able to change; using data architecture as the connective tissue between a modular core and differentiated capabilities.

4. The pace of innovation, and avoiding future rigidity 

Certain types of demands may necessitate a faster change response, and not all can wait for a perfect solution. In these situations, tactical or temporary solutions may be appropriate, especially where the core is impacted.

This risk arises when speed becomes an excuse for permanence. Tactical solutions that are not designed with a clear path to integration quickly harden into critical components, increasing fragility, technical debt, and longterm cost. To avoid this, any temporary solution should be implemented with an explicit plan to transition it into the target architecture and integrate it with the data layer.

A similar approach could apply to end-user computing (EUC), which remains a source of innovation, particularly in the front office. For example, an investment analyst may build a lightweight Python tool to combine market data, positions, and risk metrics to support intraday decision-making. Rather than seeking to prevent this, firms should enable it by providing governed access to data through the core data architecture. Crucially, when an EUC-based solution proves valuable, there must be a defined process to industrialise it.

That is not to say that all innovation needs to be in-house. Software vendors and outsourcing providers are also investing heavily in new capabilities, and firms are increasingly partnering with them to adopt rather than build innovative solutions.

The key takeaway here is: encourage innovation and experimentation but ensure that, if the experiment is a success, you have a method for industrialising it and integrating it into the operating model.

Building a platform for change 

Flexible asset management operating models are not built through wholesale transformation, but through deliberate design choices applied consistently over time.

At the core is a stable, modular backbone that supports the valuegenerating parts of the business. This backbone must be robust and efficient, but also adaptable. A welldesigned data architecture sits alongside it, reducing dependencies between systems and enabling firms to plug and play applications that support differentiated investment capabilities – the firm’s “secret sauce” – without introducing fragility.

In reality, most firms cannot redesign their operating model in one step. The leadership challenge is maintaining direction amid constant change. This requires:

  • clarity on what should remain stable 
  • discipline in how tactical initiatives are approved
  • the ability to ringfence investment that improves longterm flexibility

Defining a clear target operating model, informed by business strategy and the design principles outlined above, provides a reference point for decisionmaking. It ensures that incremental change consistently moves the organisation toward a more resilient and adaptable platform.

Ultimately, operating model flexibility is not accidental. It is the cumulative outcome of intentional decisions about stability, change, and the architecture that connects the two.