Alveo Blog Data Management

Total cost of ownership for data management

New technologies can significantly reduce the TCO for financial data management platforms – what are the key aspects to consider when selecting one?

When exploring potential technology solutions for your financial data management challenges, it’s crucial to recognize that the vendor license fee is just one component of the total cost of ownership (TCO). To gain a comprehensive understanding of your data management TCO, it is important to consider the following factors when evaluating solution providers:

  • What is the underlying technology stack and is it best in class?
  • What is the disaster recovery process and how much does it cost?
  • What does a change management process look like and how many Business Analysts and IT personnel will I need?
  • Is the vendor strategy to offer a self-serve model to the underlying business or will I need specialized resources to help the business get what they need?
  • What are the distribution capabilities of the product?

The specifics of the technology stack impact both a deployed (internal or external cloud – for example, AWS, GCP or Azure) or managed service where a vendor is managing the instance and operations on your behalf. This is typically wrapped up in additional service charges.

You may be asking yourself, why does it make any difference to me what the technology is if I use the vendor’s managed service?

A good question, but consider the situation where a vendor’s underlying technology stack is based on a client-server architecture. Essentially, in a monolithic deployment, the vendor must resource the platform’s resources based on the point of maximum load or utilization. For example, loading back-office files at the start of the day, but during the rest of the day, the resources are underutilized.

The customer will be paying for those larger resources that are not needed 95% of the time. If you are managing the cloud, this means higher running costs. If a vendor is managing the platform these higher running costs are passed on to the client.

Sadly, this is not the only additional cost you will incur.

Impact of architecture on hardware cost

In the case of a client-server architecture you cannot simply add more hardware resources to better run certain services. Instead, you must replace your current platform with another virtual server which can again significantly increase infrastructure costs.

A microservices architecture is a step forward in solving these challenges by enabling you to scale the solution at peak times and shut down resources when not needed. Furthermore, it enables you to scale services so that the overall cost of this solution is less expensive in terms of cloud and compute costs for the vendor, which should make their solution more cost-effective.

Microservices as piecemeal solution components

Another advantage of a microservices architecture is that specific microservices from a vendor can be blended with inhouse developed applications. This means that institutions don’t need to replace complete inhouse built solutions with a vendor’s client server monolithic platform. Instead, they can choose to enhance the functionality of an inhouse application by using vendor components to solve a particular problem.

For example, you want a component to acquire data. This may be available already from a vendor as a separate microservice and offered as a managed service with regular updates as data sources make changes to their feeds. Equally, you can use a vendor’s microservices components to address requirements including distribution, cross-referencing, quality reporting and consumption monitoring and benefit from best in class capabilities without requiring a wholesale overhaul of your existing data management infrastructure. This way, you can augment your data management setup piecemeal to address specific pain points.

My point is that a stateless microservices architecture enables customers to use vendor specific functionality within your own in-house build applications which is more cost effective, enables you to use best in class components and delivers faster time to market and ROI.

Operational resilience

Disaster recovery is important for any regulated financial service organization. However, this requirement can lead to very high cloud infrastructure costs or internal resources as typical client-server architectures require a HOT-HOT or HOT-COLD standby.

Not ideal for those looking for a lower TCO. Microservices architectures enable customers to deploy a single instance but through well-defined DevOps deployments with the use of multiple zones and cloud providers, institutions can ensure a fault-tolerant solution for less cost than a typical standby routine required by a client-server architecture.

Another factor that is generally overlooked is the use of open-source technology and the impact it can have on financial data management. This is an important part of any TCO decision. The use of open-source technology, specifically data storage can impact the TCO of financial data management platforms dramatically.

Using open-source data storage is significantly cheaper and can still offer the same performance. However, one important factor that is overlooked is whether the open source technology is a tier-one support application in your chosen cloud providers. Cloud providers such as GCP, AWS or Azure offer services to help support clients with running these technologies. For example, AWS Keyspaces for Cassandra or AWS RDS for PostgreSQL.

These services can significantly reduce TCO as you will not need to hire internal staff to run them. Check with your data management vendor that they offer tier-two support for any underlying open source technology specifically around their product.

Lowering the cost of change

Data management processes in financial organizations are always in flight and you cannot fight the tide of change coming from the business. So it is important to consider that the change processes within an organization can not only impact the effectiveness of the business to deliver but also its costs.

In most instances a change process can be the largest component of TCO if you make the wrong vendor decision. To ensure you do not fall into this trap ask the following questions to any perspective data management solution provider:

  • What are the underlying technology and data storage methods and how easy it is to update the schema in the underlying database?
    Where possible avoid a normalized data structure as it adds significant overhead when changing the schema. Furthermore, it requires the vendor to be involved when extending custom relationships and they will typically charge you for this.
  • Do you manage the data model centrally and can you extend the data model without asking the vendor?
    By not controlling your own data model this could significantly impact your ability to deliver to the business in a timely manner.
  • Can the vendor extend the data model with new attributes but not impact the customizations?
    Typically you may need to add some fields of your own such as internal IDs or taxonomies or other fields used for downstream processing. It is important to select a vendor who can deliver maintenance updates without impacting customizations otherwise you will incur a high testing overhead within your organization.
  • Does the vendor offer standard support for the integration of data sources and managed feeds whereby the delivery of regular updates is included out of the box?
    A managed service with respect to data feeds can significantly reduce your TCO. Market data vendors regularly make changes to their feeds and this can be a full time job so put the burden on the vendor it will reduce your ongoing TCO and ensure better accuracy as the vendor specializes in doing this task.
  • Does the vendor offer an out of the box data model for normalization and golden copy creation?
    An out of the box model for golden copy creation can significantly reduce the implementation lead time and overall cost.
  • When extending our data model for the business do data attribute changes automatically become available for distribution?
    When selecting a solution it is important to get them to show you how easily data model changes pass through to data distribution. In my view it should be automated and be available in technologies like Kafka automatically or easy to extend through the user experience for bulk delivery files.

Aspects to include when selecting a data management solution

The trend we are seeing in buy and sell-side organizations has been the rise in adoption of technologies such as Kafka to distribute market data. Adopting technologies like this enables customers to focus on adopting a self-serve model. Enabling consumers of data to request their universe of interest and attributes revolutionizes an organization’s ability to adapt to the needs of their internal customers.

However, this concept, though revolutionary for most, creates challenges in some instances in data management platforms. In my experience, you need to focus on these important questions:

  1. How seamless and timely is it for a data management platform to distribute datasets to new users and feeds?
  2. What is the timeliness of adding new attributes to a feed?

The answer to these questions may surprise you. However, it is important to understand distribution whether through technologies like Kafka or fixed schema flat files can be a significant proportion of the total cost of ownership and more importantly a self-serve platform can really transform your business users’ use of financial data.

When selecting a financial data management vendor it is important to get answers to these questions. These in my view are the hidden cost of any TCO and you only find out about these costs once you are managing your platform. These costs impact you whether you are hosting or using the vendor’s managed service.

In conclusion, choosing a financial data management platform involves more than assessing license fees. Delving into the total cost of ownership (TCO) requires scrutiny of factors like technology stack, disaster recovery, open-source usage, and change management processes. It is important to note the impact of underlying technology on costs, particularly in a hosted but also managed service scenario, as this will simply be a cost passed on to you.

The benefits of a microservices architecture can be overlooked aspects of a decision-making process but this can save you significantly in the long run through running costs and testing, ensure better business user satisfaction and – simply put – get the most out of your data.