Showing posts with label financial services. Show all posts
Showing posts with label financial services. Show all posts

Sunday, September 27, 2009

High Performance Computing with RESTFul Excel in Financial Services

If you have ever worked for financial services to develop systems for their front office, you know that traders love MS Excel. There are a lot of reasons to use Excel as the viewer; manipulating and visualizing data in a grid fashion are both sound and logical. Still, computations that require more than 1 computer could handle are better to offload to a grid computing platform and that's what we have done for one of the largest banks in China.

Today, I'm not going to talk about the grid computing platform we've modernized for the bank, instead I will focus on version control for MS Excel. You might think that is easy because you can use Sharepoint. You are absolutely right and that's what we have selected as a technology to manage different version of the Excel. However, this is only part of the story. There is another part of the story which is constantly being overlooked and that is to manage the different version of "the data" in the Excel.

Version control of data in the Excel is a problem that even Microsoft leaves the door opened for others to contribute because it requires the domain knowledge of the field (in this case, the financial services) in order to understand what the data is about to version control. In the financial services, the quants build their pricing models into the Excel spreadsheet. The pricing models are mathematical models that required model parameters and some predefined inputs in order to calculate the outputs for the pricing models. Once the quants have validated the models, the Excel spreadsheet is hands-off to the traders and the sales for perform their daily operations. Problem is that when the quants update the pricing models with new parameters and new inputs, all of a sudden the traders and the sales are facing a difficult problem; all the current deals that are made in the past using the old models are needed to port to the new version of the Excel spreadsheet. This porting activity include the inputs, the model parameters and the outputs of the model, PLUS all the information about the deal. They need a flexible way such that they can use the new Excel spreadsheet with the data that is in the old version of the Excel. The solution we have proposed and implemented uses RESTFul services to facilitate the version control and migration of data in the Excel spreadsheet.

RESTful services are great for system integration. As discussed in this presentation, it allows each system to upgrade at its own pace without concerning the other depending and dependent systems as long as the upgraded system maintains the older versions of the services. In our case, the data in the Excel is stored in GigaSpaces which provides real-time risk hedging functionalities based on the market changes. We developed a XML schema for the data and represented the data in XML format so that all client applications know exactly what is in the XML and how to use it (we build a library that uses XPath to get the data we need from the XML and populate the Excel spreadsheet on-the-fly) . If the new version of the Excel spreadsheet does not required new inputs and model parameters, the traders and the sales can benefit from it immediately. If the new version of the Excel spreadsheet requires new inputs and new parameters, the quants will need to add new entries in the XML to describe the new data. When the traders and the sales open the new spreadsheet, the data in the old version of the Excel will be filled-in to the new version of the Excel spreadsheet. After the traders and the sales have filled-in the new data and submit the new version of the data back to the data grid, everything is up and running again. This is just one of the use cases for the RESTful Excel but you can imagine that there are other use cases that can make good use of this technology.

Bear in mind that system integration is crucial in financial services and therefore, having a good technology that can facilitate system integration will allow the banks to adopt new technologies much faster which might improve their system reliability and performance. It has a direct impact to everyone's life (assuming you also put your money in a bank).

Friday, July 3, 2009

My Thought on Extreme Transaction Processing (XTP) in Financial Services

Today, I have a discussion with a customer through emails regarding Extreme Transaction Processing (XTP) system for Order Management System (OMS) using GigaSpaces XAP. I found the discussion interesting and would like to also share it with you all. The discussion was about measuring system performance in XTP which usually involves Latency and Throughput. Here is the detail in the email:

Although latency is an important performance metric for an OMS, we also need to consider other aspects such as the system throughput. Measuring independently the latency and the throughput using different test cases will not reflect the performance of the system in the real world as the test case might aim to optimize the performance of a specific system parameter (in this case the latency) by trading off other important aspects of performance (such as the throughput and the ACID properties). In optimization theory, the optimization complexity increases with the number of variable in the system. In the real-world application, this complexity arises frequently and GigaSpaces does a good job, if not the best, to solve this problem.

We believe that the single-threaded approach to optimize out the object locking in order to shave off any possible latency will actually impact the throughput of the system as this approach can only update ONE order information at a time; limiting the system concurrency and utilization. Also, the single-threaded approach DOES NOT satisfy all the ACID properties even in this simple test case which affects the system reliability.

In fact, GigaSpaces can achieve the same latency using the single-threaded approach (i.e. polling container with single consumer in GigaSpaces). It can achieve even lower latency using embedded space. Also, in addition to the single-threaded approach, GigaSpaces provides standard transaction support. It has several implementation of Spring's PlatformTransactionManager which allow user to utilize Spring's rich support for declarative transaction management (which is reliable and standard), instead of coding an in-house transaction manager which might be error-prone and complex.

In reality, there will be more than 1 application using the XTP. The denominator for all these applications is the same which boils down to data consistency. It is very easy to achieve weak consistency in which many XTP solutions can/only provide. However, for some applications, strong consistency is a must. Therefore, we need to evaluate an enterprise XTP solution from a broader perspective and how much flexibility that it can provide in order to achieve the desired performance, manageability and security. When the usage of a XTP solution is beyond the basic and many different applications rely on it, these functionalities are not just "nice-to-have" but essential foundations for any enterprise application.

At the end of the day, what we would like to achieve in Extreme Transaction Processing is to keep latency low and throughput high while the processing (the business logic) is done transactionally (i.e. to provide ACID guarantee). Therefore, the performance of a XTP should be measured against all 3 properties as a whole in order to get a better idea of the capability of the solution.

Sunday, June 21, 2009

High Performance Computing in Financial Services (Part 2)

This installment is a follow-up of the previous installment on HPC in Financial Services. It discusses about how to deal with "Legacy System" properly. In fact, it is an art rather than science.

Legacy System in Brief

In the HPC project, traders and sales have been using the legacy system for over 10 years. Note that even with all the benefits we mentioned in this post, they are not optimistic about the change for the following reasons:
  • There are a lot of functionality that are built over the years and it is difficult to port them all in the platform we propose.
  • Behavior change is a big NO NO. They are used to work with Excel, the way it works.
  • The algorithm they have built is tightly coupled with Excel (i.e. Microsoft) and therefore it is risky to port them onto the cloud computing platform.
  • The legacy system requires other legacy systems to function properly. Changing the legacy system might affect other applications that depend on it.
To solve the above challenges, it is important to understand the big picture (i.e. the competitive advantage that we are offering directly to their business).

Approaches for Integration with Legacy System

First of all, let go through the available approaches for the new system migration process:

1. Redesigning the Legacy System from Scratch

This will not work for many reasons. The main reason is that either company will have enough resources to redesign the legacy system from scratch. It is a very time consuming process without any observable benefit to the bank. Also, traders and sales will continue to develop new features on top of the legacy system in order to run their business. They cannot stand still while the new system cannot deliver quickly enough.

2. Replacing the components in the Legacy System Incrementally

This approach is better than the first approach but it has some shortcomings. First of all, for every component that is going to be replaced in the legacy system, it might involves other applications that depend on it (i.e. its behavior, its use cases, etc). Changing it is not as easy as it might sound because it might involve many other designs from other teams that just cannot be changed in this period of time.

3. Delivering the Required functionality with Minimum Impact to the Legacy System

This is the approach we adopted in this project and it works great. The idea is and always is "Not to change anything (from GUI to user workflow) unless it is absolutely necessary". We bring the core functionality into the legacy system seamlessly so that traders and sales can benefit from the new features while continuing to work with their legacy system without knowing that the underneath mechanism has been changed. We deliver those new features in the shortest amount of time (time-to-market) which can improve their productivity in just after 2 weeks. In order to succeed, we focus on efficiently extracting and isolating the part of the legacy system that is required by the new feature and redesigning that part directly. In order to be able to build just that part, we isolate it with an anti-corruption layer that reaches into the legacy system and retrieves the information that the new feature needs, converting it into the new model. In this way, the impact to the legacy system is minimized and the new features can be delivered in days rather than months. In case there is any malfunction of the new feature, they can always switch back the legacy system and continue to do business without interruption. Once the feature is fixed, they can put them back in and voila.