Sunday, June 21, 2009

High Performance Computing in Financial Services (Part 2)

This installment is a follow-up of the previous installment on HPC in Financial Services. It discusses about how to deal with "Legacy System" properly. In fact, it is an art rather than science.

Legacy System in Brief

In the HPC project, traders and sales have been using the legacy system for over 10 years. Note that even with all the benefits we mentioned in this post, they are not optimistic about the change for the following reasons:
  • There are a lot of functionality that are built over the years and it is difficult to port them all in the platform we propose.
  • Behavior change is a big NO NO. They are used to work with Excel, the way it works.
  • The algorithm they have built is tightly coupled with Excel (i.e. Microsoft) and therefore it is risky to port them onto the cloud computing platform.
  • The legacy system requires other legacy systems to function properly. Changing the legacy system might affect other applications that depend on it.
To solve the above challenges, it is important to understand the big picture (i.e. the competitive advantage that we are offering directly to their business).

Approaches for Integration with Legacy System

First of all, let go through the available approaches for the new system migration process:

1. Redesigning the Legacy System from Scratch

This will not work for many reasons. The main reason is that either company will have enough resources to redesign the legacy system from scratch. It is a very time consuming process without any observable benefit to the bank. Also, traders and sales will continue to develop new features on top of the legacy system in order to run their business. They cannot stand still while the new system cannot deliver quickly enough.

2. Replacing the components in the Legacy System Incrementally

This approach is better than the first approach but it has some shortcomings. First of all, for every component that is going to be replaced in the legacy system, it might involves other applications that depend on it (i.e. its behavior, its use cases, etc). Changing it is not as easy as it might sound because it might involve many other designs from other teams that just cannot be changed in this period of time.

3. Delivering the Required functionality with Minimum Impact to the Legacy System

This is the approach we adopted in this project and it works great. The idea is and always is "Not to change anything (from GUI to user workflow) unless it is absolutely necessary". We bring the core functionality into the legacy system seamlessly so that traders and sales can benefit from the new features while continuing to work with their legacy system without knowing that the underneath mechanism has been changed. We deliver those new features in the shortest amount of time (time-to-market) which can improve their productivity in just after 2 weeks. In order to succeed, we focus on efficiently extracting and isolating the part of the legacy system that is required by the new feature and redesigning that part directly. In order to be able to build just that part, we isolate it with an anti-corruption layer that reaches into the legacy system and retrieves the information that the new feature needs, converting it into the new model. In this way, the impact to the legacy system is minimized and the new features can be delivered in days rather than months. In case there is any malfunction of the new feature, they can always switch back the legacy system and continue to do business without interruption. Once the feature is fixed, they can put them back in and voila.

High Performance Computing in Financial Services (Part 1)

In the past 8 months, I was leading a team of 3 to perform a High Performance Computing (HPC) professional service for one of the largest banks in China. I think it will be a good idea to summarize what I have learned during the process so that other can make use of it if they also find themselves in similar situations. It will be nice to hear comments from others as well.

A Brief Description of the HPC Professional Service

The goal of the HPC professional service is to enable the bank who uses Excel in a structured products pricing scenario to:
  • improve the manageability of their IT assets
  • scale their use of Excel to prevent bottlenecks
  • parallelize Monte Carlo pricing computation
  • automate Excel batch revaluation
all of the above benefits on top of a highly available and fault tolerant platform.

Traditionally, traders and sales use Excel to run their pricing models in their standalone workstation. While this environment works great for them in the past, this can no longer cope with the growing demands from their customers since the computation power is limited to a single workstation. In fact, the effect of the growing demand from their customers increases exponentially their demands on the computation power for reevaluation in risk management.

Additionally, with the computation power limited to a single workstation, new algorithms which require more computation power will take longer time to run. This reduces the number of deals that one could make in a day and affects also the team productivity.

While Excel has an ideal GUI for traders and sales, I challenge that it is not meant to be used for computation as it is definitely not design for it. Bounding the computation in Excel will not scale in the long run. Also, the data in Excel cannot be shared with other applications easily. Making it difficult to collaborate with other applications which is often required in the financial industry.

Lastly, but not least, the manageability of Excel is an important aspect to be considered in this project. When every trader and sale has his/her own copy of Excel, version control of Excel becomes a nightmare. Each person can make changes to his/her own copy of Excel which poses a serious challenge for management.

The HPC professional service that we offer addresses the above challenges by distributing and parallelzing the computation using cloud computing technologies, and xmlizing the data to decouple it from the Excel GUI. Additionally, we take version control to the next level by providing version control for both the data and the Excel spreadsheet. So that while the Excel spreadsheet changes, the data can be imported to the new spreadsheet without any manual procedure.

My Responsibility in Brief


I assumed the following responsibilities along the timeline of the project:
I was started as a technical sales engineer to demonstrate the feasibility of the solution and to perform proof of concept that I mentioned above. I was actively involved in the negotiation of the agreement regarding the project (number of manpower, functional deliverable, delivery schedule, training, etc).

After the agreement was settled, I designed the software based on user requirements and start to lead a team of 3 to implement the required functionality. By the end of the project, my role in project management starts to grow in order to secure the expected delivery schedule, to match user expectation and solution QA.

Each of the roles I played in this project has some interesting things to learn about and I will talk about them individually in the next installments.

~Salute~