Sunday, June 21, 2009

High Performance Computing in Financial Services (Part 2)

This installment is a follow-up of the previous installment on HPC in Financial Services. It discusses about how to deal with "Legacy System" properly. In fact, it is an art rather than science.

Legacy System in Brief

In the HPC project, traders and sales have been using the legacy system for over 10 years. Note that even with all the benefits we mentioned in this post, they are not optimistic about the change for the following reasons:
  • There are a lot of functionality that are built over the years and it is difficult to port them all in the platform we propose.
  • Behavior change is a big NO NO. They are used to work with Excel, the way it works.
  • The algorithm they have built is tightly coupled with Excel (i.e. Microsoft) and therefore it is risky to port them onto the cloud computing platform.
  • The legacy system requires other legacy systems to function properly. Changing the legacy system might affect other applications that depend on it.
To solve the above challenges, it is important to understand the big picture (i.e. the competitive advantage that we are offering directly to their business).

Approaches for Integration with Legacy System

First of all, let go through the available approaches for the new system migration process:

1. Redesigning the Legacy System from Scratch

This will not work for many reasons. The main reason is that either company will have enough resources to redesign the legacy system from scratch. It is a very time consuming process without any observable benefit to the bank. Also, traders and sales will continue to develop new features on top of the legacy system in order to run their business. They cannot stand still while the new system cannot deliver quickly enough.

2. Replacing the components in the Legacy System Incrementally

This approach is better than the first approach but it has some shortcomings. First of all, for every component that is going to be replaced in the legacy system, it might involves other applications that depend on it (i.e. its behavior, its use cases, etc). Changing it is not as easy as it might sound because it might involve many other designs from other teams that just cannot be changed in this period of time.

3. Delivering the Required functionality with Minimum Impact to the Legacy System

This is the approach we adopted in this project and it works great. The idea is and always is "Not to change anything (from GUI to user workflow) unless it is absolutely necessary". We bring the core functionality into the legacy system seamlessly so that traders and sales can benefit from the new features while continuing to work with their legacy system without knowing that the underneath mechanism has been changed. We deliver those new features in the shortest amount of time (time-to-market) which can improve their productivity in just after 2 weeks. In order to succeed, we focus on efficiently extracting and isolating the part of the legacy system that is required by the new feature and redesigning that part directly. In order to be able to build just that part, we isolate it with an anti-corruption layer that reaches into the legacy system and retrieves the information that the new feature needs, converting it into the new model. In this way, the impact to the legacy system is minimized and the new features can be delivered in days rather than months. In case there is any malfunction of the new feature, they can always switch back the legacy system and continue to do business without interruption. Once the feature is fixed, they can put them back in and voila.

No comments: