Borrowing from
Wikipedia, Computing is usually defined as the activity of using and improving computer technology, computer hardware and software. It is clear that both computer hardware and software are important to the development of Computing, but the pace of their development is significant different. Due to the physical limits of the technology, for the past decade, the Chip industry has shifted their focus of developing a single powerful CPU from one single processor to many microprocessor cores. While the development of multi-core processor is underway, there is a new paradigm in Computing (in terms of computer hardware) and that is the GPU. Although the concept is similar (i.e. they are both based on the multi-core architecture), the underlying hardware is fundamentally different and it is this difference which allows them to serve different needs in the application (i.e. some applications are better to be executed on CPU and some are better to be executed on GPU). In order to make the new computer hardware useful, the industry is now looking for better software to take advantage of the new processing powers they offer.
Currently, many applications fail to take advantage of the multi-core architecture because it requires developers to learn a new set of skills in order to make use of them. Programming on the multi-core architecture is challenging for many developers because it requires the developers to think in "parallel" and to handle many pitfalls inherited from parallel programming that are not present in serial programming. Not to mention that developing applications for multi-core CPU is different than multi-core GPU and therefore, it requires them 2X of the effort to take the full advantage of what is available today in the computer hardware. Many universities have recognized the gap between the development of the computer hardware and the computer software and have decided to teach the required skill for programming parallel processors in school and a
new textbook has been published recently to support this movement. Personally, I really think this is a good move and I would love to see more universities to engage in this movement. However, I would like to argue that this is not going to help much for the general public unless the movement focuses on solving the real pain point in the Computing world.
The real pain point is that developers should spend most of their development time on delivering business values rather than working with the underlying hardware. Developers are looking for a better way which can facilitate them to develop software on the multi-core architecture without a steep learning curve on the underlying hardware. In fact, software engineers experienced a similar problem in the past and they found a solution for it. The solution is well-known in the Engineering domain and it is called "Virtualization". Virtualization has been used in many places. For example, by virtualizing operating system, a physical machine can be sliced into multiple virtual machines which can then allow to deploy several applications on the same physical machines. I think "Virtualization" can also be applied on the multi-core architecture so that multi-core applications can be developed with "less" pain. It is also a logical and economical way to make use of the available resources. With a group of experts specialized in the multi-core virtualization development, a large population of developers can make use of the multi-core architecture to build high-responsive applications.
One solution available today, that I know of, approaches the concept of multi-core virtualization and it is called
Ct Technology. Code can be written once using the development platform provided by Ct Technology, then run in parallel on any of the processors that the Ct Technology Platform supports. The current supported processors are Cell BE, GPUs, and CPUs. I'm hoping to see more of this kind of technology in the future because I believe that it revolutionalizes the computing industry. It abstracts the complexity of the hardware from the application code, freeing the developers to implement applications that matter to the user.
In the future, as blogged in
DevCentral, virtualization is becoming the most disruptive infrastructure technology. Especially when Cloud Computing takes off, virtualization will become even more important than one can imagine. Therefore, I'm positive that one day, we will have something called "Computing as a Service" available in the Cloud Computing stack and everyone can access it without knowing what is the underlying computer hardware to execute the actual instructions (be it Cell, GPU or Molecules).
References: