Sunday, May 11, 2008

Parallel Computing: The Case for Universality

Bill McColl on Domain-Specific Parallel Programming

Bill McColl just posted a well-argued, albeit unconvincing, article on his blog to try to make the case that domain-specific parallel programming tools and languages will have the biggest impact on parallel computing in the short term. McColl writes:

The central unresolved question in this area is whether a single general-purpose parallel programming language can emerge that will achieve the kind of universality that languages such as C, C++ and Java have achieved in sequential computing. My own view, based on more than 20 years of research in this area, is that this is very unlikely, certainly in the short to medium term. Instead I expect that we will see a number of powerful new domain-specific parallel languages emerging that quickly capture a large user base by being high-level and easy-to-use, with all of the complex implementation aspects of mapping, scheduling, load balancing and fault tolerance being handled automatically.
It is hard to tell whether McColl is making a prediction about the near future direction of parallel computing based on his experience in the field and his familiarity with various on-going research projects or whether this is the direction that he is personally promoting. I think it is both since McColl’s company, Parallel Machines, is also in the business of developing domain-specific programming tools for parallel computers. Although I agree that the industry seems to be moving in that direction (and I wish Bill McColl the best of luck in his business venture), I disagree that this is how it should spend its research money. Indeed, I am convinced that it would be a colossal mistake, the end result of which will be to create another huge legacy of soon-to-be obsolete computer applications and tools.

The Need for Universality

Bill McColl is mistaken, in my opinion. If you have spent the last 20 years researching only domain-specific parallel programming tools, as opposed to a universal computing model, you can only see one side of the picture. You are therefore in no position to decide or advise others that universality is not the way to proceed, even in the short term. Domain-specific tools are designed to treat the symptoms of the malady, not to cure its cause once and for all. As I argued in my Nightmare on Core Street series, universality should be the primary objective of multicore research. The reason is that, once you have achieved universality, you know that you have solved the problem. Universality must not be limited to programming tools, however. It should be the direct and natural outcome of a universal computing model. The old sequential approach to computing is obviously not universal; otherwise the industry would not be in the mess that it is in. It is not as if people in the business do not already observe and understand the high costs of non-universality. Making the transition from sequential computing to massive parallelism is obviously turning out to be a very costly nightmare. Should the industry embark on another costly adventure in non-universality? Answer: Of course not. It would be the ultimate exercise in foolishness. As they say, a scalded cat is afraid of cold water. At least, it should be.

The COSA Model Is Universal Now

I have my biases and Bill McColl has his but I can honestly claim that I understand both sides of this debate (universality vs. domain-specificity) as well as or better than anyone else in the business and I am not saying this to boast. My primary area of interest is artificial intelligence and I see the need for a universal parallel computing model to support the massively parallel artificial brains of the future. I have spent the better part of the last two decades researching a universal computing model called the COSA software model. Certainly, it will require a radical change in the way we curently build and program our computers but it is not rocket science. A COSA-compatible multicore processor can be made pin and signal-compatible with existing motherboards. Given the right resources, it can be designed and implemented in as little as two years using current fabrication technology. The proposed COSA development environment offers many advantages over domain-specific tools besides universality. It is inherently deterministic and implicitly parallel and the development environment is graphical. Determinism is icing on the parallel cake because it leads to secure and rock-solid applications that do not fail. In addition, it makes it possible to effectively implement plug-compatible components, an essential characteristic of drag-and-drop programming and massive code reusability. COSA will usher in the age of programming for the masses and the era of the custom operating system: drag'm and drop'm. In fact, I believe that COSA programming will be so easy that rewriting existing applications for the COSA environment will be a breeze.

See also:

Nightmare on Core Street
Parallel Programming, Math and the Curse of the Algorithm
Why Parallel Programming Is So Hard
Parallel Computing: Why the Future Is Non-Algorithmic

No comments: