Tuesday, February 23, 2010

Computer Scientists Created the Parallel Programming Crisis

It Pays to Be Incompetent

The computer industry is spending large sums of money to come up with a solution to the parallel programming crisis but the very people who created the crisis are the direct beneficiaries. About two years ago, the major players in the computer industry (Intel, Microsoft, AMD, Nvidia, etc.) poured tens of millions of dollars into the coffers of major university research centers. Has there been any progress in finding a solution since? Answer: No. Is there any indication that progress will be made in the near future? Answer: No. Now, everyone knows that two years is a long time in the computer business. A lot can and should happen in two years. The amazing thing about parallel programming research is that computer scientists have been working on the problem for the last thirty years! They are just as clueless now as to what the solution might be as they were when they first started working on it! What is wrong with this picture?

Seismic Paradigm Shift Ahead

Computer scientists are not ashamed to admit that they have no idea what the solution to the crisis might be. Why should they be? Long term failure has never stopped the money from flowing in. In fact, research money on parallel computing has markedly increased in the last few years because the industry is beginning to panic. Nobody messes with Moore's law and walks away to brag about it. In a way, it makes sense that the industry should approach academia for help. I mean, who else but academia is qualified to find a solution to this pressing problem? But how much longer can the crisis continue? Can the industry fund unsuccessful research indefinitely? Can we continue to live forever with hideous monstrosities like heterogeneous processors and multithreading? I don't think so. Sooner or later, something will have to give. The captains of the industry will eventually realize that they are pouring money into a black hole and many will wise up. A seismic upheaval in the way computer science is conducted will ensue. Many scientists who are now placed on a pedestal will see their work discredited. The computer science community may think they are immune to hard times but the market is known to be rather cruel when profits are in the balance.

Wrong From the Start

If you asked me who are the most to blame for the current crisis, I would tell you without hesitation that it is the mathematicians. All the major participants who helped shape the history of computing, people like Charles Babbage, Lady Ada Lovelace, Alan Turing and John Von Neumann, were mathematicians. Their vision of the computer is that of a machine built for the computation of mathematical functions or algorithms. Even today, after years of failure to solve the parallel programming problem, mathematicians are still clamoring for the use of functional programming as the solution. Everything looks like a nail when all you have is a hammer. The only reason that the functional programming community has not succeeded in pushing FP into the mainstream is that reality keeps kicking them in the ass. The truth is that functional programming languages are counter-intuitive, hard to learn and a pain to work with.

Behaving Machines, Communication and Timing

Mathematicians notwithstanding, a computer is a class of automaton known as a behaving machine. As such, it must not be seen as a function calculator that takes input arguments and returns an answer. It should be seen as belonging to the category of machines that includes brains and neural networks. The proper paradigm for describing the working of such machines is not mathematics but psychology. We should be using terms like stimuli, responses, sensors, effectors, signals, and environment when describing computers. This way, an operation becomes an effect performed by an effector on the environment (data) and a comparison operator becomes a sensor. Once the computer is seen in its proper light, it immediately becomes clear that, like nervous systems, a computer program is really a communication network that senses and effects changes in its environment. Nothing more and nothing less. And, as with all communication systems, the deterministic timing of sensed events (stimuli) and operations (responses) becomes critical. Most of the problems that plagues the computer industry (e.g., unreliability and low productivity) stem from the inability to precisely determine the temporal order (concurrency or sequentiality) of all events in the machine. Temporal determinism is a must. This then is the biggest problem with the Turing Computing Model: timing is not an inherent part of the model.

How to Solve the Crisis

We must reinvent the computer. We must turn it from a mathematician's wet dream into a psychologist's wet dream, from a glorified function calculator into a universal behaving machine. And we must do so at the fundamental level. It is time to stop feeding money into the academic black hole for the reason that the academics have failed. It is time to stop encouraging mediocrity among the baby boomer generation who created this mess. It is time for the boomers to gracefully retire and allow a new generation of thinkers to have their turn at the wheel. Industry leaders should simply say to the Turing Machine worshipers, “thank you very much, ladies and gentlemen, for your great service; here's a nice retirement check for all your troubles.” Then they should forget the whole mess and forge a bold new future for computing. And then the next computer revolution will make the first one pale in comparison.

See Also:

How to Construct 100% Bug-Free Software
How to Solve the Parallel Programming Crisis
Parallel Computing: The End of the Turing Madness
Why Parallel Programming Is So Hard
Parallel Computing: Why the Future Is Non-Algorithmic
Half a Century of Crappy Computing
Why Software Is Bad and What We Can Do to Fix It

No comments: