Saturday, March 20, 2010

Computer Scientists Created the Parallel Programming Crisis (repost)

[I seem to be in a permanent computer science bashing mode. I am reposting this article because it reflects my current mood.]

It Pays to Be Incompetent

The computer industry is spending large sums of money to come up with a solution to the parallel programming crisis but the very people who created the crisis are the direct beneficiaries. About two years ago, the major players in the computer industry (Intel, Microsoft, AMD, Nvidia, etc.) poured tens of millions of dollars into the coffers of major university research centers. Has there been any progress in finding a solution since? Answer: No. Is there any indication that progress will be made in the near future? Answer: No. Now, everyone knows that two years is a long time in the computer business. A lot can and should happen in two years. The amazing thing about parallel programming research is that computer scientists have been working on the problem for the last thirty years! They are just as clueless now as to what the solution might be as they were when they first started working on it! What is wrong with this picture?

Seismic Paradigm Shift Ahead

Computer scientists are not ashamed to admit that they have no idea what the solution to the crisis might be. Why should they be? Long term failure has never stopped the money from flowing in. In fact, research money on parallel computing has markedly increased in the last few years because the industry is beginning to panic. Nobody messes with Moore's law and walks away to brag about it. In a way, it makes sense that the industry should approach academia for help. I mean, who else but academia is qualified to find a solution to this pressing problem? But how much longer can the crisis continue? Can the industry fund unsuccessful research indefinitely? Can we continue to live forever with hideous monstrosities like heterogeneous processors and multithreading? I don't think so. Sooner or later, something will have to give. The captains of the industry will eventually realize that they are pouring money into a black hole and many will wise up. A seismic upheaval in the way computer science is conducted will ensue. Many scientists who are now placed on a pedestal will see their work discredited. The computer science community may think they are immune to hard times but the market is known to be rather cruel when profits are in the balance.

Wrong From the Start

If you asked me who are the most to blame for the current crisis, I would tell you without hesitation that it is the mathematicians. All the major participants who helped to shape the history of computing, people like Charles Babbage, Lady Ada Lovelace, Alan Turing and John Von Neumann, were mathematicians. Their vision of the computer is that of a machine built for the computation of mathematical functions or algorithms. Even today, after years of failure to solve the parallel programming problem, mathematicians are still clamoring for the use of functional programming as the solution. Everything looks like a nail when all you have is a hammer. The only reason that the functional programming community has not succeeded in pushing FP into the mainstream is that reality keeps kicking them in the ass. The truth is that functional programming languages are counter-intuitive, hard to learn and a pain to work with.

Behaving Machines, Communication and Timing

Mathematicians notwithstanding, a computer is a class of automaton known as a behaving machine. As such, it must not be seen as a function calculator that takes input arguments and returns an answer. It should be seen as belonging to the category of machines that includes brains and neural networks. The proper paradigm for describing the working of such machines is not mathematics but psychology. We should be using terms like stimuli, responses, sensors, effectors, signals, and environment when describing computers. This way, an operation becomes an effect performed by an effector on the environment (data) and a comparison operator becomes a sensor. Once the computer is seen in its proper light, it immediately becomes clear that, like nervous systems, a computer program is really a communication network that senses and effects changes in its environment. Nothing more and nothing less. And, as with all communication systems, the deterministic timing of sensed events (stimuli) and operations (responses) becomes critical. Most of the problems that plagues the computer industry (e.g., unreliability and low productivity) stem from the inability to precisely determine the temporal order (concurrency or sequentiality) of all events in the machine. Temporal determinism is a must. This then is the biggest problem with the Turing Computing Model: timing is not an inherent part of the model.

How to Solve the Crisis

We must reinvent the computer. We must turn it from a mathematician's wet dream into a psychologist's wet dream, from a glorified function calculator into a universal behaving machine. And we must do so at the fundamental level. It is time to stop feeding money into the academic black hole for the reason that the academics have failed. It is time to stop encouraging mediocrity among the baby boomer generation who created this mess. It is time for the boomers to gracefully retire and allow a new generation of thinkers to have their turn at the wheel. Industry leaders should simply say to the Turing Machine worshipers, “thank you very much, ladies and gentlemen, for your great service; here's a nice retirement check for all your troubles.” Then they should forget the whole mess and forge a bold new future for computing. And then the next computer revolution will make the first one pale in comparison.

See Also:

How to Construct 100% Bug-Free Software
How to Solve the Parallel Programming Crisis
Parallel Computing: The End of the Turing Madness
Why Parallel Programming Is So Hard
Parallel Computing: Why the Future Is Non-Algorithmic
Half a Century of Crappy Computing
Why Software Is Bad and What We Can Do to Fix It

1 comment:

Hector A. Maquieira said...

Hi, I apologize for being off-topic but I actually wanted to comment about your theory of physical motion.

First of all, I think you're combative tone is unnecessary because many of the ideas you talk about ARE embraced by the physics academic community. That the fundamental nature of the universe is discrete is probably the opinion of half of the scientific community. You also talked about the fact that matter is moving at the speed of light along the fourth dimension, and in fact, many peer-reviewed papers do propose that matter is moving at the speed of light through time, aka the fourth dimension! Finally, I would guess that at least 95% of physicists would agree with you that space is a bubbling sea of particles, indeed, that is a basic corollary of quantum theory!

If you look at the orbital model of electrons, it precisely mentions that electrons jump from one orbital to another, instantaneously and with no in-between positions. Indeed, the entire science of astronomy is based on this principle. (cf. lyman-alpha line)

Finally, I wanted to address the fundamental question of motion and why Newton was more correct that Aristotle.

Aristotle was wrong because he completely ignored air friction. That is the fundamental reason why on earth, things 'naturally' come to rest. Of course, he is right that nothing will move unless first moved, but once something is moved, why can't it keep moving??

The reason it MUST keep moving is to conserve energy. An object in motion contains a measurable amount of energy,(K.E. = mv^2) and for it to stop ad hoc, would be to destroy that energy.

Please let me know if I you have any issues with what I wrote, but keep in mind that if you defend Aristotle you are defending a man who thought that something that weighs twice as much should fall twice as fast.

I hope that you can try to be more open minded towards academia and also recognize that academia itself is quite open minded.