Sunday, February 21, 2010

Computer Scientists Suck

Rebelling Against Chicken Shit Science

I feel the same toward computer scientists as I do toward physicists who espouse such absurdities as continuity, time travel and the like. Their science is what I call a chicken shit science. “Well, you would not be using your computer to post articles to your blog on the internet if it weren't for us” complains the nearest computer scientist. And, lo and behold, I agree. Instead, I would be sitting by the pool smoking a fine cigar while my robotic chef prepares fresh sashimi in the kitchen and my mechanical secretary fetches me a good bottle of sake or some fine wine from the cellar. In retrospect, this is what really pisses me off about computer scientists. They have deprived, not just me, but the entire world of the good life that we should be enjoying right now. There would be no parallel programming crisis, had those scientists really understood what the hell they were doing all those years.

The Incompetence of the Labs

As you can tell, I am in what you might call, full rebellious mode. I am rebelling against the computer scientists at the UC Berkeley Parallel Computing Lab, the Stanford Pervasive Parallelism Lab and the Universal Parallel Computing Research Center of the University of Illinois at Urbana-Champaign. Those state of the art labs pocketed millions of dollars almost two years ago to come up with a solution to the parallel programming crisis. If you thought that, with all those brilliant savants working on the problem, they would have something interesting and novel to show by now, you would be wrong. They got zilch, nada, zero, not even the promise of a breakthrough. And those three labs are just the tip of the iceberg of incompetence. Billions of dollars are being wasted right now in parallel computing research labs around the world. That's not counting the untold billions of dollars computer scientists spent over the last forty years working on the problem to no avail. They are as clueless now, as to what the solution might be, as they were when they first started thinking about the problem back in the sixties and the seventies!

Chicken Shit to the Extreme

What really frustrates me is that every one of the labs I mentioned above have visited my blog and the Rebel Science site countless times along with IBM, AMD, Intel, Nvidia, Motorola, and many other companies and universities around the world. They are all familiar with my ideas on parallel programming. Not once did any of those folks bother to contact me to discuss these ideas. But it gets worse. Some are now trying to surreptitiously claim my (and other people's) ideas as their own. University of Illinois's Josep Torrellas is a case in point. As most of my readers know, I have been calling for a solution to the parallel programming problem that involves using buffers to hold instructions to be processed in parallel. While one buffer is being processed, another is filled with instructions to be processed in the next cycle. Guess what? Not too long ago, Torrellas and a few other computer scientists published a paper titled The Bulk Multicore Architecture for Improved Programmability (pdf). Torrellas thinks he can get away with calling instruction buffers “chunks” and buffer processing "bulk processing" even though buffer processing is a well-known parallelization technique. Here is an excerpt from the December 18 UIUC announcement:
In the Bulk Multicore Architecture, the default execution mode of a processor is to commit chunks of instructions at a time. Torrellas explains, "Such a chunked mode of execution and commit is a hardware-only mechanism, invisible to the software running on the processor.' Moreover, its purpose is not to parallelize a thread, but to improve programmability and performance."
And here's what I wrote in my parallel programming blog article in July, 2008:
The solution to the parallel programming problem is to do away with threads altogether. Threads are evil. There is a way to design and program a parallel computer that is 100% threadless. It is based on a method that has been around for decades. Programmers have been using it to simulate parallelism in such apps as neural networks,cellular automata, simulations, video games and even VHDL. Essentially, it requires two buffers and an endless loop. While the parallel objects in one buffer are being processed, the other buffer is filled with the objects to be processed in the next cycle. At the end of the cycle, the buffers are swapped and the cycle begins anew.
I then went on to explain in the next paragraph that the use of instruction buffers within the processor is completely transparent to the program, which is exactly what Torellas is claiming. Even though I wrote my blog article in July of 2008, these are ideas that I have been promoting on the internet since the 1990s. I used to call the instruction buffers 'update lists' in those days. My point is, you can call them buffers, chunks, or lists but it's all the same thing: they contain independent instructions that can be executed in any order but preferably in parallel. And besides, I don't claim to have invented the method as it is commonly used in video games, neural networks and cellular automata to simulate parallelism, albeit at a coarse-grain level. However, I think I may have been the first to propose their use in parallel processors at the instruction level.

Torrellas (et al) and UIUC Owe Everybody an Apology

So how do Torrellas and UIUC get away with pretending that they're introducing something novel? After two years and millions of dollars, the brilliant scientists at UIUC figured out that it is best to process parallel instructions in batches? Haysoos Martinez! All they had to do is ask me and I would have given them the answer for free. The truth is that Torrellas and other members of the parallel computing lab at UIUC probably read my blog or my web site and decided to use deceptive phrases like “chunk processing” to hide where they got their ideas from. This is chicken shit to the extreme. Torrellas and his co-authors are fakes who are claiming other people's ideas on parallel processing as their own. They and UIUC owe the programming community and myself an apology.

Ass-Kissing Cult

Needless to say, I am not feeling very forgiving at the moment. The computer science community invented the parallel programming crisis. They did it by creating a cult around a man named Alan Turing. The Turing machine is a purely sequential machine. It has absolutely nothing to do with providing a solution to the parallel programming crisis (see Parallel Computing: The End of the Turing Madness). I have known since 1980 that the way we build and program our computers is fundamentally flawed. It was flawed from the start. Why? Because computer scientists were busy kissing Turing's ass instead of doing what they were paid to do. Kissing ass is the norm in academia. Peer review enforces it. They're still at it to this day.

Rebel Now or Live in Mediocrity

In my opinion, the only way to liberate computing from the grips of the ass-kissing cult that currently controls it is to rebel against them. There is a dire need for a grass-root rebellion amongst the younger generation of programmers, chip engineers and managers in the computer industry. It's time to tell the baby boomers (the Turing Machine worshipers) who gave us this mess to eat shit. It's time to tell them to move the hell out of the way so the rest of us can fix this crisis and put computing on the right track. We are not going to take it anymore. So my advice to my readers is, rebel now or resign yourselves to live in mediocrity.

Next: Half a Century of Crappy Computing.

See Also:

How to Solve the Parallel Programming Crisis
UC Berkeley's Edward Lee: A Breath of Fresh Air
Parallel Computing: The End of the Turing Madness

1 comment:

Tango said...

First Question I ask the "self-deception" filled minds that who does say you computer is a Science???
Computer is the next generation of Calculator. Over the 50 years there no improvement happened in Computers. Computer is still a Calculator, just with one higher feature, that the calculators not have. Just One.
The Conditional Expression- if-else That is the secret behind the success of the computer.
Computer Scientists SUCK