Sunday, June 15, 2008

Parallel Computing: The End of the Turing Madness. Part I

Part II

Hail, Turing

Alan Turing is, without a doubt, the most acclaimed of all computer scientists. He is considered to be the father of modern computer science. Soon after his untimely death in 1954 at the age of 41, the computer science community elevated him to the status of a god. Books are written, movies are made and statues are erected in his memory. Turing is so revered that the most coveted prize in computer science, the A. M. Turing Award, was named after him. It is worth noting that Turing’s sexual inclinations and religious convictions did little to diminish his notoriety. Homosexual and atheist movements around the world have wasted no time in transforming the man into a martyr and the sad history of his life into a veritable cause célèbre. The end result is that nothing negative may be said about Turing. It is all right to talk about the Von Neumann bottleneck but mentioning that the bottleneck was already part and parcel of the Turing machine is taboo. The adulation of Turing is such that criticizing him or his ideas is guaranteed to deal a deathblow to the career of any scientist who would be so foolish.

Unabashedly Bashing Turing and Loving It

It is a good thing that I don’t lose any sleep over my reputation in either the scientific community or the computer industry, otherwise I would be in a world of hurt. I am free to bash Turing and his supporters to my heart’s content. I am free to point out that the emperor is buck-naked and ugly even while everyone around me is fawning at his non-existent clothes. As someone with a strong iconoclastic bent, I admit that I enjoy it. Free speech is a wonderful thing. I have argued forcefully and unabashedly in the past that academia’s strange and pathological obsession with Turing is the primary cause of the software reliability and productivity crisis. I now argue even more forcefully that the parallel programming crisis can be directly traced to our having swallowed Turing’s erroneous ideas on computing, hook, line and sinker. Had the computer science community adopted the correct computing model from the start, there would be no crisis and you would not be reading this article and getting offended by my irreverent treatment of Mr. Turing. In fairness to Turing, my criticism is directed mostly toward those who have turned the man into the infallible god that he never was and never claimed to be.

The Not So Universal UTM

Unfortunately for Turing’s admirers, for many years now, the laws of physics have been quietly conspiring against their hero. Sequential processors have reached the limits of their performance due to heat dissipation problems. The computer industry is thus forced to embrace parallel processing as the only way out of its predicament. Parallelism is a good thing but it turns out that programming parallel processors is much easier said than done. In fact, it is such a pain that the big players in the multicore industry are spending hundreds of millions of dollars in research labs around the world in the hope of finding a viable solution. They have been at it for decades without any hint of success in sight. Worse, the leaders of the industry, out of sheer desperation, are now turning to the very people who got everybody into this mess in the first place, none other than the Turing worshippers in academia. It would be funny if it weren’t so pathetic.

Consider that Turing’s ideas on computing, especially the Universal Turing machine (UTM), were strongly influenced by his love of mathematics. He was, first and foremost, a mathematician and, like Babbage and Lady Ada a century before him, he saw the computer primarily as a tool for solving serial math problems. It is highly unlikely that he ever thought of the concept of parallel processing or the idea that a computer might be used for anything other than problem solving and the execution of instruction sequences. The time has come for the computer industry to realize that the UTM is the quintessential sequential computer and, as such, it is ill suited as a model for parallel computing. It is time to devise a truly universal computing machine, an anti-UTM machine if you will, one that can handle both sequential and concurrent processing with equal ease.

Smashing the Turing Machine

The Turing machine cannot be said to be universal because it is a strictly sequential machine by definition whereas the universe is inherently parallel. It is certainly possible to use a UTM to emulate parallel processes. Programmers have been emulating parallelism in such applications as neural networks, video games and cellular automata for decades. However, the emulation is not part of the Turing computing model. It is an ad hoc abstraction that a programmer can use to pretend that he or she is performing parallel processing even though the underlying model is sequential. Programmers get away with the pretense because processors are extremely fast. Slow the processor down to a few hundred Hertz and the illusion of parallelism disappears.

One can always argue that having two or more Turing machines running side by side is an example of parallel computation. However, two Turing machines running independently cannot be construed as representing one Universal Turing machine, as defined by Turing. Parallelism is not synonymous with temporal independence. On the contrary, a truly parallel system is one in which all events can be unambiguously determined as being either concurrent or sequential. Even if one could extend the definition of the UTM to include multiple parallel machines, deciding which operations are performed concurrently requires even more ad hoc additions such as a communication channel between the two machines. Alternatively, one can imagine a Turing like machine with an infinite number of read/write heads on a single infinitely long tape. Still, the Turing model does not provide a deterministic mechanism for the detection of either concurrency or sequentiality and the problem becomes worse with the addition of multiple heads. There is only one way to solve the parallel programming crisis. We must smash the un-universal Turing machine and invent a truly universal alternative.

In Part II of this two-part article, I will describe what I mean by a true universal computing machine and do a direct comparison with the UTM.

See Also:

Jeff Raskin - Computers Are Not Turing Machines (pdf)
Half a Century of Crappy Computing
How to Solve the Parallel Programming Crisis
Parallel Computing: Why the Future Is Non-Algorithmic


Unknown said...

I say you should cut the poor guy some slack . Till when was he alive?
1954 . He , himself can't be blamed for not inventing a solution for today's problems . (indeed , some people are to be blamed for the fact that we're STILL using bloated sequential processors ) . It's like blaming the inventors of the gas lamp because they couldn't make a flashlight . You wouldn't expect the GUI to have been invented before the command line , yet the inventors of the CLI shouldn't be blamed . Before evolution or revolution can occur , initial steps must be made .
Besides , had it not been for Turing's work on computability , we would still be tiring to make miracle-machines that could solve everything , wouldn't we ?

Louis Savain said...

I say you should cut the poor guy some slack.

No way. I refuse to give Turing any slack. I can forgive Babbage and Lady Ada for not thinking too far ahead ahead but Turing has no excuse. Boolean algebra and DeMorgan's theorems were well known subjects in Turing's day. The work of Spanish neurobiology pioneer Ramon Y Cahal was also well known in academia. All these things pointed to the importance of parallelism in computation. Heck, the universe around us is a perfect example of parallel processes producing both concurrent and sequential events.

Turing came out with nothing really new on the computer scene that Babbage had not already invented a century before him: a sequential calculating machine. His work on computability is utterly useless as far as the major problems of the comnputer industry are concerned. If it weren't, we would not be in the sorry mess that we are in right now.

Having said that, I reserve my greatest criticism, not for Turing, but for the hordes of sycophants in the academic community who created a stupid pathological cult around the man. had it not been for this undue enfatuation with the Turing machine, we would have had an inherently parallel and deterministic computing model fifty years ago, even with single-core processors. Software unreliability would have been non-existent and the transition from single-core to multiple core processors would have been a mere evolution, a hardware engineering problem, not the painful paradigm changing revolution that it is turning out to be. As always, I tell it like I see it.

Unknown said...

Sorry, but science is all about hypothesizing, seeing where the hypothesis takes you, and then adjusting accordingly. It is awfully arrogant to look back almost a century, and say "hey, that guy should have known what many people still don't see as obvious truth nowadays." It's kind of like saying Aristotle should have made Newton's discoveries, or Newton should have seen the "obvious" signs of relativity (which was primarily discovered through thought experiments, right?)...

Sorry that science made a wrong turn, but it always does. You cannot blame the scientist, or even his followers, for giving imperfect information, or overlooking what seems to you to be obvious over half a century later. For as problematic as sequential processing might be, we have done a hell of a lot with it.

You can say that Turing offered nothing beyond the ideas of Babbage all you want, and we will never see a world in which Turing didn't exist, or in which a "Parallel Turing" did exist, or even a world in which Turing was ignored... But it doesn't make sense to speculate that Turing did not help the advancement of computational technology. After all, if parallelism were so obvious 50 years ago, someone else would have been there advocating it.

You're not a kook, but you surely are arrogant and hungry for attention. The ideas which you advocate could easily be presented without being wrapped up in barbed wire. It sounds like you just have a personal bone to pick with the academic community, so you try to wrap it up in a package of character assassination. I suppose it works as a ploy for getting some attention paid to your ideas, but it wreaks of immaturity and amateurism. If you really want your ideas taken seriously (in any context, not just academic), then you should clean up your act, and stop acting like a wailing baby.

Vitrolos said...

I never cared much for Turing's work, other than as a curious academic exercise. However SISD (Von Neumann) machines have been around for a while, and your arguments don't convince me that parallel machines would have avoided the software quality problems. The cause of poor software is poor programming by poor programmers - in short, people. Anyways, I always value new ideas and fresh views. In my eyes, the breakthroughs come from eccentrics working alone, not hordes of mediocre people.

Louis Savain said...

Yo, Rob.

Why are you reading my stuff? You are obviously a Turing worshipper and/or a wannabe academic. You took offence, didn't you? Good. Just stay off my blog, man. It's not meant for you.

Vitrolos said...

The problem is that nobody gives any money (vulture capitalists at least) without patents or working protoypes. What about putting some of these ideas onto real hardware, say a 3-4k$ PC based FPGA board? Just my 2 cents. Regards, h

Louis Savain said...


Deterministic timing is the key to software quality. It is absolutely crucial to be able to determine and predict the temporal relationships between all events in a program. This is something that Turing and his fan club never thought about since the Turing machine is strictly about what is mathematically computable, as opposed to how to implement deterministic reactive behavior in a machine.

It just so happens that, in order to enforce deterministic timing, one needs a computing model that handles both concurrent and sequential events at the fundamental level. This is why a parallel computing model is necessary. In addition, some type of timing mechanism or clock (real or virtual) must be an integral part of the model.

The math nerds have shot computing in the foot big time. It is time to rescue the field from their control. Computing is not about solving math problems. It is about behavior. Solving math problems is just one of many possible types of behavior.

Unknown said...

I'm sure you wish I took offense. On the contrary, you gave me a good laugh.

Quit being so paranoid. Not everyone is either with you or against you. If you would stop forcing that worldview, you would realize that most people are a little from column A and a little from column B.

Louis Savain said...


Are you kidding me man? If anything, I must be the only non-paranoid blogger left on the web. I always tell it like I see it, regardless of the consequences. Lots of people in the academic community have been badmouthing me for years.

I am still here and I haven't changed my tune in the least. I am still in their faces and telling them that they're full of shit. I give them a taste of their own medicine. Guess who is more paranoid?

The way I see it, you're either with me or you're not. I can't stand lukewarm people. So make your choice because this blog is a dictatorship and I am liable to delete your comments just for fun. You're lucky I let you post anonymously.

Unknown said...

Hi Louis,

You might be interested in:

Why interaction is more powerful than algorithms


The Interactive Nature of Computing: Refuting the Strong Church---Turing Thesis"

Louis Savain said...


What can I say? This is a breath of fresh air. You got major huevos, amigo. Wow! I have always said that what the computer industry and the computer science community need more than anything is fewer ass kissers and more people with backbones.

You and I are fighting the same battle. You're on the inside while I'm on the outside. Being on the inside, you must tread carefully though, otherwise you will be forcibly ejected or worse. Besides, you're taking a risk by publically posting a comment on my blog. Bravo! I will give your papers a careful reading and, with your permission, I would like to post a critique in a future article. And thanks for bringing your work to my attention.

Louis Savain said...


Apparently, very few people have been listening to you and Dina Goldin because, judging from the publication dates on your home page, it seems like you've been ringing the non-algorithmic/reactive bell since the mid 90s. If the industry had listened, there would be no parallel programming crisis. I, too, have been fighting this battle for a long time (see The COSA Saga). It is not an easy thing to shift scientific paradigms, especially when you are contradicting what many consider to be infallible doctrine. The good (or bad, depending on how you want to look at it) thing is that the industry has a nasty problem on their hands that can only be solved by adopting a non-algorithmic reactive computing model.

It would be nice if you, Dina and I could somehow cooperate in a business venture. There is a fortune to be made by incorporating the reactive/parallel paradigm in the design of a multicore processor and show the industry how it should be done. But I am afraid that any association with me would only bring you nothing but hardship. There are a few people in the academic community who hate me with a passion. LOL. Hell, it's good to fantasize, once in a while.

PS. Your contention that the Church-Turing thesis is a myth is an eye opener. It seems that some people have been guilty of putting words in Turing's mouth. That's fifty long years of crappy computing! Wow! Those folks owe the world an apology, in my opinion. And, if you are right, I will apologise to Turing for assuming that what has been said in his name is true.

Unknown said...

> I am still here and I haven't changed
> my tune in the least. I am still in
> their faces and telling them that
> they're full of shit.

And that is the core of your credibility problem. I've been watching you for quite a while now, off and on. I like iconoclasts and I like people who want to smash the way the world makes software now because, frankly, software sucks so hard it could remove matter from black holes.

But all you're doing is talking shit. You're not producing shit. Instead if telling people they're wrong, why don't you prove that they're wrong?

Lots of people have pointed you in the direction you have to take to do that proof: build a proof-of-concept processor in a cheap FPGA that shows your concepts at work. You've not done this yet. The same part of me that likes iconoclasts because they question things starts its own line of questioning when I note this. The question boils down to "why is this asshat spending so much time talking and so little time doing?".

So here's the bottom line: shit or get off the pot. Your going from forum to forum ranting against everything that isn't you gets tiresome when your grandiose claims have absolutely, positively zero backing them up. Do the work and show the world or just finally shut the fuck up.

Countdown to being accused of Turing worship and of being an academic running-dog starts now. Ten... nine... eight... seven...

Louis Savain said...

Eat shit, Richter. If you write me a check for a few million dollars, I'll build a prototype COSA multicore processor and a set of dev tools. Until then, you can kiss my ass, man. How about that? And if you don't like the way I promote my ideas on the web, that's tough shit, isn't it?

I'm gonna continue to do it and there's nothing you can do about it, is there? The last time I checked, there is still freedom of expression on the internet in the most of the world.

PS. Don't bother responding. You've got nothing constructive to contribute to the parallel programming problem that I can see.

VicDiesel said...

And still I'd say Turing deserves some slack. He was not designing a computer: he was settling the purely mathematical Entscheidungsproblem. The Turing Machine was a hypothetical device, call it a "thought experiment", to resolve a theoretical question regarding computability.

Unknown said...

Interesting discussion.
In the english-speaking world,
Turing and von Neumann are well known. Kind of heroes.

Have you ever heard of Konrad Zuse?
He built the first freely programmable computer. The program was stored on punched movie tape . The hardware was based around relays.

US and UK engineer came up with their computers years later.

And guess, all of these machines were .... sequential.

And why is this so?
This was so because is was already difficult enoung in these days to build a computer at all. Demanding from Mr. Zuse to build a parallel machine is .. a joke (he built the machine during wartime when supply of relays was short).

Since then , we saw an evolution of ... sequential machines. They eveolved so fast because they are much easier to design and easier to program than parallel machines. If parallel machines would be easy to design and program, then, we would already have them.

It is not a matter of 'these guys all all idiots', its all about the complexity people can handle.

Brother Wright did not come up with a Boeing 747, right?

Shiva Kaul said...

You seem to think that Turing machines entrenched sequential computing. That doesn't make any sense mathematically, so perhaps you are referring to some social or psychological legacy of sequential thinking. That legacy exists, but it has nothing to with Turing machines, which are low-level constructs which even theoretical computer scientists rarely appeal to. You may wish to investigate e.g. circuit complexity and Nick's Class before embarrassing yourself with a bogus claim about how existing models of computation can't accommodate parallelism.

Louis Savain said...

Shiva Kaul wrote:
You seem to think that Turing machines entrenched sequential computing. That doesn't make any sense mathematically, so perhaps you are referring to some social or psychological legacy of sequential thinking. That legacy exists, but it has nothing to with Turing machines, which are low-level constructs which even theoretical computer scientists rarely appeal to. You may wish to investigate e.g. circuit complexity and Nick's Class before embarrassing yourself with a bogus claim about how existing models of computation can't accommodate parallelism.

Wow. If a Turing machine model is a model of parallel computing, then I got a bridge to sell you, because the TM is a sequential computer by definition. Whether or not one can simulate a parallel computer on a TM is irrelevant to parallel computing for the same reason that the ability to simulate a tic-tac-toe machine does not say squat about tic-tac-toe. A true parallel computer, on the other hand, does not have to simulate a TM. It can emulate it directly.

At any rate, if the TM was the solution to the parallel programming crisis, you would not be reading this comment. So, obviously it isn't. In fact, the TM is the cause of the crisis.

Kynszch Didales said...

they say the Internet is full of rubbish and i must say from what i've read here i'd have to agree.

you are confusing theory with practice. the practical limits of serial artifices have nothing to do with what can in theory be computed by a TM with an infinitely long tape.

so bash away - your blows fall on thin air.

on the subject of parallelism, there is much to be found out.

but you haven't contributed anything useful here.

it would be useful to the collective if you were to delete your blog, so that the parallel channels of internet communication aren't clogged by messages such as this one.