Saturday, September 15, 2007

The Age of Crappy Concurrency: Erlang, Tilera, Intel, AMD, IBM, Freescale, etc…

I’ll get right to the point. If your multicore CPU or concurrent programming language or operating system does not support fine-grain, instruction-level parallelism in a MIMD (multiple instruction, multiple data) environment, it is crap. Ok, I know that you are offended and you don’t think that your company’s CPU, language, or OS is crap but it is. And I mean Crap with a capital C. The computer market is looking for fine-grain parallel systems that are fast, secure, easy to program, auto-scalable and bug free. The crap that I see out there does not even come close to delivering what the market wants. Let me twist the knife in the wound a little deeper because I don’t think you people (I’m tempted to say, you idiots :-), but I’ll let it slide) are getting the message. Here is a list of highly desirable things that the market wants right now, things that your crappy products are not supporting but should be.


  • Fast, fine-grain, instruction-level parallelism using MIMD. What is the point of parallelism otherwise? And please, don't tell your customers that it can only be done using SIMD. As I wrote elsewhere on this blog, using SIMD to develop software is like pulling teeth with a crowbar. You don't know how to do it with MIMD because you are a bunch of thread monkeys, that's all.
  • Easy software composition. This means a graphical interface for non-algorithmic programming, among other things. It also means plug-compatible components. Just drag’m and drop’m. No more (text-based) computer languages, por favor! Computer languages are a primitive, awkward, unnecessarily complex and messy legacy from the 20th century. They are a stupid way to program computers. Only computer geeks love computer languages. The market is not about what geeks like. It's about profit, reliability, security and productivity. To give you an idea of what I'm talking about, take a look at this parallel QuickSort using COSA cells. All labels can be created by the designer/developer using a natural language of his/her choosing.
  • Deterministic timing of events at the elementary operation level. Deterministic temporal order is a must for reliability. This is possible only by adopting a convention whereby all instructions (elementary operations) are parallel reactive processes and have equal durations based on a virtual system-wide clock. That is to say, they all execute in exactly one virtual cycle. This is called ‘synchronous reactive computing’ (not to be confused with synchronous messaging).
  • Implicit parallelism at the design level, and by this, I don’t mean compilers that extract parallelism from crappy sequential programs. I mean the programming environment should use objects that are inherently parallel. Otherwise, you're just fooling yourself that you're doing parallel programming. Implicit parallelism is the natural way to program and that's the way it should have been from the start, even with single core CPUs.
  • Explicit sequential order. Either you’re doing parallel programming or you’re not. If you are, then sequences should not be implicit but explicitly specified by the developer. Otherwise, things become hard to organize and understand.
  • Automatic resolution of data dependencies. This eliminates blind code and otherwise hidden side effects of code modification. It is an essential ingredient for reliability. It can only be done using a reactive synchronous software model.
  • Fast asynchronous message passing using shared memory structures. Copying an entire message to a queue a la Erlang is a joke.
  • Automatic transparent scalability. The developer should never have to think about cores. Programs should take advantage of all the cores automatically.
  • Impregnable security. This is possible only in a system that enforces deterministic timing.
I could go on but that’s enough for now. What is my point? My point is that you people in the computer industry who insist on putting out one crappy product after another, have no excuse. And please, don’t say anything about customers wanting to preserve their legacy tools and software. It’s bullshit. If you gave your customers the things that I list above, they would switch to it as fast as they can. Why? Because it’s not crap, that’s why. It would save them a shitload of money and eliminate the headaches. It would allow them to develop systems of arbitrary complexity without having to worry about unreliability and insecurity. We would finally be able to implement truly automated transportation systems (no, you don’t need AI for this) and eliminate the need for human drivers on the road, thereby saving 40,000 lives a year in the US alone! And don’t tell me that it cannot be done because I know otherwise.

The public wants cars, trains and buses that drive themselves and planes that fly themselves. They want automated air traffic control that never fails. Your crappy products are killing people out there, damnit! by default and otherwise. You have no excuses! If you don’t get off your sorry asses :-) right now and investigate the soundness of the COSA software model, you should all be prosecuted for gross negligence. I mean it.

PS. There is one cool thing that happened in the industry recently and that's Jeff Han's multi-touch screen technology. You people should take a good look at Jeff''s stuff and study that user interface closely because fast, touch 'n drag composition is part of the future of parallel programming.

See Also:

Erlang Is Not the Solution
Nightmare on Core Street

Parallel Programming, Math and the Curse of the Algorithm
Half a Century of Crappy Computing
Parallel Computers and the Algorithm: Square Peg vs. Round Hole

8 comments:

Anonymous said...

Congratulations on your new fab! So, when are you going to start churning out CPUs that meet the spec? I can't wait. I can't believe you're going to do this AND develop the operating systems and programming languages to go with it!! Thank-you so much, you're a superman!

Louis Savain said...

Anonymous wrote: I can't believe you're going to do this AND develop the operating systems and programming languages to go with it!!

I normally trash anonymous comments but yours merits a reply. You got it backwards. The OS and development tools (the software model) must be in place first and all the kinks ironed out before the CPU can be designed and fabbed. The fabs that can make the chips already exist. A fabless company can design the CPU. It's not easy but it can all be done in less than two years with the right people and resources.

Ian said...

And how are you going to write the OS and development tools? Without the CPU, you'll need to emulate it, and it'll have to be on a traditional, sequential CPU programmed with an algorithmic langauge. Doesn't that at least show that older languages are powerful enough to emulate "better" languages?

Your model is interesting, but smack-talking so-called algorithmic languages is just controversy for controversy's sake. Get those right people and resources and show us how it's done.

Louis Savain said...

Ian wrote, And how are you going to write the OS and development tools? Without the CPU, you'll need to emulate it, and it'll have to be on a traditional, sequential CPU programmed with an algorithmic langauge. Doesn't that at least show that older languages are powerful enough to emulate "better" languages?

This is like asking a prisoner, why do you eat prison food if you believe you're innocent? Look, I'm the first to acknowledge the power and usefulness of the algorithm for solving all sorts of problems. The problem is that the algorithm is a hopelessly flawed method of software construction. But then again, non-algorithmic software is not entirely alien to algorithmic software. They are close cousins since they have some things in common.

Your model is interesting, but smack-talking so-called algorithmic languages is just controversy for controversy's sake. Get those right people and resources and show us how it's done.

Well, there is controversy all right, but only because the old ways are very entrenched and some people feel genuinely threatened by these ideas (I don't blame them). It's like religion. But hey, send me your resume. I'm looking for experienced hardware and software managers, CEO, CFO, etc... With the right people on the team to back me up, I could easily convince some venture capitalists to fork up some substantial money for a startup.

Joe Ardent said...

I notice that your COSA quicksort is not even finished. You've had many years to build your software creation system, and yet, there is no functioning model. There's not even "wishful" models, ie, a library of COSA components that *would* run, if a COSA interpreter were available. This is because creating software, even in COSA, with graphics and whatnot, is still programming, and programming is hard.

I point this out not to say, "What you are proposing is impossible," as it certainly is not impossible. In fact, what I wish to point out is that the COSA system you've described would not be too difficult to create, and even if it were strictly single-processor, non-shared-memory, it could still be used to demonstrate your ideas very effectively.

But really, there is already no shortage of reactive, concurrent, data-aware systems; search the internet for "reactive concurrent system" or "dataflow language". Many even feature graphical composition of software. Then there's the ideas and work of Alan Kay, who is busy creating more systems like the one you describe; check out his NSF proposal from this page.

So, there you go. There is no vast conspiracy intended to keep "programming" to the realm of the few. What there is is a huge base of code, a culture, and a propensity in most people to devote their scarce cognitive resources to things that interest them more than computers (everyone, including programmers, have scarce cognitive resources, and they spend them on things that interest them).

Really, I don't mean to be snarky. There are plenty of examples of systems that could be called "COSA" (just as there are plenty of examples of systems that could be called "Lisp"), already out there in the world, thereby demonstrating that COSA is not an unrealizable fantasy. As well, there's plenty of improvement to be made in the way that we create software, in terms of having computers assist us as we create it. But for you, it's long past the time to put up or shut up, if you value your integrity.

truenorth said...

Wow!
Wow!, Wow!!
Wow!,Wow!!, Wow!!
Million times WOW!!!!!!!!!

I mean really what a refreshing article. I truly wish you the best of luck and sincerely hope those ancient CPU manufacturers, and their equally antique computer language programmers take an advise from you, therefore develop spectacular machines as you have so cleverly explained.

Anyway, thanks a lot again and PLEASE keep up the good work, which i believe will lead to a better world for us all to live in.

Voxel said...

Those money-milking MOTHERFUCKERS!!11 >:@§!1111

Arcadia said...

Bravo to this part: " Just drag’m and drop’m. No more (text-based) computer languages, por favor! Computer languages are a primitive, awkward, unnecessarily complex and messy legacy from the 20th century. They are a stupid way to program computers. Only computer geeks love computer languages. The market is not about what geeks like. It's about profit, reliability, security and productivity." How much life have I wasted although I've enjoyed it, but it never has to be like that.