Monday, July 18, 2011

A Fundamental Flaw in Numenta's HTM Design, Part I

Part I, II

Abstract

A couple of days ago, I reread Numenta's latest document (pdf) on HTM. I thought it was strange that there was no mention of the maximum number of nodes that an HTM sequence is allowed to have. I thought it was strange because a sequence in Rebel Cortex (RC) has a maximum of seven nodes. I tried to understand the reason for the lack of a maximum sequence size in HTM and failed. Then I began to meditate about all the differences between HTM and RC. That's when I noticed another fundamental difference: every level in Numenta's knowledge hierarchy works exactly the same way. This is not true in Rebel Cortex. I explain why below.

Nodes and Sequences of Nodes

There are two types of nodes in RC, bottom level and upper level. A bottom level node (BLN) is a group of concurrent sensory inputs that are connected to the bottom level sequences of the memory hierarchy. A BLN is similar to what Numenta calls a spatial pattern in HTM. There is no limit to the number of concurrent inputs a BLN can have. A bottom level sequence can have up to seven BLNs. An upper level node (ULN), by contrast, is just a sequence of seven lower-level nodes. Again, there are no concurrent inputs in the upper levels of the hierarchy.

Rebel Cortex Memory Hierarchy

The bottom level, level 0, receives its inputs indirectly from the sensory layer by way of the signal separation layer (see RC Document for a description of the SSL). Only the bottom level nodes receive concurrent inputs. The question that comes to mind is, why only the bottom level? The answer is not immediately obvious; otherwise the super-smart and math-savvy intelligentsia at Numenta and elsewhere would have discovered it. Not that I am any smarter, mind you, not by a long shot. I did not discover it either. So before I reveal the answer to the question I posed, let me explain how I came to know that it was the way to go long before I understood the actual reason. Check out what happens when you flip the image vertically.


What you see above is a faithful representation of the metaphorical vision of the golden lampstand (menorah) by the old testament prophet, Zechariah, whose name means 'Yahweh remembers' in Hebrew. Here's how Zechariah described the symbolic lampstand:
Zechariah 4:2
And he said unto me, What seest thou? And I said, I have seen, and, behold, a candlestick all of gold, with its bowl upon the top of it, and its seven lamps thereon; there are seven pipes to each of the lamps, which are upon the top thereof. (New American Standard Bible)
The only thing that is missing from the diagram is the bowl, the meaning of which I will explain in a future post. The design of Zechariah's lampstand is so strange that several translators, including the author of the King James Version, mistakenly assumed that he meant to write that he saw seven pipes, one for each of the seven lamps, which would be the description of a normal seven-branch menorah. Indeed, why would every lamp need seven pipes? Nevertheless, that is what Zechariah described.

Menorah

Here is the English Standard Version:
Zechariah 4:2
And he said to me, "What do you see?" I said, "I see, and behold, a lampstand all of gold, with a bowl on the top of it, and seven lamps on it, with seven lips on each of the lamps that are on the top of it.
There are more translations like these in many languages. They use words like pipes, tubes, channels or conduits to describe the seven ducts attached to each of the seven lamps. I figured out many years ago (around 2002) that Zechariah's vision was a symbolic description of the brain's memory architecture. The clues were unmistakable then. Any lingering doubts that I had in the beginning have long since vanished. Now it's mostly a matter of implementation. I am coming out publicly with all of it because I want there to be a record of it all, or prior art, as intellectual property lawyers would call it. To those folks out there who want to corner the market in true artificial intelligence by acquiring a huge patent portfolio, all I can say is this: it will not work.

Coming Up

I think that Numenta's HTM design is fundamentally flawed. In Part II, I will explain why the upper level nodes of the memory hierarchy must not receive concurrent inputs. I will also explain why memory sequences are limited to seven nodes.

See Also:

How Jeff Hawkins Reneged on his Own Principles
Jeff Hawkins Is Close to Something Big
The Rebel Science Speech Recognition Project

1 comment:

Robert said...

The answer is not immediately obvious... see above is a faithful representation of the metaphorical vision... 'Yahweh remembers' ... And he said to me, "What do you see?" I said, "I see, and behold...

Psalm 119:105; Zechariah 1:1-3; John 1:1-3; 8:12; Revelation 2:5; 19:12-16




https://archive.org/details/namestitlesoflor00spea_0

https://archive.org/details/christteacherofm00pitz

https://archive.org/details/parablesoflordje00rich

https://archive.org/details/notesonmiracleslond00tren

https://archive.org/details/tabernacleinsina00rand

https://archive.org/details/cu31924029132284

https://archive.org/details/TheTabernacleThePriesthoodAndTheOfferingsByHenrySoltau