Sorry for the long hiatus. I posted the last installment of this multi-part article way back in March. I have been meaning to write some more on the topic but a series of unfortunate events in my life have slowed me down a bit.
In Part II, I explained how to do pattern learning and how to prevent patterns in the hierarchy from getting bigger than they need to be. In today's post, I explain sequence learning and the organization of sequence memory. However, be advised that there are a couple of things about sequence memory that I want to keep secret for the time being.
A Few Observations
A sequence is a string of consecutive nodes representing successive pattern detections. Sequence memory is organized hierarchically, like a tree. A sequence is divided into seven-node sections although sections at either end of a sequence may have less than seven nodes. These are the building blocks of memory. Why seven nodes per sequence? It's because this is the capacity of short term memory. But regardless of its level in the hierarchy, a block is ultimately a sequence of patterns.
- A sequence is used to detect a unique transformation in the sensory space which is manifested as a number of consecutive pattern detections.
- A sequence is a recording mechanism. It records a memory trace, that is, the precise timing of its last activation.
- A sequence is a predictive mechanism. The firing of a single node in a sequence is enough to predict the firing of subsequent nodes.
- A sequence can be used for pattern completion and fault tolerance. The firing of a node in the sequence is enough to compensate for missing signals. This is important when working with noisy and imperfect sensory streams.
- Sequences, together with the branch mechanism (see Part IV), are part of the invariant recognition mechanism of an intelligent system.
- A sequence is a sensory motor unit, i.e., an integral part of the goal-oriented behavior mechanism of an intelligent system.
- The temporal interval between any two consecutive pattern signals can vary.
- Some sequences repeat more slowly than others. Indeed, many sequences will occur only once or twice.
- Several sequences can share one or more patterns. This is used to join otherwise unrelated sequences together and is part of the mechanism of invariant recognition.
This may sound counterintuitive but patterns (see Part II for a description of patterns) are the key to sequence learning. This is because patterns are inherently predictive. Patterns are so unique that they normally have just a few predecessors and successors. Most patterns will have only one predecessor and one successor. This is important because it dictates a crucial aspect of sequence learning. Unlike pattern learning, which requires many frequent repetitions, the learning of a sequence (predecessor-successor) needs only one example. In other words, sequence learning can be extremely fast.
Dealing with Imperfection
How can an intelligent system learn a sequence if pattern signals do not always arrive on time? In my opinion, it does not matter that the signals are imperfect as long as they arrive on time some of the time. A single instance of two consecutive signals is sufficient to learn a new sequence. Sequences that lead to a contradiction (I'll explain this in Part IV) or that are not reinforced over time are simply discarded.
Small Things First
One of the main problems we face in sequence learning is that the interval between any two consecutive pattern signals is a variable. It can change with circumstances. For example, the notes of a song can be fast or slow but it's still the same song. It is a problem because it makes it almost impossible to determine which pattern precedes which. The solution turns out to be rather simple: the learning system should start with the smallest intervals first before slowly moving on to progressively longer intervals.
In Part IV, I will go over the mechanisms of invariant recognition and short term memory. I will also explain how to catch a liar, i.e., how to detect contradictions in sequence memory.
The Holy Grail of Robotics
Raiders of the Holy Grail
Jeff Hawkins Is Close to Something Big